We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
I try to run LLAMA using EasyLM. I follow the README for llama. The first step is conver raw LLAMA parameters.
python -m EasyLM.models.llama.convert_torch_to_easylm.py \ --checkpoint_dir='path/to/torch/llama/checkpoint' \ --output_dir='path/to/output/easylm/checkpoint' \ --streaming=True
The arg output_dir does not appear in convert_torch_to_easylm.py, which should be output_file now, as shown in code.
convert_torch_to_easylm.py
output_file
I wonder if the doc is outdated?
The text was updated successfully, but these errors were encountered:
Yeah the doc is a bit outdated due to a lot of changes I made recently. I will try to update it more frequently.
Sorry, something went wrong.
No branches or pull requests
I try to run LLAMA using EasyLM. I follow the README for llama. The first step is conver raw LLAMA parameters.
The arg output_dir does not appear in
convert_torch_to_easylm.py
, which should beoutput_file
now, as shown in code.I wonder if the doc is outdated?
The text was updated successfully, but these errors were encountered: