You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi. Thanks for the great work.
You provided this instruction below: For using the LLaMa model weight, follow [pyllama](https://github.com/juncongmoo/pyllama) to download the original LLaMA model, and then follow [Lit-LLaMA](https://github.com/Lightning-AI/lit-llama) to convert the weights to the Lit-LLaMA format. After this process, please move the lit-llama/ directory under the checkpoints/ directory.
However, when I check the Lit-LLaMA repository, I cannot find any explanation for how to convert the weights of pyllama 7B (consolidated.00.pth) into lit-llama format. Could you share the link/instructions?
Thank you
Mohammad
The text was updated successfully, but these errors were encountered:
@41xu Thanks.
The downloaded llama 7B consolidated.00.pth is around 12GB. When I convert it to lit-llama format, it becomes 26GB.
Is this correct?
I download the model from huggingface, shown below, because the python -m llama.download --model_size 7B gets stuck. wget https://huggingface.co/nyanko7/LLaMA-7B/resolve/main/consolidated.00.pth
HI @mmdrahmani
for me, the situation is the same. the process of downloading and converting weights made me wonder if my network was interrupted. it needs a long time (if i remember correctly)
@41xu Thanks. The downloaded llama 7B consolidated.00.pth is around 12GB. When I convert it to lit-llama format, it becomes 26GB. Is this correct?
I download the model from huggingface, shown below, because of the python -m llama.download --model_size 7B gets stuck. wget https://huggingface.co/nyanko7/LLaMA-7B/resolve/main/consolidated.00.pth
Hi. Thanks for the great work.
You provided this instruction below:
For using the LLaMa model weight, follow [pyllama](https://github.com/juncongmoo/pyllama) to download the original LLaMA model, and then follow [Lit-LLaMA](https://github.com/Lightning-AI/lit-llama) to convert the weights to the Lit-LLaMA format. After this process, please move the lit-llama/ directory under the checkpoints/ directory.
However, when I check the Lit-LLaMA repository, I cannot find any explanation for how to convert the weights of pyllama 7B (consolidated.00.pth) into lit-llama format. Could you share the link/instructions?
Thank you
Mohammad
The text was updated successfully, but these errors were encountered: