Skip to content
This repository has been archived by the owner on Oct 19, 2024. It is now read-only.

Any solution to support llama2 finetune? #966

Open
LeiWang1999 opened this issue Nov 12, 2023 · 0 comments
Open

Any solution to support llama2 finetune? #966

LeiWang1999 opened this issue Nov 12, 2023 · 0 comments

Comments

@LeiWang1999
Copy link

Hi all, I notice that the latest transformers repo doesn't have flax version of llama model, any solutions to support llama finetune?

There exists a previous pr (#923) to support llama model conversion, but looks like it didn't support GQA (llama2).

Thanks for all your suggestions :)

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant