Skip to content

Will it be possible to train the whole model using the 7B LLM #274

@ElegantLin

Description

@ElegantLin

Dear authors,

Thanks for your great work. I wonder whether it is possible to fine-tune the whole model whose LLM is 7B using one 80G GPU if I use some settings, like FSDP, bfp16, etc.

Thanks!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions