Skip to content

PEFT integration #27

@ananthsub

Description

@ananthsub

Compatibility with https://docs.nvidia.com/nemo-framework/user-guide/latest/nemo-2.0/features/peft.html

  1. Port over walk utilities to run functions over modules: [peft] Port walking utils for peft integration #43
  2. Port over AdapterWrapper and LoRA linear layers and utilities: [peft] Port base adapter wrapper and lora utils #44
  3. Port over ModuleMatcher: [peft] Port module matcher over from NeMo #68
  4. Define PeFT base class + LoRA to apply peft over the whole model: [peft] Define PEFT base class and LoRA transform #71
  5. Integrate into config container / model initialization: load from pretrained, apply peft transformation: Integrate peft into training loop #94
  6. Optimize checkpointing flow: saving adapters params, loading adapters + optimizer states only: Integrate peft into training loop #94
  7. Add DoRA: [peft] Port DoRA support #82
  8. Add Canonical LoRA: [peft] Port canonical LoRA #83

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions