Skip to content

Commit

Permalink
Add a LoRA fine-tuning Colab
Browse files Browse the repository at this point in the history
PiperOrigin-RevId: 726086542
  • Loading branch information
Conchylicultor authored and The gemma Authors committed Feb 12, 2025
1 parent 05376a4 commit 69588dc
Show file tree
Hide file tree
Showing 8 changed files with 991 additions and 56 deletions.
5 changes: 3 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -44,11 +44,12 @@ To download the model weights. See

## Examples

Our documentation contain various Colabs and tutorial for:
Our documentation contain various Colabs and tutorial, including:

* [Sampling](https://gemma-llm.readthedocs.io/en/latest/colab_sampling.html)
* [Fine-tuning](https://gemma-llm.readthedocs.io/en/latest/colab_finetuning.html)
* [LoRA](https://gemma-llm.readthedocs.io/en/latest/lora.html)
* [LoRA](https://gemma-llm.readthedocs.io/en/latest/colab_lora_sampling.html)
* ...

Additionally, our
[examples/](https://github.com/google-deepmind/gemma/tree/main/examples) folder
Expand Down
979 changes: 979 additions & 0 deletions colabs/lora_finetuning.ipynb

Large diffs are not rendered by default.

6 changes: 3 additions & 3 deletions colabs/lora.ipynb → colabs/lora_sampling.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -6,11 +6,11 @@
"id": "qKlB5QTDIV6S"
},
"source": [
"# LoRA example\n",
"# LoRA (Sampling)\n",
"\n",
"[![Open in Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/google-deepmind/gemma/blob/main/colabs/lora.ipynb)\n",
"[![Open in Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/google-deepmind/gemma/blob/main/colabs/lora_sampling.ipynb)\n",
"\n",
"Example on using LoRA with Gemma (for both training and inference)."
"Example on using LoRA with Gemma (for inference). For an example of fine-tuning with LoRA, see [LoRA finetuning](https://github.com/google-deepmind/gemma/blob/main/docs/lora_finetuning.md) example."
]
},
{
Expand Down
3 changes: 2 additions & 1 deletion docs/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,8 @@
# `'colab/finetuning.ipynb'` as output.
includes_paths={
'colabs/finetuning.ipynb': 'colab_finetuning.ipynb',
'colabs/lora.ipynb': 'colab_lora.ipynb',
'colabs/lora_sampling.ipynb': 'colab_lora_sampling.ipynb',
'colabs/lora_finetuning.ipynb': 'colab_lora_finetuning.ipynb',
'colabs/sampling.ipynb': 'colab_sampling.ipynb',
'colabs/tokenizer.ipynb': 'colab_tokenizer.ipynb',
'gemma/peft/README.md': 'peft.md',
Expand Down
3 changes: 2 additions & 1 deletion docs/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,8 @@
colab_finetuning
colab_sampling
colab_tokenizer
lora
colab_lora_sampling
colab_lora_finetuning
peft
checkpoints
Expand Down
49 changes: 0 additions & 49 deletions docs/lora.md

This file was deleted.

1 change: 1 addition & 0 deletions docs/lora_finetuning.md
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
# LoRA (Finetuning)
1 change: 1 addition & 0 deletions docs/lora_sampling.md
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
# LoRA (Sampling)

0 comments on commit 69588dc

Please sign in to comment.