Skip to content

Commit cc59505

Browse files
authored
[training docs] smol update to README files (#11616)
add comment to install prodigy
1 parent 5f5d02f commit cc59505

File tree

3 files changed

+4
-2
lines changed

3 files changed

+4
-2
lines changed

examples/advanced_diffusion_training/README.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -128,6 +128,7 @@ You can also load a dataset straight from by specifying it's name in `dataset_na
128128
Look [here](https://huggingface.co/blog/sdxl_lora_advanced_script#custom-captioning) for more info on creating/loading your own caption dataset.
129129

130130
- **optimizer**: for this example, we'll use [prodigy](https://huggingface.co/blog/sdxl_lora_advanced_script#adaptive-optimizers) - an adaptive optimizer
131+
- To use Prodigy, please make sure to install the prodigyopt library: `pip install prodigyopt`
131132
- **pivotal tuning**
132133
- **min SNR gamma**
133134

examples/advanced_diffusion_training/README_flux.md

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -143,7 +143,8 @@ Now we'll simply specify the name of the dataset and caption column (in this cas
143143
You can also load a dataset straight from by specifying it's name in `dataset_name`.
144144
Look [here](https://huggingface.co/blog/sdxl_lora_advanced_script#custom-captioning) for more info on creating/loading your own caption dataset.
145145

146-
- **optimizer**: for this example, we'll use [prodigy](https://huggingface.co/blog/sdxl_lora_advanced_script#adaptive-optimizers) - an adaptive optimizer
146+
- **optimizer**: for this example, we'll use [prodigy](https://huggingface.co/blog/sdxl_lora_advanced_script#adaptive-optimizers) - an adaptive optimizer
147+
- To use Prodigy, please make sure to install the prodigyopt library: `pip install prodigyopt`
147148
- **pivotal tuning**
148149

149150
### Example #1: Pivotal tuning

examples/dreambooth/README_flux.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -134,7 +134,7 @@ Note also that we use PEFT library as backend for LoRA training, make sure to ha
134134
Prodigy is an adaptive optimizer that dynamically adjusts the learning rate learned parameters based on past gradients, allowing for more efficient convergence.
135135
By using prodigy we can "eliminate" the need for manual learning rate tuning. read more [here](https://huggingface.co/blog/sdxl_lora_advanced_script#adaptive-optimizers).
136136

137-
to use prodigy, specify
137+
to use prodigy, first make sure to install the prodigyopt library: `pip install prodigyopt`, and then specify -
138138
```bash
139139
--optimizer="prodigy"
140140
```

0 commit comments

Comments
 (0)