From 975f4ac5de518c75ae0a59b044d5c3ef3a32520e Mon Sep 17 00:00:00 2001 From: linoytsaban Date: Tue, 27 May 2025 14:35:38 +0300 Subject: [PATCH] add comment to install prodigy --- examples/advanced_diffusion_training/README.md | 1 + examples/advanced_diffusion_training/README_flux.md | 3 ++- examples/dreambooth/README_flux.md | 2 +- 3 files changed, 4 insertions(+), 2 deletions(-) diff --git a/examples/advanced_diffusion_training/README.md b/examples/advanced_diffusion_training/README.md index 6e63b4712fbf..eedb1c96e459 100644 --- a/examples/advanced_diffusion_training/README.md +++ b/examples/advanced_diffusion_training/README.md @@ -128,6 +128,7 @@ You can also load a dataset straight from by specifying it's name in `dataset_na Look [here](https://huggingface.co/blog/sdxl_lora_advanced_script#custom-captioning) for more info on creating/loading your own caption dataset. - **optimizer**: for this example, we'll use [prodigy](https://huggingface.co/blog/sdxl_lora_advanced_script#adaptive-optimizers) - an adaptive optimizer + - To use Prodigy, please make sure to install the prodigyopt library: `pip install prodigyopt` - **pivotal tuning** - **min SNR gamma** diff --git a/examples/advanced_diffusion_training/README_flux.md b/examples/advanced_diffusion_training/README_flux.md index c5f66013f5bc..c05fa26cf9de 100644 --- a/examples/advanced_diffusion_training/README_flux.md +++ b/examples/advanced_diffusion_training/README_flux.md @@ -143,7 +143,8 @@ Now we'll simply specify the name of the dataset and caption column (in this cas You can also load a dataset straight from by specifying it's name in `dataset_name`. Look [here](https://huggingface.co/blog/sdxl_lora_advanced_script#custom-captioning) for more info on creating/loading your own caption dataset. -- **optimizer**: for this example, we'll use [prodigy](https://huggingface.co/blog/sdxl_lora_advanced_script#adaptive-optimizers) - an adaptive optimizer +- **optimizer**: for this example, we'll use [prodigy](https://huggingface.co/blog/sdxl_lora_advanced_script#adaptive-optimizers) - an adaptive optimizer + - To use Prodigy, please make sure to install the prodigyopt library: `pip install prodigyopt` - **pivotal tuning** ### Example #1: Pivotal tuning diff --git a/examples/dreambooth/README_flux.md b/examples/dreambooth/README_flux.md index a0bd8cc1bdc0..aa43b00fafb3 100644 --- a/examples/dreambooth/README_flux.md +++ b/examples/dreambooth/README_flux.md @@ -134,7 +134,7 @@ Note also that we use PEFT library as backend for LoRA training, make sure to ha Prodigy is an adaptive optimizer that dynamically adjusts the learning rate learned parameters based on past gradients, allowing for more efficient convergence. By using prodigy we can "eliminate" the need for manual learning rate tuning. read more [here](https://huggingface.co/blog/sdxl_lora_advanced_script#adaptive-optimizers). -to use prodigy, specify +to use prodigy, first make sure to install the prodigyopt library: `pip install prodigyopt`, and then specify - ```bash --optimizer="prodigy" ```