Skip to content

[LoRA] minor fix for load_lora_weights() for Flux and a test #11595

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 4 commits into from
May 22, 2025

Conversation

sayakpaul
Copy link
Member

What does this PR do?

Fixes #11592

Additionally, adds a test to check the following flow of operations work as expected:

load_lora_weights() -> delete_adapters() -> load_lora_weights().

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@bghira
Copy link
Contributor

bghira commented May 21, 2025

hi, thank you so much for the fast fix. i have verified it resolved the issue.

@@ -2545,14 +2545,13 @@ def _maybe_expand_lora_state_dict(cls, transformer, lora_state_dict):
if unexpected_modules:
logger.debug(f"Found unexpected modules: {unexpected_modules}. These will be ignored.")

is_peft_loaded = getattr(transformer, "peft_config", None) is not None
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Okay to merge. Just a question. Why is this condition causing the issue?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

delete_adapters() doesn't permanently delete the adapter layers. So, as an effect of that even if there's no peft_config in the base model, the base model state dict has base_layer substring present inside of it. Cc: @BenjaminBossan is this expected?

For the case of this PR which fixes #11592, we call load_lora_weights() -> delete_adapters() -> load_lora_weights(), so the said change resolves the problem.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

even if there's no peft_config in the base model, the base model state dict has base_layer substring present inside of it

Yes, that's expected. PEFT layers wrap the base layer. When deleting adapters, even the last, the PEFT layer does not "unwrap" itself. The only way to achieve that is via peft_model.unload(). Since diffusers does not make use of PeftModel, if this is desired, it would need to be re-implemented.

@sayakpaul sayakpaul merged commit a5f4cc7 into main May 22, 2025
33 checks passed
@sayakpaul sayakpaul deleted the flux-delete-adapters branch May 22, 2025 10:15
@DN6 DN6 added the roadmap Add to current release roadmap label Jun 5, 2025
@DN6 DN6 moved this from In Progress to Done in Diffusers Roadmap 0.34 Jun 5, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
roadmap Add to current release roadmap
Projects
Development

Successfully merging this pull request may close these issues.

0.33.X FluxLoraLoaderMixin delete_adapters Regression
5 participants