Skip to content

Commit a451c0e

Browse files
removing redundant requires_grad = False (#10628)
We already set the unet to requires grad false at line 506 Co-authored-by: Aryan <aryan@huggingface.co>
1 parent 37c9697 commit a451c0e

File tree

1 file changed

+0
-4
lines changed

1 file changed

+0
-4
lines changed

examples/text_to_image/train_text_to_image_lora.py

Lines changed: 0 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -515,10 +515,6 @@ def main():
515515
elif accelerator.mixed_precision == "bf16":
516516
weight_dtype = torch.bfloat16
517517

518-
# Freeze the unet parameters before adding adapters
519-
for param in unet.parameters():
520-
param.requires_grad_(False)
521-
522518
unet_lora_config = LoraConfig(
523519
r=args.rank,
524520
lora_alpha=args.rank,

0 commit comments

Comments
 (0)