Skip to content

Conversation

@kozistr
Copy link
Contributor

@kozistr kozistr commented Dec 21, 2025

Note

related to kozistr/pytorch_optimizer#462,

LoRA training in Kohya_ss could fail when module dropout was enabled together with pytorch-optimizer optimizers due to the lazy initialization scheme. This issue has been fixed and released in v3.9.0.

You can also check out the changelog here!

Thanks to @JHawkley for bringing this up!

@kohya-ss
Copy link
Owner

Thank you! This fixes the issue with module_dropout!

@kohya-ss kohya-ss merged commit 94e3dbe into kohya-ss:sd3 Dec 21, 2025
3 checks passed
@kozistr kozistr deleted the deps/pytorch-optimizer branch December 21, 2025 20:45
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants