Can we reduce the rank of a high-rank LoRA? #2095
sayakpaul
started this conversation in
Show and tell
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Can we lower the rank of high-rank LoRA to reduce mem requirements, have LoRAs in the same rank to enable torch.compile()?
Turns out that we can! That too w/ good'ol random projections & SVD 📷
Original vs. Reduced Rank 🐶
Code, more results, detailed write up are in: https://huggingface.co/sayakpaul/lower-rank-flux-lora
Beta Was this translation helpful? Give feedback.
All reactions