What's the difference between DreamBooth LoRa and traditional LoRa? #12942
Replies: 4 comments
-
|
Yes, I have the same question! |
Beta Was this translation helpful? Give feedback.
-
|
Hi, are you asking this after reading the Dreambooth paper? The prior preservation is optional and its used to what the name says, to preserve the knowledge of the model, for example if you're training a special breed of dog, you use prior preservation so not all the generated dogs are the breed you trained, if you don't care about this, which is common, you don't have to use prior preservation. Dreambooth is a very different target training that normal SFT training, lora or not. Primarily is that it uses a very small dataset and uses instance and class tokens, that's why you sometimes see something like A shorter answer to your questions just in case:
Yes
No |
Beta Was this translation helpful? Give feedback.
-
|
Thank you very much for your reply. |
Beta Was this translation helpful? Give feedback.
-
|
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread. Please note that issues that do not follow the contributing guidelines are likely to be ignored. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
I see a lot of examples using DreamBooth LoRa training code. What's the difference between this and traditional LoRa training? Can this DreamBooth LoRa training code be adapted to standard SFT LoRa code? Does disabling with_prior_preservation return normal LoRa training?
Beta Was this translation helpful? Give feedback.
All reactions