Replies: 1 comment 7 replies
-
Hey, do you have your config available to share? Did you set |
Beta Was this translation helpful? Give feedback.
7 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
My base model (llama3.2-3b-instruct) is in bfloat16, however, the lora adapter after training is float32, thus causing difference when merging adapter into a base model weight. So I would like to ask is there any way to have the LoRA adapter weights in bfloat16?
I train the model using FSDP + LoRA
Beta Was this translation helpful? Give feedback.
All reactions