LoRA训练时Loss一直是0 #921
Replies: 2 comments
-
我之前也遇到过,是peft库的版本问题,peft库的版本得是0.7.1 |
Beta Was this translation helpful? Give feedback.
0 replies
-
ok,我也是重新指定PEFT库的版本后,训练时loss正常了
…---- 回复的原邮件 ----
| 发件人 | ***@***.***> |
| 发送日期 | 2024年03月07日 16:38 |
| 收件人 | THUDM/ChatGLM3 ***@***.***> |
| 抄送人 | sofiane-20241050 ***@***.***>,
Author ***@***.***> |
| 主题 | Re: [THUDM/ChatGLM3] LoRA训练时Loss一直是0 (Discussion #921) |
我之前也遇到过,是peft库的版本问题,peft库的版本得是0.7.1
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you authored the thread.Message ID: ***@***.***>
|
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
请问为什么我在使用LoRA训练时,Loss从第二个step开始一直是0.0
Beta Was this translation helpful? Give feedback.
All reactions