Replies: 1 comment
-
@zrzrzrzrzrzrzrzr,求指导,谢谢! |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
官方微调页面给出使用微调后的模型的说明:“请注意,对于 LORA 和 P-TuningV2 我们没有合并训练后的模型,而是在adapter_config.json 中记录了微调型的路径,如果你的原始模型位置发生更改,则你应该修改adapter_config.json中base_model_name_or_path的路径。”,但我下载的官方项目里面没有找打adapter_config.json文件,那是否说明微调后输出的output目录下的checkpoint就是合并后的?请指导,谢谢! 如果不是的,我该怎么得到合并后的模型呢?谢谢!要使用讨论区WE-IOT大佬说的合并吗?
https://zhuanlan.zhihu.com/p/683583816#showWechatShareTip?utm_source=wechat_session&utm_medium=social&s_r=0
Beta Was this translation helpful? Give feedback.
All reactions