PT微调后的模型如何增加每次输出的最大长度 #538
Replies: 2 comments
-
用官方给的inference.py 改一下模型的路径和微调出的pt路径就行了 |
Beta Was this translation helpful? Give feedback.
0 replies
-
#253 在这里讨论 |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
PT微调后的模型如何本地部署,使用api调用?可以给一个demo吗
Beta Was this translation helpful? Give feedback.
All reactions