想了解下,是什么原因,选择了全局变量 global model, tokenizer ... #824
Replies: 1 comment
-
刚又试了试,不全局也会影响到了。。。 |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
openai 接口部分,有如下代码。
这里是 全局的
global model, tokenizer
用的过程中,如果因为输入过长度了。模型“崩了”,在重启模型之前,就用不了了。
然后,取消了全局调用
#global model, tokenizer
好像,输入超长度,崩了后,只是崩本次对话,不会影响到后续新的对话。那,基于什么原因,原来的采用全局变量呢?我想着,对话后,也不会去修改原模型啊,为什么全局呢.
我不要全局变量,会有什么潜在的影响吗?
Beta Was this translation helpful? Give feedback.
All reactions