About Connect Custom APIs #283
Unanswered
lhx20011226
asked this question in
Q&A
Replies: 1 comment
-
|
I don't believe you have the server connected correctly. It should be "http://10.8.10.9:8020/v1" not "http://10.8.10.9:8020/v1/chat/co…" After that it should work (assuming the URL and model name are correct) |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I have a Qianwen large language model (LLM) deployed on an internal Linux server. After adding it to “Connect Custom APIs,” it can only call the deepseek deployed locally on my Windows 10 computer via ollama, and it can't be called from anywhere else. What should I do?

Beta Was this translation helpful? Give feedback.
All reactions