Agents do not work with Self-hosted Models #5508
Unanswered
sreevatsank1999
asked this question in
Troubleshooting
Replies: 1 comment 3 replies
-
It works with Ollama models if you name your endpoint "Ollama". Otherwise, it needs to be complete OpenAI compatible as a custom endpoint, did you set a baseURL? Please share your |
Beta Was this translation helpful? Give feedback.
3 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
What happened?
Agents endpoint attempts to use the OpenAI backend when set to use a Self-Hosted model, and fails with
MODEL_NOT_FOUND
error asllama3.3:70b
is not a valid OpenAI model.It works as expected when using OpenAI Model or Gemini Model
Steps to Reproduce
What browsers are you seeing the problem on?
Microsoft Edge
Relevant log output
Screenshots
Code of Conduct
Beta Was this translation helpful? Give feedback.
All reactions