Replies: 2 comments
-
Yep right here - https://docs.litellm.ai/docs/proxy/quick_start#supported-llms
e.g.
to call ollama's codellama model (by default this will assume it's on port 11434) If you want to change the api base, just do
|
Beta Was this translation helpful? Give feedback.
0 replies
-
@jxnl let me know if this solves your problem |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Is there an example of using ollama with the openai proxy server to host mistral?
Beta Was this translation helpful? Give feedback.
All reactions