Docs - Add example using LiteLLM Proxy to call Mistral AI Models#75
Docs - Add example using LiteLLM Proxy to call Mistral AI Models#75ishaan-jaff wants to merge 7 commits intomistralai:mainfrom
Conversation
|
hi @sophiamyang can you review this PR? Happy to make any changes necessary |
third_party/LiteLLM/README.md
Outdated
| [Use with Langchain, LlamaIndex, Instructor, etc.](https://docs.litellm.ai/docs/proxy/user_keys) | ||
|
|
||
| ```bash | ||
| import openai |
There was a problem hiding this comment.
Could you use Mistral SDK instead of OpenAI please?
There was a problem hiding this comment.
Updated @sophiamyang - now examples use the Mistral SDK
|
following up on this @sophiamyang - any other changes ? |
|
Hi ishaan, I dont think this will work, our current SDK Python Client does not have the same methods and behavior as OpenAIs SDKs, it seems you are using chat.completions.create and other methods that do not current exist with the current SDK, possible to update it? And thank you for the notebook! 🙏 |
|
Hi @ishaan-jaff, I think your code with Mistral client still has issues. Could you help update the code? You can see our docs here. |
|
acknowledging this, will add this to my backlog @sophiamyang |
Doc - Add example on using Mistral models with LiteLLM Proxy
Hi, I'm the maintainer of LiteLLM - made a PR to show how to use LiteLLM Proxy to call Mistral AI models
Why use LiteLLM Proxy ?
Use LiteLLM Proxy for:
Works for Mistral AI API + Codestral API + Bedrock