Using Mistral models via API #464
Unanswered
procoprobocop
asked this question in
Q&A
Replies: 1 comment 2 replies
-
LlamaIndex supports also MistralAI see here https://docs.llamaindex.ai/en/stable/examples/llm/mistralai/ (In a few weeks I will do the same as you.) Then I know more |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Is there a possibility in "text_to_sql_pipelines" to use models other than Ollama?
For example, are the models available via API, for example, models from Mistral?
I know that Open AI models can be used via the API, but this does not suit me.
Beta Was this translation helpful? Give feedback.
All reactions