You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: apps/baseai.dev/content/docs/docs/supported-models-and-providers.mdx
+9Lines changed: 9 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -29,6 +29,7 @@ We currently support the following LLM providers.
29
29
- Fireworks AI
30
30
- Perplexity
31
31
- Mistral AI
32
+
- xAI
32
33
- Ollama *(local-only)*
33
34
34
35
You can use any of these providers to build your Pipe, by adding your provider's key. Please feel free to request any specific provider you would like to use.
@@ -173,6 +174,14 @@ Learn more about [using Ollama models](/docs/guides/using-ollama-models) in Base
173
174
174
175
<spanclassName='text-xs'>* USD per Million tokens</span>
0 commit comments