You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
- Implement Perplexity as OpenAI-compatible provider
- Add 5 Perplexity models with pricing information
- Handle HTML error responses from Perplexity API
- Add test coverage with proper skips for unsupported features
- Update documentation and configuration
* 💬 **Unified Chat:** Converse with models from OpenAI, Anthropic, Gemini, Bedrock, OpenRouter, DeepSeek, Ollama, or any OpenAI-compatible API using `RubyLLM.chat`.
108
+
* 💬 **Unified Chat:** Converse with models from OpenAI, Anthropic, Gemini, Bedrock, OpenRouter, DeepSeek, Perplexity, Ollama, or any OpenAI-compatible API using `RubyLLM.chat`.
106
109
* 👁️ **Vision:** Analyze images within chats.
107
110
* 🔊 **Audio:** Transcribe and understand audio content.
108
111
* 📄 **Document Analysis:** Extract information from PDFs, text files, CSV, JSON, XML, Markdown, and code files.
# Uses standard AWS credential chain (environment, shared config, IAM role)
@@ -108,7 +109,8 @@ Set the corresponding `*_api_key` attribute for each provider you want to enable
108
109
*`gemini_api_key`
109
110
*`deepseek_api_key`
110
111
*`openrouter_api_key`
111
-
*`gpustack_api_key` (Available in v1.4.0)
112
+
*`gpustack_api_key`
113
+
*`perplexity_api_key` (Available in v1.5.0)
112
114
*`bedrock_api_key`, `bedrock_secret_key`, `bedrock_region`, `bedrock_session_token` (See AWS documentation for standard credential methods if not set explicitly).
Copy file name to clipboardExpand all lines: docs/guides/chat.md
-4Lines changed: 0 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -251,8 +251,6 @@ You can set the temperature using `with_temperature`, which returns the `Chat` i
251
251
## Custom Request Parameters (`with_params`)
252
252
{: .d-inline-block }
253
253
254
-
Available in v1.4.0
255
-
{: .label .label-yellow }
256
254
257
255
You can configure additional provider-specific features by adding custom fields to each API request. Use the `with_params` method.
258
256
@@ -268,8 +266,6 @@ Allowed parameters vary widely by provider and model. Please consult the provide
268
266
## Structured Output with JSON Schemas (`with_schema`)
269
267
{: .d-inline-block }
270
268
271
-
Available in v1.4.0
272
-
{: .label .label-yellow }
273
269
274
270
RubyLLM supports structured output, which guarantees that AI responses conform to your specified JSON schema. This is different from JSON mode – while JSON mode guarantees valid JSON syntax, structured output enforces the exact schema you define.
* 💬 **Unified Chat:** Converse with models from OpenAI, Anthropic, Gemini, Bedrock, OpenRouter, DeepSeek, Ollama, or any OpenAI-compatible API using `RubyLLM.chat`.
136
+
* 💬 **Unified Chat:** Converse with models from OpenAI, Anthropic, Gemini, Bedrock, OpenRouter, DeepSeek, Perplexity, Ollama, or any OpenAI-compatible API using `RubyLLM.chat`.
133
137
* 👁️ **Vision:** Analyze images within chats.
134
138
* 🔊 **Audio:** Transcribe and understand audio content.
135
139
* 📄 **Document Analysis:** Extract information from PDFs, text files, CSV, JSON, XML, Markdown, and code files.
0 commit comments