Skip to content

Commit 6f2e2a1

Browse files
committed
Add Perplexity provider support
- Implement Perplexity as OpenAI-compatible provider - Add 5 Perplexity models with pricing information - Handle HTML error responses from Perplexity API - Add test coverage with proper skips for unsupported features - Update documentation and configuration
1 parent 4701a45 commit 6f2e2a1

30 files changed

+2710
-245
lines changed

README.md

Lines changed: 6 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -24,6 +24,9 @@
2424
<img src="https://registry.npmmirror.com/@lobehub/icons-static-svg/latest/files/icons/openrouter.svg" alt="OpenRouter" class="logo-medium">
2525
<img src="https://registry.npmmirror.com/@lobehub/icons-static-svg/latest/files/icons/openrouter-text.svg" alt="OpenRouter" class="logo-small">
2626
&nbsp;
27+
<img src="https://registry.npmmirror.com/@lobehub/icons-static-svg/latest/files/icons/perplexity-color.svg" alt="Perplexity" class="logo-medium">
28+
<img src="https://registry.npmmirror.com/@lobehub/icons-static-svg/latest/files/icons/perplexity-text.svg" alt="Perplexity" class="logo-small">
29+
&nbsp;
2730
</div>
2831

2932
<div class="badge-container">
@@ -102,7 +105,7 @@ response = chat.with_schema(ProductSchema)
102105

103106
## Core Capabilities
104107

105-
* 💬 **Unified Chat:** Converse with models from OpenAI, Anthropic, Gemini, Bedrock, OpenRouter, DeepSeek, Ollama, or any OpenAI-compatible API using `RubyLLM.chat`.
108+
* 💬 **Unified Chat:** Converse with models from OpenAI, Anthropic, Gemini, Bedrock, OpenRouter, DeepSeek, Perplexity, Ollama, or any OpenAI-compatible API using `RubyLLM.chat`.
106109
* 👁️ **Vision:** Analyze images within chats.
107110
* 🔊 **Audio:** Transcribe and understand audio content.
108111
* 📄 **Document Analysis:** Extract information from PDFs, text files, CSV, JSON, XML, Markdown, and code files.
@@ -114,7 +117,7 @@ response = chat.with_schema(ProductSchema)
114117
* 🌊 **Streaming:** Process responses in real-time with idiomatic Ruby blocks.
115118
***Async Support:** Built-in fiber-based concurrency for high-performance operations.
116119
* 🎯 **Smart Configuration:** Global and scoped configs with automatic retries and proxy support.
117-
* 📚 **Model Registry:** Access 100+ models with capability detection and pricing info.
120+
* 📚 **Model Registry:** Access 500+ models with capability detection and pricing info.
118121

119122
## Installation
120123

@@ -141,7 +144,7 @@ See the [Installation Guide](https://rubyllm.com/installation) for full details.
141144
Add persistence to your chat models effortlessly:
142145

143146
```bash
144-
# Generate models and migrations (available in v1.4.0)
147+
# Generate models and migrations
145148
rails generate ruby_llm:install
146149
```
147150

bin/console

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -13,6 +13,7 @@ RubyLLM.configure do |config|
1313
config.anthropic_api_key = ENV.fetch('ANTHROPIC_API_KEY', nil)
1414
config.gemini_api_key = ENV.fetch('GEMINI_API_KEY', nil)
1515
config.deepseek_api_key = ENV.fetch('DEEPSEEK_API_KEY', nil)
16+
config.perplexity_api_key = ENV.fetch('PERPLEXITY_API_KEY', nil)
1617
config.openrouter_api_key = ENV.fetch('OPENROUTER_API_KEY', nil)
1718
config.ollama_api_base = ENV.fetch('OLLAMA_API_BASE', nil)
1819
config.bedrock_api_key = ENV.fetch('AWS_ACCESS_KEY_ID', nil)

docs/configuration.md

Lines changed: 5 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -52,9 +52,10 @@ RubyLLM.configure do |config|
5252
config.gemini_api_key = ENV.fetch('GEMINI_API_KEY', nil)
5353
config.deepseek_api_key = ENV.fetch('DEEPSEEK_API_KEY', nil)
5454
config.openrouter_api_key = ENV.fetch('OPENROUTER_API_KEY', nil)
55+
config.perplexity_api_key = ENV.fetch('PERPLEXITY_API_KEY', nil) # Available in v1.5.0
5556
config.ollama_api_base = ENV.fetch('OLLAMA_API_BASE', nil)
56-
config.gpustack_api_base = ENV.fetch('GPUSTACK_API_BASE', nil) # Available in v1.4.0
57-
config.gpustack_api_key = ENV.fetch('GPUSTACK_API_KEY', nil) # Available in v1.4.0
57+
config.gpustack_api_base = ENV.fetch('GPUSTACK_API_BASE', nil)
58+
config.gpustack_api_key = ENV.fetch('GPUSTACK_API_KEY', nil)
5859

5960
# --- AWS Bedrock Credentials ---
6061
# Uses standard AWS credential chain (environment, shared config, IAM role)
@@ -108,7 +109,8 @@ Set the corresponding `*_api_key` attribute for each provider you want to enable
108109
* `gemini_api_key`
109110
* `deepseek_api_key`
110111
* `openrouter_api_key`
111-
* `gpustack_api_key` (Available in v1.4.0)
112+
* `gpustack_api_key`
113+
* `perplexity_api_key` (Available in v1.5.0)
112114
* `bedrock_api_key`, `bedrock_secret_key`, `bedrock_region`, `bedrock_session_token` (See AWS documentation for standard credential methods if not set explicitly).
113115

114116
## Ollama API Base (`ollama_api_base`)

docs/guides/available-models.md

Lines changed: 85 additions & 50 deletions
Large diffs are not rendered by default.

docs/guides/chat.md

Lines changed: 0 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -251,8 +251,6 @@ You can set the temperature using `with_temperature`, which returns the `Chat` i
251251
## Custom Request Parameters (`with_params`)
252252
{: .d-inline-block }
253253

254-
Available in v1.4.0
255-
{: .label .label-yellow }
256254

257255
You can configure additional provider-specific features by adding custom fields to each API request. Use the `with_params` method.
258256

@@ -268,8 +266,6 @@ Allowed parameters vary widely by provider and model. Please consult the provide
268266
## Structured Output with JSON Schemas (`with_schema`)
269267
{: .d-inline-block }
270268

271-
Available in v1.4.0
272-
{: .label .label-yellow }
273269

274270
RubyLLM supports structured output, which guarantees that AI responses conform to your specified JSON schema. This is different from JSON mode – while JSON mode guarantees valid JSON syntax, structured output enforces the exact schema you define.
275271

docs/guides/rails.md

Lines changed: 0 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -62,8 +62,6 @@ This approach has one important consequence: **you cannot use `validates :conten
6262
### Quick Setup with Generator
6363
{: .d-inline-block }
6464

65-
Available in v1.4.0
66-
{: .label .label-yellow }
6765

6866
The easiest way to get started is using the provided Rails generator:
6967

@@ -362,8 +360,6 @@ The attachment API automatically detects file types based on file extension or c
362360
### Structured Output with Schemas
363361
{: .d-inline-block }
364362

365-
Available in v1.4.0
366-
{: .label .label-yellow }
367363

368364
Structured output works seamlessly with Rails persistence:
369365

docs/index.md

Lines changed: 7 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -49,6 +49,10 @@ One beautiful API for ChatGPT, Claude, Gemini, and more. Chat, images, embedding
4949
<img src="https://registry.npmmirror.com/@lobehub/icons-static-svg/latest/files/icons/openrouter.svg" alt="OpenRouter" class="logo-medium">
5050
<img src="https://registry.npmmirror.com/@lobehub/icons-static-svg/latest/files/icons/openrouter-text.svg" alt="OpenRouter" class="logo-small">
5151
</div>
52+
<div class="provider-logo">
53+
<img src="https://registry.npmmirror.com/@lobehub/icons-static-svg/latest/files/icons/perplexity-color.svg" alt="Perplexity" class="logo-medium">
54+
<img src="https://registry.npmmirror.com/@lobehub/icons-static-svg/latest/files/icons/perplexity-text.svg" alt="Perplexity" class="logo-small">
55+
</div>
5256
</div>
5357

5458
<div class="badge-container">
@@ -129,7 +133,7 @@ response = chat.with_schema(ProductSchema)
129133

130134
## Core Capabilities
131135

132-
* 💬 **Unified Chat:** Converse with models from OpenAI, Anthropic, Gemini, Bedrock, OpenRouter, DeepSeek, Ollama, or any OpenAI-compatible API using `RubyLLM.chat`.
136+
* 💬 **Unified Chat:** Converse with models from OpenAI, Anthropic, Gemini, Bedrock, OpenRouter, DeepSeek, Perplexity, Ollama, or any OpenAI-compatible API using `RubyLLM.chat`.
133137
* 👁️ **Vision:** Analyze images within chats.
134138
* 🔊 **Audio:** Transcribe and understand audio content.
135139
* 📄 **Document Analysis:** Extract information from PDFs, text files, CSV, JSON, XML, Markdown, and code files.
@@ -141,7 +145,7 @@ response = chat.with_schema(ProductSchema)
141145
* 🌊 **Streaming:** Process responses in real-time with idiomatic Ruby blocks.
142146
***Async Support:** Built-in fiber-based concurrency for high-performance operations.
143147
* 🎯 **Smart Configuration:** Global and scoped configs with automatic retries and proxy support.
144-
* 📚 **Model Registry:** Access 100+ models with capability detection and pricing info.
148+
* 📚 **Model Registry:** Access 500+ models with capability detection and pricing info.
145149

146150
## Installation
147151

@@ -168,7 +172,7 @@ See the [Installation Guide](https://rubyllm.com/installation) for full details.
168172
Add persistence to your chat models effortlessly:
169173

170174
```bash
171-
# Generate models and migrations (available in v1.4.0)
175+
# Generate models and migrations
172176
rails generate ruby_llm:install
173177
```
174178

lib/ruby_llm.rb

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -16,6 +16,7 @@
1616
'openai' => 'OpenAI',
1717
'api' => 'API',
1818
'deepseek' => 'DeepSeek',
19+
'perplexity' => 'Perplexity',
1920
'bedrock' => 'Bedrock',
2021
'openrouter' => 'OpenRouter',
2122
'gpustack' => 'GPUStack',
@@ -82,6 +83,7 @@ def logger
8283
RubyLLM::Provider.register :anthropic, RubyLLM::Providers::Anthropic
8384
RubyLLM::Provider.register :gemini, RubyLLM::Providers::Gemini
8485
RubyLLM::Provider.register :deepseek, RubyLLM::Providers::DeepSeek
86+
RubyLLM::Provider.register :perplexity, RubyLLM::Providers::Perplexity
8587
RubyLLM::Provider.register :bedrock, RubyLLM::Providers::Bedrock
8688
RubyLLM::Provider.register :openrouter, RubyLLM::Providers::OpenRouter
8789
RubyLLM::Provider.register :ollama, RubyLLM::Providers::Ollama

lib/ruby_llm/configuration.rb

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -18,6 +18,7 @@ class Configuration
1818
:anthropic_api_key,
1919
:gemini_api_key,
2020
:deepseek_api_key,
21+
:perplexity_api_key,
2122
:bedrock_api_key,
2223
:bedrock_secret_key,
2324
:bedrock_region,

0 commit comments

Comments
 (0)