Skip to content

Commit eea2b04

Browse files
committed
Enhance documentation for RubyLLM
- Updated the "Streaming Responses" guide to improve clarity and structure, including a new table of contents and detailed explanations of chunk handling and integration with web frameworks. - Revised the "Using Tools" guide to clarify tool creation, usage, and error handling, along with new examples and best practices. - Improved the "Installation" guide with clearer prerequisites, installation methods, and configuration options, including advanced settings for API keys and connection settings. - Added a table of contents to the models documentation for better navigation and updated the last updated timestamp display.
1 parent 48f2faa commit eea2b04

File tree

12 files changed

+1510
-1980
lines changed

12 files changed

+1510
-1980
lines changed

docs/guides/available-models.md

Lines changed: 244 additions & 145 deletions
Large diffs are not rendered by default.

docs/guides/chat.md

Lines changed: 168 additions & 195 deletions
Large diffs are not rendered by default.

docs/guides/embeddings.md

Lines changed: 92 additions & 222 deletions
Large diffs are not rendered by default.

docs/guides/error-handling.md

Lines changed: 150 additions & 223 deletions
Large diffs are not rendered by default.

docs/guides/getting-started.md

Lines changed: 74 additions & 109 deletions
Original file line numberDiff line numberDiff line change
@@ -7,164 +7,129 @@ permalink: /guides/getting-started
77
---
88

99
# Getting Started with RubyLLM
10+
{: .no_toc }
1011

11-
This guide will help you get up and running with RubyLLM, showing you the basics of chatting with AI models, generating images, and creating embeddings.
12+
Welcome to RubyLLM! This guide will get you up and running quickly. We'll cover installing the gem, configuring your first API key, and making basic chat, image, and embedding requests.
13+
{: .fs-6 .fw-300 }
1214

13-
## Prerequisites
15+
## Table of contents
16+
{: .no_toc .text-delta }
1417

15-
Before starting, make sure you have:
18+
1. TOC
19+
{:toc}
1620

17-
1. Installed the RubyLLM gem (see the [Installation guide]({% link installation.md %}))
18-
2. At least one API key from a supported provider (OpenAI, Anthropic, Google, AWS Bedrock, or DeepSeek)
21+
---
22+
23+
After reading this guide, you will know:
24+
25+
* How to install RubyLLM.
26+
* How to configure API keys.
27+
* How to start a simple chat conversation.
28+
* How to generate an image.
29+
* How to create text embeddings.
1930

20-
## Basic Configuration
31+
## Installation
2132

22-
Let's start by setting up RubyLLM with your API keys:
33+
Add RubyLLM to your Gemfile:
34+
35+
```ruby
36+
gem 'ruby_llm'
37+
```
38+
39+
Then run `bundle install`.
40+
41+
Alternatively, install it manually: `gem install ruby_llm`
42+
43+
(For full details, see the [Installation Guide]({% link installation.md %})).
44+
45+
## Configuration
46+
47+
RubyLLM needs API keys for the AI providers you want to use. Configure them, typically in an initializer (`config/initializers/ruby_llm.rb` in Rails) or at the start of your script.
2348

2449
```ruby
2550
require 'ruby_llm'
2651

2752
RubyLLM.configure do |config|
28-
# Add the API keys you have available
53+
# Add keys for the providers you plan to use.
54+
# Using environment variables is recommended.
2955
config.openai_api_key = ENV.fetch('OPENAI_API_KEY', nil)
30-
config.anthropic_api_key = ENV.fetch('ANTHROPIC_API_KEY', nil)
31-
config.gemini_api_key = ENV.fetch('GEMINI_API_KEY', nil)
32-
config.deepseek_api_key = ENV.fetch('DEEPSEEK_API_KEY', nil)
33-
34-
# Bedrock
35-
config.bedrock_api_key = ENV.fetch('AWS_ACCESS_KEY_ID', nil)
36-
config.bedrock_secret_key = ENV.fetch('AWS_SECRET_ACCESS_KEY', nil)
37-
config.bedrock_region = ENV.fetch('AWS_REGION', nil)
38-
config.bedrock_session_token = ENV.fetch('AWS_SESSION_TOKEN', nil)
56+
# config.anthropic_api_key = ENV.fetch('ANTHROPIC_API_KEY', nil)
57+
# ... add other provider keys as needed
3958
end
4059
```
4160

61+
You only need to configure keys for the providers you intend to use. See the [Installation Guide]({% link installation.md %}#configuration) for all configuration options.
62+
4263
## Your First Chat
4364

44-
Let's start with a simple chat interaction:
65+
The primary way to interact with language models is through the `RubyLLM.chat` interface.
4566

4667
```ruby
47-
# Create a chat (uses the default model)
68+
# Create a chat instance (uses the default model, usually GPT)
4869
chat = RubyLLM.chat
4970

5071
# Ask a question
51-
response = chat.ask "What's the capital of France?"
52-
puts response.content
53-
# => "The capital of France is Paris."
72+
response = chat.ask "What is Ruby on Rails?"
5473

55-
# Continue the conversation
56-
response = chat.ask "What's the population of that city?"
74+
# The response is a RubyLLM::Message object
5775
puts response.content
58-
# => "Paris has a population of approximately 2.1 million people..."
59-
```
60-
61-
### Using a Specific Model
76+
# => "Ruby on Rails, often shortened to Rails, is a server-side web application..."
6277

63-
You can specify which model you want to use:
64-
65-
```ruby
66-
# Use Claude
67-
claude_chat = RubyLLM.chat(model: 'claude-3-5-sonnet-20241022')
68-
claude_chat.ask "Tell me about Ruby programming language"
69-
70-
# Use Gemini
71-
gemini_chat = RubyLLM.chat(model: 'gemini-2.0-flash')
72-
gemini_chat.ask "What are the best Ruby gems for machine learning?"
78+
# Continue the conversation naturally
79+
response = chat.ask "What are its main advantages?"
80+
puts response.content
81+
# => "Some key advantages of Ruby on Rails include..."
7382
```
7483

75-
## Exploring Available Models
84+
RubyLLM automatically handles conversation history. Dive deeper in the [Chatting with AI Models Guide]({% link guides/chat.md %}).
7685

77-
RubyLLM gives you access to models from multiple providers. You can see what's available:
86+
## Generating an Image
7887

79-
```ruby
80-
# List all models
81-
all_models = RubyLLM.models.all
82-
puts "Total models: #{all_models.count}"
83-
84-
# List chat models
85-
chat_models = RubyLLM.models.chat_models
86-
puts "Chat models:"
87-
chat_models.each do |model|
88-
puts "- #{model.id} (#{model.provider})"
89-
end
90-
91-
# List embedding models
92-
RubyLLM.models.embedding_models.each do |model|
93-
puts "- #{model.id} (#{model.provider})"
94-
end
95-
96-
# Find info about a specific model
97-
gpt = RubyLLM.models.find('gpt-4.1-nano')
98-
puts "Context window: #{gpt.context_window}"
99-
puts "Max tokens: #{gpt.max_tokens}"
100-
puts "Pricing: $#{gpt.input_price_per_million} per million input tokens"
101-
```
102-
103-
## Generating Images
104-
105-
RubyLLM makes it easy to generate images with DALL-E:
88+
You can generate images using models like DALL-E 3 via the `RubyLLM.paint` method.
10689

10790
```ruby
108-
# Generate an image
109-
image = RubyLLM.paint("a sunset over mountains")
91+
# Generate an image (uses the default image model, usually DALL-E 3)
92+
image = RubyLLM.paint("A futuristic cityscape at sunset, watercolor style")
11093

111-
# The URL where you can view/download the image
94+
# Access the image URL
11295
puts image.url
96+
# => "https://oaidalleapiprodscus.blob.core.windows.net/..."
11397

114-
# How the model interpreted your prompt
98+
# See the potentially revised prompt the model used
11599
puts image.revised_prompt
116-
117-
# Generate a larger image
118-
large_image = RubyLLM.paint(
119-
"a cyberpunk city at night with neon lights",
120-
size: "1792x1024"
121-
)
100+
# => "A watercolor painting of a futuristic cityscape bathed in the warm hues of a setting sun..."
122101
```
123102

103+
Learn more in the [Image Generation Guide]({% link guides/image-generation.md %}).
104+
124105
## Creating Embeddings
125106

126-
Embeddings are vector representations of text that can be used for semantic search, classification, and more:
107+
Embeddings represent text as numerical vectors, useful for tasks like semantic search. Use `RubyLLM.embed`.
127108

128109
```ruby
129-
# Create an embedding for a single text
130-
embedding = RubyLLM.embed("Ruby is a programmer's best friend")
110+
# Create an embedding for a single piece of text
111+
embedding = RubyLLM.embed("Ruby is optimized for programmer happiness.")
131112

132-
# The vector representation
113+
# Access the vector (an array of floats)
133114
vector = embedding.vectors
134-
puts "Vector dimension: #{vector.length}"
115+
puts "Vector dimension: #{vector.length}" # e.g., 1536 for text-embedding-3-small
135116

136-
# Create embeddings for multiple texts
137-
texts = ["Ruby", "Python", "JavaScript"]
117+
# Embed multiple texts at once
118+
texts = ["Convention over configuration", "Model-View-Controller", "Metaprogramming"]
138119
embeddings = RubyLLM.embed(texts)
139120

140-
# Each text gets its own vector
141-
puts "Number of vectors: #{embeddings.vectors.length}"
121+
puts "Generated #{embeddings.vectors.length} vectors." # => 3
142122
```
143123

144-
## Working with Conversations
145-
146-
Here's how to have a multi-turn conversation:
147-
148-
```ruby
149-
chat = RubyLLM.chat
150-
151-
# First message
152-
chat.ask "What are the benefits of Ruby on Rails?"
153-
154-
# Follow-up questions
155-
chat.ask "How does that compare to Django?"
156-
chat.ask "Which one would you recommend for a new web project?"
157-
158-
# You can check all messages in the conversation
159-
chat.messages.each do |message|
160-
puts "#{message.role}: #{message.content[0..100]}..."
161-
end
162-
```
124+
Explore further in the [Embeddings Guide]({% link guides/embeddings.md %}).
163125

164126
## What's Next?
165127

166-
Now that you've got the basics down, you're ready to explore more advanced features:
128+
You've seen the basics! Now you're ready to explore RubyLLM's features in more detail:
167129

168-
- [Chatting with AI]({% link guides/chat.md %}) - Learn more about chat capabilities
169-
- [Using Tools]({% link guides/tools.md %}) - Let AI use your Ruby code
170-
- [Rails Integration]({% link guides/rails.md %}) - Persist chats in your Rails apps
130+
* [Chatting with AI Models]({% link guides/chat.md %})
131+
* [Working with Models]({% link guides/models.md %}) (Choosing models, custom endpoints)
132+
* [Using Tools]({% link guides/tools.md %}) (Letting AI call your code)
133+
* [Streaming Responses]({% link guides/streaming.md %})
134+
* [Rails Integration]({% link guides/rails.md %})
135+
* [Error Handling]({% link guides/error-handling.md %})

0 commit comments

Comments
 (0)