Skip to content

Commit 1f8865a

Browse files
Provide access to raw response object from Faraday (#304)
## What this does <!-- Clear description of what this PR does and why --> Give callers access to the Faraday response on a property of the Message called "raw" ## Type of change - [x] New feature ## Scope check - [x] I read the [Contributing Guide](https://github.com/crmne/ruby_llm/blob/main/CONTRIBUTING.md) - [x] This aligns with RubyLLM's focus on **LLM communication** - [x] This isn't application-specific logic that belongs in user code - [x] This benefits most users, not just my specific use case ## Quality check - [x] I ran `overcommit --install` and all hooks pass - [x] I tested my changes thoroughly - [x] I updated documentation if needed - [x] I didn't modify auto-generated files manually (`models.json`, `aliases.json`) ## API changes - [x] New public methods/classes ## Related issues <!-- Link issues: "Fixes #123" or "Related to #123" --> Resolves #301 --------- Co-authored-by: Mike Robbins <mrobbins@alum.mit.edu>
1 parent b1238d5 commit 1f8865a

18 files changed

+574
-13
lines changed

docs/guides/chat.md

Lines changed: 11 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -487,6 +487,17 @@ end
487487
chat.ask "What is metaprogramming in Ruby?"
488488
```
489489

490+
## Raw Responses
491+
492+
You can access the raw response from the API provider with `response.raw`.
493+
494+
```ruby
495+
response = chat.ask("What is the capital of France?")
496+
puts response.raw.body
497+
```
498+
499+
The raw response is a `Faraday::Response` object, which you can use to access the headers, body, and status code.
500+
490501
## Next Steps
491502

492503
This guide covered the core `Chat` interface. Now you might want to explore:

lib/ruby_llm/message.rb

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ module RubyLLM
77
class Message
88
ROLES = %i[system user assistant tool].freeze
99

10-
attr_reader :role, :tool_calls, :tool_call_id, :input_tokens, :output_tokens, :model_id
10+
attr_reader :role, :tool_calls, :tool_call_id, :input_tokens, :output_tokens, :model_id, :raw
1111
attr_writer :content
1212

1313
def initialize(options = {})
@@ -18,6 +18,7 @@ def initialize(options = {})
1818
@output_tokens = options[:output_tokens]
1919
@model_id = options[:model_id]
2020
@tool_call_id = options[:tool_call_id]
21+
@raw = options[:raw]
2122

2223
ensure_valid_role
2324
end

lib/ruby_llm/providers/anthropic/chat.rb

Lines changed: 4 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -57,22 +57,23 @@ def parse_completion_response(response)
5757
text_content = extract_text_content(content_blocks)
5858
tool_use_blocks = Tools.find_tool_uses(content_blocks)
5959

60-
build_message(data, text_content, tool_use_blocks)
60+
build_message(data, text_content, tool_use_blocks, response)
6161
end
6262

6363
def extract_text_content(blocks)
6464
text_blocks = blocks.select { |c| c['type'] == 'text' }
6565
text_blocks.map { |c| c['text'] }.join
6666
end
6767

68-
def build_message(data, content, tool_use_blocks)
68+
def build_message(data, content, tool_use_blocks, response)
6969
Message.new(
7070
role: :assistant,
7171
content: content,
7272
tool_calls: Tools.parse_tool_calls(tool_use_blocks),
7373
input_tokens: data.dig('usage', 'input_tokens'),
7474
output_tokens: data.dig('usage', 'output_tokens'),
75-
model_id: data['model']
75+
model_id: data['model'],
76+
raw: response
7677
)
7778
end
7879

lib/ruby_llm/providers/bedrock/streaming/base.rb

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -34,15 +34,15 @@ def stream_response(connection, payload, &block)
3434
payload:)
3535
accumulator = StreamAccumulator.new
3636

37-
connection.post stream_url, payload do |req|
37+
response = connection.post stream_url, payload do |req|
3838
req.headers.merge! build_headers(signature.headers, streaming: block_given?)
3939
req.options.on_data = handle_stream do |chunk|
4040
accumulator.add chunk
4141
block.call chunk
4242
end
4343
end
4444

45-
accumulator.to_message
45+
accumulator.to_message(response)
4646
end
4747

4848
def handle_stream(&block)

lib/ruby_llm/providers/gemini/chat.rb

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -81,7 +81,8 @@ def parse_completion_response(response)
8181
tool_calls: tool_calls,
8282
input_tokens: data.dig('usageMetadata', 'promptTokenCount'),
8383
output_tokens: data.dig('usageMetadata', 'candidatesTokenCount'),
84-
model_id: data['modelVersion'] || response.env.url.path.split('/')[3].split(':')[0]
84+
model_id: data['modelVersion'] || response.env.url.path.split('/')[3].split(':')[0],
85+
raw: response
8586
)
8687
end
8788

lib/ruby_llm/providers/openai/chat.rb

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -59,7 +59,8 @@ def parse_completion_response(response)
5959
tool_calls: parse_tool_calls(message_data['tool_calls']),
6060
input_tokens: data['usage']['prompt_tokens'],
6161
output_tokens: data['usage']['completion_tokens'],
62-
model_id: data['model']
62+
model_id: data['model'],
63+
raw: response
6364
)
6465
end
6566

lib/ruby_llm/stream_accumulator.rb

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -29,14 +29,15 @@ def add(chunk)
2929
RubyLLM.logger.debug inspect
3030
end
3131

32-
def to_message
32+
def to_message(response)
3333
Message.new(
3434
role: :assistant,
3535
content: content.empty? ? nil : content,
3636
model_id: model_id,
3737
tool_calls: tool_calls_from_stream,
3838
input_tokens: @input_tokens.positive? ? @input_tokens : nil,
39-
output_tokens: @output_tokens.positive? ? @output_tokens : nil
39+
output_tokens: @output_tokens.positive? ? @output_tokens : nil,
40+
raw: response
4041
)
4142
end
4243

lib/ruby_llm/streaming.rb

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@ module Streaming
1111
def stream_response(connection, payload, &block)
1212
accumulator = StreamAccumulator.new
1313

14-
connection.post stream_url, payload do |req|
14+
response = connection.post stream_url, payload do |req|
1515
if req.options.respond_to?(:on_data)
1616
# Handle Faraday 2.x streaming with on_data method
1717
req.options.on_data = handle_stream do |chunk|
@@ -27,7 +27,7 @@ def stream_response(connection, payload, &block)
2727
end
2828
end
2929

30-
accumulator.to_message
30+
accumulator.to_message(response)
3131
end
3232

3333
def handle_stream(&block)

spec/fixtures/vcr_cassettes/chat_basic_chat_functionality_anthropic_claude-3-5-haiku-20241022_returns_raw_responses.yml

Lines changed: 80 additions & 0 deletions
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

spec/fixtures/vcr_cassettes/chat_basic_chat_functionality_bedrock_anthropic_claude-3-5-haiku-20241022-v1_0_returns_raw_responses.yml

Lines changed: 53 additions & 0 deletions
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

0 commit comments

Comments
 (0)