Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .release-please-manifest.json
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
{
".": "0.15.0"
".": "0.16.0"
}
6 changes: 3 additions & 3 deletions .stats.yml
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
configured_endpoints: 109
openapi_spec_url: https://storage.googleapis.com/stainless-sdk-openapi-specs/openai%2Fopenai-b2a451656ca64d30d174391ebfd94806b4de3ab76dc55b92843cfb7f1a54ecb6.yml
openapi_spec_hash: 27d9691b400f28c17ef063a1374048b0
config_hash: e822d0c9082c8b312264403949243179
openapi_spec_url: https://storage.googleapis.com/stainless-sdk-openapi-specs/openai%2Fopenai-721e6ccaa72205ee14c71f8163129920464fb814b95d3df9567a9476bbd9b7fb.yml
openapi_spec_hash: 2115413a21df8b5bf9e4552a74df4312
config_hash: 9606bb315a193bfd8da0459040143242
23 changes: 23 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,28 @@
# Changelog

## 0.16.0 (2025-07-30)

Full Changelog: [v0.15.0...v0.16.0](https://github.com/openai/openai-ruby/compare/v0.15.0...v0.16.0)

### Features

* add output_text method for non-streaming responses ([#757](https://github.com/openai/openai-ruby/issues/757)) ([50cf119](https://github.com/openai/openai-ruby/commit/50cf119106f9e16d9ac6a9898028b6d563a6f809))
* **api:** manual updates ([e9fa8a0](https://github.com/openai/openai-ruby/commit/e9fa8a08d6ecebdd06212eaf6b9103082b7d67aa))


### Bug Fixes

* **internal:** ensure sorbet test always runs serially ([0601061](https://github.com/openai/openai-ruby/commit/0601061047525d16cc2afac64e5a4de0dd9de2e5))
* provide parsed outputs for resumed streams ([#756](https://github.com/openai/openai-ruby/issues/756)) ([82254f9](https://github.com/openai/openai-ruby/commit/82254f980ccc0affa2555a81b0d8ed5aa0290835))
* union definition re-using ([#760](https://github.com/openai/openai-ruby/issues/760)) ([3046c28](https://github.com/openai/openai-ruby/commit/3046c28935ca925c2f399f0350937d04eab54c0a))


### Chores

* extract reused JSON schema references even in unions ([#761](https://github.com/openai/openai-ruby/issues/761)) ([e17d3bf](https://github.com/openai/openai-ruby/commit/e17d3bf1fdf241f7a78ed72a39ddecabeb5877c8))
* **internal:** refactor variable name ([#762](https://github.com/openai/openai-ruby/issues/762)) ([7e15b07](https://github.com/openai/openai-ruby/commit/7e15b0745dcbd3bf7fc4c1899d9d76e0a9ab1e48))
* update contribute.md ([b4a0297](https://github.com/openai/openai-ruby/commit/b4a029775bb52d5db2f3fac235595f37b6746a61))

## 0.15.0 (2025-07-21)

Full Changelog: [v0.14.0...v0.15.0](https://github.com/openai/openai-ruby/compare/v0.14.0...v0.15.0)
Expand Down
2 changes: 1 addition & 1 deletion CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -78,7 +78,7 @@ $ bundle exec rake test

## Linting and formatting

This repository uses [rubocop](https://github.com/rubocop/rubocop) for linting and formatting of `*.rb` and `*.rbi` files. [syntax_tree](https://github.com/ruby-syntax-tree/syntax_tree) is used for formatting `*.rbs` files.
This repository uses [rubocop](https://github.com/rubocop/rubocop) for linting and formatting of `*.rb` files; And [syntax_tree](https://github.com/ruby-syntax-tree/syntax_tree) is used for formatting of both `*.rbi` and `*.rbs` files.

There are two separate type checkers supported by this library: [sorbet](https://github.com/sorbet/sorbet) and [steep](https://github.com/soutaro/steep) are used for verifying `*.rbi` and `*.rbs` files respectively.

Expand Down
2 changes: 1 addition & 1 deletion Gemfile.lock
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ GIT
PATH
remote: .
specs:
openai (0.15.0)
openai (0.16.0)
connection_pool

GEM
Expand Down
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ To use this gem, install via Bundler by adding the following to your application
<!-- x-release-please-start-version -->

```ruby
gem "openai", "~> 0.15.0"
gem "openai", "~> 0.16.0"
```

<!-- x-release-please-end -->
Expand Down
197 changes: 137 additions & 60 deletions examples/responses/streaming_previous_response.rb
Original file line number Diff line number Diff line change
Expand Up @@ -7,73 +7,150 @@

client = OpenAI::Client.new

# Request 1: Create a new streaming response with store=true
puts "Creating a new streaming response..."
stream = client.responses.stream(
model: "o4-mini",
input: "Tell me a short story about a robot learning to paint.",
instructions: "You are a creative storyteller.",
background: true
)

events = []
response_id = ""

stream.each do |event|
events << event
puts "Event from initial stream: #{event.type} (seq: #{event.sequence_number})"
case event

when OpenAI::Models::Responses::ResponseCreatedEvent
response_id = event.response.id if response_id.empty?
puts("Captured response ID: #{response_id}")
begin
puts "----- resuming stream from a previous response -----"

# Request 1: Create a new streaming response with background=true
puts "Creating a new streaming response..."
stream = client.responses.stream(
model: "o4-mini",
input: "Tell me a short story about a robot learning to paint.",
instructions: "You are a creative storyteller.",
background: true
)

events = []
response_id = ""

stream.each do |event|
events << event
puts "Event from initial stream: #{event.type} (seq: #{event.sequence_number})"
case event

when OpenAI::Models::Responses::ResponseCreatedEvent
response_id = event.response.id if response_id.empty?
puts("Captured response ID: #{response_id}")
end

# Simulate stopping after a few events
if events.length >= 5
puts "Terminating after #{events.length} events"
break
end
end

# Simulate stopping after a few events
if events.length >= 5
puts "Terminating after #{events.length} events"
break
puts "Collected #{events.length} events"
puts "Response ID: #{response_id}"
puts "Last event sequence number: #{events.last.sequence_number}.\n"

# Give the background response some time to process more events.
puts "Waiting a moment for the background response to progress...\n"
sleep(3)

# Request 2: Resume the stream using the captured response_id.
puts
puts "Resuming stream from sequence #{events.last.sequence_number}..."

resumed_stream = client.responses.stream(
previous_response_id: response_id,
starting_after: events.last.sequence_number
)

resumed_events = []
resumed_stream.each do |event|
resumed_events << event
puts "Event from resumed stream: #{event.type} (seq: #{event.sequence_number})"
# Stop when we get the completed event or collect enough events.
if event.is_a?(OpenAI::Models::Responses::ResponseCompletedEvent)
puts "Response completed!"
break
end

break if resumed_events.length >= 10
end

puts "Collected #{resumed_events.length} additional events"

# Show that we properly resumed from where we left off.
if resumed_events.any?
first_resumed_event = resumed_events.first
last_initial_event = events.last
puts "First resumed event sequence: #{first_resumed_event.sequence_number}"
puts "Should be greater than last initial event: #{last_initial_event.sequence_number}"
end
end

stream.close

puts
puts "Collected #{events.length} events"
puts "Response ID: #{response_id}"
puts "Last event sequence number: #{events.last.sequence_number}.\n"

# Give the background response some time to process more events.
puts "Waiting a moment for the background response to progress...\n"
sleep(2)

# Request 2: Resume the stream using the captured response_id.
puts "Resuming stream from sequence #{events.last.sequence_number}..."

resumed_stream = client.responses.stream(
previous_response_id: response_id,
starting_after: events.last.sequence_number
)

resumed_events = []
resumed_stream.each do |event|
resumed_events << event
puts "Event from resumed stream: #{event.type} (seq: #{event.sequence_number})"
# Stop when we get the completed event or collect enough events.
if event.is_a?(OpenAI::Models::Responses::ResponseCompletedEvent)
puts "Response completed!"
break
begin
puts "\n----- resuming stream with structured outputs -----"

class Step < OpenAI::BaseModel
required :explanation, String
required :output, String
end

break if resumed_events.length >= 10
end
class MathResponse < OpenAI::BaseModel
required :steps, OpenAI::ArrayOf[Step]
required :final_answer, String
end

puts "Creating a background streaming response with structured output..."
stream = client.responses.stream(
input: "solve 8x + 31 = 2",
model: "gpt-4o-2024-08-06",
text: MathResponse,
background: true
)

puts "\nCollected #{resumed_events.length} additional events"
events = []
response_id = ""

stream.each do |event|
events << event

case event
when OpenAI::Models::Responses::ResponseCreatedEvent
response_id = event.response.id if response_id.empty?
end

if events.length >= 5
break
end
end

puts "Waiting for the background response to complete...\n"
sleep(3)

puts
puts "Resuming stream from sequence #{events.last.sequence_number}..."

resumed_stream = client.responses.stream(
previous_response_id: response_id,
starting_after: events.last.sequence_number,
# NOTE: You must pass the structured output format when resuming to access parsed
# outputs in the resumed stream.
text: MathResponse
)

resumed_stream.each do |event|
case event
when OpenAI::Streaming::ResponseTextDeltaEvent
print(event.delta)
when OpenAI::Streaming::ResponseTextDoneEvent
puts
puts("--- Parsed object from resumed stream ---")
pp(event.parsed)
when OpenAI::Models::Responses::ResponseCompletedEvent
puts("Response completed.")
break
end
end

# Show that we properly resumed from where we left off.
if resumed_events.any?
first_resumed_event = resumed_events.first
last_initial_event = events.last
puts "First resumed event sequence: #{first_resumed_event.sequence_number}"
puts "Should be greater than last initial event: #{last_initial_event.sequence_number}"
puts "\nFinal response parsed outputs:"
response = resumed_stream.get_final_response
response
.output
.flat_map { _1.content }
.each do |content|
pp(content.parsed)
end
end
41 changes: 20 additions & 21 deletions lib/openai/helpers/structured_output/json_schema_converter.rb
Original file line number Diff line number Diff line change
Expand Up @@ -6,15 +6,9 @@ module StructuredOutput
# To customize the JSON schema conversion for a type, implement the `JsonSchemaConverter` interface.
module JsonSchemaConverter
# @api private
POINTER = Object.new.tap do
POINTERS = Object.new.tap do
_1.define_singleton_method(:inspect) do
"#<#{OpenAI::Helpers::StructuredOutput::JsonSchemaConverter}::POINTER>"
end
end.freeze
# @api private
COUNTER = Object.new.tap do
_1.define_singleton_method(:inspect) do
"#<#{OpenAI::Helpers::StructuredOutput::JsonSchemaConverter}::COUNTER>"
"#<#{OpenAI::Helpers::StructuredOutput::JsonSchemaConverter}::POINTERS>"
end
end.freeze
# @api private
Expand Down Expand Up @@ -81,14 +75,15 @@ def to_nilable(schema)
def cache_def!(state, type:, &blk)
defs, path = state.fetch_values(:defs, :path)
if (stored = defs[type])
stored[OpenAI::Helpers::StructuredOutput::JsonSchemaConverter::COUNTER] += 1
stored.fetch(OpenAI::Helpers::StructuredOutput::JsonSchemaConverter::POINTER)
pointers = stored.fetch(OpenAI::Helpers::StructuredOutput::JsonSchemaConverter::POINTERS)
pointers.first.except(OpenAI::Helpers::StructuredOutput::JsonSchemaConverter::NO_REF).tap do
pointers << _1
end
else
ref_path = String.new
ref = {"$ref": ref_path}
stored = {
OpenAI::Helpers::StructuredOutput::JsonSchemaConverter::POINTER => ref,
OpenAI::Helpers::StructuredOutput::JsonSchemaConverter::COUNTER => 1
OpenAI::Helpers::StructuredOutput::JsonSchemaConverter::POINTERS => [ref]
}
defs.store(type, stored)
schema = blk.call
Expand All @@ -112,17 +107,21 @@ def to_json_schema(type)
)
reused_defs = {}
defs.each_value do |acc|
ref = acc.fetch(OpenAI::Helpers::StructuredOutput::JsonSchemaConverter::POINTER)
if (no_ref = ref.delete(OpenAI::Helpers::StructuredOutput::JsonSchemaConverter::NO_REF))
acc[OpenAI::Helpers::StructuredOutput::JsonSchemaConverter::COUNTER] -= 1
sch = acc.except(OpenAI::Helpers::StructuredOutput::JsonSchemaConverter::POINTERS)
pointers = acc.fetch(OpenAI::Helpers::StructuredOutput::JsonSchemaConverter::POINTERS)

no_refs, refs = pointers.partition do
_1.delete(OpenAI::Helpers::StructuredOutput::JsonSchemaConverter::NO_REF)
end
cnt = acc.fetch(OpenAI::Helpers::StructuredOutput::JsonSchemaConverter::COUNTER)

sch = acc.except(
OpenAI::Helpers::StructuredOutput::JsonSchemaConverter::POINTER,
OpenAI::Helpers::StructuredOutput::JsonSchemaConverter::COUNTER
)
cnt > 1 && !no_ref ? reused_defs.store(ref.fetch(:$ref), sch) : ref.replace(sch)
case refs
in [ref]
ref.replace(sch)
in [_, ref, *]
reused_defs.store(ref.fetch(:$ref), sch)
else
end
no_refs.each { _1.replace(sch) }
end

xformed = reused_defs.transform_keys { _1.delete_prefix("#/$defs/") }
Expand Down
12 changes: 11 additions & 1 deletion lib/openai/helpers/structured_output/union_of.rb
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,17 @@ def to_json_schema_inner(state:)
mergeable_keys.each_key { mergeable_keys[_1] += 1 if schema.keys == _1 }
end
mergeable = mergeable_keys.any? { _1.last == schemas.length }
mergeable ? OpenAI::Internal::Util.deep_merge(*schemas, concat: true) : {anyOf: schemas}
if mergeable
OpenAI::Internal::Util.deep_merge(*schemas, concat: true)
else
{
anyOf: schemas.each do
if _1.key?(:$ref)
_1.update(OpenAI::Helpers::StructuredOutput::JsonSchemaConverter::NO_REF => true)
end
end
}
end
end
end

Expand Down
Loading