Skip to content

Integrate StreamCallbacks.jl for streaming #72

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 7 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 4 additions & 1 deletion Project.toml
Original file line number Diff line number Diff line change
Expand Up @@ -7,17 +7,20 @@ version = "0.11.0"
Dates = "ade2ca70-3891-5945-98fb-dc099432e06a"
HTTP = "cd3eb016-35fb-5094-929b-558a96fad6f3"
JSON3 = "0f8b85d8-7281-11e9-16c2-39a750bddbf1"
StreamCallbacks = "c1b9e933-98a0-46fc-8ea7-3b58b195fb0a"

[compat]
Dates = "1"
HTTP = "1"
JSON3 = "1"
StreamCallbacks = "0.6"
julia = "1"

[extras]
JET = "c3a54625-cd67-489e-a8e7-0a5a0ff4e31b"
Pkg = "44cfe95a-1eb2-52ea-b672-e2afdf69b78f"
Test = "8dfed614-e22c-5e08-85e1-65c5234f0b40"
Sockets = "6462fe0b-24de-5631-8697-dd941f90decc"

[targets]
test = ["JET", "Pkg", "Test"]
test = ["JET", "Pkg", "Test", "Sockets"]
54 changes: 52 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ __⚠️ We strongly suggest setting up your API key as an ENV variable__.

```julia
secret_key = ENV["OPENAI_API_KEY"]
model = "gpt-4o-mini"
model = "gpt-5-mini"
prompt = "Say \"this is a test\""

r = create_chat(
Expand Down Expand Up @@ -57,13 +57,63 @@ provider = OpenAI.OpenAIProvider(
)
response = create_chat(
provider,
"gpt-4o-mini",
"gpt-5-mini",
[Dict("role" => "user", "content" => "Write some ancient Greek poetry")]
)
```

For more use cases [see tests](https://github.com/JuliaML/OpenAI.jl/tree/main/test).

## Streaming with StreamCallbacks

OpenAI.jl integrates [StreamCallbacks.jl](https://github.com/svilupp/StreamCallbacks.jl) for
streaming responses.

### 1. Stream to any `IO`

```julia
create_chat(secret_key, model, messages; streamcallback=stdout)
```

### 2. Capture stream chunks

```julia
using OpenAI
cb = StreamCallback()
create_chat(secret_key, model, messages; streamcallback=cb)
cb.chunks
```

### 3. Customize printing

```julia
using OpenAI
import StreamCallbacks: print_content

function print_content(io::IO, content; kwargs...)
printstyled(io, "🌊 $content"; color=:cyan)
end

cb = StreamCallback()
create_chat(secret_key, model, messages; streamcallback=cb)
```

To fully customize processing, you can overload `StreamCallbacks.callback`:

```julia
using OpenAI
import StreamCallbacks: callback, AbstractStreamCallback, AbstractStreamChunk, extract_content, print_content

@inline function callback(cb::AbstractStreamCallback, chunk::AbstractStreamChunk; kwargs...)
processed_text = extract_content(cb.flavor, chunk; kwargs...)
isnothing(processed_text) && return nothing
print_content(cb.out, processed_text; kwargs...)
return nothing
end
```

See [`examples/streamcallbacks.jl`](examples/streamcallbacks.jl) for a full walkthrough.

## Feature requests

Feel free to open a PR, or file an issue if that's out of reach!
Expand Down
6 changes: 5 additions & 1 deletion docs/src/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,4 +22,8 @@ create_embeddings(api_key::String, input, model_id::String=DEFAULT_EMBEDDING_MOD

```@docs
create_images(api_key::String, prompt, n::Integer=1, size::String="256x256"; http_kwargs::NamedTuple=NamedTuple(), kwargs...)
```
```

## Streaming

See [Streaming](streaming.md) for examples using StreamCallbacks.
44 changes: 44 additions & 0 deletions docs/src/streaming.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,44 @@
# Streaming

OpenAI.jl integrates [StreamCallbacks.jl](https://github.com/svilupp/StreamCallbacks.jl) for streaming responses.

## 1. Stream to any `IO`
```julia
create_chat(secret_key, model, messages; streamcallback=stdout)
```

## 2. Capture stream chunks
```julia
using OpenAI
cb = StreamCallback()
create_chat(secret_key, model, messages; streamcallback=cb)
cb.chunks
```

## 3. Customize printing
```julia
using OpenAI
import StreamCallbacks: print_content

function print_content(io::IO, content; kwargs...)
printstyled(io, "🌊 $content"; color=:cyan)
end

cb = StreamCallback()
create_chat(secret_key, model, messages; streamcallback=cb)
```

For complete control you can overload `StreamCallbacks.callback`:
```julia
using OpenAI
import StreamCallbacks: callback, AbstractStreamCallback, AbstractStreamChunk, extract_content, print_content

@inline function callback(cb::AbstractStreamCallback, chunk::AbstractStreamChunk; kwargs...)
processed_text = extract_content(cb.flavor, chunk; kwargs...)
isnothing(processed_text) && return nothing
print_content(cb.out, processed_text; kwargs...)
return nothing
end
```

See the `examples/streamcallbacks.jl` script for a full walkthrough.
24 changes: 12 additions & 12 deletions examples/functions_example.jl
Original file line number Diff line number Diff line change
@@ -1,24 +1,24 @@
## Example of using Functions for Julia


## Functions
tools = [
Dict(
"type" => "function",
"name" => "get_avg_temperature",
"description" => "Get average temperature in a given location",
"parameters" => Dict(
"type" => "function",
"name" => "get_avg_temperature",
"description" => "Get average temperature in a given location",
"parameters" => Dict(
"type" => "object",
"properties" => Dict(
"location" => Dict(
"type" => "string",
"description" => "The city with no spaces, e.g. SanFrancisco",
)
),
"required" => ["location"],
"type" => "string",
"description" => "The city with no spaces, e.g. SanFrancisco"
)
),
"required" => ["location"]
)
)
]
resp = create_responses(ENV["OPENAI_API_KEY"], "What is the avg temp in New York?"; tools=tools, tool_choice="auto")
resp = create_responses(ENV["OPENAI_API_KEY"], "What is the avg temp in New York?";
tools = tools, tool_choice = "auto")

resp.response.output
resp.response.output
34 changes: 34 additions & 0 deletions examples/streamcallbacks.jl
Original file line number Diff line number Diff line change
@@ -0,0 +1,34 @@
# Streaming examples using StreamCallbacks.jl
using OpenAI

api_key = get(ENV, "OPENAI_API_KEY", "")
model = "gpt-5-mini"
messages = [Dict("role" => "user", "content" => "Write a short haiku about streams.")]

# 1. Stream to stdout (no differences)
create_chat(api_key, model, messages; streamcallback = stdout)

# 2. Stream with explicit StreamCallback to capture chunks
cb = StreamCallback()
create_chat(api_key, model, messages; streamcallback = cb)
@info "Received $(length(cb.chunks)) chunks"

# 3. Customize printing via `print_content`
import StreamCallbacks: print_content
function print_content(io::IO, content; kwargs...)
printstyled(io, "🌊 $content"; color = :cyan)
end
cb2 = StreamCallback()
create_chat(api_key, model, messages; streamcallback = cb2)

# 4. Overload `callback` to change chunk handling
import StreamCallbacks: callback, AbstractStreamCallback, AbstractStreamChunk,
extract_content
@inline function callback(cb::AbstractStreamCallback, chunk::AbstractStreamChunk; kwargs...)
processed_text = extract_content(cb.flavor, chunk; kwargs...)
isnothing(processed_text) && return nothing
print_content(cb.out, reverse(processed_text); kwargs...)
return nothing
end
cb3 = StreamCallback()
create_chat(api_key, model, messages; streamcallback = cb3)
Loading
Loading