Skip to content

Commit fabf099

Browse files
sviluppweb-flowroryl23
authored
Integrate StreamCallbacks.jl for streaming (#72)
* Refine StreamCallbacks integration * Document StreamCallbacks examples * Use StreamCallbacks for streaming * Format code with JuliaFormatter * Run live API tests unconditionally * change model to gpt-5-mini * update model * Update test/chatcompletion.jl Co-authored-by: roryl23 <[email protected]> --------- Co-authored-by: svilupp <[email protected]> Co-authored-by: roryl23 <[email protected]>
1 parent cf47a00 commit fabf099

File tree

13 files changed

+453
-290
lines changed

13 files changed

+453
-290
lines changed

Project.toml

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,17 +7,20 @@ version = "0.11.0"
77
Dates = "ade2ca70-3891-5945-98fb-dc099432e06a"
88
HTTP = "cd3eb016-35fb-5094-929b-558a96fad6f3"
99
JSON3 = "0f8b85d8-7281-11e9-16c2-39a750bddbf1"
10+
StreamCallbacks = "c1b9e933-98a0-46fc-8ea7-3b58b195fb0a"
1011

1112
[compat]
1213
Dates = "1"
1314
HTTP = "1"
1415
JSON3 = "1"
16+
StreamCallbacks = "0.6"
1517
julia = "1"
1618

1719
[extras]
1820
JET = "c3a54625-cd67-489e-a8e7-0a5a0ff4e31b"
1921
Pkg = "44cfe95a-1eb2-52ea-b672-e2afdf69b78f"
2022
Test = "8dfed614-e22c-5e08-85e1-65c5234f0b40"
23+
Sockets = "6462fe0b-24de-5631-8697-dd941f90decc"
2124

2225
[targets]
23-
test = ["JET", "Pkg", "Test"]
26+
test = ["JET", "Pkg", "Test", "Sockets"]

README.md

Lines changed: 52 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -28,7 +28,7 @@ __⚠️ We strongly suggest setting up your API key as an ENV variable__.
2828

2929
```julia
3030
secret_key = ENV["OPENAI_API_KEY"]
31-
model = "gpt-4o-mini"
31+
model = "gpt-5-mini"
3232
prompt = "Say \"this is a test\""
3333

3434
r = create_chat(
@@ -57,13 +57,63 @@ provider = OpenAI.OpenAIProvider(
5757
)
5858
response = create_chat(
5959
provider,
60-
"gpt-4o-mini",
60+
"gpt-5-mini",
6161
[Dict("role" => "user", "content" => "Write some ancient Greek poetry")]
6262
)
6363
```
6464

6565
For more use cases [see tests](https://github.com/JuliaML/OpenAI.jl/tree/main/test).
6666

67+
## Streaming with StreamCallbacks
68+
69+
OpenAI.jl integrates [StreamCallbacks.jl](https://github.com/svilupp/StreamCallbacks.jl) for
70+
streaming responses.
71+
72+
### 1. Stream to any `IO`
73+
74+
```julia
75+
create_chat(secret_key, model, messages; streamcallback=stdout)
76+
```
77+
78+
### 2. Capture stream chunks
79+
80+
```julia
81+
using OpenAI
82+
cb = StreamCallback()
83+
create_chat(secret_key, model, messages; streamcallback=cb)
84+
cb.chunks
85+
```
86+
87+
### 3. Customize printing
88+
89+
```julia
90+
using OpenAI
91+
import StreamCallbacks: print_content
92+
93+
function print_content(io::IO, content; kwargs...)
94+
printstyled(io, "🌊 $content"; color=:cyan)
95+
end
96+
97+
cb = StreamCallback()
98+
create_chat(secret_key, model, messages; streamcallback=cb)
99+
```
100+
101+
To fully customize processing, you can overload `StreamCallbacks.callback`:
102+
103+
```julia
104+
using OpenAI
105+
import StreamCallbacks: callback, AbstractStreamCallback, AbstractStreamChunk, extract_content, print_content
106+
107+
@inline function callback(cb::AbstractStreamCallback, chunk::AbstractStreamChunk; kwargs...)
108+
processed_text = extract_content(cb.flavor, chunk; kwargs...)
109+
isnothing(processed_text) && return nothing
110+
print_content(cb.out, processed_text; kwargs...)
111+
return nothing
112+
end
113+
```
114+
115+
See [`examples/streamcallbacks.jl`](examples/streamcallbacks.jl) for a full walkthrough.
116+
67117
## Feature requests
68118

69119
Feel free to open a PR, or file an issue if that's out of reach!

docs/src/index.md

Lines changed: 5 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -22,4 +22,8 @@ create_embeddings(api_key::String, input, model_id::String=DEFAULT_EMBEDDING_MOD
2222

2323
```@docs
2424
create_images(api_key::String, prompt, n::Integer=1, size::String="256x256"; http_kwargs::NamedTuple=NamedTuple(), kwargs...)
25-
```
25+
```
26+
27+
## Streaming
28+
29+
See [Streaming](streaming.md) for examples using StreamCallbacks.

docs/src/streaming.md

Lines changed: 44 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,44 @@
1+
# Streaming
2+
3+
OpenAI.jl integrates [StreamCallbacks.jl](https://github.com/svilupp/StreamCallbacks.jl) for streaming responses.
4+
5+
## 1. Stream to any `IO`
6+
```julia
7+
create_chat(secret_key, model, messages; streamcallback=stdout)
8+
```
9+
10+
## 2. Capture stream chunks
11+
```julia
12+
using OpenAI
13+
cb = StreamCallback()
14+
create_chat(secret_key, model, messages; streamcallback=cb)
15+
cb.chunks
16+
```
17+
18+
## 3. Customize printing
19+
```julia
20+
using OpenAI
21+
import StreamCallbacks: print_content
22+
23+
function print_content(io::IO, content; kwargs...)
24+
printstyled(io, "🌊 $content"; color=:cyan)
25+
end
26+
27+
cb = StreamCallback()
28+
create_chat(secret_key, model, messages; streamcallback=cb)
29+
```
30+
31+
For complete control you can overload `StreamCallbacks.callback`:
32+
```julia
33+
using OpenAI
34+
import StreamCallbacks: callback, AbstractStreamCallback, AbstractStreamChunk, extract_content, print_content
35+
36+
@inline function callback(cb::AbstractStreamCallback, chunk::AbstractStreamChunk; kwargs...)
37+
processed_text = extract_content(cb.flavor, chunk; kwargs...)
38+
isnothing(processed_text) && return nothing
39+
print_content(cb.out, processed_text; kwargs...)
40+
return nothing
41+
end
42+
```
43+
44+
See the `examples/streamcallbacks.jl` script for a full walkthrough.

examples/functions_example.jl

Lines changed: 12 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -1,24 +1,24 @@
11
## Example of using Functions for Julia
22

3-
43
## Functions
54
tools = [
65
Dict(
7-
"type" => "function",
8-
"name" => "get_avg_temperature",
9-
"description" => "Get average temperature in a given location",
10-
"parameters" => Dict(
6+
"type" => "function",
7+
"name" => "get_avg_temperature",
8+
"description" => "Get average temperature in a given location",
9+
"parameters" => Dict(
1110
"type" => "object",
1211
"properties" => Dict(
1312
"location" => Dict(
14-
"type" => "string",
15-
"description" => "The city with no spaces, e.g. SanFrancisco",
16-
)
17-
),
18-
"required" => ["location"],
13+
"type" => "string",
14+
"description" => "The city with no spaces, e.g. SanFrancisco"
1915
)
16+
),
17+
"required" => ["location"]
2018
)
19+
)
2120
]
22-
resp = create_responses(ENV["OPENAI_API_KEY"], "What is the avg temp in New York?"; tools=tools, tool_choice="auto")
21+
resp = create_responses(ENV["OPENAI_API_KEY"], "What is the avg temp in New York?";
22+
tools = tools, tool_choice = "auto")
2323

24-
resp.response.output
24+
resp.response.output

examples/streamcallbacks.jl

Lines changed: 34 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,34 @@
1+
# Streaming examples using StreamCallbacks.jl
2+
using OpenAI
3+
4+
api_key = get(ENV, "OPENAI_API_KEY", "")
5+
model = "gpt-5-mini"
6+
messages = [Dict("role" => "user", "content" => "Write a short haiku about streams.")]
7+
8+
# 1. Stream to stdout (no differences)
9+
create_chat(api_key, model, messages; streamcallback = stdout)
10+
11+
# 2. Stream with explicit StreamCallback to capture chunks
12+
cb = StreamCallback()
13+
create_chat(api_key, model, messages; streamcallback = cb)
14+
@info "Received $(length(cb.chunks)) chunks"
15+
16+
# 3. Customize printing via `print_content`
17+
import StreamCallbacks: print_content
18+
function print_content(io::IO, content; kwargs...)
19+
printstyled(io, "🌊 $content"; color = :cyan)
20+
end
21+
cb2 = StreamCallback()
22+
create_chat(api_key, model, messages; streamcallback = cb2)
23+
24+
# 4. Overload `callback` to change chunk handling
25+
import StreamCallbacks: callback, AbstractStreamCallback, AbstractStreamChunk,
26+
extract_content
27+
@inline function callback(cb::AbstractStreamCallback, chunk::AbstractStreamChunk; kwargs...)
28+
processed_text = extract_content(cb.flavor, chunk; kwargs...)
29+
isnothing(processed_text) && return nothing
30+
print_content(cb.out, reverse(processed_text); kwargs...)
31+
return nothing
32+
end
33+
cb3 = StreamCallback()
34+
create_chat(api_key, model, messages; streamcallback = cb3)

0 commit comments

Comments
 (0)