fix: direct ReadableStream perf blow-up (issue #1741)#1772
fix: direct ReadableStream perf blow-up (issue #1741)#1772aymaneallaoui wants to merge 1 commit intoelysiajs:mainfrom
Conversation
|
No actionable comments were generated in the recent review. 🎉 ℹ️ Recent review infoConfiguration used: Organization UI Review profile: CHILL Plan: Pro 📒 Files selected for processing (2)
WalkthroughThe PR refactors stream handling in adapter utilities by introducing a centralized Changes
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~25 minutes Possibly related PRs
Poem
🚥 Pre-merge checks | ✅ 3✅ Passed checks (3 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
commit: |
There was a problem hiding this comment.
Pull request overview
This PR fixes a severe performance regression (issue #1741) where returning a ReadableStream directly from a route handler caused 100% CPU usage and 100x+ latency compared to wrapping the stream in new Response(body). The root cause was that binary chunks (Uint8Array, ArrayBuffer, typed views) were being passed through JSON.stringify before being enqueued, instead of being streamed as raw bytes.
Changes:
- Adds
isBinaryChunk,enqueueBinaryChunk, andisPromiseLikehelpers tosrc/adapter/utils.tsfor fast-path binary chunk handling increateStreamHandler - Refactors the stream loop's chunk dispatch into an
enqueueChunkclosure that routes binary types as raw bytes and text/SSE types through the existing JSON serialization path - Adds three regression tests covering large binary payloads, mixed typed-view/Blob chunks, and generator-function binary yields
Reviewed changes
Copilot reviewed 2 out of 2 changed files in this pull request and generated 2 comments.
| File | Description |
|---|---|
src/adapter/utils.ts |
Adds binary chunk helpers and refactors createStreamHandler's enqueue logic to bypass JSON serialization for binary data |
test/response/stream.test.ts |
Adds three tests covering ReadableStream binary chunks, mixed view/Blob chunks, and generator binary output |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| if (!isSSE && isBinaryChunk(chunk)) | ||
| return enqueueBinaryChunk(controller, chunk) |
There was a problem hiding this comment.
When a generator function yields a binary type (Uint8Array, ArrayBuffer, Blob, etc.) as its first value, init.value is that binary object. Since typeof Uint8Array === 'object' (and similarly for all ArrayBufferView/ArrayBuffer/Blob), the contentType determination (which reads init?.value && typeof init?.value === 'object') evaluates to 'application/json' — even though enqueueChunk now correctly emits raw bytes for that chunk. This means the response will advertise Content-Type: application/json while delivering binary bytes, which is incorrect.
The isBinaryChunk predicate should be used in the contentType conditional to detect binary init values and return 'application/octet-stream' instead.
|
|
||
| const response = await app.handle(req('/')) | ||
| const result = new Uint8Array(await response.arrayBuffer()) | ||
|
|
There was a problem hiding this comment.
The test verifies that binary bytes from a generator are transmitted correctly but does not assert the Content-Type header. Because Uint8Array is an object, the existing contentType logic (lines 227-231 in src/adapter/utils.ts) sets Content-Type: application/json for a generator that yields a Uint8Array as its first chunk, even though the body now contains raw bytes. A content-type assertion would catch this regression.
| expect(response.headers.get('content-type')).toMatch( | |
| /^application\/octet-stream\b/i | |
| ) |
|
due to a merge conflict from #1803, I have rewrite your code to merge with backpressure-based stream and include the test cases and mentioned in CHANGELOG.md Thanks for your contribution |
Fixes 1741
Direct
ReadableStreamresponses were super slow cuz binary chunks were getting JSON.stringify instead of streamed as bytes.Changes
Blob/ArrayBuffer/typed views) as raw bytesResult
Direct
ReadableStreamperf is now in the same range asnew Response(body)on large real-world streams, without breaking existing stream behavior.some perf metrics
Also did randomized binary fuzz-style checks and concurrent large-stream stress runs to make sure this doesn’t regress under load.
ps: i tested all of it on same machine so yup
Summary by CodeRabbit
Release Notes
Improvements
Tests