Skip to content

fix: direct ReadableStream perf blow-up (issue #1741)#1772

Closed
aymaneallaoui wants to merge 1 commit intoelysiajs:mainfrom
aymaneallaoui:fix/readable-stream-cpu-latency
Closed

fix: direct ReadableStream perf blow-up (issue #1741)#1772
aymaneallaoui wants to merge 1 commit intoelysiajs:mainfrom
aymaneallaoui:fix/readable-stream-cpu-latency

Conversation

@aymaneallaoui
Copy link
Contributor

@aymaneallaoui aymaneallaoui commented Mar 1, 2026

Fixes 1741

Direct ReadableStream responses were super slow cuz binary chunks were getting JSON.stringify instead of streamed as bytes.

Changes

  • treat binary chunks (Blob/ArrayBuffer/typed views) as raw bytes
  • keep SSE + object stream behavior the same
  • add regression tests for binary stream cases

Result

Direct ReadableStream perf is now in the same range as new Response(body) on large real-world streams, without breaking existing stream behavior.

some perf metrics

  • before: direct 100MB stream could take ~26s (100% CPU vibe mentioned on issue), while wrapped response was ~0.2s
  • after: direct and wrapped are now in the same ballpark (~0.14s vs ~0.13s in local checks)

Also did randomized binary fuzz-style checks and concurrent large-stream stress runs to make sure this doesn’t regress under load.

ps: i tested all of it on same machine so yup

Summary by CodeRabbit

Release Notes

  • Improvements

    • Enhanced binary data streaming support with improved handling of various binary formats (Uint8Array, Blob, ArrayBuffer, and buffer views).
    • Optimized stream initialization and chunk processing for better reliability and consistency across SSE, binary, and JSON data types.
  • Tests

    • Expanded streaming test coverage with validation of binary data handling and mixed binary source scenarios.

Copilot AI review requested due to automatic review settings March 1, 2026 23:31
@coderabbitai
Copy link
Contributor

coderabbitai bot commented Mar 1, 2026

No actionable comments were generated in the recent review. 🎉

ℹ️ Recent review info

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 2c4c117 and 437729e.

📒 Files selected for processing (2)
  • src/adapter/utils.ts
  • test/response/stream.test.ts

Walkthrough

The PR refactors stream handling in adapter utilities by introducing a centralized enqueueChunk function that standardizes processing of different chunk types (binary, SSE, JSON/string). Helpers for Promise detection and binary data typing are added, along with three new test cases validating binary stream handling across various data types.

Changes

Cohort / File(s) Summary
Stream handling refactorization
src/adapter/utils.ts
Introduces enqueueChunk helper to centralize chunk processing logic. Adds utilities for Promise-like detection and binary chunk typing. Reworks stream initialization and generator item processing to use unified enqueue flow with conditional promise awaiting. Extends support for Blob, ArrayBuffer, Uint8Array, and ArrayBuffer views while maintaining SSE formatting.
Binary stream test coverage
test/response/stream.test.ts
Adds three test cases: streaming large Uint8Array in subchunks, mixed binary views (Uint8Array slice, DataView, Blob), and generator-yielded Uint8Array chunks. All tests validate exact byte-for-byte correctness of binary data handling.

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~25 minutes

Possibly related PRs

Poem

🐰 Hopping through streams with organized care,
Chunks enqueued in a unified flow,
Binary data now knows where to go,
Promises await, and tests verify so,
A refactored path—efficient and fair!

🚥 Pre-merge checks | ✅ 3
✅ Passed checks (3 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title directly addresses the main change: fixing a performance issue with direct ReadableStream responses by preventing unnecessary JSON stringification of binary chunks.
Docstring Coverage ✅ Passed No functions found in the changed files to evaluate docstring coverage. Skipping docstring coverage check.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

@pkg-pr-new
Copy link

pkg-pr-new bot commented Mar 1, 2026

Open in StackBlitz

npm i https://pkg.pr.new/elysia@1772

commit: 3bf7566

Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR fixes a severe performance regression (issue #1741) where returning a ReadableStream directly from a route handler caused 100% CPU usage and 100x+ latency compared to wrapping the stream in new Response(body). The root cause was that binary chunks (Uint8Array, ArrayBuffer, typed views) were being passed through JSON.stringify before being enqueued, instead of being streamed as raw bytes.

Changes:

  • Adds isBinaryChunk, enqueueBinaryChunk, and isPromiseLike helpers to src/adapter/utils.ts for fast-path binary chunk handling in createStreamHandler
  • Refactors the stream loop's chunk dispatch into an enqueueChunk closure that routes binary types as raw bytes and text/SSE types through the existing JSON serialization path
  • Adds three regression tests covering large binary payloads, mixed typed-view/Blob chunks, and generator-function binary yields

Reviewed changes

Copilot reviewed 2 out of 2 changed files in this pull request and generated 2 comments.

File Description
src/adapter/utils.ts Adds binary chunk helpers and refactors createStreamHandler's enqueue logic to bypass JSON serialization for binary data
test/response/stream.test.ts Adds three tests covering ReadableStream binary chunks, mixed view/Blob chunks, and generator binary output

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment on lines +266 to +267
if (!isSSE && isBinaryChunk(chunk))
return enqueueBinaryChunk(controller, chunk)
Copy link

Copilot AI Mar 1, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

When a generator function yields a binary type (Uint8Array, ArrayBuffer, Blob, etc.) as its first value, init.value is that binary object. Since typeof Uint8Array === 'object' (and similarly for all ArrayBufferView/ArrayBuffer/Blob), the contentType determination (which reads init?.value && typeof init?.value === 'object') evaluates to 'application/json' — even though enqueueChunk now correctly emits raw bytes for that chunk. This means the response will advertise Content-Type: application/json while delivering binary bytes, which is incorrect.

The isBinaryChunk predicate should be used in the contentType conditional to detect binary init values and return 'application/octet-stream' instead.

Copilot uses AI. Check for mistakes.

const response = await app.handle(req('/'))
const result = new Uint8Array(await response.arrayBuffer())

Copy link

Copilot AI Mar 1, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The test verifies that binary bytes from a generator are transmitted correctly but does not assert the Content-Type header. Because Uint8Array is an object, the existing contentType logic (lines 227-231 in src/adapter/utils.ts) sets Content-Type: application/json for a generator that yields a Uint8Array as its first chunk, even though the body now contains raw bytes. A content-type assertion would catch this regression.

Suggested change
expect(response.headers.get('content-type')).toMatch(
/^application\/octet-stream\b/i
)

Copilot uses AI. Check for mistakes.
@SaltyAom SaltyAom closed this in ccbd9e4 Mar 16, 2026
@SaltyAom
Copy link
Member

due to a merge conflict from #1803, I have rewrite your code to merge with backpressure-based stream and include the test cases and mentioned in CHANGELOG.md

Thanks for your contribution

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Returning a ReadableStream results in 100% CPU and >100x latency

3 participants