Skip to content

Conversation

@ihabadham
Copy link
Contributor

@ihabadham ihabadham commented Nov 26, 2025

Summary

Adds error handling for client disconnects during streaming, which was missing after the architecture change in #2111.

Background

Problem

When a client disconnects mid-stream (browser closed, network drop), response.stream.write raises IOError or Errno::EPIPE. Without handling:

  1. The exception crashes the request
  2. Producer tasks continue processing and enqueueing chunks unnecessarily, wasting CPU

Changes

1. Writer error handling (stream.rb)

  • Catch IOError/Errno::EPIPE in writing task
  • Set client_disconnected flag and stop barrier to cancel producer tasks
  • Log disconnect for debugging (when logging_on_server enabled)

2. Producer early termination (react_on_rails_pro_helper.rb)

  • Add stream.closed? check before processing each chunk
  • Prevents deadlock when producer blocks on full queue after writer dies
  • Prevents wasted CPU when producer runs ahead of failed writer
  • Extract process_stream_chunks method for clarity

3. Configuration validation (configuration.rb)

  • Add setter validation for concurrent_component_streaming_buffer_size
  • Must be a positive integer

4. Tests

  • Add client disconnect handling describe block with IOError and EPIPE tests
  • Add buffer size validation tests
  • Add closed? stub to stream test setup

Test Plan

  • bundle exec rspec spec/react_on_rails_pro/ - all 152 tests pass
  • bundle exec rubocop - no offenses
  • Verified tests fail on master, pass on this branch

🤖 Generated with Claude Code

Summary by CodeRabbit

  • Bug Fixes

    • Improved streaming stability: better client-disconnect detection, logging, and shutdown to avoid wasted work while preserving previous chunk delivery semantics.
  • Configuration

    • Added a validated setting for concurrent streaming buffer size; only positive integers are accepted.
  • Enhancements

    • Centralized stream chunk processing for more consistent handling of first and subsequent chunks.
  • Tests

    • Added tests covering disconnect scenarios and buffer-size validation.

✏️ Tip: You can customize this high-level summary in your review settings.

ihabadham and others added 2 commits November 26, 2025 21:51
When a client disconnects during streaming (browser closed, network drop),
the response stream write raises IOError or Errno::EPIPE. Without handling,
this crashes the request and wastes CPU on continued processing.

Changes:

1. Writer error handling (stream.rb):
   - Catch IOError/EPIPE in writing task
   - Stop barrier to cancel producer tasks on disconnect
   - Log disconnect for debugging (when logging_on_server enabled)

2. Producer early termination (react_on_rails_pro_helper.rb):
   - Add stream.closed? check before processing each chunk
   - Prevents deadlock when producer blocks on full queue
   - Prevents wasted CPU when producer runs ahead of failed writer

3. Configuration validation (configuration.rb):
   - Add setter validation for concurrent_component_streaming_buffer_size
   - Must be a positive integer

4. Tests:
   - Add "stops writing when client disconnects" test
   - Add buffer size validation tests
   - Add closed? stub to stream test setup

Based on error handling from PRs #2017 and #2026, adapted for the
new Async::Barrier architecture introduced in PR #2111.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <[email protected]>
- Reorganize disconnect tests under "client disconnect handling" describe block
- Add test for Errno::EPIPE (broken pipe) in addition to IOError
- Simplify test names: "stops writing on IOError/Errno::EPIPE"
- Clean up comments and remove redundant assertions

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <[email protected]>
@coderabbitai
Copy link
Contributor

coderabbitai bot commented Nov 26, 2025

Walkthrough

Refactors streaming chunk processing into a reusable helper, adds client-disconnect detection and conditional logging to concurrent streaming, and enforces validation on the concurrent streaming buffer size configuration.

Changes

Cohort / File(s) Summary
Helper refactor
react_on_rails_pro/app/helpers/react_on_rails_pro_helper.rb
Added public process_stream_chunks(stream, first_chunk_var, all_chunks) and replaced inline chunk-processing in consumer_stream_async with calls to this helper to centralize iteration, first-chunk capture, callback collection, closed? checks, and enqueueing.
Concurrent stream disconnect handling
react_on_rails_pro/lib/react_on_rails_pro/concerns/stream.rb
In drain_streams_concurrently added client_disconnected flag, rescue for IOError and Errno::EPIPE in writer path, log_client_disconnect call, and adjusted ensure logic to close the main queue, wait for writer task, and stop the async barrier on disconnect.
Configuration validation
react_on_rails_pro/lib/react_on_rails_pro/configuration.rb
Replaced attr_accessor with attr_reader for concurrent_component_streaming_buffer_size and added a validated setter concurrent_component_streaming_buffer_size=(value) that requires a positive Integer and raises ReactOnRailsPro::Error on invalid input.
Specs / test scaffolding
react_on_rails_pro/spec/react_on_rails_pro/configuration_spec.rb, react_on_rails_pro/spec/react_on_rails_pro/stream_spec.rb, react_on_rails_pro/spec/dummy/spec/helpers/react_on_rails_pro_helper_spec.rb
Added tests for buffer-size validation and client-disconnect scenarios (IOError / Errno::EPIPE); introduced response.stream mocks and closed? stubs; client-disconnect tests appear duplicated in stream spec.

Sequence Diagram(s)

sequenceDiagram
    participant Client
    participant Consumer as consumer_stream_async
    participant Helper as process_stream_chunks
    participant Queue as MainOutputQueue
    participant Drain as drain_streams_concurrently
    rect rgb(235,248,255)
      Note right of Helper: Centralized chunk iteration\n(first chunk capture, enqueue, closed? checks)
    end

    Client->>Consumer: request streaming
    Consumer->>Helper: iterate over stream chunks
    alt chunk produced
        Helper->>Queue: enqueue chunk
        Helper-->>Consumer: return first chunk if first
    else stream closed or no chunks
        Helper-->>Consumer: stop iteration / return nil
    end

    Consumer->>Drain: run concurrent draining (writer task)
    alt writer raises IOError or Errno::EPIPE
        Drain->>Drain: set client_disconnected = true
        Drain->>Drain: log_client_disconnect()
        Drain->>Drain: stop async barrier early
    else normal completion
        Drain->>Drain: close queue and complete barrier
    end
Loading

Estimated code review effort

🎯 4 (Complex) | ⏱️ ~45 minutes

  • Focus areas: concurrency semantics in drain_streams_concurrently (barrier stop conditions and ordering), correctness of process_stream_chunks preserving first-chunk vs callback behavior, and duplicated tests in stream_spec.rb.

Possibly related PRs

Suggested reviewers

  • justin808
  • AbanoubGhadban

Poem

🐇 I hopped through streams both wide and deep,
I caught the first nibble, then saved the heap,
When pipes went silent I flagged the scene,
Buffers checked neatly — tidy and keen,
Hooray — safe streams for every sleep! 🎉

Pre-merge checks and finishing touches

❌ Failed checks (1 warning)
Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 30.77% which is insufficient. The required threshold is 80.00%. You can run @coderabbitai generate docstrings to improve docstring coverage.
✅ Passed checks (2 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title 'Add client disconnect handling for concurrent component streaming' accurately captures the main objective of the pull request, which is to restore client-disconnect error handling after a streaming architecture rewrite.
✨ Finishing touches
  • 📝 Generate docstrings
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch ihabadham/stream-client-disconnect-handling

📜 Recent review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between b0c86d0 and c023053.

📒 Files selected for processing (1)
  • react_on_rails_pro/lib/react_on_rails_pro/concerns/stream.rb (2 hunks)
🧰 Additional context used
📓 Path-based instructions (2)
**/*.{rb,rake}

📄 CodeRabbit inference engine (CLAUDE.md)

ALWAYS run bundle exec rubocop and fix ALL violations before every commit/push

Files:

  • react_on_rails_pro/lib/react_on_rails_pro/concerns/stream.rb
**/*

📄 CodeRabbit inference engine (CLAUDE.md)

ALWAYS ensure files end with a newline character before committing/pushing

Files:

  • react_on_rails_pro/lib/react_on_rails_pro/concerns/stream.rb
🧠 Learnings (1)
📓 Common learnings
Learnt from: AbanoubGhadban
Repo: shakacode/react_on_rails PR: 1696
File: node_package/src/transformRSCNodeStream.ts:3-31
Timestamp: 2025-04-09T13:02:16.009Z
Learning: In the React Server Components (RSC) stream processing implementation, all chunks end with a newline character (`\n`), so the `lastIncompleteChunk` variable is expected to be empty by the time the stream ends, making a flush handler unnecessary in the Transform stream.
🧬 Code graph analysis (1)
react_on_rails_pro/lib/react_on_rails_pro/concerns/stream.rb (3)
packages/react-on-rails-pro/tests/AsyncQueue.ts (1)
  • dequeue (27-54)
react_on_rails_pro/spec/dummy/spec/helpers/react_on_rails_pro_helper_spec.rb (1)
  • response (18-18)
react_on_rails_pro/lib/react_on_rails_pro/configuration.rb (1)
  • configuration (9-38)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (6)
  • GitHub Check: pro-lint-js-and-ruby
  • GitHub Check: examples (3.2, minimum)
  • GitHub Check: examples (3.4, latest)
  • GitHub Check: build-dummy-app-webpack-test-bundles
  • GitHub Check: build-dummy-app-webpack-test-bundles
  • GitHub Check: claude-review
🔇 Additional comments (4)
react_on_rails_pro/lib/react_on_rails_pro/concerns/stream.rb (4)

63-73: Excellent documentation of the producer-consumer pattern.

The comment clearly explains the streaming architecture and client disconnect handling behavior, making it easy for maintainers to understand the design decisions.


74-106: Robust client disconnect handling with proper race condition fix.

The implementation correctly handles client disconnects:

  1. Graceful error handling: Catches IOError/Errno::EPIPE without re-raising, preventing client disconnects from crashing the request.

  2. Race condition fix: Waiting for writing_task (line 102) before checking client_disconnected ensures the flag is set before evaluation—excellent catch and clear documentation.

  3. Proper cleanup order: The ensure block closes the queue, waits for the writer, then stops the barrier if needed. Calling stop on the barrier actively cancels all producer tasks, preventing wasted work after disconnect.

  4. Clear comments: The inline documentation explains the reasoning for each step, making the code maintainable.


108-114: Clean conditional logging helper.

The method appropriately:

  • Checks the logging configuration before logging
  • Uses debug level for disconnect events
  • Includes relevant context and exception class in the message

The context parameter provides flexibility for future use cases.


1-116: The verification task is complete. The final rewritten review comment has been provided above with the `` tag, indicating that the original review comment is accurate and no changes are needed.

Summary of verification:

  • ✓ File ends with proper newline character
  • ✓ File structure is complete (116 lines)
  • ✓ Rubocop configuration exists in react_on_rails_pro directory
  • ✓ Review comment correctly suggests running bundle exec rubocop per coding guidelines
  • ✓ No issues found with the review comment

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

@claude
Copy link

claude bot commented Nov 26, 2025

Code Review

✅ Overall Assessment

This is a high-quality PR that adds critical error handling for client disconnects during concurrent streaming. The implementation is well-thought-out, properly tested, and follows the project's conventions.

🌟 Strengths

  1. Clear motivation and context - The PR description excellently traces the history from Concurrently drain streaming fibers #2015Replace manual Fibers with Async gem primitives for component streaming #2111 and explains why this was needed
  2. Comprehensive error handling - Catches both IOError and Errno::EPIPE (covers different OS behaviors)
  3. Producer-consumer coordination - Smart use of client_disconnected flag + @async_barrier.stop to gracefully cancel producer tasks
  4. Test coverage - Both disconnect scenarios (IOError, EPIPE) are tested with realistic timing
  5. Code organization - Extracting process_stream_chunks method improves clarity
  6. Documentation - Excellent inline comments explaining the producer-consumer pattern

🔍 Detailed Analysis

1. Writer Error Handling (stream.rb:82-86)

rescue IOError, Errno::EPIPE => e
  client_disconnected = true
  log_client_disconnect("writer", e)
end

✅ Excellent: Graceful degradation without propagating exceptions to controller. Logging is conditional on logging_on_server config.

2. Producer Early Termination (react_on_rails_pro_helper.rb:330)

break if response.stream.closed?

✅ Critical optimization: Prevents:

  • Deadlock when queue is full and writer has died
  • Wasted CPU cycles processing chunks that will never be sent
  • Producer blocking on @main_output_queue.enqueue after writer exits

3. Barrier Cleanup (stream.rb:97)

@async_barrier.stop if client_disconnected

✅ Proper resource cleanup: Ensures producer tasks are cancelled in the ensure block, preventing resource leaks.

4. Configuration Validation (configuration.rb:78-82)

unless value.is_a?(Integer) && value.positive?
  raise ReactOnRailsPro::Error, "..."
end

✅ Defensive programming: Prevents configuration errors at setup time rather than runtime.


🐛 Potential Issues & Questions

⚠️ Critical: Missing Test Coverage for Producer Early Termination

The tests verify that the writer stops on disconnect, but they don't verify that producers stop early via the response.stream.closed? check.

Current tests:

  • ✅ Verify writer stops after IOError/EPIPE
  • ❌ Don't verify producers check closed? and break early

Recommendation: Add a test that:

  1. Simulates client disconnect (set stream.closed? to return true)
  2. Enqueues many chunks to a producer
  3. Verifies the producer stops processing chunks after closed? returns true
  4. Confirms not all chunks were processed (proving early termination worked)

Example test structure:

it "stops producer processing when stream is closed" do
  queues, controller, stream = setup_stream_test(component_count: 1)
  
  # Simulate client disconnect after 2 writes
  allow(stream).to receive(:closed?).and_return(false, false, true)
  
  processed_chunks = []
  allow(stream).to receive(:write) { |chunk| processed_chunks << chunk }
  
  run_stream(controller) do
    10.times { |i| queues[0].enqueue("Chunk\#{i}") }
    queues[0].close
    sleep 0.1
  end
  
  # Should have stopped early, not processed all 10 chunks
  expect(processed_chunks.length).to be < 12  # TEMPLATE + partial chunks
end

⚠️ Minor: all_chunks&.push(chunk) Behavior Change

Before: all_chunks << chunk if on_complete
After: all_chunks&.push(chunk)

These are functionally equivalent when all_chunks is [] or nil, but:

  • The change from << to push is stylistic (both work identically)
  • Using safe navigation &. is more idiomatic Ruby

✅ No issue, but note this is a behavior-preserving refactor.


⚠️ Minor: Potential Race Condition in Barrier Stop

In stream.rb:97:

@async_barrier.stop if client_disconnected

Question: Is there a race where:

  1. Writer catches disconnect, sets client_disconnected = true
  2. @async_barrier.wait (line 90) completes normally before ensure block runs
  3. Barrier is stopped after producers have already finished?

Answer: This is harmless - Async::Barrier#stop is idempotent and safe to call on finished barriers. Not a bug, but worth documenting.

Recommendation: Add comment:

# Safe to call even if barrier already finished - idempotent
@async_barrier.stop if client_disconnected

⚠️ Minor: writing_task.wait Could Hang if Writer Crashed

In stream.rb:101:

ensure
  @async_barrier.stop if client_disconnected
  @main_output_queue.close
  writing_task.wait  # Could this hang?
end

Scenario: If writing_task crashes with an exception other than IOError/EPIPE:

  • Exception is not caught in writing_task
  • writing_task.wait will re-raise the exception
  • This is correct behavior for unexpected errors

Question: Should this explicitly handle writing_task failures?

Current behavior: ✅ Correct - unexpected errors should propagate
Recommendation: No change needed, but consider adding integration test for non-disconnect write errors.


🔒 Security Considerations

No security concerns:

  • No user input processed in error paths
  • Logging uses safe string interpolation
  • Error handling doesn't expose sensitive information

⚡ Performance Considerations

Excellent optimizations:

  1. Early termination prevents CPU waste - producers stop immediately when client disconnects
  2. Bounded queue prevents memory bloat - validation ensures buffer size is positive
  3. Minimal overhead - Single closed? check per chunk is negligible

Potential micro-optimization (not necessary now):

  • Currently checks response.stream.closed? on every chunk
  • Could batch check every N chunks for high-frequency streams
  • But single method call overhead is ~nanoseconds, not worth optimizing yet

📝 Documentation & Code Quality

Excellent:

  • Clear method documentation with producer-consumer pattern explanation
  • Inline comments explain why, not just what
  • RuboCop clean (as verified in test plan)

Minor suggestion: Add YARD @raise documentation to concurrent_component_streaming_buffer_size=:

# @raise [ReactOnRailsPro::Error] if value is not a positive integer
def concurrent_component_streaming_buffer_size=(value)

🧪 Test Quality

Good test coverage with room for improvement:

Covered:

  • ✅ IOError handling
  • ✅ Errno::EPIPE handling
  • ✅ Configuration validation (all edge cases)
  • ✅ Chunks written before disconnect

Missing:

  • ❌ Producer early termination (see detailed recommendation above)
  • ❌ Verify @async_barrier.stop is called on disconnect
  • ❌ Verify logging behavior when logging_on_server = true
  • ❌ Multiple concurrent producers with disconnect

Recommendation: Add at least one test verifying producer-side closed? check works as intended.


🎯 Final Recommendations

  1. MUST FIX: Add test coverage for producer early termination via response.stream.closed?
  2. NICE TO HAVE: Add comment about Async::Barrier#stop idempotency
  3. NICE TO HAVE: Add YARD @raise doc to setter method
  4. OPTIONAL: Integration test for non-disconnect write errors

✅ Approval Status

Approved with minor recommendations. The critical functionality is correct and well-implemented. The missing test coverage for producer early termination should be added, but the code itself is sound.

Excellent work on tracing the historical context and implementing a complete solution! 🎉

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

🧹 Nitpick comments (2)
react_on_rails_pro/spec/react_on_rails_pro/stream_spec.rb (1)

494-549: Consider adding test for logging behavior.

The disconnect handling tests verify the write stopping behavior, but there's no coverage for log_client_disconnect. Consider adding a test that verifies the debug log is written when ReactOnRails.configuration.logging_on_server is enabled.

react_on_rails_pro/lib/react_on_rails_pro/configuration.rb (1)

232-238: Redundant validation method.

The validate_concurrent_component_streaming_buffer_size method is now redundant since the custom setter (lines 86-92) already performs the same validation at assignment time. The setter is triggered both during initialization (line 131) and when users configure via ReactOnRailsPro.configure.

Consider removing this method and the call on line 140:

 def setup_config_values
   configure_default_url_if_not_provided
   validate_url
   validate_remote_bundle_cache_adapter
   setup_renderer_password
   setup_assets_to_copy
-  validate_concurrent_component_streaming_buffer_size
   setup_execjs_profiler_if_needed
   check_react_on_rails_support_for_rsc
 end
 
 # ... later in the file ...
 
-def validate_concurrent_component_streaming_buffer_size
-  return if concurrent_component_streaming_buffer_size.is_a?(Integer) &&
-            concurrent_component_streaming_buffer_size.positive?
-
-  raise ReactOnRailsPro::Error,
-        "config.concurrent_component_streaming_buffer_size must be a positive integer"
-end
📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 16b3908 and 47ca8b1.

📒 Files selected for processing (5)
  • react_on_rails_pro/app/helpers/react_on_rails_pro_helper.rb (2 hunks)
  • react_on_rails_pro/lib/react_on_rails_pro/concerns/stream.rb (2 hunks)
  • react_on_rails_pro/lib/react_on_rails_pro/configuration.rb (1 hunks)
  • react_on_rails_pro/spec/react_on_rails_pro/configuration_spec.rb (1 hunks)
  • react_on_rails_pro/spec/react_on_rails_pro/stream_spec.rb (2 hunks)
🧰 Additional context used
📓 Path-based instructions (2)
**/*.{rb,rake}

📄 CodeRabbit inference engine (CLAUDE.md)

ALWAYS run bundle exec rubocop and fix ALL violations before every commit/push

Files:

  • react_on_rails_pro/spec/react_on_rails_pro/stream_spec.rb
  • react_on_rails_pro/lib/react_on_rails_pro/concerns/stream.rb
  • react_on_rails_pro/app/helpers/react_on_rails_pro_helper.rb
  • react_on_rails_pro/lib/react_on_rails_pro/configuration.rb
  • react_on_rails_pro/spec/react_on_rails_pro/configuration_spec.rb
**/*

📄 CodeRabbit inference engine (CLAUDE.md)

ALWAYS ensure files end with a newline character before committing/pushing

Files:

  • react_on_rails_pro/spec/react_on_rails_pro/stream_spec.rb
  • react_on_rails_pro/lib/react_on_rails_pro/concerns/stream.rb
  • react_on_rails_pro/app/helpers/react_on_rails_pro_helper.rb
  • react_on_rails_pro/lib/react_on_rails_pro/configuration.rb
  • react_on_rails_pro/spec/react_on_rails_pro/configuration_spec.rb
🧠 Learnings (12)
📚 Learning: 2025-11-25T08:05:17.804Z
Learnt from: CR
Repo: shakacode/react_on_rails PR: 0
File: CLAUDE.md:0-0
Timestamp: 2025-11-25T08:05:17.804Z
Learning: Applies to /CHANGELOG.md : Update `/CHANGELOG.md` for user-visible changes (features, bug fixes, breaking changes, deprecations, performance improvements) to the open-source React on Rails gem and npm package

Applied to files:

  • react_on_rails_pro/app/helpers/react_on_rails_pro_helper.rb
  • react_on_rails_pro/lib/react_on_rails_pro/configuration.rb
  • react_on_rails_pro/spec/react_on_rails_pro/configuration_spec.rb
📚 Learning: 2025-04-09T13:02:16.009Z
Learnt from: AbanoubGhadban
Repo: shakacode/react_on_rails PR: 1696
File: node_package/src/transformRSCNodeStream.ts:3-31
Timestamp: 2025-04-09T13:02:16.009Z
Learning: In the React Server Components (RSC) stream processing implementation, all chunks end with a newline character (`\n`), so the `lastIncompleteChunk` variable is expected to be empty by the time the stream ends, making a flush handler unnecessary in the Transform stream.

Applied to files:

  • react_on_rails_pro/app/helpers/react_on_rails_pro_helper.rb
📚 Learning: 2025-11-25T08:05:17.804Z
Learnt from: CR
Repo: shakacode/react_on_rails PR: 0
File: CLAUDE.md:0-0
Timestamp: 2025-11-25T08:05:17.804Z
Learning: Applies to lib/react_on_rails/*.rb : Create RBS signature files for new Ruby files in `lib/react_on_rails/` by adding corresponding `.rbs` file in `sig/react_on_rails/`, adding to Steepfile, and validating before committing

Applied to files:

  • react_on_rails_pro/app/helpers/react_on_rails_pro_helper.rb
  • react_on_rails_pro/spec/react_on_rails_pro/configuration_spec.rb
📚 Learning: 2025-07-08T05:57:29.630Z
Learnt from: AbanoubGhadban
Repo: shakacode/react_on_rails PR: 1745
File: node_package/src/RSCRequestTracker.ts:8-14
Timestamp: 2025-07-08T05:57:29.630Z
Learning: The global `generateRSCPayload` function in React on Rails Pro (RORP) is provided by the framework during rendering requests, not implemented in application code. The `declare global` statements are used to document the expected interface that RORP will inject at runtime.

Applied to files:

  • react_on_rails_pro/app/helpers/react_on_rails_pro_helper.rb
📚 Learning: 2025-02-18T13:08:01.477Z
Learnt from: AbanoubGhadban
Repo: shakacode/react_on_rails PR: 1644
File: lib/react_on_rails/helper.rb:190-197
Timestamp: 2025-02-18T13:08:01.477Z
Learning: RSC support validation in React on Rails Pro is handled through a chain of validations:
1. Pro version check in `run_stream_inside_fiber`
2. RSC support check during pack generation via `ReactOnRailsPro.configuration.enable_rsc_support`
3. RSC support validation during component registration
This makes additional validation in the helper methods unnecessary.

Applied to files:

  • react_on_rails_pro/app/helpers/react_on_rails_pro_helper.rb
  • react_on_rails_pro/lib/react_on_rails_pro/configuration.rb
  • react_on_rails_pro/spec/react_on_rails_pro/configuration_spec.rb
📚 Learning: 2025-09-16T08:01:11.146Z
Learnt from: justin808
Repo: shakacode/react_on_rails PR: 1770
File: lib/generators/react_on_rails/templates/base/base/app/javascript/src/HelloWorld/ror_components/HelloWorld.client.jsx:2-2
Timestamp: 2025-09-16T08:01:11.146Z
Learning: React on Rails uses webpack CSS Modules configuration with namedExports: true, which requires the import syntax `import * as style from './file.module.css'` rather than the default export pattern. This configuration enables better tree shaking and bundle size optimization for CSS modules.

Applied to files:

  • react_on_rails_pro/app/helpers/react_on_rails_pro_helper.rb
  • react_on_rails_pro/lib/react_on_rails_pro/configuration.rb
📚 Learning: 2025-04-26T21:55:55.874Z
Learnt from: alexeyr-ci2
Repo: shakacode/react_on_rails PR: 1732
File: spec/dummy/client/app-react16/startup/ReduxSharedStoreApp.client.jsx:40-44
Timestamp: 2025-04-26T21:55:55.874Z
Learning: In the react_on_rails project, files under `app-react16` directories are copied/moved to corresponding `/app` directories during the conversion process (removing the `-react16` suffix), which affects their relative import paths at runtime.

Applied to files:

  • react_on_rails_pro/app/helpers/react_on_rails_pro_helper.rb
📚 Learning: 2025-10-23T17:22:01.074Z
Learnt from: AbanoubGhadban
Repo: shakacode/react_on_rails PR: 1875
File: lib/react_on_rails/utils.rb:112-124
Timestamp: 2025-10-23T17:22:01.074Z
Learning: In React on Rails, when Pro is installed but not licensed, the intended behavior is to raise an error on boot. The `react_on_rails_pro?` method validates licenses and should raise errors early (including during path resolution in methods like `server_bundle?`) to enforce licensing requirements rather than failing later with obscure errors.

Applied to files:

  • react_on_rails_pro/lib/react_on_rails_pro/configuration.rb
  • react_on_rails_pro/spec/react_on_rails_pro/configuration_spec.rb
📚 Learning: 2025-02-18T13:08:01.477Z
Learnt from: AbanoubGhadban
Repo: shakacode/react_on_rails PR: 1644
File: lib/react_on_rails/helper.rb:190-197
Timestamp: 2025-02-18T13:08:01.477Z
Learning: RSC support validation is handled in deeper level calls of the React on Rails Pro codebase, so it doesn't need to be validated again in the `rsc_payload_react_component` helper method.

Applied to files:

  • react_on_rails_pro/lib/react_on_rails_pro/configuration.rb
📚 Learning: 2025-02-13T14:29:49.267Z
Learnt from: AbanoubGhadban
Repo: shakacode/react_on_rails PR: 1644
File: spec/react_on_rails/utils_spec.rb:218-218
Timestamp: 2025-02-13T14:29:49.267Z
Learning: In RSpec tests, prefer using local variables over constants within test blocks to avoid constant redefinition warnings and maintain better test isolation.

Applied to files:

  • react_on_rails_pro/spec/react_on_rails_pro/configuration_spec.rb
📚 Learning: 2025-11-25T08:05:17.804Z
Learnt from: CR
Repo: shakacode/react_on_rails PR: 0
File: CLAUDE.md:0-0
Timestamp: 2025-11-25T08:05:17.804Z
Learning: The `react_on_rails_pro/` directory has its own Prettier/ESLint configuration and must be linted separately

Applied to files:

  • react_on_rails_pro/spec/react_on_rails_pro/configuration_spec.rb
📚 Learning: 2025-11-25T08:05:17.804Z
Learnt from: CR
Repo: shakacode/react_on_rails PR: 0
File: CLAUDE.md:0-0
Timestamp: 2025-11-25T08:05:17.804Z
Learning: Applies to sig/react_on_rails/**/*.rbs : In RBS type signature files, define type signatures that correspond to Ruby files in `lib/react_on_rails/`

Applied to files:

  • react_on_rails_pro/spec/react_on_rails_pro/configuration_spec.rb
🧬 Code graph analysis (3)
react_on_rails_pro/spec/react_on_rails_pro/stream_spec.rb (3)
react_on_rails_pro/spec/dummy/client/app/ror-auto-load-components/RouterApp.server.jsx (1)
  • controller (10-10)
react_on_rails_pro/spec/dummy/spec/helpers/react_on_rails_pro_helper_spec.rb (1)
  • run_stream (615-618)
packages/react-on-rails-pro/tests/AsyncQueue.ts (1)
  • enqueue (10-20)
react_on_rails_pro/lib/react_on_rails_pro/concerns/stream.rb (1)
packages/react-on-rails-pro/tests/AsyncQueue.ts (1)
  • dequeue (27-54)
react_on_rails_pro/spec/react_on_rails_pro/configuration_spec.rb (1)
react_on_rails_pro/lib/react_on_rails_pro/configuration.rb (2)
  • concurrent_component_streaming_buffer_size (86-92)
  • configuration (9-38)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (5)
  • GitHub Check: examples (3.4, latest)
  • GitHub Check: build-dummy-app-webpack-test-bundles
  • GitHub Check: pro-lint-js-and-ruby
  • GitHub Check: build-dummy-app-webpack-test-bundles
  • GitHub Check: claude-review
🔇 Additional comments (9)
react_on_rails_pro/lib/react_on_rails_pro/concerns/stream.rb (3)

63-86: LGTM! Client disconnect handling is well-implemented.

The rescue block correctly catches IOError and Errno::EPIPE, which are the standard exceptions raised when writing to a disconnected client. Setting client_disconnected allows the ensure block to stop producers and avoid wasted work.

One consideration: the client_disconnected variable is set inside the async block. Ruby closures capture the variable binding, so this works correctly, but be aware that the variable must be read only after writing_task.wait completes to ensure visibility of the updated value.


95-102: Correct shutdown sequence on disconnect.

The order of operations is appropriate: stopping the barrier first cancels producer tasks to prevent wasted work, then closing the queue signals end of streaming, and finally waiting for the writing task ensures cleanup completes.


104-110: Good logging implementation.

Using a block for Rails.logger.debug is the right approach for lazy evaluation. The debug level is appropriate since client disconnects are expected occurrences in production, not errors.

react_on_rails_pro/app/helpers/react_on_rails_pro_helper.rb (2)

325-346: Good extraction with proper disconnect handling.

The process_stream_chunks method cleanly encapsulates the streaming logic. The response.stream.closed? check on line 330 correctly prevents wasted work by breaking early when the client disconnects.

The first chunk / subsequent chunks separation is clear, and the empty stream case (line 345) is handled correctly by setting first_chunk_var.value = nil only when no chunks were received.


314-318: Clean refactor.

Extracting the chunk processing logic into process_stream_chunks improves the readability of consumer_stream_async while maintaining the same behavior.

react_on_rails_pro/spec/react_on_rails_pro/stream_spec.rb (2)

395-395: Necessary stub for disconnect detection.

Adding the closed? stub is required because process_stream_chunks in the helper now checks response.stream.closed? to detect client disconnects and break early.


494-548: Good test coverage for client disconnect scenarios.

The tests correctly verify behavior for both IOError and Errno::EPIPE exceptions:

  • Write 1: TEMPLATE (from render_to_string)
  • Write 2: Chunk1 (first enqueued chunk)
  • Write 3: Raises exception on Chunk2

The assertions expect(written_chunks).to eq(%w[TEMPLATE Chunk1]) correctly verify that streaming stops gracefully after the disconnect, with no exception propagating to the caller.

react_on_rails_pro/spec/react_on_rails_pro/configuration_spec.rb (1)

264-299: Comprehensive validation tests.

Good coverage of the validation edge cases:

  • Positive integer (valid) ✓
  • Zero (invalid boundary) ✓
  • Negative integer (invalid) ✓
  • Non-integer string (type check) ✓

The tests correctly verify both successful configuration and error messages for invalid values.

react_on_rails_pro/lib/react_on_rails_pro/configuration.rb (1)

73-92: Well-documented setter with proper validation.

The YARD documentation clearly explains the purpose, behavior, and constraints of the buffer size configuration. The validation correctly rejects non-integers and non-positive values.

ihabadham and others added 2 commits November 26, 2025 23:49
Test that response.stream.closed? check in process_stream_chunks
actually stops processing when client disconnects early. Also adds
required response mocks to existing streaming tests.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <[email protected]>
The concurrent_component_streaming_buffer_size setter already validates
the value when assigned (lines 86-92), making the separate validation
method and its call in setup_config_values unnecessary.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <[email protected]>
@claude
Copy link

claude bot commented Nov 26, 2025

Code Review: Client Disconnect Handling for Concurrent Component Streaming

Summary

This PR addresses a critical gap in error handling for client disconnects during streaming, which was missing after the architectural changes in #2111. The implementation is well-designed and thoroughly tested. ✅


Strengths

1. Excellent Architecture & Design 🎯

  • Producer-consumer pattern: Clean separation between streaming tasks (producers) and the writer (consumer)
  • Graceful degradation: Client disconnects don't crash the server, they're handled silently
  • Resource conservation: Stopping the barrier when clients disconnect prevents wasted CPU on producer tasks
  • Clear documentation: The new comments in drain_streams_concurrently perfectly explain the producer-consumer pattern

2. Comprehensive Test Coverage

The tests cover all critical scenarios:

  • IOError and Errno::EPIPE handling in stream_spec.rb
  • Early termination when client disconnects mid-stream in helper_spec.rb
  • Configuration validation for buffer size (positive integers only)
  • Proper mocking of response.stream.closed? throughout the test suite

3. Good Refactoring 🔧

  • Extracting process_stream_chunks method improves readability
  • Moving validation from initialization-time to setter-time (configuration.rb) provides better error messages

Issues & Recommendations

Critical Issues

1. Potential Race Condition in drain_streams_concurrently ⚠️

Location: react_on_rails_pro/lib/react_on_rails_pro/concerns/stream.rb:96-97

ensure
  # If client disconnected, stop all producer tasks to avoid wasted work
  @async_barrier.stop if client_disconnected

Problem: The client_disconnected flag is set in the writing_task (line 84), but the ensure block runs on the parent task. There's a race condition where the ensure block might execute before writing_task sets client_disconnected = true.

Timeline:

  1. Writer raises IOError at line 82
  2. Parent task's @async_barrier.wait completes (line 90)
  3. Parent task enters ensure block (line 95)
  4. client_disconnected might still be false if writing_task hasn't executed line 84 yet
  5. Barrier doesn't get stopped → producers continue wastefully

Suggested Fix:

def drain_streams_concurrently(parent_task)
  client_disconnected = Async::Variable.new

  writing_task = parent_task.async do
    while (chunk = @main_output_queue.dequeue)
      response.stream.write(chunk)
    end
  rescue IOError, Errno::EPIPE => e
    client_disconnected.value = true
    log_client_disconnect("writer", e)
  end

  begin
    @async_barrier.wait
  rescue StandardError => e
    @async_barrier.stop
    raise e
  end
ensure
  # Wait briefly for writing_task to set the flag if it errored
  writing_task.wait
  @async_barrier.stop if client_disconnected.value
  @main_output_queue.close
end

Alternatively, check writing_task.status for failure state.


2. Test Flakiness Risk ⚠️

Location: react_on_rails_pro/spec/dummy/spec/helpers/react_on_rails_pro_helper_spec.rb:462-496

# Should have stopped early - not all chunks processed
# The exact count depends on timing, but should be less than 9 (all remaining)
expect(collected_chunks.length).to be < 9

Problem: This test relies on timing. The comment acknowledges "exact count depends on timing," which is a red flag for flaky tests.

Why it's flaky:

  • If the async task processes chunks faster than expected, it might process all 9 before closed? returns true
  • If the system is under load, timing assumptions break

Suggested Fix:

it "stops processing chunks when client disconnects" do
  many_chunks = Array.new(10) do |i|
    { html: "<div>Chunk #{i}</div>", consoleReplayScript: "" }
  end
  mock_request_and_response(many_chunks)

  # Use a counter to deterministically disconnect after N chunks
  chunks_checked = 0
  allow(mocked_rails_stream).to receive(:closed?) do
    chunks_checked += 1
    chunks_checked > 2 # Disconnect after checking 2 chunks
  end

  initial_result = stream_react_component(component_name, props: props, **component_options)
  expect(initial_result).to include("<div>Chunk 0</div>")

  @async_barrier.wait
  @main_output_queue.close

  collected_chunks = []
  while (chunk = @main_output_queue.dequeue)
    collected_chunks << chunk
  end

  # Should have exactly 1 chunk (Chunk 1), since we disconnect after 2 checks:
  # - Check 1: returns false, processes Chunk 1, enqueues it
  # - Check 2: returns true, breaks before Chunk 2
  expect(collected_chunks.length).to eq(1)
  expect(collected_chunks[0]).to include("<div>Chunk 1</div>")
end

Minor Issues

3. Inconsistent Null-Safety Operator 📝

Location: react_on_rails_pro/app/helpers/react_on_rails_pro_helper.rb:332

all_chunks&.push(chunk)  # Safe navigation

vs. previous code:

all_chunks << chunk if on_complete  # Conditional guard

Observation: The new code uses safe navigation (&.), which is fine, but the old code used explicit if on_complete guard. Both work, but:

  • Safe navigation is more concise
  • Explicit guard is more readable for developers unfamiliar with &.

Not a bug, just noting the style change. The safe navigation is preferable here.


4. Documentation: Missing Edge Case 📖

Location: react_on_rails_pro/lib/react_on_rails_pro/concerns/stream.rb:70-73

The comment says:

No exception propagates to the controller for client disconnects

Question: What happens if a producer task raises an error unrelated to client disconnect?

Answer from code: Line 90-94 handles this:

begin
  @async_barrier.wait
rescue StandardError => e
  @async_barrier.stop
  raise e
end

Suggestion: Update the documentation to clarify:

# Client disconnect handling:
# - If client disconnects (IOError/Errno::EPIPE), writer stops gracefully
# - Barrier is stopped to cancel all producer tasks, preventing wasted work
# - Client disconnects are logged (if logging enabled) but don't propagate to controller
# - Other producer task errors ARE propagated to allow proper error handling

Nitpicks

5. Test Organization 📝

The test file has multiple let(:mocked_stream) declarations scattered across different describe blocks. Consider moving common mocking setup to a shared context or helper method to reduce duplication.


Performance Considerations

✅ Good: Early Termination

The break if response.stream.closed? check (line 330) prevents deadlock scenarios where:

  • Producer blocks on @main_output_queue.enqueue(chunk) when queue is full
  • Writer has died and will never dequeue
  • Without the check, producer would hang forever

✅ Good: Barrier Stopping

Stopping the barrier when client disconnects saves CPU by canceling all in-flight producer tasks.


Security Considerations

✅ No Security Issues

  • No new attack vectors introduced
  • Error handling doesn't leak sensitive information
  • Logging is behind a configuration flag (logging_on_server)

Recommendations

  1. Fix the race condition in drain_streams_concurrently by waiting for writing_task before checking client_disconnected
  2. Make the disconnect test deterministic by using a counter instead of timing
  3. Enhance documentation to clarify error propagation behavior
  4. Consider extracting test setup to reduce duplication across describe blocks

Conclusion

This is a high-quality PR that addresses a real problem with a clean, well-tested solution. The architecture is sound, the tests are comprehensive, and the code is maintainable.

The race condition in drain_streams_concurrently is the only critical issue, but it's an edge case that may not manifest in practice (the ensure block typically runs after the writing_task completes). Still, it's worth fixing for robustness.

Recommendation: ✅ Approve with requested changes (fix race condition & test flakiness)


Great work on this PR! The producer-consumer pattern is exactly the right approach, and the attention to detail in testing is excellent. 🚀

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

🧹 Nitpick comments (4)
react_on_rails_pro/spec/dummy/spec/helpers/react_on_rails_pro_helper_spec.rb (3)

383-389: Consistent response.stream mocking for new closed? checks

Stubbing response.stream.closed? to false by default ensures the new helper logic that consults stream.closed? can run in these specs without spurious failures. If this pattern grows further, consider extracting a small helper (e.g., stub_open_stream) to DRY up similar setups across streaming specs, but it’s optional at this scale.


466-496: Client-disconnect test is sound; consider avoiding a magic number

The “stops processing chunks when client disconnects” example correctly simulates a disconnect by toggling closed? after the first call and asserting that fewer than all remaining chunks are processed, which guards the new stream.closed? checks in process_stream_chunks.

To make the test a bit more future-proof and self-documenting, you could express the expectation in terms of many_chunks.length instead of the literal 9:

-        # Should have stopped early - not all chunks processed
-        # The exact count depends on timing, but should be less than 9 (all remaining)
-        expect(collected_chunks.length).to be < 9
+        # Should have stopped early - not all non-initial chunks processed
+        expect(collected_chunks.length).to be < (many_chunks.length - 1)

769-775: Ensure mocked_stream stubs stay minimal but sufficient

Stubbing only closed? and wiring response.stream to mocked_stream is appropriate as long as the RandomValue integration path doesn’t invoke write/close on the stream in these tests (which seems true given current behavior). If future changes add writes here, you may need to extend the stub:

allow(mocked_stream).to receive(:write)
allow(mocked_stream).to receive(:close)

For now, the setup is lean and adequate.

react_on_rails_pro/lib/react_on_rails_pro/configuration.rb (1)

73-92: Validation on concurrent_component_streaming_buffer_size is appropriate and fail-fast

Switching from an attr_accessor to an attr_reader plus a custom setter that enforces a positive Integer and raises ReactOnRailsPro::Error on invalid values is a solid improvement: it prevents silent misconfiguration and constrains this tuning knob to sane values.

If you expect users to often source this from environment variables or YAML (so you get strings), you might optionally consider a small coercion step (e.g., Integer(value) with a rescue that raises the same error) instead of requiring is_a?(Integer), but strict typing is also a reasonable choice if that’s the intended API.

📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 47ca8b1 and b0c86d0.

📒 Files selected for processing (2)
  • react_on_rails_pro/lib/react_on_rails_pro/configuration.rb (1 hunks)
  • react_on_rails_pro/spec/dummy/spec/helpers/react_on_rails_pro_helper_spec.rb (9 hunks)
🧰 Additional context used
📓 Path-based instructions (2)
**/*.{rb,rake}

📄 CodeRabbit inference engine (CLAUDE.md)

ALWAYS run bundle exec rubocop and fix ALL violations before every commit/push

Files:

  • react_on_rails_pro/lib/react_on_rails_pro/configuration.rb
  • react_on_rails_pro/spec/dummy/spec/helpers/react_on_rails_pro_helper_spec.rb
**/*

📄 CodeRabbit inference engine (CLAUDE.md)

ALWAYS ensure files end with a newline character before committing/pushing

Files:

  • react_on_rails_pro/lib/react_on_rails_pro/configuration.rb
  • react_on_rails_pro/spec/dummy/spec/helpers/react_on_rails_pro_helper_spec.rb
🧠 Learnings (13)
📓 Common learnings
Learnt from: AbanoubGhadban
Repo: shakacode/react_on_rails PR: 1696
File: node_package/src/transformRSCNodeStream.ts:3-31
Timestamp: 2025-04-09T13:02:16.009Z
Learning: In the React Server Components (RSC) stream processing implementation, all chunks end with a newline character (`\n`), so the `lastIncompleteChunk` variable is expected to be empty by the time the stream ends, making a flush handler unnecessary in the Transform stream.
📚 Learning: 2025-11-25T08:05:17.804Z
Learnt from: CR
Repo: shakacode/react_on_rails PR: 0
File: CLAUDE.md:0-0
Timestamp: 2025-11-25T08:05:17.804Z
Learning: Applies to /CHANGELOG.md : Update `/CHANGELOG.md` for user-visible changes (features, bug fixes, breaking changes, deprecations, performance improvements) to the open-source React on Rails gem and npm package

Applied to files:

  • react_on_rails_pro/lib/react_on_rails_pro/configuration.rb
  • react_on_rails_pro/spec/dummy/spec/helpers/react_on_rails_pro_helper_spec.rb
📚 Learning: 2025-02-18T13:08:01.477Z
Learnt from: AbanoubGhadban
Repo: shakacode/react_on_rails PR: 1644
File: lib/react_on_rails/helper.rb:190-197
Timestamp: 2025-02-18T13:08:01.477Z
Learning: RSC support validation in React on Rails Pro is handled through a chain of validations:
1. Pro version check in `run_stream_inside_fiber`
2. RSC support check during pack generation via `ReactOnRailsPro.configuration.enable_rsc_support`
3. RSC support validation during component registration
This makes additional validation in the helper methods unnecessary.

Applied to files:

  • react_on_rails_pro/lib/react_on_rails_pro/configuration.rb
📚 Learning: 2025-10-23T17:22:01.074Z
Learnt from: AbanoubGhadban
Repo: shakacode/react_on_rails PR: 1875
File: lib/react_on_rails/utils.rb:112-124
Timestamp: 2025-10-23T17:22:01.074Z
Learning: In React on Rails, when Pro is installed but not licensed, the intended behavior is to raise an error on boot. The `react_on_rails_pro?` method validates licenses and should raise errors early (including during path resolution in methods like `server_bundle?`) to enforce licensing requirements rather than failing later with obscure errors.

Applied to files:

  • react_on_rails_pro/lib/react_on_rails_pro/configuration.rb
📚 Learning: 2025-09-16T08:01:11.146Z
Learnt from: justin808
Repo: shakacode/react_on_rails PR: 1770
File: lib/generators/react_on_rails/templates/base/base/app/javascript/src/HelloWorld/ror_components/HelloWorld.client.jsx:2-2
Timestamp: 2025-09-16T08:01:11.146Z
Learning: React on Rails uses webpack CSS Modules configuration with namedExports: true, which requires the import syntax `import * as style from './file.module.css'` rather than the default export pattern. This configuration enables better tree shaking and bundle size optimization for CSS modules.

Applied to files:

  • react_on_rails_pro/lib/react_on_rails_pro/configuration.rb
📚 Learning: 2025-02-18T13:08:01.477Z
Learnt from: AbanoubGhadban
Repo: shakacode/react_on_rails PR: 1644
File: lib/react_on_rails/helper.rb:190-197
Timestamp: 2025-02-18T13:08:01.477Z
Learning: RSC support validation is handled in deeper level calls of the React on Rails Pro codebase, so it doesn't need to be validated again in the `rsc_payload_react_component` helper method.

Applied to files:

  • react_on_rails_pro/lib/react_on_rails_pro/configuration.rb
📚 Learning: 2025-11-25T08:05:17.804Z
Learnt from: CR
Repo: shakacode/react_on_rails PR: 0
File: CLAUDE.md:0-0
Timestamp: 2025-11-25T08:05:17.804Z
Learning: Applies to lib/react_on_rails/*.rb : Create RBS signature files for new Ruby files in `lib/react_on_rails/` by adding corresponding `.rbs` file in `sig/react_on_rails/`, adding to Steepfile, and validating before committing

Applied to files:

  • react_on_rails_pro/lib/react_on_rails_pro/configuration.rb
📚 Learning: 2025-11-25T08:05:17.804Z
Learnt from: CR
Repo: shakacode/react_on_rails PR: 0
File: CLAUDE.md:0-0
Timestamp: 2025-11-25T08:05:17.804Z
Learning: The `react_on_rails_pro/` directory has its own Prettier/ESLint configuration and must be linted separately

Applied to files:

  • react_on_rails_pro/lib/react_on_rails_pro/configuration.rb
📚 Learning: 2024-10-08T20:53:47.076Z
Learnt from: theforestvn88
Repo: shakacode/react_on_rails PR: 1620
File: spec/dummy/client/app/startup/HelloTurboStream.jsx:3-3
Timestamp: 2024-10-08T20:53:47.076Z
Learning: The `RailsContext` import in `spec/dummy/client/app/startup/HelloTurboStream.jsx` is used later in the project, as clarified by the user theforestvn88.

Applied to files:

  • react_on_rails_pro/spec/dummy/spec/helpers/react_on_rails_pro_helper_spec.rb
📚 Learning: 2025-11-25T08:05:17.804Z
Learnt from: CR
Repo: shakacode/react_on_rails PR: 0
File: CLAUDE.md:0-0
Timestamp: 2025-11-25T08:05:17.804Z
Learning: Applies to spec/dummy/e2e/playwright/**/*.spec.js : Use Rails helpers (`appFactories`, `appEval`) in Playwright tests instead of UI-based setup to leverage factory_bot and arbitrary Ruby code execution

Applied to files:

  • react_on_rails_pro/spec/dummy/spec/helpers/react_on_rails_pro_helper_spec.rb
📚 Learning: 2025-04-09T13:02:16.009Z
Learnt from: AbanoubGhadban
Repo: shakacode/react_on_rails PR: 1696
File: node_package/src/transformRSCNodeStream.ts:3-31
Timestamp: 2025-04-09T13:02:16.009Z
Learning: In the React Server Components (RSC) stream processing implementation, all chunks end with a newline character (`\n`), so the `lastIncompleteChunk` variable is expected to be empty by the time the stream ends, making a flush handler unnecessary in the Transform stream.

Applied to files:

  • react_on_rails_pro/spec/dummy/spec/helpers/react_on_rails_pro_helper_spec.rb
📚 Learning: 2025-02-13T14:29:49.267Z
Learnt from: AbanoubGhadban
Repo: shakacode/react_on_rails PR: 1644
File: spec/react_on_rails/utils_spec.rb:218-218
Timestamp: 2025-02-13T14:29:49.267Z
Learning: In RSpec tests, prefer using local variables over constants within test blocks to avoid constant redefinition warnings and maintain better test isolation.

Applied to files:

  • react_on_rails_pro/spec/dummy/spec/helpers/react_on_rails_pro_helper_spec.rb
📚 Learning: 2024-10-08T20:53:47.076Z
Learnt from: justin808
Repo: shakacode/react_on_rails PR: 1622
File: spec/dummy/spec/rake/assets_precompile_rake_spec.rb:12-12
Timestamp: 2024-10-08T20:53:47.076Z
Learning: When stubbing environment variables in RSpec tests, use `before` and `after` hooks to ensure that the original values are restored after the tests, preventing any side effects on other tests. Example provided by justin808:
```ruby
describe "My test" do
  before do
    original_value = ENV["VARIABLE_NAME"]
    allow(ENV).to receive(:[]).with("VARIABLE_NAME").and_return("stubbed_value")
  end

  after do
    allow(ENV).to receive(:[]).with("VARIABLE_NAME").and_call_original
    ENV["VARIABLE_NAME"] = original_value
  end

  it "tests something" do
    # Your test code here
  end
end
```
This practice ensures test isolation and reliability.

Applied to files:

  • react_on_rails_pro/spec/dummy/spec/helpers/react_on_rails_pro_helper_spec.rb
🧬 Code graph analysis (1)
react_on_rails_pro/spec/dummy/spec/helpers/react_on_rails_pro_helper_spec.rb (2)
react_on_rails_pro/app/helpers/react_on_rails_pro_helper.rb (1)
  • stream_react_component (126-138)
react_on_rails_pro/spec/react_on_rails_pro/stream_spec.rb (1)
  • include (8-35)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (6)
  • GitHub Check: examples (3.2, minimum)
  • GitHub Check: examples (3.4, latest)
  • GitHub Check: build-dummy-app-webpack-test-bundles
  • GitHub Check: build-dummy-app-webpack-test-bundles
  • GitHub Check: pro-lint-js-and-ruby
  • GitHub Check: claude-review
🔇 Additional comments (6)
react_on_rails_pro/spec/dummy/spec/helpers/react_on_rails_pro_helper_spec.rb (5)

365-367: Good isolation for stream_react_component streaming tests

The dedicated describe "#stream_react_component" with its own mocked_rails_stream double cleanly scopes the streaming tests and keeps the setup localized. This aligns well with the new stream.closed? behavior.


521-521: closed? stub keeps stream_view_containing_react_components specs aligned with new behavior

Adding allow(mocked_stream).to receive(:closed?).and_return(false) here is necessary now that the streaming helpers may consult stream.closed?. This keeps the existing view-level streaming specs focused on behavior rather than low-level stream state and looks correct.


611-611: Cached streaming specs correctly mock closed? on the stream

The additional closed? stub for the cached streaming tests ensures that the new client-disconnect-aware logic doesn’t inadvertently terminate cached streams during these examples. The change is minimal and appropriate.


756-758: Integration describe for cached streaming plus RandomValue is well-scoped

The new describe "cached_stream_react_component integration with RandomValue" and its mocked_stream double carve out a focused space to validate cache behavior under the concurrent streaming architecture. This structure is clear and consistent with the earlier streaming specs.


836-852: Attribution tests correctly adapt to closed?-aware streaming

The attribution specs now mock response.stream.closed? and route response.stream through a double, which keeps them compatible with the new disconnect-aware streaming logic while still focusing purely on the presence and multiplicity of the attribution comment. This is a good, minimal adjustment.

react_on_rails_pro/lib/react_on_rails_pro/configuration.rb (1)

105-132: Initializer correctly routes through the validated setter

Passing concurrent_component_streaming_buffer_size: into the initializer with a default of DEFAULT_CONCURRENT_COMPONENT_STREAMING_BUFFER_SIZE and then assigning via self.concurrent_component_streaming_buffer_size = ... ensures the validation is applied at construction time as well, not just on later mutations. That’s a good fail-fast behavior for bad configuration.

Given this is a user-visible configuration validation (raising a specific ReactOnRailsPro::Error), it’s worth confirming the CHANGELOG entry reflects this stricter behavior so upgrades don’t surprise existing users. Based on learnings, this should be documented in /CHANGELOG.md.

The ensure block was checking client_disconnected before waiting for
writing_task to complete. This caused a race where the flag might still
be false when checked, even though writing_task would set it to true
moments later.

Fixed by reordering the ensure block:
1. Close queue (unblocks writing_task if waiting on dequeue)
2. Wait for writing_task (ensures flag is set if disconnect occurred)
3. Check flag and stop barrier if needed

This ensures producers are properly stopped when clients disconnect,
preventing wasted CPU cycles.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <[email protected]>
@claude
Copy link

claude bot commented Nov 26, 2025

Code Review - PR #2137: Client Disconnect Handling for Concurrent Component Streaming

I've reviewed this PR and it's a solid implementation. Here's my feedback:


✅ Strengths

1. Well-Structured Error Handling

The error handling in stream.rb is clean and follows best practices:

  • Catches specific exceptions (IOError, Errno::EPIPE) rather than generic errors
  • Uses a flag-based approach (client_disconnected) to coordinate shutdown
  • Gracefully stops producer tasks via @async_barrier.stop to prevent wasted CPU

2. Race Condition Fix

The ensure block correctly handles the race condition:

# Close the queue first to unblock writing_task (it may be waiting on dequeue)
@main_output_queue.close

# Wait for writing_task to ensure client_disconnected flag is set
# before we check it (fixes race condition where ensure runs before
# writing_task's rescue block sets the flag)
writing_task.wait

# If client disconnected, stop all producer tasks to avoid wasted work
@async_barrier.stop if client_disconnected

This is excellent - the comment clearly explains the race condition and the fix is correct.

3. Producer-Side Early Termination

The closed? check in process_stream_chunks prevents deadlock:

stream.each_chunk do |chunk|
  # Check if client disconnected before processing chunk
  break if response.stream.closed?
  # ...
end

This is important when the queue is full and the producer would otherwise block.

4. Configuration Validation

Moving validation to a setter is the right approach:

def concurrent_component_streaming_buffer_size=(value)
  unless value.is_a?(Integer) && value.positive?
    raise ReactOnRailsPro::Error,
          "config.concurrent_component_streaming_buffer_size must be a positive integer"
  end
  @concurrent_component_streaming_buffer_size = value
end

This provides immediate feedback on invalid configuration.

5. Excellent Test Coverage

  • Tests both IOError and Errno::EPIPE scenarios
  • Tests buffer size validation (positive, zero, negative, non-integer)
  • Tests early termination in producer (stops processing chunks when client disconnects)
  • All tests have clear expectations and meaningful assertions

6. Code Organization

Extracting process_stream_chunks improves readability and makes the disconnect check more maintainable.


💡 Suggestions for Improvement

1. Consider Logging Level

The disconnect logging uses Rails.logger.debug:

Rails.logger.debug do
  "[React on Rails Pro] Client disconnected during streaming (#{context}): #{exception.class}"
end

Consideration: Client disconnects are expected behavior (user closes browser, network issues). debug is appropriate for development, but you might want to:

  • Keep it at debug if you want minimal noise in production logs
  • OR use info if you want to track disconnect frequency for monitoring

The current choice seems reasonable, but worth documenting the rationale.

2. Potential Race in Producer Check

In process_stream_chunks:

break if response.stream.closed?

Minor concern: There's a small window between checking closed? and calling @main_output_queue.enqueue(chunk) where the writer could fail. However:

  • The writer catches the error gracefully
  • The queue will be closed in the ensure block
  • Producer tasks will be stopped via barrier

So this is likely fine, but worth noting for future debugging if weird edge cases appear.

3. Test Determinism

In the test:

# Should have stopped early - not all chunks processed
# The exact count depends on timing, but should be less than 9 (all remaining)
expect(collected_chunks.length).to be < 9

The comment acknowledges timing dependency. This is acceptable for this type of integration test, but if it becomes flaky, consider:

  • Using more aggressive disconnect simulation (return true immediately)
  • Or just verify that NOT all 10 chunks were processed

🔒 Security Considerations

No security concerns. The error handling doesn't expose sensitive information, and the logging respects the logging_on_server configuration.


⚡ Performance Considerations

Excellent performance characteristics:

  1. Prevents CPU waste by stopping producer tasks on disconnect
  2. Prevents memory bloat by breaking out of chunk processing early
  3. Avoids unnecessary network calls when client is gone
  4. Buffer size validation ensures reasonable memory bounds

The overhead of checking closed? per chunk is negligible compared to the savings from early termination.


📋 Additional Notes

Missing from #2017/#2026

As noted in the PR description, this implementation differs from the closed PRs because it's adapted for the #2111 architecture (Async::Barrier + Async::Variable + Async::LimitedQueue). The approach is sound and appropriate for the current architecture.

Architecture Fit

This implementation fits well with the producer-consumer pattern:

  • Writer task handles disconnect exceptions
  • Producer tasks check for early termination
  • Barrier coordinates graceful shutdown

✅ Final Verdict

LGTM - Ready to merge pending minor considerations above.

The code is well-written, thoroughly tested, and solves a real problem (wasted CPU on disconnect). The race condition fix is particularly well-handled. Great work on this implementation! 🎉


📝 Checklist Review

Based on CLAUDE.md requirements:

  • ✅ Code follows best practices
  • ✅ No potential bugs identified (minor race noted but handled gracefully)
  • ✅ Performance considerations addressed (prevents wasted work)
  • ✅ No security concerns
  • ✅ Excellent test coverage (152 tests passing, new tests for all scenarios)
  • ✅ RuboCop clean (as confirmed in test plan)
  • ✅ Configuration validation properly implemented

@claude
Copy link

claude bot commented Nov 26, 2025

Code Review: Client Disconnect Handling for Concurrent Component Streaming

Summary

Excellent work! This PR adds critical error handling for client disconnects that was missing after the architectural changes in #2111. The implementation is well-designed, thoroughly tested, and follows best practices.

✅ Strengths

1. Correct Error Handling

  • Catches the right exceptions: IOError and Errno::EPIPE are exactly what Rails raises on client disconnects
  • Graceful degradation: Errors don't crash the request or log alarming stack traces
  • Clean flag-based coordination between writer and producers

2. Resource Management

  • Producer early termination via stream.closed? prevents wasted CPU
  • Barrier stop ensures all producer tasks are cancelled promptly
  • Prevents deadlock when writer dies while producers are blocked on full queue

3. Code Quality

  • process_stream_chunks extraction improves readability (single responsibility)
  • Excellent inline documentation explaining producer-consumer pattern
  • Configuration validation moved to setter (better user experience - fails early)

4. Test Coverage

  • Tests both IOError and Errno::EPIPE scenarios
  • Tests early termination behavior (verifies producer stops mid-stream)
  • Adds buffer size validation tests
  • All tests verify actual behavior changes (not just code coverage)

5. Race Condition Handling

The ensure block in drain_streams_concurrently has excellent sequencing:

ensure
  @main_output_queue.close  # Unblock writer first
  writing_task.wait         # Wait for flag to be set
  @async_barrier.stop if client_disconnected  # Then cancel producers
end

This prevents the race where ensure runs before rescue sets the flag. Nice attention to detail!

🔍 Observations & Suggestions

1. Optional Logging Consideration

The logging is behind logging_on_server which is good, but consider if disconnect events might warrant a separate log level or config:

# Current approach is fine, but you could consider:
Rails.logger.info "[React on Rails Pro] Client disconnected during streaming"
# instead of debug level, since this is normal operational info

However, debug is probably correct since disconnects are frequent and not actionable.

2. Test Determinism

The early termination test has this comment:

# Should have stopped early - not all chunks processed
# The exact count depends on timing, but should be less than 9
expect(collected_chunks.length).to be < 9

This is reasonable given async timing, but if flakiness appears, consider:

  • Using a deterministic signal (e.g., close stream after N chunks dequeued)
  • Or using expect(collected_chunks.length).to be_between(1, 8)

Current approach is fine for now - just flagging for future reference.

3. Configuration Validation Timing

Moving validation from setup_config_values to the setter is excellent:

  • Fails immediately when misconfigured (better DX)
  • Clear error message points to exact config line
  • Removes initialization ordering concerns

This is a best practice improvement!

🎯 Architecture Alignment

This PR correctly adapts the client disconnect handling to the new Barrier + Queue architecture:

Aspect Old (Fibers) New (Barrier/Queue) This PR
Writer error handling In writer task ✅ In writer task ✅ Preserved
Producer cancellation Stop fibers Stop barrier ✅ Correct
Early termination check fiber.alive? stream.closed? ✅ Correct

🔒 Security & Performance

  • No security concerns: Error handling doesn't expose internal state
  • Performance: Prevents wasted CPU on disconnected clients (positive impact)
  • Resource leaks: Properly stops all tasks and closes queue

📝 Documentation

The inline comments in stream.rb explaining the producer-consumer pattern and disconnect handling are excellent. Consider adding this to CHANGELOG_PRO.md as:

#### Fixed
- Fixed missing client disconnect handling in concurrent component streaming that could waste CPU resources when clients disconnect mid-stream. [PR 2137](https://github.com/shakacode/react_on_rails/pull/2137) by [ihabadham](https://github.com/ihabadham)

✅ Final Verdict

LGTM! Ready to merge.

This is high-quality work:

Great job on the thorough testing and attention to edge cases like the race condition in the ensure block!


Review by Claude Code - all 152 Pro tests passing, RuboCop clean

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants