-
-
Notifications
You must be signed in to change notification settings - Fork 638
Replace manual Fibers with Async gem primitives for component streaming #2111
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: master
Are you sure you want to change the base?
Replace manual Fibers with Async gem primitives for component streaming #2111
Conversation
## Problem We need to run async gem code inside the `each_chunk` function for the upcoming React on Rails async props implementation. However, this raises the error "Running scheduler on non-blocking fiber" because `each_chunk` executes within a manually created Fiber instead of a Fiber managed by the async gem. ## Solution Replace manual Fiber management with async gem primitives: - **Async::Barrier**: Tracks all component streaming tasks - **Async::Variable**: Synchronizes first chunk delivery - **Async::LimitedQueue**: Single output queue for all components ## Changes 1. **stream_view_containing_react_components**: - Wrap entire method in `Sync do` block - Initialize `@async_barrier` and `@main_output_queue` before rendering - Components can now start async tasks during view rendering 2. **run_stream_inside_fiber**: - Replace `Fiber.new` with `@async_barrier.async` - Use `Async::Variable` to wait for first chunk synchronously - First chunk stored in variable, remaining chunks enqueued to main queue 3. **drain_streams_concurrently**: - Simplified from 45 lines to 11 lines - Just wait on barrier, close queue, drain chunks - Removed `build_producer_tasks` and `build_writer_task` methods ## Benefits - ✅ Unified concurrency model (all async gem, no manual Fibers) - ✅ Enables async gem code to run inside component streaming - ✅ Simpler code (-40 lines total) - ✅ Same functionality: concurrent streaming, backpressure, error handling 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <[email protected]>
WalkthroughSwitches streaming from fiber-yield sequencing to an Async::Barrier + main output queue model: returns the first chunk synchronously via an Async::Variable, streams remaining chunks asynchronously into a main queue, and supports an optional on_complete callback to receive collected chunks (used for post-stream caching). Changes
Sequence Diagram(s)sequenceDiagram
participant Caller
participant Helper as run_stream_inside_fiber(on_complete:)
participant Var as Async::Variable\n(first_chunk_var)
participant Barrier as Async::Barrier
participant ComponentTasks as Component\nasync tasks
participant MainQ as Main Output\nQueue
participant Response as response.stream
participant onComplete as on_complete (optional)
Caller->>Helper: start streaming (opts, maybe on_complete)
Helper->>Var: create first_chunk_var
Helper->>Barrier: create barrier
Helper->>Response: render template head & write initial head
par component enqueue (concurrent)
ComponentTasks->>MainQ: enqueue subsequent chunks
ComponentTasks->>Var: deliver first chunk (if produced)
and synchronous first-chunk return
Var-->>Helper: first chunk value (or nil)
Helper-->>Caller: return first chunk synchronously
end
Helper->>Barrier: wait for async component tasks to finish
MainQ->>Response: write enqueued subsequent chunks
Helper->>onComplete: invoke with collected chunks (if provided)
Helper->>Response: close stream
Estimated code review effort🎯 4 (Complex) | ⏱️ ~50 minutes
Possibly related issues
Possibly related PRs
Suggested reviewers
Poem
Pre-merge checks and finishing touches❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✨ Finishing touches
🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
Code Review: Replace manual Fibers with Async gem primitivesSummaryThis PR successfully migrates from manual Fiber management to async gem primitives (Barrier, Variable, LimitedQueue) to enable async code execution within ✅ Strengths1. Clear Problem Statement & ArchitectureThe PR description excellently documents:
2. Simplified Code (-40 lines net)
3. Correct Async PatternsThe use of async primitives is well-designed:
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (2)
react_on_rails_pro/app/helpers/react_on_rails_pro_helper.rb(1 hunks)react_on_rails_pro/lib/react_on_rails_pro/concerns/stream.rb(1 hunks)
🧰 Additional context used
🧠 Learnings (6)
📚 Learning: 2025-04-26T21:55:55.874Z
Learnt from: alexeyr-ci2
Repo: shakacode/react_on_rails PR: 1732
File: spec/dummy/client/app-react16/startup/ReduxSharedStoreApp.client.jsx:40-44
Timestamp: 2025-04-26T21:55:55.874Z
Learning: In the react_on_rails project, files under `app-react16` directories are copied/moved to corresponding `/app` directories during the conversion process (removing the `-react16` suffix), which affects their relative import paths at runtime.
Applied to files:
react_on_rails_pro/app/helpers/react_on_rails_pro_helper.rb
📚 Learning: 2025-09-16T08:01:11.146Z
Learnt from: justin808
Repo: shakacode/react_on_rails PR: 1770
File: lib/generators/react_on_rails/templates/base/base/app/javascript/src/HelloWorld/ror_components/HelloWorld.client.jsx:2-2
Timestamp: 2025-09-16T08:01:11.146Z
Learning: React on Rails uses webpack CSS Modules configuration with namedExports: true, which requires the import syntax `import * as style from './file.module.css'` rather than the default export pattern. This configuration enables better tree shaking and bundle size optimization for CSS modules.
Applied to files:
react_on_rails_pro/app/helpers/react_on_rails_pro_helper.rb
📚 Learning: 2025-02-12T16:38:06.537Z
Learnt from: Romex91
Repo: shakacode/react_on_rails PR: 1697
File: package-scripts.yml:28-28
Timestamp: 2025-02-12T16:38:06.537Z
Learning: The file `node_package/lib/ReactOnRails.full.js` is autogenerated during the build process and should not be present in the repository.
Applied to files:
react_on_rails_pro/app/helpers/react_on_rails_pro_helper.rb
📚 Learning: 2024-10-08T20:53:47.076Z
Learnt from: theforestvn88
Repo: shakacode/react_on_rails PR: 1620
File: spec/dummy/client/app/startup/HelloTurboStream.jsx:3-3
Timestamp: 2024-10-08T20:53:47.076Z
Learning: The `RailsContext` import in `spec/dummy/client/app/startup/HelloTurboStream.jsx` is used later in the project, as clarified by the user theforestvn88.
Applied to files:
react_on_rails_pro/app/helpers/react_on_rails_pro_helper.rb
📚 Learning: 2025-07-08T05:57:29.630Z
Learnt from: AbanoubGhadban
Repo: shakacode/react_on_rails PR: 1745
File: node_package/src/RSCRequestTracker.ts:8-14
Timestamp: 2025-07-08T05:57:29.630Z
Learning: The global `generateRSCPayload` function in React on Rails Pro (RORP) is provided by the framework during rendering requests, not implemented in application code. The `declare global` statements are used to document the expected interface that RORP will inject at runtime.
Applied to files:
react_on_rails_pro/app/helpers/react_on_rails_pro_helper.rb
📚 Learning: 2025-09-15T21:24:48.207Z
Learnt from: AbanoubGhadban
Repo: shakacode/react_on_rails PR: 1781
File: node_package/src/ClientSideRenderer.ts:82-95
Timestamp: 2025-09-15T21:24:48.207Z
Learning: In React on Rails, the force_load feature includes both explicit `data-force-load="true"` usage and the ability to hydrate components during the page loading state (`document.readyState === 'loading'`). Both capabilities require a Pro license, so the condition `!railsContext.rorPro && (isComponentForceLoaded || document.readyState === 'loading')` correctly gates both scenarios.
Applied to files:
react_on_rails_pro/lib/react_on_rails_pro/concerns/stream.rb
🧬 Code graph analysis (2)
react_on_rails_pro/app/helpers/react_on_rails_pro_helper.rb (1)
packages/react-on-rails-pro/tests/AsyncQueue.ts (1)
enqueue(10-20)
react_on_rails_pro/lib/react_on_rails_pro/concerns/stream.rb (4)
react_on_rails_pro/lib/react_on_rails_pro/configuration.rb (1)
configuration(9-38)react_on_rails_pro/spec/dummy/spec/helpers/react_on_rails_pro_helper_spec.rb (2)
render_to_string(14-14)response(15-15)react_on_rails_pro/spec/react_on_rails_pro/stream_spec.rb (1)
render_to_string(17-30)packages/react-on-rails-pro/tests/AsyncQueue.ts (1)
dequeue(27-54)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (5)
- GitHub Check: examples (3.4, latest)
- GitHub Check: build-dummy-app-webpack-test-bundles
- GitHub Check: pro-lint-js-and-ruby
- GitHub Check: build-dummy-app-webpack-test-bundles
- GitHub Check: claude-review
🔇 Additional comments (2)
react_on_rails_pro/lib/react_on_rails_pro/concerns/stream.rb (2)
38-49: LGTM! Clean initialization and early template rendering.The refactor correctly initializes the async primitives before template rendering and immediately writes the template to the response stream. This ensures components can start streaming under the barrier as soon as they're encountered during rendering.
69-71: The review comment is incorrect; the code correctly handles queue closure.When an Async::LimitedQueue is closed,
dequeuereturns nil once the queue is closed and empty, which is exactly what the code expects. The while loop conditionwhile (chunk = @main_output_queue.dequeue)exits gracefully whendequeuereturns nil, a falsy value. This pattern is also validated in the test files (stream_spec.rb line 21-22), which use the samedequeue→break if nil?pattern, confirming the expected behavior.Likely an incorrect or invalid review comment.
Previously, chunks were written to the response stream only after all component streaming tasks completed. Now the writer task starts immediately and writes chunks concurrently as they arrive in the queue. This improves streaming performance by eliminating the delay between chunk production and delivery to the client. Changes: - Start writer task before waiting for barrier - Writer dequeues and writes chunks concurrently with producers - Added debug logging for development troubleshooting - Disabled prerender_caching in dummy config for testing 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <[email protected]>
Replace Fiber-based caching with async gem primitives for both cache hit replay and cache miss buffering. Changes: 1. **Cache Hit Replay** (handle_stream_cache_hit): - Replace Fiber.new with @async_barrier.async task - Use Async::Variable for first chunk synchronization - Enqueue cached chunks to main output queue 2. **Cache Miss Buffering** (handle_stream_cache_miss): - Set @rorp_cache_buffering flag before streaming - Normal streaming buffers chunks automatically - Simplified from nested fiber wrapping to simple flag 3. **run_stream_inside_fiber**: - Check for @rorp_cache_buffering flag - Buffer chunks if caching enabled - Write to Rails.cache after streaming completes Benefits: - ✅ Caching works with async primitives - ✅ No manual Fiber management - ✅ Simpler cache miss logic (no fiber wrapping) - ✅ Cache hit replay uses same pattern as normal streaming 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <[email protected]>
Code Review: Replace Manual Fibers with Async Gem PrimitivesOverall AssessmentThis is a well-architected refactor that successfully achieves its goal of enabling async gem code execution within component streaming. The approach is solid, the code is cleaner, and the architectural shift from manual Fiber management to async primitives is the right solution for the stated problem. ✅ Strengths1. Clear Problem Statement & Solution
2. Code Simplification
3. Architectural Improvements
4. Preserved Functionality
|
The StreamController test class was still using the old Fiber-based approach. Updated it to use the new async primitives: - Replace @rorp_rendering_fibers with @async_barrier tasks - Enqueue all chunks to @main_output_queue - Add require for Async::Variable - Simplify test controller to just create barrier tasks All streaming tests now pass (20 examples, 0 failures). 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <[email protected]>
Code Review: Replace manual Fibers with Async gem primitivesI've completed a thorough review of this PR. Overall, this is a well-architected refactoring that achieves its primary goal of enabling async gem code within component streaming. Here's my detailed feedback: ✅ Strengths1. Clear Architecture & Motivation
2. Code Simplification
3. Unified Concurrency Model
🔍 Issues & Concerns1.
|
PR Review: Replace manual Fibers with Async gem primitivesI've reviewed this PR and it represents a well-thought-out architectural improvement. Here's my detailed feedback: ✅ Strengths1. Clear Problem Statement & SolutionThe PR description excellently explains:
2. Simplified Code Structure
3. Unified Concurrency ModelReplacing manual Fiber management with async gem primitives ( 4. Test Coverage UpdatedThe
|
| Priority | Issue | Action |
|---|---|---|
| 🔴 Critical | Instance variable lifecycle | Add ensure cleanup or refactor to parameters |
| 🟡 Medium | Error handling in async tasks | Ensure first_chunk_var set even on error |
| 🟡 Medium | Test doesn't match real implementation | Add test for first-chunk-variable behavior |
| 🟢 Low | Race condition in cache hit | Document or add comment about ordering |
| 🟢 Low | Caching closure capture | Add clarifying comment |
🎯 Verdict
This is a well-designed refactoring that achieves its stated goal of enabling async code in streaming contexts. The code simplification is real and meaningful. However, the instance variable lifecycle issue should be addressed before merging to avoid potential memory leaks or state pollution in threaded environments.
Recommendation: Request changes to address the critical instance variable cleanup issue. The other concerns are enhancements that could be addressed in follow-up PRs if time is tight.
Great work on the clear documentation and systematic approach to this refactoring! 🚀
Reviewed with attention to code quality, thread safety, async patterns, and Rails conventions per CLAUDE.md guidelines.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
♻️ Duplicate comments (1)
react_on_rails_pro/app/helpers/react_on_rails_pro_helper.rb (1)
299-344: Prevent deadlock in run_stream_inside_fiber when streaming fails before first chunk
run_stream_inside_fiberusesAsync::Variableto synchronize the first chunk and then blocks withfirst_chunk_var.wait. If the async task raises before assigning tofirst_chunk_var(e.g.,yieldfails, the returnedstreamraises ineach_chunk, or there’s an early error before any chunk is produced),first_chunk_var.waitwill never be satisfied, and the calling code can deadlock.This is the same issue noted in the previous review; it’s still present here since there’s no rescue/ensure around the async block to guarantee that
first_chunk_varis set.A minimal fix is to ensure
first_chunk_varis always assigned, even on error, and to keep the cache write consistent:def run_stream_inside_fiber require "async/variable" if @async_barrier.nil? raise ReactOnRails::Error, "You must call stream_view_containing_react_components to render the view containing the react component" end first_chunk_var = Async::Variable.new cache_config = @rorp_cache_buffering @async_barrier.async do is_first = true begin stream = yield stream.each_chunk do |chunk| cache_config[:chunks] << chunk if cache_config if is_first first_chunk_var.value = chunk is_first = false else @main_output_queue.enqueue(chunk) end end # No chunks case: first_chunk_var.value = nil if is_first if cache_config Rails.cache.write(cache_config[:key], cache_config[:chunks], cache_config[:options]) end rescue => e # Unblock caller even on error: first_chunk_var.value = nil if is_first # Optionally log or re-raise to propagate the failure: raise end end first_chunk_var.wait first_chunk_var.value endThis keeps the happy-path behavior (first chunk returned synchronously, rest enqueued, cache populated) but guarantees the caller never waits forever if the underlying stream fails early.
You can verify the fix by forcing an exception inside the streaming block (e.g., raising from
server_rendered_react_componentbefore any chunks are produced) and confirming that the request fails fast rather than hanging.
🧹 Nitpick comments (3)
react_on_rails_pro/lib/react_on_rails_pro/concerns/stream.rb (1)
34-55: Barrier + LimitedQueue integration looks solid but newline comment is staleUsing
Syncto initialize a per-requestAsync::BarrierandAsync::LimitedQueue, then rendering the template and streaming the rest viadrain_streams_concurrently, is a clean replacement for the previous fiber setup and should preserve concurrent streaming and backpressure.The inline comment about stripping extra newlines no longer matches the implementation (
template_stringis written as-is). Either update the comment to describe the current behavior or implement the trimming logic to avoid future confusion.react_on_rails_pro/app/helpers/react_on_rails_pro_helper.rb (2)
241-265: Cache hit replays look correct; consider guarding against missing async contextThe cache-hit path correctly:
- Recomputes
render_optionsand callsload_pack_for_generated_component.- Splits
cached_chunksintoinitial_resultandrest_chunks.- Uses an async task to enqueue the remaining chunks onto
@main_output_queuewhile returninginitial_resultsynchronously viaAsync::Variable.Two small robustness points:
- This method implicitly assumes
@async_barrierand@main_output_queueare already set up bystream_view_containing_react_components. If someone accidentally callscached_stream_react_componentoutside a streaming view, they’ll get a less obvious failure later. Mirroringrun_stream_inside_fiber’s guard with a friendly error would make this safer:raise ReactOnRails::Error, "You must call stream_view_containing_react_components ..." if @async_barrier.nil? || @main_output_queue.nil?
- If
cached_chunksis unexpectedly empty,initial_resultwill benil. That will still work (ERB treats it as empty string), but you may want an explicit check with a clearer error or log in that case so cache corruption doesn’t silently degrade streaming.
267-278: Cache miss buffering via @rorp_cache_buffering is sound but relies on conventionThe cache-miss path enables buffering by setting
@rorp_cache_buffering, then delegates torender_stream_component_with_props, which ultimately usesrun_stream_inside_fiber. Becauserun_stream_inside_fibersnapshotscache_config = @rorp_cache_bufferinginto a local before starting the async task, resetting@rorp_cache_bufferingtonilafterwards is safe even with multiple concurrent streaming components.Given that this helper is intended only for use inside streaming views, it would still be worth:
- Adding the same async-context guard as
run_stream_inside_fiber(and as suggested forhandle_stream_cache_hit) to fail fast if it’s called from a non-streamed template.- Optionally documenting that
@rorp_cache_bufferingis an internal flag and not part of the public surface, to discourage accidental reuse.Functionally, the current implementation is correct.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (3)
react_on_rails_pro/app/helpers/react_on_rails_pro_helper.rb(2 hunks)react_on_rails_pro/lib/react_on_rails_pro/concerns/stream.rb(1 hunks)react_on_rails_pro/spec/react_on_rails_pro/stream_spec.rb(2 hunks)
🧰 Additional context used
🧠 Learnings (2)
📚 Learning: 2025-09-15T21:24:48.207Z
Learnt from: AbanoubGhadban
Repo: shakacode/react_on_rails PR: 1781
File: node_package/src/ClientSideRenderer.ts:82-95
Timestamp: 2025-09-15T21:24:48.207Z
Learning: In React on Rails, the force_load feature includes both explicit `data-force-load="true"` usage and the ability to hydrate components during the page loading state (`document.readyState === 'loading'`). Both capabilities require a Pro license, so the condition `!railsContext.rorPro && (isComponentForceLoaded || document.readyState === 'loading')` correctly gates both scenarios.
Applied to files:
react_on_rails_pro/lib/react_on_rails_pro/concerns/stream.rbreact_on_rails_pro/app/helpers/react_on_rails_pro_helper.rb
📚 Learning: 2025-04-09T12:56:10.756Z
Learnt from: AbanoubGhadban
Repo: shakacode/react_on_rails PR: 1696
File: node_package/src/RSCPayloadContainer.ts:0-0
Timestamp: 2025-04-09T12:56:10.756Z
Learning: In the react_on_rails codebase, RSC payloads are already stringified using `JSON.stringify()` before being processed by the `escapeScript` function, which handles escaping of special characters. The function only needs to handle specific HTML markers like comments and closing script tags.
Applied to files:
react_on_rails_pro/spec/react_on_rails_pro/stream_spec.rb
🧬 Code graph analysis (3)
react_on_rails_pro/lib/react_on_rails_pro/concerns/stream.rb (4)
react_on_rails_pro/lib/react_on_rails_pro/configuration.rb (1)
configuration(9-38)react_on_rails_pro/spec/react_on_rails_pro/stream_spec.rb (1)
render_to_string(18-34)react_on_rails_pro/spec/dummy/spec/helpers/react_on_rails_pro_helper_spec.rb (2)
render_to_string(14-14)response(15-15)packages/react-on-rails-pro/tests/AsyncQueue.ts (1)
dequeue(27-54)
react_on_rails_pro/spec/react_on_rails_pro/stream_spec.rb (1)
packages/react-on-rails-pro/tests/AsyncQueue.ts (2)
dequeue(27-54)enqueue(10-20)
react_on_rails_pro/app/helpers/react_on_rails_pro_helper.rb (2)
lib/react_on_rails/react_component/render_options.rb (1)
auto_load_bundle(74-76)lib/react_on_rails/helper.rb (1)
load_pack_for_generated_component(333-349)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (6)
- GitHub Check: examples (3.4, latest)
- GitHub Check: examples (3.2, minimum)
- GitHub Check: pro-lint-js-and-ruby
- GitHub Check: build-dummy-app-webpack-test-bundles
- GitHub Check: build-dummy-app-webpack-test-bundles
- GitHub Check: claude-review
🔇 Additional comments (1)
react_on_rails_pro/spec/react_on_rails_pro/stream_spec.rb (1)
17-33: StreamController test harness correctly mirrors barrier/queue streaming flowThe
render_to_stringtest helper now drives per-componentAsync::Queues into@main_output_queueunder@async_barrier, matching the productionReactOnRailsPro::Streampattern. Closing the per-component queues in the specs ensures thedequeueloop terminates once they’re empty, so the async tasks won’t hang.This looks consistent with Async’s documented behavior (closed-and-empty queues returning
nilfromdequeue) and with the new barrier-based drain model in the concern.If you want to double-check the queue semantics against your installed
asyncversion, you can run something like:require "async" require "async/queue" Async do |task| q = Async::Queue.new consumer = task.async { q.dequeue } q.close puts consumer.wait.inspect # Expect nil once closed and empty end
| def drain_streams_concurrently(parent_task) | ||
| writing_task = parent_task.async do | ||
| # Drain all remaining chunks from the queue to the response stream | ||
| while (chunk = @main_output_queue.dequeue) | ||
| response.stream.write(chunk) | ||
| end | ||
| end | ||
| end | ||
|
|
||
| def build_writer_task(parent:, queue:) | ||
| parent.async do | ||
| loop do | ||
| pair = queue.dequeue | ||
| break if pair.nil? | ||
| # Wait for all component streaming tasks to complete | ||
| @async_barrier.wait | ||
|
|
||
| _idx_from_queue, item = pair | ||
| response.stream.write(item) | ||
| end | ||
| end | ||
| # Close the queue to signal end of streaming | ||
| @main_output_queue.close | ||
| writing_task.wait | ||
| end |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧩 Analysis chain
Ensure queues and writer task are cleaned up even if a component task fails
drain_streams_concurrently assumes @async_barrier.wait completes normally before closing @main_output_queue and waiting on writing_task. If any async component task raises, @async_barrier.wait can raise as well, which means:
@main_output_queue.closewon’t run, leaving the queue open.writing_taskmay remain blocked indequeueon an open, now-idle queue.
Depending on how Async::Barrier and task failure propagation behave in your exact async version, this could manifest as a hung streaming response or leaked task rather than a clean failure.
Consider wrapping the wait/close sequence so the queue is always closed and the writer task joined, even when @async_barrier.wait raises, e.g.:
def drain_streams_concurrently(parent_task)
writing_task = parent_task.async do
while (chunk = @main_output_queue.dequeue)
response.stream.write(chunk)
end
end
begin
@async_barrier.wait
ensure
@main_output_queue.close
writing_task.wait
end
endThis keeps the steady-state behavior but guarantees termination of the writer loop in error paths.
To validate this, you can inject a failure into one component task (e.g., raise inside stream.each_chunk) and confirm the request doesn’t hang and all tasks complete.
I'll verify the review comment by examining the actual implementation and understanding the async behavior.Good! I found critical information. According to the documentation, Async::Barrier#wait will invoke Async::Task#wait on each waiting task, which may raise an error. This confirms the review concern. Now let me search the codebase to verify the actual implementation.
[/run_scripts]
🏁 Script executed:
#!/bin/bash
# Find the file and verify it exists
find . -name "stream.rb" -path "*/react_on_rails_pro/*" 2>/dev/null | head -5Length of output: 62
[/run_scripts]
🏁 Script executed:
#!/bin/bash
# View the exact method and surrounding context
sed -n '50,85p' ./react_on_rails_pro/lib/react_on_rails_pro/concerns/stream.rbLength of output: 666
🏁 Script executed:
#!/bin/bash
# Search for error handling or rescue blocks in this file
grep -n -B2 -A2 "rescue\|ensure\|raise" ./react_on_rails_pro/lib/react_on_rails_pro/concerns/stream.rbLength of output: 160
🏁 Script executed:
#!/bin/bash
# Search for tests related to drain_streams_concurrently or error handling
find . -name "*.rb" -path "*/test/*" -o -path "*/spec/*" | xargs grep -l "drain_streams_concurrently\|stream.*fail\|async.*barrier" 2>/dev/nullLength of output: 59
[/run_scripts]
🏁 Script executed:
#!/bin/bash
# Examine the test file to see if error scenarios are covered
cat ./react_on_rails_pro/spec/react_on_rails_pro/stream_spec.rbLength of output: 15492
🏁 Script executed:
#!/bin/bash
# Check if there are any other usages or dependencies on @async_barrier or @main_output_queue
grep -n "@async_barrier\|@main_output_queue" ./react_on_rails_pro/lib/react_on_rails_pro/concerns/stream.rbLength of output: 232
Ensure queues and writer task are cleaned up even if a component task fails
The concern is valid. If @async_barrier.wait raises an exception (propagated from a failed component task), the subsequent lines closing @main_output_queue and joining writing_task won't execute. This leaves the queue open and the writer task blocked on an idle queue, causing potential resource leaks and hung streaming responses. The outer begin/ensure block (lines 51–55) only closes response.stream and doesn't address internal state cleanup.
The proposed fix is correct and necessary:
def drain_streams_concurrently(parent_task)
writing_task = parent_task.async do
while (chunk = @main_output_queue.dequeue)
response.stream.write(chunk)
end
end
begin
@async_barrier.wait
ensure
@main_output_queue.close
writing_task.wait
end
endThis ensures termination of the writer loop and proper task cleanup even in error paths.
🤖 Prompt for AI Agents
In react_on_rails_pro/lib/react_on_rails_pro/concerns/stream.rb around lines 61
to 75, the current flow calls @async_barrier.wait and then closes
@main_output_queue and waits on writing_task, but if @async_barrier.wait raises
an exception the queue and writer are left open/blocked; wrap the
@async_barrier.wait call in a begin/ensure block so that in the ensure you
always call @main_output_queue.close and writing_task.wait to guarantee the
writer loop terminates and resources are cleaned up even when a component task
fails.
## Problem
Previously used instance variable @rorp_cache_buffering to communicate
between handle_stream_cache_miss and run_stream_inside_fiber. This was:
- Implicit and hard to follow
- Used unnecessary Async::Variable in handle_stream_cache_hit
- More complex than needed
## Solution
Introduce clean `on_complete` callback pattern:
- stream_react_component extracts on_complete from options
- run_stream_inside_fiber accepts on_complete parameter
- Callback receives all chunks when streaming completes
- Only collects chunks when callback provided (memory efficient)
## Changes
1. **stream_react_component**:
- Extract on_complete callback from options
- Pass to run_stream_inside_fiber
2. **run_stream_inside_fiber**:
- Accept on_complete: nil parameter
- Replace @rorp_cache_buffering with local all_chunks array
- Call on_complete&.call(all_chunks) when streaming completes
3. **handle_stream_cache_miss**:
- Use on_complete: ->(chunks) { Rails.cache.write(...) }
- Much simpler than instance variable approach
4. **handle_stream_cache_hit**:
- Remove unnecessary Async::Variable
- Return initial_result directly (already have it)
- Simpler async task to enqueue remaining chunks
## Benefits
✅ Clean callback API: on_complete: ->(chunks) { ... }
✅ No instance variables for communication
✅ Memory efficient (only collects when callback provided)
✅ Clear lifecycle (callback fires after all chunks received)
✅ Net -14 lines of code
✅ All tests passing (20 examples, 0 failures)
🤖 Generated with [Claude Code](https://claude.com/claude-code)
Co-Authored-By: Claude <[email protected]>
Code Review: Replace Manual Fibers with Async Gem PrimitivesThank you for this PR! This is a well-thought-out refactoring that successfully achieves its stated goal of enabling async gem code within streaming components. Here's my detailed review: ✅ Strengths1. Clear Architecture ImprovementThe migration from manual Fiber management to async gem primitives creates a unified concurrency model. This is a solid architectural decision that:
2. Well-Documented PR DescriptionThe PR description excellently explains:
3. Maintains Existing FunctionalityGood job preserving:
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
react_on_rails_pro/app/helpers/react_on_rails_pro_helper.rb (1)
249-280: Cached streaming currently ignores or inconsistently applieson_complete.In the cached streaming helpers,
on_completebehavior diverges between cache MISS and HIT, and a user-suppliedon_completecan be silently overwritten:
- In
handle_stream_cache_miss,cache_aware_options = raw_options.merge(on_complete: ->(chunks) { ... })overwrites any user-providedraw_options[:on_complete], so callers ofcached_stream_react_componentwon’t see their callback invoked on the initial (miss) render.- In
handle_stream_cache_hit,raw_options[:on_complete]is never used, so callbacks are never invoked on cache hits either.- This contradicts the documented contract that “all the same options as
stream_react_componentapply” tocached_stream_react_component.A small change can preserve the user callback and make HIT/MISS semantics consistent while still using
on_completeinternally for caching:def handle_stream_cache_hit(component_name, raw_options, auto_load_bundle, cached_chunks) render_options = ReactOnRails::ReactComponent::RenderOptions.new( react_component_name: component_name, options: { auto_load_bundle: auto_load_bundle }.merge(raw_options) ) load_pack_for_generated_component(component_name, render_options) initial_result, *rest_chunks = cached_chunks # Enqueue remaining chunks asynchronously @async_barrier.async do rest_chunks.each { |chunk| @main_output_queue.enqueue(chunk) } end + + # Invoke any caller-provided on_complete callback with all chunks + if (on_complete = raw_options[:on_complete]) + on_complete.call(cached_chunks) + end + # Return first chunk directly initial_result end def handle_stream_cache_miss(component_name, raw_options, auto_load_bundle, view_cache_key, &block) - cache_aware_options = raw_options.merge( - on_complete: ->(chunks) { - Rails.cache.write(view_cache_key, chunks, raw_options[:cache_options] || {}) - } - ) + user_on_complete = raw_options[:on_complete] + cache_aware_options = raw_options.merge( + on_complete: ->(chunks) { + Rails.cache.write(view_cache_key, chunks, raw_options[:cache_options] || {}) + user_on_complete&.call(chunks) + } + ) render_stream_component_with_props( component_name, cache_aware_options, auto_load_bundle, &block ) endThis way:
- On a MISS, chunks are cached and then any user callback runs once the full set of chunks is known.
- On a HIT, all cached chunks are already available, so the user callback can be invoked immediately while the remaining chunks are asynchronously enqueued to the main output queue.
♻️ Duplicate comments (1)
react_on_rails_pro/app/helpers/react_on_rails_pro_helper.rb (1)
301-341: Ensurerun_stream_inside_fibernever deadlocks waiting for the first chunk.As written, if the block passed to
run_stream_inside_fiberorstream.each_chunkraises before assigningfirst_chunk_var.value, the finalfirst_chunk_var.waitwill block indefinitely and the request will hang. Examples include:
- The internal renderer raising before returning a stream.
- The stream raising before yielding any chunks (e.g., upstream error fetching data).
This is the same class of issue that was highlighted in an earlier review; the new async implementation still needs an explicit guarantee that
first_chunk_varis set on all code paths (success and failure) so the caller is always unblocked.You can wrap the async block in a
begin/rescueand setfirst_chunk_varin the error path when we haven’t produced a first chunk yet:def run_stream_inside_fiber(on_complete:) require "async/variable" if @async_barrier.nil? raise ReactOnRails::Error, "You must call stream_view_containing_react_components to render the view containing the react component" end # Create a variable to hold the first chunk for synchronous return first_chunk_var = Async::Variable.new - all_chunks = [] if on_complete # Only collect if callback provided - - # Start an async task on the barrier to stream all chunks - @async_barrier.async do - stream = yield - is_first = true - - stream.each_chunk do |chunk| - all_chunks << chunk if on_complete # Collect for callback - - if is_first - # Store first chunk in variable for synchronous access - first_chunk_var.value = chunk - is_first = false - else - # Enqueue remaining chunks to main output queue - @main_output_queue.enqueue(chunk) - end - end - - # Handle case where stream has no chunks - first_chunk_var.value = nil if is_first - - # Call callback with all chunks when streaming completes - on_complete&.call(all_chunks) if on_complete - end + all_chunks = [] if on_complete # Only collect if callback provided + + # Start an async task on the barrier to stream all chunks + @async_barrier.async do + is_first = true + + begin + stream = yield + + stream.each_chunk do |chunk| + all_chunks << chunk if on_complete # Collect for callback + + if is_first + # Store first chunk in variable for synchronous access + first_chunk_var.value = chunk + is_first = false + else + # Enqueue remaining chunks to main output queue + @main_output_queue.enqueue(chunk) + end + end + + # Handle case where stream has no chunks + if is_first + first_chunk_var.value = nil + is_first = false + end + + # Call callback with all chunks when streaming completes + on_complete&.call(all_chunks) if on_complete + rescue => e + # Ensure the waiting caller is always unblocked on error + if is_first + first_chunk_var.value = nil + is_first = false + end + raise + end + end # Wait for and return the first chunk (blocking) first_chunk_var.wait first_chunk_var.value endThis keeps the existing behavior (including
on_complete) while guaranteeing the caller never blocks forever and still allowing errors to propagate via the barrier.Please double-check this against the current
asyncgem docs to confirm that settingAsync::Variable#valuein both success and rescue paths is safe and thatwaitbehaves as expected when the value is set before or during the wait.
🧹 Nitpick comments (2)
react_on_rails_pro/app/helpers/react_on_rails_pro_helper.rb (2)
126-138: On-complete plumbing throughstream_react_componentlooks solid; consider documenting callback shape.Extracting
:on_completeand keeping it out of the options passed tointernal_stream_react_componentis correct and keeps the React renderer options clean. To make this discoverable, you may want to document:on_completein the method’s YARD-style param docs (e.g., that it receives an array of all streamed chunks once streaming completes).
188-199: RSC payload streaming now supportson_complete; no extra RSC checks needed here.Plumbing
:on_completethroughrsc_payload_react_componentintorun_stream_inside_fibermirrors the HTML streaming path and looks consistent. Given prior decisions that RSC enablement is validated deeper in the stack, it’s good that no extra configuration checks were added here (avoids redundant validation). If you intendon_completeto be public for RSC as well, consider briefly documenting it alongside the otheroptions.Based on learnings, RSC support validation is intentionally handled in lower-level code paths rather than in this helper.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
react_on_rails_pro/app/helpers/react_on_rails_pro_helper.rb(4 hunks)
🧰 Additional context used
🧠 Learnings (8)
📚 Learning: 2025-02-18T13:08:01.477Z
Learnt from: AbanoubGhadban
Repo: shakacode/react_on_rails PR: 1644
File: lib/react_on_rails/helper.rb:190-197
Timestamp: 2025-02-18T13:08:01.477Z
Learning: RSC support validation in React on Rails Pro is handled through a chain of validations:
1. Pro version check in `run_stream_inside_fiber`
2. RSC support check during pack generation via `ReactOnRailsPro.configuration.enable_rsc_support`
3. RSC support validation during component registration
This makes additional validation in the helper methods unnecessary.
Applied to files:
react_on_rails_pro/app/helpers/react_on_rails_pro_helper.rb
📚 Learning: 2025-02-13T16:50:47.848Z
Learnt from: AbanoubGhadban
Repo: shakacode/react_on_rails PR: 1644
File: node_package/src/clientStartup.ts:18-21
Timestamp: 2025-02-13T16:50:47.848Z
Learning: In the react_on_rails module, the `reactOnRailsPageUnloaded` function in clientStartup.ts is intentionally kept private as it's only used internally as a callback for `onPageUnloaded`.
Applied to files:
react_on_rails_pro/app/helpers/react_on_rails_pro_helper.rb
📚 Learning: 2025-02-18T13:08:01.477Z
Learnt from: AbanoubGhadban
Repo: shakacode/react_on_rails PR: 1644
File: lib/react_on_rails/helper.rb:190-197
Timestamp: 2025-02-18T13:08:01.477Z
Learning: RSC support validation is handled in deeper level calls of the React on Rails Pro codebase, so it doesn't need to be validated again in the `rsc_payload_react_component` helper method.
Applied to files:
react_on_rails_pro/app/helpers/react_on_rails_pro_helper.rb
📚 Learning: 2025-07-08T05:57:29.630Z
Learnt from: AbanoubGhadban
Repo: shakacode/react_on_rails PR: 1745
File: node_package/src/RSCRequestTracker.ts:8-14
Timestamp: 2025-07-08T05:57:29.630Z
Learning: The global `generateRSCPayload` function in React on Rails Pro (RORP) is provided by the framework during rendering requests, not implemented in application code. The `declare global` statements are used to document the expected interface that RORP will inject at runtime.
Applied to files:
react_on_rails_pro/app/helpers/react_on_rails_pro_helper.rb
📚 Learning: 2025-09-15T21:24:48.207Z
Learnt from: AbanoubGhadban
Repo: shakacode/react_on_rails PR: 1781
File: node_package/src/ClientSideRenderer.ts:82-95
Timestamp: 2025-09-15T21:24:48.207Z
Learning: In React on Rails, the force_load feature includes both explicit `data-force-load="true"` usage and the ability to hydrate components during the page loading state (`document.readyState === 'loading'`). Both capabilities require a Pro license, so the condition `!railsContext.rorPro && (isComponentForceLoaded || document.readyState === 'loading')` correctly gates both scenarios.
Applied to files:
react_on_rails_pro/app/helpers/react_on_rails_pro_helper.rb
📚 Learning: 2025-09-16T08:01:11.146Z
Learnt from: justin808
Repo: shakacode/react_on_rails PR: 1770
File: lib/generators/react_on_rails/templates/base/base/app/javascript/src/HelloWorld/ror_components/HelloWorld.client.jsx:2-2
Timestamp: 2025-09-16T08:01:11.146Z
Learning: React on Rails uses webpack CSS Modules configuration with namedExports: true, which requires the import syntax `import * as style from './file.module.css'` rather than the default export pattern. This configuration enables better tree shaking and bundle size optimization for CSS modules.
Applied to files:
react_on_rails_pro/app/helpers/react_on_rails_pro_helper.rb
📚 Learning: 2025-04-09T12:56:10.756Z
Learnt from: AbanoubGhadban
Repo: shakacode/react_on_rails PR: 1696
File: node_package/src/RSCPayloadContainer.ts:0-0
Timestamp: 2025-04-09T12:56:10.756Z
Learning: In the react_on_rails codebase, RSC payloads are already stringified using `JSON.stringify()` before being processed by the `escapeScript` function, which handles escaping of special characters. The function only needs to handle specific HTML markers like comments and closing script tags.
Applied to files:
react_on_rails_pro/app/helpers/react_on_rails_pro_helper.rb
📚 Learning: 2025-04-26T21:55:55.874Z
Learnt from: alexeyr-ci2
Repo: shakacode/react_on_rails PR: 1732
File: spec/dummy/client/app-react16/startup/ReduxSharedStoreApp.client.jsx:40-44
Timestamp: 2025-04-26T21:55:55.874Z
Learning: In the react_on_rails project, files under `app-react16` directories are copied/moved to corresponding `/app` directories during the conversion process (removing the `-react16` suffix), which affects their relative import paths at runtime.
Applied to files:
react_on_rails_pro/app/helpers/react_on_rails_pro_helper.rb
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (6)
- GitHub Check: examples (3.2, minimum)
- GitHub Check: examples (3.4, latest)
- GitHub Check: build-dummy-app-webpack-test-bundles
- GitHub Check: build-dummy-app-webpack-test-bundles
- GitHub Check: pro-lint-js-and-ruby
- GitHub Check: claude-review
Replace @rorp_rendering_fibers with @async_barrier and @main_output_queue in test setup. Update all tests to: - Wrap in Sync blocks to provide async context - Use @async_barrier.wait and @main_output_queue.close for synchronization - Collect chunks from queue instead of calling fiber.resume - Remove timing-dependent assertions on chunks_read.count This completes the migration from manual Fiber management to async gem primitives for component streaming tests. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <[email protected]>
Code Review: Replace manual Fibers with Async gem primitivesI've reviewed this PR in detail. Overall, this is a well-executed architectural improvement with significant benefits. Here's my detailed feedback: Strengths1. Clear Problem Definition and Solution
2. Code Quality
3. Test Coverage
Critical Issues1. RuboCop Compliance (BLOCKING)Per CLAUDE.md, you MUST run bundle exec rubocop in react_on_rails_pro before merging. This is a hard requirement. Medium Priority Recommendations2. Memory Management ConcernIn run_stream_inside_fiber, all_chunks buffers ALL chunks in memory when caching is enabled. For large streams, this could consume significant memory. Recommendation: Add documentation comments about memory implications and consider adding warnings for large streams. 3. Error Handling Edge CaseWhat happens if stream.each_chunk raises an exception mid-stream? The on_complete callback won't be called with partial chunks, and partial chunks may remain in the queue. Recommendation: Consider wrapping streaming logic in a rescue block for proper cleanup. Low Priority Recommendations4. Race Condition DocumentationIn drain_streams_concurrently, the synchronization logic is correct but deserves comments explaining that barrier.wait ensures all producers finish before signaling EOF. 5. Test Implementation DetailsTest expectations changed from eq(1) to eq(3), suggesting tests now verify implementation details rather than observable behavior. 6. Documentation GapsMissing YARD docs for on_complete parameter and no migration guide for users who may have customized the old implementation. Security and PerformanceSecurity: No obvious issues. Backpressure via LimitedQueue is maintained (good for DoS protection). Performance: Positive changes include concurrent chunk writing and simplified code paths. Potential regression from async gem overhead should be benchmarked. Testing RecommendationsBefore merging:
Final VerdictOverall Assessment: Approve with minor recommendations This is high-quality work that:
Before merging:
Risk Level: Low-Medium Per CLAUDE.md guidelines, also consider adding an entry to CHANGELOG_PRO.md as this is a user-visible architectural change. Great work overall! Address the RuboCop check and consider the recommendations before merging. Generated with Claude Code |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (1)
react_on_rails_pro/spec/dummy/spec/helpers/react_on_rails_pro_helper_spec.rb (1)
364-377: Async streaming patterns verified as correct; optional DRY refactor recommendedThe async patterns you've used are confirmed to be correct and follow the async gem's intended approach. After
Async::Queue#close, calling#dequeuereturnsnilonce the queue is drained, and the sequencebarrier.wait→queue.close→while (chunk = queue.dequeue)is the recommended way to consume all remaining items.The test scaffolding correctly simulates the streaming context that
stream_view_containing_react_componentsprovides, and your drain pattern mirrors thedrain_streams_concurrentlybehavior for deterministic chunk collection.Optional refactor: The Sync + barrier/queue initialization and wait/close/dequeue loops repeat across lines 364–377, 390–392, 394–423, 438–451, 721–742, and 782–788. Extracting these into small helper methods (e.g.,
with_streaming_async_contextandwait_and_collect_stream_chunks) would reduce duplication and make future streaming changes easier to propagate across the spec.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
react_on_rails_pro/spec/dummy/spec/helpers/react_on_rails_pro_helper_spec.rb(6 hunks)
🧰 Additional context used
🧠 Learnings (7)
📚 Learning: 2024-10-08T20:53:47.076Z
Learnt from: theforestvn88
Repo: shakacode/react_on_rails PR: 1620
File: spec/dummy/client/app/startup/HelloTurboStream.jsx:3-3
Timestamp: 2024-10-08T20:53:47.076Z
Learning: The `RailsContext` import in `spec/dummy/client/app/startup/HelloTurboStream.jsx` is used later in the project, as clarified by the user theforestvn88.
Applied to files:
react_on_rails_pro/spec/dummy/spec/helpers/react_on_rails_pro_helper_spec.rb
📚 Learning: 2025-02-18T13:08:01.477Z
Learnt from: AbanoubGhadban
Repo: shakacode/react_on_rails PR: 1644
File: lib/react_on_rails/helper.rb:190-197
Timestamp: 2025-02-18T13:08:01.477Z
Learning: RSC support validation in React on Rails Pro is handled through a chain of validations:
1. Pro version check in `run_stream_inside_fiber`
2. RSC support check during pack generation via `ReactOnRailsPro.configuration.enable_rsc_support`
3. RSC support validation during component registration
This makes additional validation in the helper methods unnecessary.
Applied to files:
react_on_rails_pro/spec/dummy/spec/helpers/react_on_rails_pro_helper_spec.rb
📚 Learning: 2025-07-08T05:57:29.630Z
Learnt from: AbanoubGhadban
Repo: shakacode/react_on_rails PR: 1745
File: node_package/src/RSCRequestTracker.ts:8-14
Timestamp: 2025-07-08T05:57:29.630Z
Learning: The global `generateRSCPayload` function in React on Rails Pro (RORP) is provided by the framework during rendering requests, not implemented in application code. The `declare global` statements are used to document the expected interface that RORP will inject at runtime.
Applied to files:
react_on_rails_pro/spec/dummy/spec/helpers/react_on_rails_pro_helper_spec.rb
📚 Learning: 2025-09-15T21:24:48.207Z
Learnt from: AbanoubGhadban
Repo: shakacode/react_on_rails PR: 1781
File: node_package/src/ClientSideRenderer.ts:82-95
Timestamp: 2025-09-15T21:24:48.207Z
Learning: In React on Rails, the force_load feature includes both explicit `data-force-load="true"` usage and the ability to hydrate components during the page loading state (`document.readyState === 'loading'`). Both capabilities require a Pro license, so the condition `!railsContext.rorPro && (isComponentForceLoaded || document.readyState === 'loading')` correctly gates both scenarios.
Applied to files:
react_on_rails_pro/spec/dummy/spec/helpers/react_on_rails_pro_helper_spec.rb
📚 Learning: 2025-04-09T13:02:16.009Z
Learnt from: AbanoubGhadban
Repo: shakacode/react_on_rails PR: 1696
File: node_package/src/transformRSCNodeStream.ts:3-31
Timestamp: 2025-04-09T13:02:16.009Z
Learning: In the React Server Components (RSC) stream processing implementation, all chunks end with a newline character (`\n`), so the `lastIncompleteChunk` variable is expected to be empty by the time the stream ends, making a flush handler unnecessary in the Transform stream.
Applied to files:
react_on_rails_pro/spec/dummy/spec/helpers/react_on_rails_pro_helper_spec.rb
📚 Learning: 2025-04-09T12:56:10.756Z
Learnt from: AbanoubGhadban
Repo: shakacode/react_on_rails PR: 1696
File: node_package/src/RSCPayloadContainer.ts:0-0
Timestamp: 2025-04-09T12:56:10.756Z
Learning: In the react_on_rails codebase, RSC payloads are already stringified using `JSON.stringify()` before being processed by the `escapeScript` function, which handles escaping of special characters. The function only needs to handle specific HTML markers like comments and closing script tags.
Applied to files:
react_on_rails_pro/spec/dummy/spec/helpers/react_on_rails_pro_helper_spec.rb
📚 Learning: 2024-10-08T20:53:47.076Z
Learnt from: justin808
Repo: shakacode/react_on_rails PR: 1622
File: spec/dummy/spec/rake/assets_precompile_rake_spec.rb:12-12
Timestamp: 2024-10-08T20:53:47.076Z
Learning: When stubbing environment variables in RSpec tests, use `before` and `after` hooks to ensure that the original values are restored after the tests, preventing any side effects on other tests. Example provided by justin808:
```ruby
describe "My test" do
before do
original_value = ENV["VARIABLE_NAME"]
allow(ENV).to receive(:[]).with("VARIABLE_NAME").and_return("stubbed_value")
end
after do
allow(ENV).to receive(:[]).with("VARIABLE_NAME").and_call_original
ENV["VARIABLE_NAME"] = original_value
end
it "tests something" do
# Your test code here
end
end
```
This practice ensures test isolation and reliability.
Applied to files:
react_on_rails_pro/spec/dummy/spec/helpers/react_on_rails_pro_helper_spec.rb
🧬 Code graph analysis (1)
react_on_rails_pro/spec/dummy/spec/helpers/react_on_rails_pro_helper_spec.rb (2)
react_on_rails_pro/spec/react_on_rails_pro/stream_spec.rb (1)
include(8-35)spec/dummy/spec/helpers/react_on_rails_helper_spec.rb (1)
include(6-9)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (6)
- GitHub Check: examples (3.4, latest)
- GitHub Check: examples (3.2, minimum)
- GitHub Check: build-dummy-app-webpack-test-bundles
- GitHub Check: build-dummy-app-webpack-test-bundles
- GitHub Check: pro-lint-js-and-ruby
- GitHub Check: claude-review
🔇 Additional comments (1)
react_on_rails_pro/spec/dummy/spec/helpers/react_on_rails_pro_helper_spec.rb (1)
3-6: Async::Barrier require is appropriate and verifiedThe canonical require path for using Async::Barrier is
require 'async/barrier', confirming that line 5 is correct. No issues found.
Code Review: Replace manual Fibers with Async gem primitivesI've thoroughly reviewed this PR. Overall, this is a well-architected refactoring that modernizes the concurrency model. The code quality is high, and the approach is sound. Here are my detailed findings: ✅ Strengths1. Excellent Architecture Decision
2. Code Simplification
3. Smart Caching Design
4. Test Coverage
🔍 Potential Issues & Recommendations1. Critical: Memory Leak Risk with
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
react_on_rails_pro/app/helpers/react_on_rails_pro_helper.rb (1)
249-261: Add guard tohandle_stream_cache_hitfor consistent error messagingThe cache hit path (line 259) calls
@async_barrier.asyncwithout checking if the async context exists, creating an asymmetry with the cache miss path, which eventually reachesrun_stream_inside_fiberwith its explicit guard at lines 304-307. Ifcached_stream_react_componentis called outsidestream_view_containing_react_components, the cache hit returns a crypticNoMethodError, while cache miss would raise a clear error message.Add this guard before line 259:
raise ReactOnRails::Error, "You must call stream_view_containing_react_components to render the view containing the react component" if @async_barrier.nil?This ensures consistent, actionable error messages regardless of cache state.
♻️ Duplicate comments (1)
react_on_rails_pro/app/helpers/react_on_rails_pro_helper.rb (1)
301-341: Ensurerun_stream_inside_fibercannot hang if streaming fails before settingfirst_chunk_varThe new Async-based
run_stream_inside_fiberhas a solid structure (barrier-scoped task,Async::Variablefor sync first-chunk return, and optionalall_chunkscollection foron_complete). However, there’s still a subtle failure mode:
- If
yield(e.g.,internal_stream_react_component/internal_rsc_payload_react_component) orstream.each_chunkraises before any chunk is processed,first_chunk_var.valueis never set.- The outer call to
first_chunk_var.waitwill then block indefinitely unless the Async runtime aborts the current task in a way that raises here, which is not guaranteed by this method’s local logic.This is the same deadlock risk previously flagged in an earlier review on this PR; the current version appears to have reintroduced it by removing the defensive assignment in the exception path. To make this bulletproof, the async block should guarantee that
first_chunk_var.valueis set exactly once, even on error, before re-raising:@async_barrier.async do - stream = yield - is_first = true - - stream.each_chunk do |chunk| - all_chunks << chunk if on_complete - if is_first - first_chunk_var.value = chunk - is_first = false - else - @main_output_queue.enqueue(chunk) - end - end - - first_chunk_var.value = nil if is_first - on_complete&.call(all_chunks) + begin + stream = yield + is_first = true + + stream.each_chunk do |chunk| + all_chunks << chunk if on_complete + + if is_first + first_chunk_var.value = chunk + is_first = false + else + @main_output_queue.enqueue(chunk) + end + end + + # Handle case where stream has no chunks + first_chunk_var.value = nil if is_first + on_complete&.call(all_chunks) + rescue => e + # Ensure waiter is unblocked even on failure + first_chunk_var.value = nil if first_chunk_var.value.nil? + raise + end endYou can also simplify the caller to
return first_chunk_var.waitand drop the separate.valueread.This ensures that:
- callers never hang on
wait, and- failures still propagate (so error handling/circuit-breaking happens as expected).
In the ruby 'async' gem, does `Async::Variable#wait` automatically raise if the task that was supposed to set its value fails before assignment, or can it block indefinitely unless the code explicitly sets a value in error paths?
🧹 Nitpick comments (4)
react_on_rails_pro/spec/dummy/spec/helpers/react_on_rails_pro_helper_spec.rb (3)
5-5: Rubocop pragma and async require look appropriate, but scope could be narrowerThe added
require "async/barrier"matches the new Async usage in this spec, and theRSpec/InstanceVariabledisable/enable is justified given the shared@async_barrier,@main_output_queue, and other ivars across examples. If you want to keep cops tighter, consider limiting the disable/enable region to just the streaming-relateddescribeblocks rather than the entire file, but it's not strictly necessary.Also applies to: 11-12, 820-820
784-790: Around hook for attribution specs is consistent but duplicated with prior setupThe attribution specs now use their own
aroundhook to set upSync,@async_barrier, and@main_output_queue, which is consistent with the earlier streaming specs. Given this pattern is repeated in multiple contexts (#stream_react_componentexamples and here), you may want to factor out a small shared helper (e.g.,with_stream_async_context) to reduce duplication and keep future changes to the async harness in one place.
365-379: UseAsync::LimitedQueuein test setup to match production implementationThe queue drain logic is sound—
Async::Queue#dequeuecorrectly returnsnilwhen closed and empty, so both the currentwhilepattern and the suggested explicitbreak if nilpattern work without risk of hanging. However, there's a queue type mismatch: production usesAsync::LimitedQueue(stream.rb line 42), but the test initializes a plainAsync::Queue(line 371). Using the same queue type reduces behavioral drift between tests and runtime.Change line 371 from:
@main_output_queue = Async::Queue.newto:
@main_output_queue = Async::LimitedQueue.new(buffer_size)Alternatively, if buffer sizing isn't needed for tests, you may document why the plain
Queueis preferred. The explicitnil-break style suggestion is optional—both patterns are sound and idiomatic.react_on_rails_pro/app/helpers/react_on_rails_pro_helper.rb (1)
132-135: Newon_completecallback plumbing is clean but undocumented in the helper docsExtracting
:on_completefromoptionsand passing it intorun_stream_inside_fiber(on_complete: ...)is a tidy way to introduce a streaming-completion callback without leaking this option down into the React renderer.To make this discoverable and avoid confusion for users, consider updating the yard-style documentation above
stream_react_componentandrsc_payload_react_componentto mention that:
:on_completeis an accepted option,- it receives an array of all streamed chunks (including the first), and
- it runs once after the stream has fully completed.
This keeps the public API and implementation in sync.
Also applies to: 193-196
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (2)
react_on_rails_pro/app/helpers/react_on_rails_pro_helper.rb(4 hunks)react_on_rails_pro/spec/dummy/spec/helpers/react_on_rails_pro_helper_spec.rb(7 hunks)
🧰 Additional context used
🧠 Learnings (10)
📚 Learning: 2025-02-18T13:08:01.477Z
Learnt from: AbanoubGhadban
Repo: shakacode/react_on_rails PR: 1644
File: lib/react_on_rails/helper.rb:190-197
Timestamp: 2025-02-18T13:08:01.477Z
Learning: RSC support validation in React on Rails Pro is handled through a chain of validations:
1. Pro version check in `run_stream_inside_fiber`
2. RSC support check during pack generation via `ReactOnRailsPro.configuration.enable_rsc_support`
3. RSC support validation during component registration
This makes additional validation in the helper methods unnecessary.
Applied to files:
react_on_rails_pro/app/helpers/react_on_rails_pro_helper.rbreact_on_rails_pro/spec/dummy/spec/helpers/react_on_rails_pro_helper_spec.rb
📚 Learning: 2025-02-13T16:50:47.848Z
Learnt from: AbanoubGhadban
Repo: shakacode/react_on_rails PR: 1644
File: node_package/src/clientStartup.ts:18-21
Timestamp: 2025-02-13T16:50:47.848Z
Learning: In the react_on_rails module, the `reactOnRailsPageUnloaded` function in clientStartup.ts is intentionally kept private as it's only used internally as a callback for `onPageUnloaded`.
Applied to files:
react_on_rails_pro/app/helpers/react_on_rails_pro_helper.rb
📚 Learning: 2025-09-15T21:24:48.207Z
Learnt from: AbanoubGhadban
Repo: shakacode/react_on_rails PR: 1781
File: node_package/src/ClientSideRenderer.ts:82-95
Timestamp: 2025-09-15T21:24:48.207Z
Learning: In React on Rails, the force_load feature includes both explicit `data-force-load="true"` usage and the ability to hydrate components during the page loading state (`document.readyState === 'loading'`). Both capabilities require a Pro license, so the condition `!railsContext.rorPro && (isComponentForceLoaded || document.readyState === 'loading')` correctly gates both scenarios.
Applied to files:
react_on_rails_pro/app/helpers/react_on_rails_pro_helper.rbreact_on_rails_pro/spec/dummy/spec/helpers/react_on_rails_pro_helper_spec.rb
📚 Learning: 2025-02-18T13:08:01.477Z
Learnt from: AbanoubGhadban
Repo: shakacode/react_on_rails PR: 1644
File: lib/react_on_rails/helper.rb:190-197
Timestamp: 2025-02-18T13:08:01.477Z
Learning: RSC support validation is handled in deeper level calls of the React on Rails Pro codebase, so it doesn't need to be validated again in the `rsc_payload_react_component` helper method.
Applied to files:
react_on_rails_pro/app/helpers/react_on_rails_pro_helper.rb
📚 Learning: 2025-04-09T12:56:10.756Z
Learnt from: AbanoubGhadban
Repo: shakacode/react_on_rails PR: 1696
File: node_package/src/RSCPayloadContainer.ts:0-0
Timestamp: 2025-04-09T12:56:10.756Z
Learning: In the react_on_rails codebase, RSC payloads are already stringified using `JSON.stringify()` before being processed by the `escapeScript` function, which handles escaping of special characters. The function only needs to handle specific HTML markers like comments and closing script tags.
Applied to files:
react_on_rails_pro/app/helpers/react_on_rails_pro_helper.rbreact_on_rails_pro/spec/dummy/spec/helpers/react_on_rails_pro_helper_spec.rb
📚 Learning: 2025-07-08T05:57:29.630Z
Learnt from: AbanoubGhadban
Repo: shakacode/react_on_rails PR: 1745
File: node_package/src/RSCRequestTracker.ts:8-14
Timestamp: 2025-07-08T05:57:29.630Z
Learning: The global `generateRSCPayload` function in React on Rails Pro (RORP) is provided by the framework during rendering requests, not implemented in application code. The `declare global` statements are used to document the expected interface that RORP will inject at runtime.
Applied to files:
react_on_rails_pro/app/helpers/react_on_rails_pro_helper.rbreact_on_rails_pro/spec/dummy/spec/helpers/react_on_rails_pro_helper_spec.rb
📚 Learning: 2024-10-08T20:53:47.076Z
Learnt from: theforestvn88
Repo: shakacode/react_on_rails PR: 1620
File: spec/dummy/client/app/startup/HelloTurboStream.jsx:3-3
Timestamp: 2024-10-08T20:53:47.076Z
Learning: The `RailsContext` import in `spec/dummy/client/app/startup/HelloTurboStream.jsx` is used later in the project, as clarified by the user theforestvn88.
Applied to files:
react_on_rails_pro/spec/dummy/spec/helpers/react_on_rails_pro_helper_spec.rb
📚 Learning: 2025-04-09T13:02:16.009Z
Learnt from: AbanoubGhadban
Repo: shakacode/react_on_rails PR: 1696
File: node_package/src/transformRSCNodeStream.ts:3-31
Timestamp: 2025-04-09T13:02:16.009Z
Learning: In the React Server Components (RSC) stream processing implementation, all chunks end with a newline character (`\n`), so the `lastIncompleteChunk` variable is expected to be empty by the time the stream ends, making a flush handler unnecessary in the Transform stream.
Applied to files:
react_on_rails_pro/spec/dummy/spec/helpers/react_on_rails_pro_helper_spec.rb
📚 Learning: 2024-10-08T20:53:47.076Z
Learnt from: justin808
Repo: shakacode/react_on_rails PR: 1622
File: spec/dummy/spec/rake/assets_precompile_rake_spec.rb:12-12
Timestamp: 2024-10-08T20:53:47.076Z
Learning: When stubbing environment variables in RSpec tests, use `before` and `after` hooks to ensure that the original values are restored after the tests, preventing any side effects on other tests. Example provided by justin808:
```ruby
describe "My test" do
before do
original_value = ENV["VARIABLE_NAME"]
allow(ENV).to receive(:[]).with("VARIABLE_NAME").and_return("stubbed_value")
end
after do
allow(ENV).to receive(:[]).with("VARIABLE_NAME").and_call_original
ENV["VARIABLE_NAME"] = original_value
end
it "tests something" do
# Your test code here
end
end
```
This practice ensures test isolation and reliability.
Applied to files:
react_on_rails_pro/spec/dummy/spec/helpers/react_on_rails_pro_helper_spec.rb
📚 Learning: 2025-02-13T14:29:49.267Z
Learnt from: AbanoubGhadban
Repo: shakacode/react_on_rails PR: 1644
File: spec/react_on_rails/utils_spec.rb:218-218
Timestamp: 2025-02-13T14:29:49.267Z
Learning: In RSpec tests, prefer using local variables over constants within test blocks to avoid constant redefinition warnings and maintain better test isolation.
Applied to files:
react_on_rails_pro/spec/dummy/spec/helpers/react_on_rails_pro_helper_spec.rb
🧬 Code graph analysis (1)
react_on_rails_pro/app/helpers/react_on_rails_pro_helper.rb (2)
packages/react-on-rails-pro/tests/AsyncQueue.ts (1)
enqueue(10-20)lib/react_on_rails/react_component/render_options.rb (1)
auto_load_bundle(74-76)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (6)
- GitHub Check: examples (3.2, minimum)
- GitHub Check: examples (3.4, latest)
- GitHub Check: pro-lint-js-and-ruby
- GitHub Check: build-dummy-app-webpack-test-bundles
- GitHub Check: build-dummy-app-webpack-test-bundles
- GitHub Check: claude-review
🔇 Additional comments (2)
react_on_rails_pro/app/helpers/react_on_rails_pro_helper.rb (1)
268-279: Write-through caching viaon_completeis well-integratedThe MISS path’s
cache_aware_options:cache_aware_options = raw_options.merge( on_complete: lambda { |chunks| Rails.cache.write(view_cache_key, chunks, raw_options[:cache_options] || {}) } )plugs neatly into
run_stream_inside_fiber’son_completehandling so that:
- all streamed chunks are collected server-side, and
- the cache is populated only after the full stream completes.
This design avoids eager evaluation of props, keeps the cache shape consistent with the HIT path (
Arrayof chunks), and correctly threads throughcache_options. Nicely done.react_on_rails_pro/spec/dummy/spec/helpers/react_on_rails_pro_helper_spec.rb (1)
723-745: Current code correctly follows async gem idioms; the review concern is unfoundedThe verification confirms that using
nilas a termination sentinel afterAsync::Queue#closeis the intended/idiomatic behaviour in the ruby async gem. The barewhile @main_output_queue.dequeueloop is not fragile—it's the established pattern. No refactoring is necessary.Likely an incorrect or invalid review comment.
Code Review: Replace manual Fibers with Async gem primitivesGreat work on this architectural refactoring! This is a well-thought-out solution that simplifies the code while enabling future async capabilities. Here's my detailed review: ✅ Strengths1. Architectural Improvement
2. Smart Use of Async Primitives
3. Test Coverage
🔍 Potential Issues & Suggestions1. Critical: Thread Safety of Instance VariablesIssue: # In stream.rb - line 40
@async_barrier = Async::Barrier.new
@main_output_queue = Async::LimitedQueue.newRisk: If the same controller instance handles concurrent requests (e.g., in threaded servers like Puma), these could be shared across requests. Recommendation:
2. Memory Management: on_complete CallbackIssue: When # helper.rb line 310
all_chunks = [] if on_complete # Collects all chunksRisk: For large responses or high concurrency, this could consume significant memory. Recommendations:
3. Error Handling: Empty Stream CaseCode: # helper.rb line 331
first_chunk_var.value = nil if is_firstQuestion: What happens when a component has no chunks? Does returning Recommendation:
4. Queue Cleanup on ErrorsIssue: If an error occurs during streaming, the queue might not be properly closed. Recommendation: # In drain_streams_concurrently
begin
@async_barrier.wait
ensure
@main_output_queue.close
writing_task.wait
endThis ensures cleanup even on exceptions. (Current code does have 5. Race Condition: First Chunk VariableSubtle issue: There's a potential race between:
If the async scheduler doesn't immediately start the task, the wait could happen before the task runs, though this is unlikely with async gem's scheduling. Recommendation: Add a comment explaining the synchronization guarantee, or consider using 🎯 Code Quality & Best Practices1. Good: Backward Compatibility
2. Good: Documentation
3. Testing SuggestionsMissing test cases:
Example test to add: it "handles empty streams gracefully" do
allow_any_instance_of(Stream).to receive(:each_chunk).and_return(enum_for(:each_chunk))
result = stream_react_component(component_name, props: props)
expect(result).to be_nil # or expect appropriate fallback
end🔒 Security Considerations
⚡ Performance Considerations
📋 Minor Issues1. Rubocop Disable Comment# rubocop:disable RSpec/InstanceVariableSuggestion: Consider refactoring to avoid instance variables in tests using 2. Test Comment Accuracy# NOTE: With async architecture, chunks are consumed in background immediately,
expect(chunks_read.count).to eq(3)This comment explains the behavior change but could be clearer about why this is expected/acceptable. ✅ Pre-Merge ChecklistBefore merging, ensure:
🎉 ConclusionThis is a well-designed refactoring that achieves its goals. The main concerns are around thread-safety of instance variables and memory management for the Recommendation: Approve with minor changes (address thread-safety documentation and add test cases for edge cases). The architectural direction is sound and this unblocks important future work. Nice job! 🚀 |
Code Review: Async Gem Migration for Component StreamingI've reviewed this PR and have several observations about code quality, architecture, and potential issues. ✅ Strengths1. Clear Architectural Improvement
2. Good Use of Async Primitives
3. Test CoverageThe test updates properly reflect the architectural changes, switching from Fiber-based assertions to barrier/queue-based ones.
|
PR Review: Replace manual Fibers with Async gem primitives for component streamingSummaryThis PR refactors the component streaming implementation from manual Fiber management to async gem primitives ( Code Quality & Architecture✅ Strengths
Potential Issues & Concerns🔴 CRITICAL: Missing Error HandlingLocation: @async_barrier.async do
stream.each_chunk do |chunk|
all_chunks << chunk if on_complete
if is_first
first_chunk_var.value = chunk
is_first = false
else
@main_output_queue.enqueue(chunk)
end
end
first_chunk_var.value = nil if is_first
on_complete&.call(all_chunks)
endProblem: There's no error handling in the async task. If
Recommendation: @async_barrier.async do
begin
stream.each_chunk do |chunk|
# ... existing logic ...
end
first_chunk_var.value = nil if is_first
on_complete&.call(all_chunks)
rescue => e
# Ensure first_chunk_var is set even on error to unblock waiting code
first_chunk_var.value = nil unless first_chunk_var.resolved?
raise # Re-raise to propagate to barrier
end
end🟡 MEDIUM: Race Condition with Empty StreamsLocation: Same location as above first_chunk_var.value = nil if is_firstProblem: If Recommendation: Add a test case for empty streams and document expected behavior. 🟡 MEDIUM: Sync Block Placement ChangeLocation: The
Question: Has this been tested with complex templates that might do database queries, external API calls, or other blocking operations? The PR description doesn't mention testing this scenario. 🟡 MEDIUM: Behavioral Change (Not Just Refactoring)Location: The test comment says "NOTE: With async architecture, chunks are consumed in background immediately" - this is a behavioral change, not just a refactoring. The old implementation had lazy evaluation (chunks consumed on-demand), the new one is eager (all chunks consumed immediately). Question: What are the implications for:
The PR claims "Same Functionality" but this appears to be a semantic difference. Test Coverage✅ Good CoverageThe PR updates tests comprehensively for the new async architecture. ❌ Missing Test Cases
Recommendation: Add tests for error scenarios: it "handles errors during streaming gracefully" do
allow(stream).to receive(:each_chunk).and_raise(StandardError, "Stream error")
expect { stream_react_component(...) }.to raise_error(StandardError, "Stream error")
end
it "handles empty streams without hanging" do
allow(stream).to receive(:each_chunk) # Never yields
result = stream_react_component(...)
expect(result).to be_nil
endCompliance with CLAUDE.md❌ Potential Issues
RecommendationsMust Fix Before Merge
Should Consider
Overall AssessmentQuality: Good - The refactoring simplifies the codebase and achieves the stated goal. Completeness: Medium - Core functionality is there, but error handling and edge cases need work. Risk Level: Medium - The behavioral change from lazy to eager evaluation could have unintended consequences. Recommendation: Request changes - primarily adding error handling and tests for failure scenarios before merge. |
Code Review: Replace manual Fibers with Async gem primitivesI've reviewed this PR and overall it's a well-architected refactoring that successfully modernizes the concurrency model. Here's my detailed feedback: ✅ Strengths1. Excellent Architectural Improvement
2. Significant Code Simplification
3. Proper Test Updates
4. Smart Caching Refactor
|
Code Review: Replace manual Fibers with Async gem primitivesAll CI checks are passing. Here is my detailed review of PR #2111. Architecture and Design - Excellent Improvements1. Unified Concurrency Model
2. Simplified Code Structure
3. Callback-Based Cache Writes
Potential IssuesCRITICAL: Empty Stream Handling In run_stream_inside_fiber at react_on_rails_pro_helper.rb:313-337, if stream.each_chunk yields zero chunks, first_chunk_var.wait will block forever because the variable is never set. Root cause: The line first_chunk_var.value = nil if is_first only runs AFTER each_chunk completes. If each_chunk blocks or returns without yielding, we deadlock. Fix: Track whether any chunks were yielded with has_chunks variable, set to false initially, true on first chunk, then check after loop. MODERATE: Initialization Order Instance variables async_barrier and main_output_queue are set before render_to_string, which is correct. Worth documenting this ordering requirement prominently since helpers depend on these being initialized first. Suggestion: Add comment explaining that async primitives must be initialized BEFORE render_to_string. MINOR: Error Propagation Verify: Does the writing task handle exceptions from response.stream.write? What if the client disconnects mid-stream? Testing - Good CoverageTests were properly updated:
Note behavior change: Before tests explicitly controlled chunk consumption via fiber.resume. Now all chunks consumed immediately in background. This is fine but consider if you have integration tests verifying actual streaming behavior over time. PerformancePositive:
Consideration:
Suggestions
SummaryThis is a well-architected refactoring achieving its primary goal of enabling async gem code inside component streaming. Strengths:
Action Items (priority order):
Overall: Strong PR - just needs empty stream edge case addressed before merging. |
PR #2111 Review: Replace manual Fibers with Async gem primitivesOverall Assessment: ✅ Strong RefactoringThis PR successfully modernizes the concurrent streaming implementation by replacing manual Fiber management with async gem primitives. The changes are well-motivated, properly tested, and significantly simplify the codebase. ✅ Strengths1. Clear Problem & Solution
2. Code Quality Improvements
3. Excellent Test Coverage
4. Backward Compatibility
🔍 Code Quality Observations1. Error Handling - Good# Helper initialization check
if @async_barrier.nil?
raise ReactOnRails::Error,
"You must call stream_view_containing_react_components..."
end✅ Proper guard clause to catch misuse early 2. Resource Management - Goodbegin
drain_streams_concurrently(parent_task)
ensure
response.stream.close if close_stream_at_end
end✅ Ensures streams are properly closed even on errors 3. Cache Integration - ExcellentThe new Before (complex): wrapper_fiber = Fiber.new do
while (chunk = original_fiber.resume)
buffered_chunks << chunk
Fiber.yield(chunk)
end
Rails.cache.write(view_cache_key, buffered_chunks, ...)
endAfter (clean): on_complete: lambda { |chunks|
Rails.cache.write(view_cache_key, chunks, ...)
}
|
Code Review: Replace Manual Fibers with Async Gem PrimitivesGreat work on this refactoring! The shift from manual Fiber management to async gem primitives is a solid architectural improvement. ✅ Strengths1. Clear Architecture
2. Good Error Handling
3. Test Coverage
🔍 Potential Issues1. CRITICAL: Race Condition Risk in First Chunk DeliveryIn run_stream_inside_fiber, the code calls first_chunk_var.wait BEFORE the async task has started executing. This creates a potential race where if stream.each_chunk hasn't yielded the first chunk yet, wait might block indefinitely. Why this might work in tests but fail in production: Tests may have very fast synchronous streams. Production streams might have network delays or complex rendering. Recommendation: Verify that Async::Variable#wait blocks correctly until a value is set (according to async gem docs it should). Consider adding a timeout as a safeguard. 2. Error Handling: What if stream.each_chunk Raises?If each_chunk raises an exception:
Recommended fix: Add rescue block to ensure first_chunk_var is set even on error, and re-raise to propagate to barrier. 3. Memory Management: all_chunks ArrayFor large streams with many chunks, the all_chunks array accumulates all chunks in memory simultaneously when caching is enabled. Recommendation: Add documentation about memory implications of caching large components. Consider if there should be a configurable chunk limit. 4. Missing on_complete DocumentationThe on_complete callback is a new feature but lacks:
Recommendation: Add YARD docs with examples. 5. Testing Gap: Edge CasesGreat coverage overall, but missing:
📊 Performance ConsiderationsMemory Usage Increased:
Impact: For cached components, memory usage is now O(n) where n = total chunk size. Verdict: Performance tradeoff is acceptable but should be documented. ✅ Final VerdictApprove with minor fixes recommended Must Fix Before Merge:
Should Fix Before Merge:
Nice to Have:
Overall: Excellent refactoring that achieves its goals. The architecture is cleaner and enables future async props work. Issues identified are mostly edge cases and documentation gaps, not fundamental flaws. With suggested fixes, this PR will be solid. Great work! 🚀 |
PR Review: Replace Manual Fibers with Async Gem PrimitivesGreat work on this refactoring! This is a well-structured PR that successfully replaces manual Fiber management with async gem primitives. The code is cleaner, more maintainable, and enables the critical feature of running async code inside streaming components. ✅ Strengths1. Clear Architecture & Motivation
2. Significant Code Simplification
3. Unified Concurrency Model
4. Clean Callback Pattern
5. Comprehensive Test Updates
🔍 Code Quality ObservationsPotential Issues & Questions1. Race Condition in
|
Code Review: Async Gem Primitives for Component StreamingSummaryThis is a well-architected refactor that successfully replaces manual Fiber management with async gem primitives. The changes enable async code execution within streaming while maintaining all existing functionality. All CI checks are passing ✅ ✅ Strengths1. Clear Architecture Improvement
2. Proper Test Updates
3. Backwards Compatibility
4. Smart Callback DesignThe cache_aware_options = raw_options.merge(
on_complete: lambda { |chunks|
Rails.cache.write(view_cache_key, chunks, raw_options[:cache_options] || {})
}
)This is much cleaner than the previous wrapper fiber approach. 🔍 Potential Issues & Questions1. Require Statements Location
|
Code Review: Replace manual Fibers with Async gem primitivesThis is a well-architected refactor that successfully transitions from manual Fiber management to async gem primitives. Here's my detailed feedback: ✅ Strengths1. Clear Problem Statement & ArchitectureThe PR description excellently documents the problem ("Running scheduler on non-blocking fiber") and shows before/after architecture. This makes the change easy to review and understand. 2. Unified Concurrency ModelReplacing manual Fibers with 3. Code Simplification
4. Test Coverage UpdatesTests were properly updated to work with the new async primitives:
5. Cache Implementation ImprovementThe # Before: Complex wrapper fiber that buffers and yields chunks
wrapper_fiber = Fiber.new do
while (chunk = original_fiber.resume)
buffered_chunks << chunk
Fiber.yield(chunk)
end
Rails.cache.write(view_cache_key, buffered_chunks, raw_options[:cache_options] || {})
end
# After: Clean callback-based approach
on_complete: lambda { |chunks|
Rails.cache.write(view_cache_key, chunks, raw_options[:cache_options] || {})
}🔍 Observations & Questions1.
|
Code Review: Async Gem Primitives MigrationI've reviewed this PR that replaces manual Fiber management with Async gem primitives. Overall, this is a well-architected refactoring that achieves its stated goals. Here's my detailed feedback: ✅ Strengths1. Clear Problem & SolutionThe PR description excellently articulates:
2. Code Simplification
3. Excellent Test Coverage
4. Backward Compatibility PreservedAll existing functionality maintained:
5. Smart Caching IntegrationThe on_complete: lambda { |chunks|
Rails.cache.write(view_cache_key, chunks, raw_options[:cache_options] || {})
}This is cleaner than the previous wrapper fiber approach. 🔍 Potential Issues & Suggestions1. Error Handling & Resource Cleanup
|
Code Review: Replace manual Fibers with Async gem primitivesThis is an excellent architectural improvement that modernizes the concurrency model. The refactoring is well-motivated, thoroughly tested, and achieves significant code simplification. Strengths
Critical Issues to Fix Before Merge1. Error Handling in stream_view_containing_react_componentsProblem: render_to_string is called AFTER initializing @async_barrier but BEFORE begin/ensure. If it raises, async tasks may be orphaned. Fix: Move render_to_string inside begin block and add barrier cleanup to ensure: begin
template_string = render_to_string(template: template, **render_options)
response.stream.write(template_string)
drain_streams_concurrently(parent_task)
ensure
@async_barrier&.wait
@main_output_queue&.close
response.stream.close if close_stream_at_end
end2. First Chunk Variable Race in run_stream_inside_fiberProblem: If stream.each_chunk raises before setting first_chunk_var.value, the main thread will hang forever on first_chunk_var.wait Fix: Add rescue block to ensure variable always resolves: @async_barrier.async do
begin
# ... existing code ...
rescue => e
first_chunk_var.value = nil unless first_chunk_var.resolved?
raise
end
endRecommendations
Security & Performance
Verdict: ✅ Approve with Critical FixesGreat refactoring! Just address the two error handling issues to prevent hung requests, then this is ready to merge. 🤖 Generated with Claude Code |
Code Review: Replace manual Fibers with Async gem primitivesHi @AbanoubGhadban! Overall this is a solid refactoring that successfully migrates from manual Fiber management to Async gem primitives. ✅ Strengths
🔍 Potential Issues1.
|
Code Review - PR #2111: Replace manual Fibers with Async gem primitives✅ Overall AssessmentVerdict: LGTM with minor observations This is a well-architected refactoring that successfully replaces manual Fiber management with Async gem primitives. The code is cleaner, more maintainable, and critically enables async code execution within component streaming. All CI checks pass, and the test updates demonstrate thorough validation. 🎯 Strengths1. Architectural Improvement
2. Code Simplification
3. Excellent Test Coverage
4. Smart Callback DesignThe on_complete: lambda { |chunks|
Rails.cache.write(view_cache_key, chunks, raw_options[:cache_options] || {})
}This is much cleaner than the previous wrapper fiber approach.
|
| def run_stream_inside_fiber | ||
| if @rorp_rendering_fibers.nil? | ||
| def run_stream_inside_fiber(on_complete:) | ||
| require "async/variable" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Any reason to do it only here and not at the beginning of the file?
Problem
We need to run async gem code inside the
each_chunkfunction for the upcoming React on Rails async props implementation. However, this raises the error:This occurs because
each_chunkexecutes within a manually created Fiber instead of a Fiber managed by the async gem. The async gem requires that any fiber running async code must be created and managed by the gem itself.Solution
Replace manual Fiber management with async gem primitives for a unified concurrency model:
Async::Barrier: Tracks all component streaming tasksAsync::Variable: Synchronizes first chunk deliveryAsync::LimitedQueue: Single output queue for all componentsArchitecture Changes
Before (Manual Fibers):
Problem: Fiber created manually, not by async gem → can't run async code inside
After (Async Primitives):
Solution: All fibers created by async gem → can safely run async code
Code Changes
1.
stream_view_containing_react_components(stream.rb)Changes:
Sync doblock@async_barrierand@main_output_queuebefore view rendering2.
run_stream_inside_fiber(helper)Changes:
Fiber.newwith@async_barrier.asyncAsync::Variablefor first chunk synchronization3.
drain_streams_concurrently(stream.rb)Changes:
barrier.wait→queue.close→dequeue loopbuild_producer_tasksandbuild_writer_taskmethodsBenefits
✅ Enables Async Props Implementation
Primary Goal: Can now run async gem code inside
each_chunkwithout errors✅ Unified Concurrency Model
All concurrency handled by async gem primitives - no manual Fiber management
✅ Simpler Code
✅ Same Functionality
LimitedQueuepreservedTesting
Why This Approach?
We explored several alternatives:
This is the minimal change that enables async code in streaming while maintaining all existing behavior.
🤖 Generated with Claude Code
Co-Authored-By: Claude [email protected]
Summary by CodeRabbit
New Features
Refactor
Tests
✏️ Tip: You can customize this high-level summary in your review settings.