Skip to content
Closed
28 changes: 26 additions & 2 deletions .github/workflows/pro-integration-tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -230,16 +230,33 @@ jobs:
- name: Generate file-system based entrypoints
run: cd spec/dummy && bundle exec rake react_on_rails:generate_packs

- name: Run Pro Node renderer in background
# Install moreutils for the 'ts' command to add timestamps to logs
# This enables synchronized side-by-side log viewing with bin/view-synchronized-logs
- name: Install timestamp utility for log synchronization
run: sudo apt-get install -y moreutils

- name: Run Pro Node renderer in background with timestamped logging
run: |
cd spec/dummy
yarn run node-renderer &
mkdir -p tmp/ci-logs
yarn run node-renderer 2>&1 | ts '[%Y-%m-%d %H:%M:%.S]' | tee tmp/ci-logs/node-renderer.log &

- name: Run Rails server in background
run: |
cd spec/dummy
RAILS_ENV="test" rails server &

- name: Tail Rails test log with timestamps in background
run: |
cd spec/dummy
# Wait for log file to be created
timeout=10
while [ ! -f log/test.log ] && [ $timeout -gt 0 ]; do
sleep 0.5
timeout=$((timeout - 1))
done
tail -f log/test.log 2>&1 | ts '[%Y-%m-%d %H:%M:%.S]' > tmp/ci-logs/rails-server.log &

- name: Wait for Rails server to start
run: |
timeout=60
Expand Down Expand Up @@ -299,6 +316,13 @@ jobs:
name: pro-rspec-yarn-error-log
path: react_on_rails_pro/spec/dummy/yarn-error.log

- name: Store synchronized server logs
uses: actions/upload-artifact@v4
if: always()
with:
name: pro-rspec-synchronized-logs
path: react_on_rails_pro/spec/dummy/tmp/ci-logs/

# Playwright E2E tests with Redis service
dummy-app-node-renderer-e2e-tests:
needs:
Expand Down
243 changes: 243 additions & 0 deletions bin/view-synchronized-logs
Original file line number Diff line number Diff line change
@@ -0,0 +1,243 @@
#!/usr/bin/env ruby
# frozen_string_literal: true

require 'time'
require 'optparse'

# Script to view synchronized logs from CI runs side-by-side
# Usage: bin/view-synchronized-logs path/to/logs/directory
# bin/view-synchronized-logs --rails rails.log --node node.log

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

Missing execute permission on script file.

The script file doesn't have execute permission, which will prevent it from being run directly. Additionally, there are string literal style inconsistencies (lines 4-5).

Apply these fixes:

chmod +x bin/view-synchronized-logs

And update the string literals:

-require 'time'
-require 'optparse'
+require "time"
+require "optparse"
🧰 Tools
🪛 GitHub Actions: Lint JS and Ruby

[warning] 1-1: Lint/ScriptPermission: Script file view-synchronized-logs doesn't have execute permission.


[warning] 4-4: Style/StringLiterals: Prefer double-quoted strings unless you need single quotes to avoid extra backslashes for escaping.


[warning] 5-5: Style/StringLiterals: Prefer double-quoted strings unless you need single quotes to avoid extra backslashes for escaping.

🤖 Prompt for AI Agents
In bin/view-synchronized-logs around lines 1 to 10, the script is missing the
executable bit and has inconsistent string literal styles on lines 4-5; make the
file executable (chmod +x bin/view-synchronized-logs) and normalize the require
strings to the project style (e.g., change require 'time' and require 'optparse'
to use double quotes if the repo prefers double quotes or convert all to single
quotes to match project linting) so the script can be run directly and string
literals are consistent.

class SynchronizedLogViewer
TIMESTAMP_REGEX = /^\[(\d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2}\.\d+)\] (.*)$/

def initialize(rails_log_path, node_log_path, options = {})
@rails_log_path = rails_log_path
@node_log_path = node_log_path
@filter = options[:filter]
@time_window = options[:time_window] || 1.0 # seconds
@show_time_gaps = options[:show_time_gaps]
end

def view
rails_entries = parse_log_file(@rails_log_path, 'RAILS')
node_entries = parse_log_file(@node_log_path, 'NODE')

if rails_entries.empty? && node_entries.empty?
puts "No timestamped log entries found in either file."
return
end

display_synchronized_logs(rails_entries, node_entries)
end

private

def parse_log_file(path, source)
entries = []
return entries unless File.exist?(path)

current_timestamp = nil
current_lines = []

File.readlines(path).each do |line|
if line =~ TIMESTAMP_REGEX
# Save previous entry if exists
if current_timestamp && !current_lines.empty?
content = current_lines.join
entries << { time: current_timestamp, content: content, source: source } if matches_filter?(content)
end

# Start new entry
current_timestamp = Time.parse($1)
current_lines = [$2]
elsif current_timestamp
# Continuation of previous entry
current_lines << line
end
end

# Add final entry
if current_timestamp && !current_lines.empty?
content = current_lines.join
entries << { time: current_timestamp, content: content, source: source } if matches_filter?(content)
end

entries
rescue StandardError => e
warn "Error parsing #{path}: #{e.message}"
[]
end
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion | 🟠 Major

Reduce cyclomatic complexity and avoid Perl-style backrefs.

The parse_log_file method has high cyclomatic (12/7) and perceived (13/10) complexity. Additionally, lines 52-53 use discouraged Perl-style backrefs ($1, $2) instead of Regexp.last_match.

Consider extracting helper methods to reduce complexity:

   def parse_log_file(path, source)
     entries = []
     return entries unless File.exist?(path)

     current_timestamp = nil
     current_lines = []

     File.readlines(path).each do |line|
       if line =~ TIMESTAMP_REGEX
-        # Save previous entry if exists
-        if current_timestamp && !current_lines.empty?
-          content = current_lines.join
-          entries << { time: current_timestamp, content: content, source: source } if matches_filter?(content)
-        end
+        save_entry(entries, current_timestamp, current_lines, source) if current_timestamp
 
         # Start new entry
-        current_timestamp = Time.parse($1)
-        current_lines = [$2]
+        current_timestamp = Time.parse(Regexp.last_match(1))
+        current_lines = [Regexp.last_match(2)]
       elsif current_timestamp
         # Continuation of previous entry
         current_lines << line
       end
     end

-    # Add final entry
-    if current_timestamp && !current_lines.empty?
-      content = current_lines.join
-      entries << { time: current_timestamp, content: content, source: source } if matches_filter?(content)
-    end
+    save_entry(entries, current_timestamp, current_lines, source) if current_timestamp
 
     entries
   rescue StandardError => e
     warn "Error parsing #{path}: #{e.message}"
     []
   end
+
+  def save_entry(entries, timestamp, lines, source)
+    return if lines.empty?
+    
+    content = lines.join
+    entries << { time: timestamp, content: content, source: source } if matches_filter?(content)
+  end
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
def parse_log_file(path, source)
entries = []
return entries unless File.exist?(path)
current_timestamp = nil
current_lines = []
File.readlines(path).each do |line|
if line =~ TIMESTAMP_REGEX
# Save previous entry if exists
if current_timestamp && !current_lines.empty?
content = current_lines.join
entries << { time: current_timestamp, content: content, source: source } if matches_filter?(content)
end
# Start new entry
current_timestamp = Time.parse($1)
current_lines = [$2]
elsif current_timestamp
# Continuation of previous entry
current_lines << line
end
end
# Add final entry
if current_timestamp && !current_lines.empty?
content = current_lines.join
entries << { time: current_timestamp, content: content, source: source } if matches_filter?(content)
end
entries
rescue StandardError => e
warn "Error parsing #{path}: #{e.message}"
[]
end
def parse_log_file(path, source)
entries = []
return entries unless File.exist?(path)
current_timestamp = nil
current_lines = []
File.readlines(path).each do |line|
if line =~ TIMESTAMP_REGEX
save_entry(entries, current_timestamp, current_lines, source) if current_timestamp
# Start new entry
current_timestamp = Time.parse(Regexp.last_match(1))
current_lines = [Regexp.last_match(2)]
elsif current_timestamp
# Continuation of previous entry
current_lines << line
end
end
save_entry(entries, current_timestamp, current_lines, source) if current_timestamp
entries
rescue StandardError => e
warn "Error parsing #{path}: #{e.message}"
[]
end
def save_entry(entries, timestamp, lines, source)
return if lines.empty?
content = lines.join
entries << { time: timestamp, content: content, source: source } if matches_filter?(content)
end
🧰 Tools
🪛 GitHub Actions: Lint JS and Ruby

[warning] 36-36: Metrics/CyclomaticComplexity: Cyclomatic complexity for parse_log_file is too high. [12/7]


[warning] 36-36: Metrics/PerceivedComplexity: Perceived complexity for parse_log_file is too high. [13/10]


[warning] 52-52: Style/PerlBackrefs: Prefer ::Regexp.last_match(1) over $1.


[warning] 53-53: Style/PerlBackrefs: Prefer ::Regexp.last_match(2) over $2.

🤖 Prompt for AI Agents
In bin/view-synchronized-logs around lines 36 to 70, parse_log_file is overly
complex and uses Perl-style backrefs ($1/$2); refactor by extracting small
helpers (e.g., parse_line_match(line) to return timestamp and body,
flush_current_entry(entries, current_timestamp, current_lines, source) to handle
pushing filtered entries) to reduce cyclomatic complexity and move
parsing/append logic out of the main loop, replace $1/$2 with Regexp.last_match
(or use named captures) to obtain the timestamp and message, and ensure early
returns and single-responsibility helpers so the main method becomes a simple
loop delegating matching, appending, and error handling.


def matches_filter?(content)
return true unless @filter
content.match?(@filter)
end

def display_synchronized_logs(rails_entries, node_entries)
all_entries = (rails_entries + node_entries).sort_by { |e| e[:time] }

if all_entries.empty?
puts "No entries to display (possibly filtered out)."
return
end

# Get terminal width
terminal_width = `tput cols`.to_i rescue 160
column_width = [(terminal_width - 5) / 2, 60].max

# Print header
print_header(column_width)

# Display entries
last_time = nil
all_entries.each do |entry|
# Show time gap if requested
if @show_time_gaps && last_time
gap = entry[:time] - last_time
if gap > @time_window
print_time_gap(gap, column_width)
end
end

print_entry(entry, column_width)
last_time = entry[:time]
end

# Print summary
print_summary(rails_entries, node_entries, all_entries)
end

def print_header(column_width)
separator = "=" * (column_width * 2 + 5)
puts separator
rails_header = "RAILS SERVER".center(column_width)
node_header = "NODE RENDERER".center(column_width)
puts "#{rails_header} | #{node_header}"
puts separator
end

def print_time_gap(gap, column_width)
gap_text = format("--- %.1f second gap ---", gap).center(column_width * 2 + 5)
puts "\e[90m#{gap_text}\e[0m" # Gray color
end

def print_entry(entry, column_width)
timestamp = entry[:time].strftime("%H:%M:%S.%3N")
content = entry[:content].strip
source = entry[:source]

# Truncate content if too long
max_content_length = column_width - 13 # Account for timestamp
if content.length > max_content_length
content = content[0...max_content_length - 3] + "..."
end

# Color codes
color = source == 'RAILS' ? "\e[34m" : "\e[32m" # Blue for Rails, Green for Node
reset = "\e[0m"
time_color = "\e[90m" # Gray for timestamp

# Format entry with timestamp
formatted_entry = "#{time_color}#{timestamp}#{reset} #{color}#{content}#{reset}"

# Print in appropriate column
if source == 'RAILS'
left_content = formatted_entry.ljust(column_width + 20) # +20 for ANSI codes
puts "#{left_content} |"
else
left_padding = " " * column_width
puts "#{left_padding} | #{formatted_entry}"
end
end

def print_summary(rails_entries, node_entries, all_entries)
puts "\n" + "=" * 80
puts "Summary:"
puts " Rails entries: #{rails_entries.size}"
puts " Node entries: #{node_entries.size}"
puts " Total entries: #{all_entries.size}"

if all_entries.any?
duration = all_entries.last[:time] - all_entries.first[:time]
puts " Time span: #{format('%.2f', duration)} seconds"
puts " First entry: #{all_entries.first[:time].strftime('%Y-%m-%d %H:%M:%S.%3N')}"
puts " Last entry: #{all_entries.last[:time].strftime('%Y-%m-%d %H:%M:%S.%3N')}"
end
end
end

# Parse command line options
options = {}
parser = OptionParser.new do |opts|
opts.banner = "Usage: bin/view-synchronized-logs [options] LOGS_DIRECTORY"
opts.separator " or: bin/view-synchronized-logs --rails RAILS_LOG --node NODE_LOG"
opts.separator ""
opts.separator "View synchronized logs from CI runs side-by-side in chronological order."
opts.separator ""
opts.separator "Options:"

opts.on("-r", "--rails FILE", "Path to Rails server log file") do |file|
options[:rails] = file
end

opts.on("-n", "--node FILE", "Path to Node renderer log file") do |file|
options[:node] = file
end

opts.on("-f", "--filter REGEX", "Filter entries matching regex pattern") do |pattern|
options[:filter] = Regexp.new(pattern, Regexp::IGNORECASE)
end

opts.on("-g", "--gaps", "Show time gaps between entries") do
options[:show_time_gaps] = true
end

opts.on("-w", "--window SECONDS", Float, "Time window for gap detection (default: 1.0)") do |seconds|
options[:time_window] = seconds
end

opts.on("-h", "--help", "Show this help message") do
puts opts
exit
end
end

parser.parse!

# Determine log file paths
rails_log = options[:rails]
node_log = options[:node]

if !rails_log || !node_log
# Try to use directory argument
if ARGV.empty?
puts "Error: Please provide either a logs directory or specify both --rails and --node options"
puts parser
exit 1
end

logs_dir = ARGV[0]
unless Dir.exist?(logs_dir)
puts "Error: Directory not found: #{logs_dir}"
exit 1
end

rails_log = File.join(logs_dir, 'rails-server.log')
node_log = File.join(logs_dir, 'node-renderer.log')
end

# Validate files exist
unless File.exist?(rails_log)
puts "Error: Rails log file not found: #{rails_log}"
exit 1
end

unless File.exist?(node_log)
puts "Error: Node log file not found: #{node_log}"
exit 1
end

# Run viewer
viewer = SynchronizedLogViewer.new(rails_log, node_log, options)
viewer.view
2 changes: 1 addition & 1 deletion react_on_rails_pro/spec/dummy/client/node-renderer.js
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@ const config = {
allWorkersRestartInterval: (env.CI ? 2 : env.RENDERER_ALL_WORKERS_RESTART_INTERVAL) || 10,

// time in minutes between each worker restarting when restarting all workers
delayBetweenIndividualWorkerRestarts: env.CI ? 0.01 : 1,
delayBetweenIndividualWorkerRestarts: 1,

// If set to true, `supportModules` enables the server-bundle code to call a default set of NodeJS modules
// that get added to the VM context: { Buffer, process, setTimeout, setInterval, clearTimeout, clearInterval }.
Expand Down
Loading