feat: global AI concurrency limiter (max_ai_concurrency)#362
Conversation
The normalizeNodeEval function replaced real newlines with literal \n in node -e script arguments. This broke multiline scripts loaded from YAML exec blocks (e.g. slack-send-dm workflow) because Node.js received the entire script as a single line with literal \n sequences instead of actual newlines, causing SyntaxError: Invalid or unexpected token. Commands are executed via child_process.exec() which passes them through /bin/sh -c, so multiline content inside quoted arguments is preserved correctly by the shell. The normalization was unnecessary and harmful. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Adds max_ai_concurrency config option that creates a shared DelegationManager to gate concurrent AI API calls across all checks. The limiter is created in buildEngineContextForRun, propagated through ai-check-provider and AIReviewService to ProbeAgent instances. Includes 5 unit tests and DelegationManager mock for @probelabs/probe. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
19899fc to
50abdc5
Compare
PR Overview: Global AI Concurrency LimiterSummaryThis PR introduces a global AI concurrency limiter ( Files Changed
Total: 311 additions, 454 deletions across 13 files Architecture & Impact AssessmentWhat This PR Accomplishes
Key Technical Changes
Affected System Componentsgraph TD
A[VisorConfig] -->|max_ai_concurrency| B[buildEngineContextForRun]
B -->|creates| C[DelegationManager]
C -->|sharedConcurrencyLimiter| D[EngineContext]
D -->|_parentContext| E[AICheckProvider]
E -->|concurrencyLimiter| F[AIReviewService]
F -->|concurrencyLimiter| G[ProbeAgent]
G -->|acquire/release| H[AI API Calls]
style C fill:#e1f5ff
style D fill:#fff4e1
style E fill:#f0e1ff
style F fill:#ffe1f0
The concurrency limiter follows a dependency injection pattern:
Scope Discovery & Context ExpansionDirect Impact
Related Files (Inferred)Based on the changes, these files are likely related but not modified:
Potential Edge Cases
Testing Coverage✅ Well-covered:
Reviewer GuidanceKey Areas to Review
Potential Concerns
Configuration Example# .visor.yaml
max_parallelism: 10 # Run up to 10 checks in parallel
max_ai_concurrency: 3 # But only 3 concurrent AI API calls
checks:
security-review:
type: ai
prompt: security
code-review:
type: ai
prompt: review
# ... 8 more AI checks
# All 10 checks can start, but only 3 will make AI calls at onceMetadata
Powered by Visor from Probelabs Last updated: 2026-02-15T19:15:02.349Z | Triggered by: pr_updated | Commit: ce33a3c 💡 TIP: You can chat with Visor using |
✅ Security Check PassedNo security issues found – changes LGTM. Architecture Issues (8)
Performance Issues (1)
Quality Issues (1)
Powered by Visor from Probelabs Last updated: 2026-02-15T19:15:05.348Z | Triggered by: pr_updated | Commit: ce33a3c 💡 TIP: You can chat with Visor using |
Update @probelabs/probe to v0.6.0-rc232. Fix the emit-files test command that broke after removing normalizeNodeEval — the JS template literal \n was producing an actual newline, breaking the node -e command and Liquid split filter. Use \\n so the strings contain literal \n for node and LiquidJS to interpret. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
….visor.yaml Set max_parallelism: 3 and max_ai_concurrency: 3 in the active default config. Remove defaults/.visor.yaml which was a stale duplicate never loaded by the code. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Summary
max_ai_concurrencyconfig option toVisorConfigthat caps concurrent AI API calls across all checks in a runDelegationManagerinbuildEngineContextForRunwhenmax_ai_concurrencyis setai-check-provider→AIReviewService→ProbeAgentvia_parentContext.sharedConcurrencyLimiterDelegationManagerto the@probelabs/probemock for test supportTest plan
npx jest tests/unit/concurrency-limiter.test.ts— 5/5 pass🤖 Generated with Claude Code