Skip to content

Conversation

@Milly
Copy link
Contributor

@Milly Milly commented Oct 21, 2025

Summary

  • Add accumulate() function to the batch module that automatically batches multiple RPC calls
  • Support parallel execution with Promise.all() while maintaining automatic batching
  • Add AccumulateCancelledError for proper error handling in parallel execution scenarios

Background

The existing batch() and collect() functions require sequential execution and manual batching. The new accumulate() function enables more natural async/await patterns with automatic batching during microtask processing.

This function was originally developed in a separate repository and has been published on jsr.io with proven usage in various Denops plugins. Given its
maturity and utility, we're now integrating it into the standard library.

Example Usage

const results = await accumulate(denops, async (denops) => {
  const lines = await fn.getline(denops, 1, "$");
  return await Promise.all(lines.map(async (line, index) => {
    const keyword = await fn.matchstr(denops, line, "\\k\\+");
    const len = await fn.len(denops, keyword);
    return { lnum: index + 1, keyword, len };
  }));
});

In this example, the following 3 RPCs are called:

  1. RPC call to getline
  2. Multiple matchstr calls in one RPC
  3. Multiple len calls in one RPC

Testing

  • Comprehensive test coverage including parallel execution scenarios
  • Error handling and cancellation behavior tested
  • Integration with existing batch() and collect() functions verified

Breaking Changes

None. This is a purely additive change.

Related

Additional Context

  • A rename to gather() was suggested by AI but rejected due to:
    • Historical conflict: A function with this name existed in earlier denops-std
      versions
    • Breaking change: The function is already in use as accumulate() in some plugin libraries

Summary by CodeRabbit

  • New Features

    • Added an accumulate utility to batch multiple RPC calls within Denops.
    • Added an AccumulateCancelledError to represent cancelled batched operations.
    • Batch utilities are now publicly re-exported for easier consumption.
  • Tests

    • Added a comprehensive test suite covering accumulation semantics, parallel/sequential behavior, error propagation, cancellation, and public API surface.
  • Chores

    • Updated project config to include an async utilities import alias.

@coderabbitai
Copy link

coderabbitai bot commented Oct 21, 2025

Walkthrough

Adds an accumulate batching helper for Denops RPCs, a cancellation error type, tests, public exports, and an import-map alias for std/async. Provides AccumulateHelper with guarded batched call methods and an exported accumulate(denops, executor) API.

Changes

Cohort / File(s) Summary
Core accumulate implementation
batch/accumulate.ts
New module implementing AccumulateHelper that enqueues RPC Calls, resolves them via deferred ticks using denops.batch, exposes guarded call, batch, cmd, eval, and dispatch methods, normalizes results/errors, and exports accumulate(denops, executor).
Cancellation error
batch/error.ts
New AccumulateCancelledError and AccumulateCancelledErrorOptions to represent cancellation of pending batched calls; carries optional calls payload.
Test suite
batch/accumulate_test.ts
New comprehensive tests covering basic resolutions, collections, nested accumulate/batch/collect behavior, parallel/sequential flows, many error paths (executor, denops.batch failures, cancellations), API surface guards, and stack/trace expectations.
Module exports and docs
batch/mod.ts
Re-exports new accumulate and error modules and updates example usage to import the accumulate utility.
Import map / config
deno.jsonc
Adds import alias @std/async (version ^1.0.15) to the import map for async utilities used by accumulate implementation.

Sequence Diagram(s)

sequenceDiagram
    participant User
    participant Accumulate as accumulate()
    participant Helper as AccumulateHelper
    participant Scheduler as nextTick
    participant DenopsBatch as denops.batch()

    User->>Accumulate: accumulate(denops, executor)
    Accumulate->>Helper: new Helper(denops)
    Accumulate->>User: run executor(helper)

    loop executor schedules batched calls
        User->>Helper: helper.call / .batch / .cmd / .eval / .dispatch
        Helper->>Helper: enqueue Call, return Promise
        Helper->>Scheduler: schedule `#resolvePendingCalls`
    end

    Scheduler->>Helper: `#resolvePendingCalls`
    Helper->>DenopsBatch: denops.batch(enqueued calls)
    DenopsBatch-->>Helper: results[] or BatchError

    alt success
        Helper->>Helper: map results, resolve Promises
        Helper-->>User: Promises resolved
    else batch error / cancellation
        Helper->>Helper: wrap/convert errors (BatchError | AccumulateCancelledError)
        Helper-->>User: Promises rejected
    end

    User->>Accumulate: executor returns
    Accumulate->>Helper: close() / cleanup
    Accumulate-->>User: final awaited result
Loading

Estimated code review effort

🎯 4 (Complex) | ⏱️ ~60 minutes

  • Areas needing extra attention:
    • Error wrapping/propagation logic in accumulate.ts (BatchError vs generic errors vs AccumulateCancelledError).
    • Promise resolution queueing and nextTick scheduling (#waitResolved / #resolvePendingCalls).
    • Tests asserting stack/trace contents and cancellation semantics in batch/accumulate_test.ts.
    • Public API guards preventing helper usage outside accumulate block.

Poem

🐰 I queued my hops in tidy rows,

Batched each call where soft wind blows.
If one fell down, I marked the rest—
Cancelled gently, laid to rest.
Hops resolved, the rabbit knows.

Pre-merge checks and finishing touches

✅ Passed checks (3 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title accurately describes the main change: introducing an accumulate() function for automatic RPC batching with parallel execution support, which aligns with the primary objective and implementation of the PR.
Docstring Coverage ✅ Passed No functions found in the changed files to evaluate docstring coverage. Skipping docstring coverage check.
✨ Finishing touches
  • 📝 Generate docstrings
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch batch-accumulate

📜 Recent review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between c4da175 and 5d6645e.

📒 Files selected for processing (1)
  • batch/mod.ts (2 hunks)
🚧 Files skipped from review as they are similar to previous changes (1)
  • batch/mod.ts
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (6)
  • GitHub Check: test (windows-latest, 2.x, v9.1.1646, v0.11.3)
  • GitHub Check: test (ubuntu-latest, 2.x, v9.1.1646, v0.11.3)
  • GitHub Check: test (ubuntu-latest, ~2.3, v9.1.1646, v0.11.3)
  • GitHub Check: test (macos-latest, 2.x, v9.1.1646, v0.11.3)
  • GitHub Check: test (windows-latest, ~2.3, v9.1.1646, v0.11.3)
  • GitHub Check: test (macos-latest, ~2.3, v9.1.1646, v0.11.3)

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Add accumulate() function that automatically batches multiple RPC calls
that occur at the same timing during microtask processing. This enables
parallel RPC execution with automatic batching and proper error handling.
@codecov
Copy link

codecov bot commented Oct 21, 2025

Codecov Report

✅ All modified and coverable lines are covered by tests.
✅ Project coverage is 85.73%. Comparing base (d8bfd9d) to head (5d6645e).
⚠️ Report is 3 commits behind head on main.

Additional details and impacted files
@@            Coverage Diff             @@
##             main     #297      +/-   ##
==========================================
+ Coverage   84.82%   85.73%   +0.91%     
==========================================
  Files          64       66       +2     
  Lines        2879     3064     +185     
  Branches      281      306      +25     
==========================================
+ Hits         2442     2627     +185     
  Misses        435      435              
  Partials        2        2              

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.
  • 📦 JS Bundle Analysis: Save yourself from yourself by tracking and limiting bundle sizes in JS merges.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

🧹 Nitpick comments (3)
batch/accumulate.ts (3)

1-1: Prefer queueMicrotask for portability (Deno-first) over node:process nextTick.

Using node:process couples to Node polyfills and has different ordering vs Promise microtasks. queueMicrotask keeps semantics simple and avoids Node dependency.

Apply this diff:

-import { nextTick } from "node:process";
+// Prefer runtime-agnostic microtask scheduling
+const schedule = queueMicrotask;
...
-    nextTick(() => {
+    schedule(() => {
       if (end === this.#calls.length) {
         this.#resolvePendingCalls();
       }
     });

Also applies to: 139-144


19-22: Consider std/async deferred() instead of Promise.withResolvers() for broader compatibility.

Some environments lag Promise.withResolvers(). std/async’s deferred is stable across Deno versions.

-  readonly #disposer = Promise.withResolvers<void>();
+  // Using std/async deferred for compatibility
+  readonly #disposer = deferred<void>();
...
-  #resolvedWaiter = Promise.withResolvers<void>();
+  #resolvedWaiter = deferred<void>();
...
-    const { resolve } = this.#resolvedWaiter;
-    this.#resolvedWaiter = Promise.withResolvers();
+    const { resolve } = this.#resolvedWaiter;
+    this.#resolvedWaiter = deferred<void>();

Add import:

+import { deferred } from "@std/async/deferred";

Also applies to: 160-162


84-109: Batch error plumbing is correct; minor message polish (optional).

Current repr uses the first fn and total count; optional: include the index at which failure occurred for quicker triage.

-        const [[fn]] = calls;
-        const repr = `[['${fn}', ...], ... total ${calls.length} calls]`;
+        const [[fn]] = calls;
+        const repr = `[['${fn}', ...], ... total ${calls.length} calls] (failed at index ${errorIndex})`;
📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between d8bfd9d and c4da175.

📒 Files selected for processing (5)
  • batch/accumulate.ts (1 hunks)
  • batch/accumulate_test.ts (1 hunks)
  • batch/error.ts (1 hunks)
  • batch/mod.ts (2 hunks)
  • deno.jsonc (1 hunks)
🧰 Additional context used
🧬 Code graph analysis (3)
batch/error.ts (1)
batch/accumulate.ts (2)
  • calls (135-154)
  • calls (169-191)
batch/accumulate_test.ts (3)
batch/accumulate.ts (5)
  • calls (135-154)
  • calls (169-191)
  • accumulate (254-264)
  • batch (84-109)
  • TypeError (127-133)
batch/collect.ts (1)
  • collect (138-151)
batch/error.ts (1)
  • AccumulateCancelledError (22-43)
batch/accumulate.ts (1)
batch/error.ts (1)
  • AccumulateCancelledError (22-43)
🔇 Additional comments (8)
deno.jsonc (1)

47-47: Add std/async alias — LGTM.

Alignment with tests (delay) and future use is fine. No action needed.

batch/error.ts (1)

1-43: Error type design looks solid.

Clear payload (calls) and stable name via static initializer. Good fit for cancellation signaling.

batch/mod.ts (1)

6-8: Public surface expansion and docs — LGTM.

Re-exports and example are coherent with accumulate() usage.

Also applies to: 46-49

batch/accumulate.ts (2)

61-83: Error propagation from call() reads clean.

Good separation of error/cancel/unknown cases; cancellation message includes call repr and carries cause/calls.


169-191: Edge-cases handled well in #resolveCalls().

  • BatchError path composes results + error + cancels correctly.
  • Non-BatchError path maps to “unknown” errors with cause.
batch/accumulate_test.ts (3)

320-346: Nested accumulate/batch/collect scenarios — great coverage.

These steps validate interop and batching boundaries thoroughly.

Also applies to: 347-369, 370-394


425-437: Outside-scope rejection and concurrent disposal behavior — excellent.

Covers disposer race, ensures no stray denops.batch calls. Nicely prevents unhandled rejections.

Also applies to: 438-466, 467-495, 496-543


1014-1021: Review comment is incorrect — code already rejects as intended.

The review misunderstands how resolvesNext from @std/testing/mock works. When resolvesNext receives an Error object in the iterable, it automatically throws/rejects that Error when the stubbed function is awaited.

At lines 1014-1021 and 691-697, resolvesNext<unknown[]>([underlyingError]) with underlyingError = new Error("Network error") is correct and will reject as expected. The test assertions (e.g., assertRejects(() => actual, Error, "Network error") at line 1032) confirm the intended rejection behavior occurs.

The pattern difference at line 730—using () => Promise.reject(underlyingError) instead—exists because that test uses a non-Error value (42), which resolvesNext would not automatically reject.

No changes are needed.

Likely an incorrect or invalid review comment.

@Milly Milly requested review from Shougo and lambdalisue October 24, 2025 09:43
Copy link
Member

@lambdalisue lambdalisue left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How should we distinguish this from the existing ones?
Is it something that replaces them, or is it meant to be used in a different context?
If it’s something that replaces the existing ones, we don’t need to delete them, but I’d like to mark them as deprecated.

@Milly
Copy link
Contributor Author

Milly commented Nov 1, 2025

accumulate() is not a replacement for batch() and collect(), but rather a complementary function for different use cases.

Each serves a specific purpose:

  • batch() - Manual batching for side-effects only (commands, settings). Most efficient when return values aren't needed.
  • collect() - Simple parallel value collection with type-safe returns. Cleanest API for fetching multiple fixed values at once.
  • accumulate() - Automatic batching for complex async flows with loops and conditions. Writes natural async code while automatically optimizing RPC calls in the background.

They complement each other nicely - developers can choose based on their specific needs. I don't think we need to deprecate the existing functions. What do you think?

@Milly Milly requested a review from lambdalisue November 1, 2025 18:41
@lambdalisue
Copy link
Member

They complement each other nicely - developers can choose based on their specific needs. I don't think we need to deprecate the existing functions. What do you think?

OK. Then, we should have clear documentation for that difference.

@Milly
Copy link
Contributor Author

Milly commented Nov 15, 2025

They complement each other nicely - developers can choose based on their specific needs. I don't think we need to deprecate the existing functions. What do you think?

OK. Then, we should have clear documentation for that difference.

Wrote to module document (JSDoc in batch/mod.ts).

Copy link
Member

@lambdalisue lambdalisue left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@lambdalisue lambdalisue merged commit f0eb284 into main Nov 15, 2025
9 of 10 checks passed
@lambdalisue lambdalisue deleted the batch-accumulate branch November 15, 2025 10:50
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants