-
Notifications
You must be signed in to change notification settings - Fork 1
Description
Problem
When calling getLogs over a large block range, viem internally subdivides the range into smaller RPC requests (which is great), but accumulates all parsed results into a single in-memory array before returning. On high-throughput chains like Arbitrum, this causes a JavaScript heap out of memory crash during JSON.parse of the response, even with --max-old-space-size=8192.
For context, indexing Morpho Blue events on Arbitrum (~140M blocks) consistently OOMs at ~66% progress, with the V8 heap at ~4GB.
Expected behavior
A way to process logs incrementally (e.g. via a callback, async iterator, or built-in chunking) so that results from previous sub-requests can be garbage collected before the next batch is parsed.
Workaround
Manually chunking the block range in the caller and calling getLogs per chunk, which allows GC between iterations. This works but shifts the responsibility to every consumer.