Skip to content

getLogs accumulates all results in memory, causing OOM on large block ranges #1

@Jean-Grimal

Description

@Jean-Grimal

Problem

When calling getLogs over a large block range, viem internally subdivides the range into smaller RPC requests (which is great), but accumulates all parsed results into a single in-memory array before returning. On high-throughput chains like Arbitrum, this causes a JavaScript heap out of memory crash during JSON.parse of the response, even with --max-old-space-size=8192.

For context, indexing Morpho Blue events on Arbitrum (~140M blocks) consistently OOMs at ~66% progress, with the V8 heap at ~4GB.

Expected behavior

A way to process logs incrementally (e.g. via a callback, async iterator, or built-in chunking) so that results from previous sub-requests can be garbage collected before the next batch is parsed.

Workaround

Manually chunking the block range in the caller and calling getLogs per chunk, which allows GC between iterations. This works but shifts the responsibility to every consumer.

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions