Skip to content
Closed
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
90 changes: 90 additions & 0 deletions SECURITY.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,90 @@
<!-- GITHUB SECURITY ADVISORY SUBMISSION -->
<!-- Title: Denial of Service via Uncontrolled Memory Allocation and Event Loop Blocking in lowdb -->
<!-- Severity: High -->
<!-- Ecosystem: npm -->
<!-- Package name: lowdb -->
<!-- Affected versions: <= 7.0.1 -->
<!-- Patched versions: (leave blank) -->
<!-- CWE: CWE-770 -->

## Summary

lowdb v7.0.1 loads the entire database into memory and serializes it synchronously via `JSON.stringify` on every write. There are no limits on database size, record count, or nesting depth. An attacker who can insert records can exhaust server memory (OOM kill) or block the Node.js event loop for seconds, making the server unresponsive to all clients.

## Details

`Low.read()` and `Low.write()` (`lib/core/Low.js`, lines 15-23) operate on the entire dataset:
Copy link

Copilot AI Mar 7, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The advisory cites lib/core/Low.js with specific line numbers, but this repository doesn't contain a lib/ directory (source is under src/). Consider updating the reference to the actual source file (e.g., src/core/Low.ts) and adjusting/removing line numbers so the pointers remain accurate in-repo.

Copilot uses AI. Check for mistakes.

```javascript
async read() {
const data = await this.adapter.read(); // Read ENTIRE file into RAM
if (data) this.data = data;
}
async write() {
if (this.data) await this.adapter.write(this.data); // Serialize ENTIRE dataset
}
```

`JSONFile`'s stringify function (`lib/adapters/node/JSONFile.js`, line 6) calls `JSON.stringify(data, null, 2)`, which is **synchronous** and runs on the main thread. For large datasets, this blocks the event loop.
Copy link

Copilot AI Mar 7, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Similarly, this references lib/adapters/node/JSONFile.js with a specific line number, but the code in this repo lives under src/adapters/node/JSONFile.ts. Please update the file/line reference (or remove line numbers) to avoid sending readers to a non-existent path.

Suggested change
`JSONFile`'s stringify function (`lib/adapters/node/JSONFile.js`, line 6) calls `JSON.stringify(data, null, 2)`, which is **synchronous** and runs on the main thread. For large datasets, this blocks the event loop.
`JSONFile`'s stringify function (see `src/adapters/node/JSONFile.ts`) calls `JSON.stringify(data, null, 2)`, which is **synchronous** and runs on the main thread. For large datasets, this blocks the event loop.

Copilot uses AI. Check for mistakes.

There are no safeguards:
- No maximum database/file size limit
- No maximum record count
- No maximum object depth
- No rate limiting on write operations
- No streaming serialization

## PoC

```javascript
import { Low, Memory } from 'lowdb';

const adapter = new Memory();
const db = new Low(adapter, { items: [] });

// Simulate attacker inserting large records
for (let i = 0; i < 5000; i++) {
db.data.items.push({ id: i, payload: 'A'.repeat(10000) });
}

// JSON.stringify blocks the event loop
const start = performance.now();
const serialized = JSON.stringify(db.data);
const elapsed = performance.now() - start;

console.log(`${(Buffer.byteLength(serialized) / 1024 / 1024).toFixed(2)} MB`); // 47.80 MB
console.log(`${elapsed.toFixed(0)} ms blocked`); // 138ms — ALL clients unresponsive
Comment on lines +40 to +56
Copy link

Copilot AI Mar 7, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The PoC uses the Memory adapter, which never calls JSON.stringify internally. As written it doesn't demonstrate lowdb's write path (await db.write() with JSONFile/JSONFilePreset) or the on-write serialization behavior described above.

Suggested change
import { Low, Memory } from 'lowdb';
const adapter = new Memory();
const db = new Low(adapter, { items: [] });
// Simulate attacker inserting large records
for (let i = 0; i < 5000; i++) {
db.data.items.push({ id: i, payload: 'A'.repeat(10000) });
}
// JSON.stringify blocks the event loop
const start = performance.now();
const serialized = JSON.stringify(db.data);
const elapsed = performance.now() - start;
console.log(`${(Buffer.byteLength(serialized) / 1024 / 1024).toFixed(2)} MB`); // 47.80 MB
console.log(`${elapsed.toFixed(0)} ms blocked`); // 138ms — ALL clients unresponsive
import { Low } from 'lowdb';
import { JSONFile } from 'lowdb/node';
const adapter = new JSONFile('db.json');
const db = new Low(adapter, { items: [] });
// Simulate attacker inserting large records
for (let i = 0; i < 5000; i++) {
db.data.items.push({ id: i, payload: 'A'.repeat(10000) });
}
// Low.write() -> JSONFile.write() -> JSON.stringify(data, null, 2) blocks the event loop
const start = performance.now();
await db.write();
const elapsed = performance.now() - start;
const serialized = JSON.stringify(db.data);
console.log(`${(Buffer.byteLength(serialized) / 1024 / 1024).toFixed(2)} MB`); // ~47.80 MB
console.log(`${elapsed.toFixed(0)} ms blocked`); // e.g., 138ms — ALL clients unresponsive

Copilot uses AI. Check for mistakes.
Comment on lines +50 to +56
Copy link

Copilot AI Mar 7, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This measures JSON.stringify(db.data) directly (and without the null, 2 formatting used by JSONFile), so the size/timing numbers won't match the actual JSONFile write behavior. Consider updating the PoC to call await db.write() using the real adapter (or adjust the stringify call to match JSONFile's implementation).

Copilot uses AI. Check for mistakes.
```

**Tested and confirmed:**
- 47.80 MB dataset blocked event loop for **138ms**
- Memory grew unboundedly from 5.10 MB to 7.36 MB (heap) during insertion
- `Low.js` source contains zero resource limit logic

## Impact

- **Memory exhaustion:** Unbounded dataset growth leads to Node.js OOM kill
- **Event loop blocking:** `JSON.stringify` on large datasets blocks all request handling for 100ms+
- **Disk exhaustion:** Each `write()` creates a full `.tmp` copy on disk via steno — at peak, two full copies exist simultaneously
- **Cascading failure:** A single attacker growing the database degrades service for ALL users

## Remediation

Add configurable size limits in the adapter:

```javascript
class BoundedJSONFile extends JSONFile {
#maxSizeBytes;
constructor(filename, maxSizeBytes = 10 * 1024 * 1024) {
super(filename);
this.#maxSizeBytes = maxSizeBytes;
}
async write(data) {
const str = JSON.stringify(data);
if (Buffer.byteLength(str) > this.#maxSizeBytes) {
throw new Error(`Database exceeds ${this.#maxSizeBytes} byte limit`);
}
return super.write(data);
}
Comment on lines +79 to +88
Copy link

Copilot AI Mar 7, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In the remediation example, write() calls JSON.stringify to measure size and then super.write(data) will stringify again (via DataFile), doubling CPU/memory and still blocking the event loop. Prefer enforcing the size limit inside the stringify function you pass to DataFile/JSONFile so serialization happens only once.

Suggested change
super(filename);
this.#maxSizeBytes = maxSizeBytes;
}
async write(data) {
const str = JSON.stringify(data);
if (Buffer.byteLength(str) > this.#maxSizeBytes) {
throw new Error(`Database exceeds ${this.#maxSizeBytes} byte limit`);
}
return super.write(data);
}
const limit = maxSizeBytes;
super(filename, {
stringify(data) {
const str = JSON.stringify(data);
if (Buffer.byteLength(str) > limit) {
throw new Error(`Database exceeds ${limit} byte limit`);
}
return str;
},
});
this.#maxSizeBytes = limit;
}

Copilot uses AI. Check for mistakes.
}
```
Loading