Skip to content

Commit 63b600a

Browse files
authored
fix: cap stream size to node.js max buffer size (#715)
Now that we can handle scanning larger files, there's a very small chance of a failure if the file size is greater than the max file size allowed by node.js AND it has an ELF header The issue here is it can fail for large ELF files (> 4GB). The chance if this is near zero, I'm not aware of any go modules this large, but if someone has another executable that is this large (still pretty unlikely), it could crash the scan. So I've just added one more check
1 parent 05548c8 commit 63b600a

File tree

2 files changed

+46
-1
lines changed

2 files changed

+46
-1
lines changed

lib/go-parser/index.ts

Lines changed: 9 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -132,7 +132,15 @@ async function findGoBinaries(
132132

133133
if (first4Bytes === elfHeaderMagic) {
134134
// Now that we know it's an ELF file, allocate the buffer
135-
buffer = Buffer.alloc(streamSize ?? elfBuildInfoSize);
135+
// If the streamSize is larger than node.js's max buffer length
136+
// we should cap the size at that value. The liklihood
137+
// of a node module being this size is near zero, so we should
138+
// be okay doing this
139+
const bufferSize = Math.min(
140+
streamSize ?? elfBuildInfoSize,
141+
require("buffer").constants.MAX_LENGTH,
142+
);
143+
buffer = Buffer.alloc(bufferSize);
136144

137145
bytesWritten += Buffer.from(chunk).copy(buffer, bytesWritten, 0);
138146

test/unit/go-parser-memory-stream.spec.ts

Lines changed: 37 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -32,6 +32,43 @@ describe("Go parser memory allocation fix", () => {
3232
// If we reach here without memory errors, the fix works
3333
});
3434

35+
it("should cap buffer size at Node.js max buffer length for large ELF files", async () => {
36+
// Create ELF content that would trigger large buffer allocation
37+
const elfContent = Buffer.concat([
38+
Buffer.from("\x7FELF"), // ELF magic
39+
Buffer.alloc(1000, 0), // Some content
40+
]);
41+
const stream = createStreamFromBuffer(elfContent);
42+
const bigOlFileSize = 5 * 1024 * 1024 * 1024; // 5GB - exceeds max buffer length
43+
44+
// Mock elf.parse to avoid complexity and track buffer allocation
45+
const originalParse = elf.parse;
46+
let allocatedBufferSize: number | undefined;
47+
48+
// Mock Buffer.alloc to capture the size being allocated
49+
const originalAlloc = Buffer.alloc;
50+
Buffer.alloc = jest.fn().mockImplementation((size: number) => {
51+
allocatedBufferSize = size;
52+
return originalAlloc.call(Buffer, Math.min(size, 1024)); // Allocate small buffer for test
53+
});
54+
55+
elf.parse = jest.fn().mockReturnValue({ body: { sections: [] } });
56+
57+
// Act
58+
await findGoBinaries(stream, bigOlFileSize);
59+
60+
// Assert - buffer size should be capped at Node.js max, not the huge reported size
61+
expect(allocatedBufferSize).toBeDefined();
62+
expect(allocatedBufferSize).toEqual(
63+
require("buffer").constants.MAX_LENGTH,
64+
);
65+
expect(allocatedBufferSize).toBeLessThan(bigOlFileSize);
66+
67+
// Restore mocks
68+
Buffer.alloc = originalAlloc;
69+
elf.parse = originalParse;
70+
});
71+
3572
it("should still process legitimate ELF files", async () => {
3673
// Ensure we didn't break existing functionality
3774
const goBinaryPath = path.join(

0 commit comments

Comments
 (0)