Skip to content

Potential memory leak in @aws-sdk/client-dynamodb #7442

@danielv-funnel

Description

@danielv-funnel

Checkboxes for prior research

Describe the bug

We are running a Fargate Service using Bun with Elysia connecting to Dynamodb and have noticed slowly build-up of memory and been debugging this down to using a simple script (see below).

Using getItems it queries dynamodb with a table of data that returns 340 items

  • This shows a steady memory increase during the duration of the loop, all the time and no indication of stopping.
  • We also tried to build it using node.js and that shows the memory is being garbage collected by fluctuating a bit but still always increasing albeit in slower rate that with bun.
    To compare there is a getRandomItems function which in both bun and node get's capped fairly quickly.

I've have tried to create the client in the loop and doesn't seem to change the underlying issue of memory growing.
Have been trying with bun 1.3.0 and 1.2.16 and node 22.18.0 and @aws-sdk/client-dynamodb 3.859.0 (have tried both with and without the @aws-sdk/lib-dynamodb.

I couldn't find a way to configure it directly, but perhaps there is a way to configure for i.e ever growing socket/connections or a preferred way of working with the client in long running processes? Or if you could find why the memory grows?

Regression Issue

  • Select this option if this issue appears to be a regression.

SDK version number

@aws-sdk/client-dynamodb

Which JavaScript Runtime is this issue in?

Node.js

Details of the browser/Node.js/ReactNative version

v22.18.0

Reproduction Steps

const command = new QueryCommand({
  TableName: 'THE_NAME',
  KeyConditionExpression: keyConditionExpression,
  ExpressionAttributeValues: expressionAttributeValues,
});
const dClient = new DynamoDBClient();
const dynamoDb = DynamoDBDocumentClient.from(dClient);

async function main() {
  for (let i = 1; i <= 1000; i++) {
    const result = await getItems();
    console.log(
      `${i} - Mem: ${process.memoryUsage.rss()} - ItemCount: ${result?.length}`
    );
  }
}

async function getItems() {
  try {
    const result = await dynamoDb.send(command);
    return result?.Items;
  } catch (error) {
    console.error(error);
    return [];
  }
}

async function getRandomItems() {
  return new Promise<any[]>((resolve) => {
    setTimeout(() => {
      const result: any[] = [];
      for (let i = 0; i < 100000; i++) {
        result.push({
          id: `something-${Math.random().toString()}`,
          increment: i,
          name: `Random Item ${i}`,
          variantName: `Random Item Variant ${i}`,
          unit: 'number',
          description: `Random Item Description ${i}`,
          editor: 'text',
          template: 'text',
          format: 'text',
          aggregationType: 'text',
        });
      }
      resolve(result);
    }, 100);
  });
}

Observed Behavior

Running queries against dynamodb

bun run
1 - Mem: 97.239.040 - ItemCount: 341
100 - Mem: 155.516.928 - ItemCount: 341
300 - Mem: 172.572.672 - ItemCount: 341
500 - Mem: 187.203.584 - ItemCount: 341
700 - Mem: 206.520.320 - ItemCount: 341
900 - Mem: 221.855.744 - ItemCount: 341
~127% increase from start
~42% increase from 100

node run
1 - Mem: 88.489.984 - ItemCount: 341
100 - Mem: 130.383.872 - ItemCount: 341
300 - Mem: 122.961.920 - ItemCount: 341
500 - Mem: 131.907.584 - ItemCount: 341
700 - Mem: 133.627.904 - ItemCount: 341
900 - Mem: 135.675.904 - ItemCount: 341
1000 - Mem: 137.854.976 - ItemCount: 341
~55% increase from start
~5% increase from 100

Running against in-memory create random items

bun run
1 - Mem: 96.698.368 - ItemCount: 100000
100 - Mem: 177.127.424 - ItemCount: 100000
300 - Mem: 177.274.880 - ItemCount: 100000
500 - Mem: 177.291.264 - ItemCount: 100000
700 - Mem: 177.307.648 - ItemCount: 100000
900 - Mem: 177.307.648 - ItemCount: 100000
1000 - Mem: 177.307.648 - ItemCount: 100000
~84 % increase from start
<1% increase from 100

node run
1 - Mem: 109.117.440 - ItemCount: 100000
100 - Mem: 421.871.616 - ItemCount: 100000
300 - Mem: 424.148.992 - ItemCount: 100000
500 - Mem: 424.476.672 - ItemCount: 100000
700 - Mem: 425.541.632 - ItemCount: 100000
900 - Mem: 425.951.232 - ItemCount: 100000
1000 - Mem: 431.652.864 - ItemCount: 100000
~ 295% increase from start
~ 2.3% increase from 100

Expected Behavior

I would expect a short memory bump and then keep stable.

Possible Solution

No response

Additional Information/Context

No response

Metadata

Metadata

Assignees

Labels

closing-soonThis issue will automatically close in 4 days unless further comments are made.guidanceGeneral information and guidance, answers to FAQs, or recommended best practices/resources.third-partyThis issue is related to third-party libraries or applications.

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions