Skip to content

lib-storage: upload file of 112 GB crash chrome for out of memory (after 25 GB) #5730

@cesco69

Description

@cesco69

Checkboxes for prior research

Describe the bug

I have bundled @aws-sdk/lib-storage and @aws-sdk/client-s3 with webpack for use it on frontend
If upload a file of 112 GB, after 25 GB chrome crash for out of memory

SDK version number

@aws-sdk/[email protected], @aws-sdk/[email protected]

Which JavaScript Runtime is this issue in?

Browser

Details of the browser/Node.js/ReactNative version

Latest chrome

Reproduction Steps

I have bundled @aws-sdk/lib-storage and @aws-sdk/client-s3 with webpack for use it on frontend

package.json

{
  "name": "aws-s3-upload",
  "version": "1.0.0",
  "description": "",
  "main": "index.js",
  "scripts": {
    "test": "echo \"Error: no test specified\" && exit 1",
    "build": "webpack"
  },
  "author": "",
  "license": "ISC",
  "dependencies": {
    "@aws-sdk/client-s3": "^3.499.0",
    "@aws-sdk/lib-storage": "^3.499.0"
  },
  "devDependencies": {
    "path-browserify": "^1.0.1",
    "webpack": "^5.90.0",
    "webpack-cli": "^5.1.4"
  }
}

browser.js

const { S3Client } = require('@aws-sdk/client-s3');
const { Upload } = require("@aws-sdk/lib-storage");
window.AWS = { S3Client, Upload }
export { S3Client, Upload };

webpack.config.js

// Import path for resolving file paths
var path = require("path");
module.exports = {
  mode: 'production',
  performance: {
    hints: false
  },
  // Specify the entry point for our app.
  entry: [path.join(__dirname, "browser.js")],
  // Specify the output file containing our bundled code.
  output: {
    path: __dirname,
    filename: 'bundle.js'
  },
  // Enable WebPack to use the 'path' package.
  resolve: {
    fallback: { path: require.resolve("path-browserify") }
  }
};

Build produce bundle.js

npm run buid

usage

index.html

 <script src="bundle.js"></script>
 <script>
  const s3Client = new AWS.S3Client({
    region:  ...,
    endpoint:  ...,
    credentials: {
          accessKeyId: ...,
          secretAccessKey:  ...,
          sessionToken:  ...,
     }
   });
   const uploadParams = {
     Bucket: ...,
     Key: ...,
     Body: file // file from <input type="file">
   };
  const upload = new AWS.Upload({
    client: s3Client ,
    params: uploadParams,
    queueSize: 4, // optional concurrency configuration
    partSize: 1024 * 1024 * 5, // optional size of each part
    leavePartsOnError: false, // optional manually handle dropped parts
  });

  upload .on("httpUploadProgress", (progress) => {
    console.log(progress);
  });
  await upload.done();
</script>

If upload a file of 112 GB, after 25 GB chrome crash for out of memory.

Observed Behavior

Chrome crash for out of memory after 25 GB of uploaded file

Expected Behavior

no crash

Possible Solution

No response

Additional Information/Context

Side note: it works with SDK 2.0

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugThis issue is a bug.p2This is a standard priority issuequeuedThis issues is on the AWS team's backlog

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions