-
Notifications
You must be signed in to change notification settings - Fork 634
Open
Labels
bugThis issue is a bug.This issue is a bug.p2This is a standard priority issueThis is a standard priority issuequeuedThis issues is on the AWS team's backlogThis issues is on the AWS team's backlog
Description
Checkboxes for prior research
- I've gone through Developer Guide and API reference
- I've checked AWS Forums and StackOverflow.
- I've searched for previous similar issues and didn't find any solution.
Describe the bug
I have bundled @aws-sdk/lib-storage
and @aws-sdk/client-s3
with webpack for use it on frontend
If upload a file of 112 GB, after 25 GB chrome crash for out of memory
SDK version number
@aws-sdk/[email protected]
, @aws-sdk/[email protected]
Which JavaScript Runtime is this issue in?
Browser
Details of the browser/Node.js/ReactNative version
Latest chrome
Reproduction Steps
I have bundled @aws-sdk/lib-storage
and @aws-sdk/client-s3
with webpack
for use it on frontend
package.json
{
"name": "aws-s3-upload",
"version": "1.0.0",
"description": "",
"main": "index.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1",
"build": "webpack"
},
"author": "",
"license": "ISC",
"dependencies": {
"@aws-sdk/client-s3": "^3.499.0",
"@aws-sdk/lib-storage": "^3.499.0"
},
"devDependencies": {
"path-browserify": "^1.0.1",
"webpack": "^5.90.0",
"webpack-cli": "^5.1.4"
}
}
browser.js
const { S3Client } = require('@aws-sdk/client-s3');
const { Upload } = require("@aws-sdk/lib-storage");
window.AWS = { S3Client, Upload }
export { S3Client, Upload };
webpack.config.js
// Import path for resolving file paths
var path = require("path");
module.exports = {
mode: 'production',
performance: {
hints: false
},
// Specify the entry point for our app.
entry: [path.join(__dirname, "browser.js")],
// Specify the output file containing our bundled code.
output: {
path: __dirname,
filename: 'bundle.js'
},
// Enable WebPack to use the 'path' package.
resolve: {
fallback: { path: require.resolve("path-browserify") }
}
};
Build produce bundle.js
npm run buid
usage
index.html
<script src="bundle.js"></script>
<script>
const s3Client = new AWS.S3Client({
region: ...,
endpoint: ...,
credentials: {
accessKeyId: ...,
secretAccessKey: ...,
sessionToken: ...,
}
});
const uploadParams = {
Bucket: ...,
Key: ...,
Body: file // file from <input type="file">
};
const upload = new AWS.Upload({
client: s3Client ,
params: uploadParams,
queueSize: 4, // optional concurrency configuration
partSize: 1024 * 1024 * 5, // optional size of each part
leavePartsOnError: false, // optional manually handle dropped parts
});
upload .on("httpUploadProgress", (progress) => {
console.log(progress);
});
await upload.done();
</script>
If upload a file of 112 GB, after 25 GB chrome crash for out of memory.
Observed Behavior
Chrome crash for out of memory after 25 GB of uploaded file
Expected Behavior
no crash
Possible Solution
No response
Additional Information/Context
Side note: it works with SDK 2.0
Metadata
Metadata
Assignees
Labels
bugThis issue is a bug.This issue is a bug.p2This is a standard priority issueThis is a standard priority issuequeuedThis issues is on the AWS team's backlogThis issues is on the AWS team's backlog