-
Notifications
You must be signed in to change notification settings - Fork 637
Description
Checkboxes for prior research
- I've gone through Developer Guide and API reference
- I've checked AWS Forums and StackOverflow.
- I've searched for previous similar issues and didn't find any solution.
Describe the bug
When using the the require('@aws-sdk/lib-storage').Upload class, depending on the size of the file, the value for .Location on the returned object can be encoded in different ways.
SDK version number
@aws-sdk/[email protected], @aws-sdk/[email protected]
Which JavaScript Runtime is this issue in?
Node.js
Details of the browser/Node.js/ReactNative version
v16.20.0
Reproduction Steps
const S3 = require('@aws-sdk/client-s3');
const { Upload } = require('@aws-sdk/lib-storage');
const fs = require('fs');
const uuidv4 = require('uuid').v4;
const bucket = 'bucket';
const csvStream = fs.createReadStream('./file.csv');
const s3Client = new S3.S3Client({ region: 'us-east-1' });
const csvUpload = new Upload({
client: s3Client,
params: {
Body: csvStream,
Bucket: process.env.BUCKET
ContentEncoding: 'gzip',
ContentType: 'text/csv',
Key: `foo/bar.csv`,
},
});
(async () => {
const csvResponse = await csvUpload.done();
console.log(csvResponse.Location);
})();Observed Behavior
If file.csv is small enough to fit into one chunk such that it's uploaded via the this.__uploadUsingPut method than I get https://bucket.s3.us-east-1.amazonaws.com/foo/bar.csv printed out as expected. However, if file.csv is huge (e.g. ~300MB in our case) and is uploaded via the this.__createMultipartUpload function than I get back https://bucket.s3.us-east-1.amazonaws.com/foo%2Fbar.csv which has the / improperly encoded.
Expected Behavior
That regardless of size of file.csv, that I'd get back the same location: https://bucket.s3.us-east-1.amazonaws.com/foo/bar.csv
Possible Solution
No response
Additional Information/Context
Looking at the source code, this looks like a bug within the CompleteMultipartUploadOutput within the @aws-sdk/client-s3 library, which then I think that means would mean it's a bug within the AWS S3 service?