Skip to content

[CRITICAL] [lib-storage] Uploading in Safari is in a non-working stateΒ #2365

@ffxsam

Description

@ffxsam

Describe the bug

When uploading from Safari directly to S3 (credentials supplied via Cognito), I get this error:

Body Data is unsupported format, expected data to be one of: string | Uint8Array | Buffer | Readable | ReadableStream | Blob

I'm passing a File object like so:

const upload = new Upload({
  client: s3,
  params: {
    Bucket: process.env.VUE_APP_UPLOAD_BUCKET,
    Body: file,
    ContentType: file.type,
    Key: destKey,
    Metadata: metadata,
  }
});

Your environment

SDK version number

@aws-sdk/[email protected]

Is the issue in the browser/Node.js/ReactNative?

Browser

Details of the browser/Node.js/ReactNative version

Safari 14.0.2 and under

Steps to reproduce

(see code snippet above)

Observed behavior

Uploading a file using the Upload class fails with error:
Body Data is unsupported format, expected data to be one of: string | Uint8Array | Buffer | Readable | ReadableStream | Blob

Expected behavior

I would expect Webkit/Safari to follow W3C standards better. πŸ˜‰ Also, uploads worked totally fine in SDK v2.

Additional context

Pour yourself a coffee and pull up a chair...

So, lib-storage is checking for the existence of data.stream as a function, which is totally legit:

https://github.com/aws/aws-sdk-js-v3/blob/main/lib/lib-storage/src/chunker.ts#L19-L21

File, inheriting from Blob, should implement that. Unfortunately, in Safari, Blob.prototype.stream() does not exist, even though this is a W3C standard drafted way back in May 2019. This wasn't supported in Safari until 14.1.

I'm working around this with the following code, but it's not ideal, as it creates a slight lag for the user as very large files (potentially 70-150MB) are converted to Uint8Array before passing to the Upload constructor:

const fileBuffer = await file.arrayBuffer();
const uploadParams: PutObjectCommandInput = {
  Bucket: process.env.VUE_APP_UPLOAD_BUCKET,
  Body: new Uint8Array(fileBuffer),
  ContentType: file.type,
  Key: destKey,
  Metadata: metadata,
};
const upload = new AWSUpload({
  client: s3,
  params: uploadParams,
});

I don't think there's any ideal workaround at this point. But maybe if this library could handle this scenario instead of putting that on the developer (so they don't wind up wasting time like I did), that would be helpful. Maybe the getChunk function could check for data being an instance of Blob but with data.stream returning undefined, and if so, then call data.arrayBuffer() to parse the data and return it.

If there's a better way I can handle this, I'm open to hearing about it! But if Safari doesn't support Blob streaming, it seems like reading the entire file into memory is the only alternative at this point.

Metadata

Metadata

Assignees

Labels

dependenciesThis issue is a problem in a dependency.p3This is a minor priority issueresponse-requestedWaiting on additional info and feedback. Will move to \"closing-soon\" in 7 days.workaround-availableThis issue has a work around available.

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions