Skip to content

feat: parallel chunkedUpload in web and node #1146

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Draft
wants to merge 5 commits into
base: master
Choose a base branch
from
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
113 changes: 90 additions & 23 deletions templates/node/src/client.ts.twig
Original file line number Diff line number Diff line change
Expand Up @@ -67,6 +67,7 @@ function getUserAgent() {

class Client {
static CHUNK_SIZE = 1024 * 1024 * 5;
Copy link
Preview

Copilot AI Aug 10, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The MAX_CONCURRENCY constant lacks documentation explaining why 6 was chosen as the default value and how it affects performance or server load.

Suggested change
static CHUNK_SIZE = 1024 * 1024 * 5;
static CHUNK_SIZE = 1024 * 1024 * 5;
/**
* The maximum number of concurrent upload or network operations.
*
* The default value of 6 is chosen as a conservative balance between
* maximizing throughput and avoiding excessive server or client resource usage.
* Increasing this value may improve performance on high-bandwidth or high-CPU environments,
* but can also increase server load and risk rate limiting or throttling.
* Adjust as needed based on your application's requirements and server capabilities.
*/

Copilot uses AI. Check for mistakes.

static MAX_CONCURRENCY = 6;

config = {
endpoint: '{{ spec.endpoint }}',
Expand Down Expand Up @@ -211,38 +212,104 @@ class Client {
return await this.call(method, url, headers, originalPayload);
}

let start = 0;
let response = null;
const totalChunks = Math.ceil(file.size / Client.CHUNK_SIZE);

while (start < file.size) {
let end = start + Client.CHUNK_SIZE; // Prepare end for the next chunk
if (end >= file.size) {
end = file.size; // Adjust for the last chunk to include the last byte
}
const firstChunkStart = 0;
const firstChunkEnd = Math.min(Client.CHUNK_SIZE, file.size);
const firstChunk = file.slice(firstChunkStart, firstChunkEnd);

headers['content-range'] = `bytes ${start}-${end-1}/${file.size}`;
const chunk = file.slice(start, end);
const firstChunkHeaders = { ...headers };
firstChunkHeaders['content-range'] = `bytes ${firstChunkStart}-${firstChunkEnd - 1}/${file.size}`;

const firstPayload = { ...originalPayload };
firstPayload[fileParam] = new File([firstChunk], file.name);

const firstResponse = await this.call(method, url, firstChunkHeaders, firstPayload);

if (!firstResponse?.$id) {
throw new Error('First chunk upload failed - no ID returned');
Copy link
Preview

Copilot AI Aug 10, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The error message 'First chunk upload failed - no ID returned' could be more helpful by including the actual response or HTTP status code to aid debugging.

Suggested change
throw new Error('First chunk upload failed - no ID returned');
throw new Error(
`First chunk upload failed - no ID returned. Response: ${JSON.stringify(firstResponse)}`
);

Copilot uses AI. Check for mistakes.

}

let payload = { ...originalPayload };
payload[fileParam] = new File([chunk], file.name);
let completedChunks = 1;
let totalUploaded = firstChunkEnd;

if (onProgress && typeof onProgress === 'function') {
onProgress({
$id: firstResponse.$id,
progress: Math.round((totalUploaded / file.size) * 100),
sizeUploaded: totalUploaded,
chunksTotal: totalChunks,
chunksUploaded: completedChunks
});
}

response = await this.call(method, url, headers, payload);
if (totalChunks === 1) {
return firstResponse;
}

if (onProgress && typeof onProgress === 'function') {
onProgress({
$id: response.$id,
progress: Math.round((end / file.size) * 100),
sizeUploaded: end,
chunksTotal: Math.ceil(file.size / Client.CHUNK_SIZE),
chunksUploaded: Math.ceil(end / Client.CHUNK_SIZE)
});
let response = firstResponse;

for (let chunkIndex = 1; chunkIndex < totalChunks; chunkIndex += Client.MAX_CONCURRENCY) {
const batchEnd = Math.min(chunkIndex + Client.MAX_CONCURRENCY, totalChunks);

const batchPromises = [];
for (let i = chunkIndex; i < batchEnd; i++) {
const start = i * Client.CHUNK_SIZE;
const end = Math.min(start + Client.CHUNK_SIZE, file.size);

batchPromises.push((async () => {
const chunk = file.slice(start, end);
const chunkHeaders = { ...headers };
chunkHeaders['content-range'] = `bytes ${start}-${end - 1}/${file.size}`;
chunkHeaders['x-{{spec.title | caseLower}}-id'] = firstResponse.$id;

const payload = { ...originalPayload };
payload[fileParam] = new File([chunk], file.name);

try {
const chunkResponse = await this.call(method, url, chunkHeaders, payload);
return {
success: true,
response: chunkResponse,
chunkInfo: { index: i, start, end },
error: null
};
} catch (error) {
return {
success: false,
response: null,
chunkInfo: { index: i, start, end },
error
};
}
})());
}

if (response && response.$id) {
headers['x-{{spec.title | caseLower }}-id'] = response.$id;
const batchResults = await Promise.all(batchPromises);

const failures = batchResults.filter(result => !result.success);
if (failures.length > 0) {
const errorMessages = failures.map(f => `Chunk ${f.chunkInfo.index}: ${f.error}`);
Copy link
Preview

Copilot AI Aug 10, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The error object is being directly concatenated to a string, which may not provide meaningful error information. Consider using f.error.message || f.error.toString() for better error reporting.

Suggested change
const errorMessages = failures.map(f => `Chunk ${f.chunkInfo.index}: ${f.error}`);
const errorMessages = failures.map(f => `Chunk ${f.chunkInfo.index}: ${f.error?.message || f.error?.toString()}`);

Copilot uses AI. Check for mistakes.

throw new Error(`Chunk upload failures: ${errorMessages.join(', ')}`);
}

start = end;
for (const result of batchResults) {
if (result.success) {
completedChunks++;
totalUploaded += (result.chunkInfo.end - result.chunkInfo.start);
response = result.response;

if (onProgress && typeof onProgress === 'function') {
onProgress({
$id: firstResponse.$id,
progress: Math.round((totalUploaded / file.size) * 100),
sizeUploaded: totalUploaded,
chunksTotal: totalChunks,
chunksUploaded: completedChunks
});
}
}
}
}

return response;
Expand Down
115 changes: 91 additions & 24 deletions templates/web/src/client.ts.twig
Original file line number Diff line number Diff line change
Expand Up @@ -296,6 +296,7 @@ class {{spec.title | caseUcfirst}}Exception extends Error {
*/
class Client {
static CHUNK_SIZE = 1024 * 1024 * 5;
Copy link
Preview

Copilot AI Aug 10, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The MAX_CONCURRENCY constant lacks documentation explaining why 6 was chosen as the default value and how it affects performance or server load.

Suggested change
static CHUNK_SIZE = 1024 * 1024 * 5;
static CHUNK_SIZE = 1024 * 1024 * 5;
/**
* The maximum number of concurrent requests allowed.
*
* The default value of 6 is chosen as a balance between maximizing throughput
* and minimizing server load or rate-limiting issues. Increasing this value
* may improve performance for high-bandwidth clients or servers, but can
* also lead to higher resource usage and potential throttling by the server.
* Decreasing it can reduce load but may slow down operations that require
* multiple concurrent requests (such as multipart uploads).
*/

Copilot uses AI. Check for mistakes.

static MAX_CONCURRENCY = 6;

/**
* Holds configuration such as project.
Expand Down Expand Up @@ -639,38 +640,104 @@ class Client {
return await this.call(method, url, headers, originalPayload);
}

let start = 0;
let response = null;
const totalChunks = Math.ceil(file.size / Client.CHUNK_SIZE);

const firstChunkStart = 0;
const firstChunkEnd = Math.min(Client.CHUNK_SIZE, file.size);
const firstChunk = file.slice(firstChunkStart, firstChunkEnd);

const firstChunkHeaders = { ...headers };
firstChunkHeaders['content-range'] = `bytes ${firstChunkStart}-${firstChunkEnd - 1}/${file.size}`;

const firstPayload = { ...originalPayload };
firstPayload[fileParam] = new File([firstChunk], file.name);

const firstResponse = await this.call(method, url, firstChunkHeaders, firstPayload);

if (!firstResponse?.$id) {
throw new Error('First chunk upload failed - no ID returned');
Copy link
Preview

Copilot AI Aug 10, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The error message 'First chunk upload failed - no ID returned' could be more helpful by including the actual response or HTTP status code to aid debugging.

Suggested change
throw new Error('First chunk upload failed - no ID returned');
throw new Error(
`First chunk upload failed - no ID returned. Response: ${JSON.stringify(firstResponse)}`
);

Copilot uses AI. Check for mistakes.

}

while (start < file.size) {
let end = start + Client.CHUNK_SIZE; // Prepare end for the next chunk
if (end >= file.size) {
end = file.size; // Adjust for the last chunk to include the last byte
}
let completedChunks = 1;
let totalUploaded = firstChunkEnd;

if (onProgress && typeof onProgress === 'function') {
onProgress({
$id: firstResponse.$id,
progress: Math.round((totalUploaded / file.size) * 100),
sizeUploaded: totalUploaded,
chunksTotal: totalChunks,
chunksUploaded: completedChunks
});
}

headers['content-range'] = `bytes ${start}-${end-1}/${file.size}`;
const chunk = file.slice(start, end);
if (totalChunks === 1) {
return firstResponse;
}

let payload = { ...originalPayload };
payload[fileParam] = new File([chunk], file.name);
let response = firstResponse;

for (let chunkIndex = 1; chunkIndex < totalChunks; chunkIndex += Client.MAX_CONCURRENCY) {
const batchEnd = Math.min(chunkIndex + Client.MAX_CONCURRENCY, totalChunks);

const batchPromises = [];
for (let i = chunkIndex; i < batchEnd; i++) {
const start = i * Client.CHUNK_SIZE;
const end = Math.min(start + Client.CHUNK_SIZE, file.size);

batchPromises.push((async () => {
const chunk = file.slice(start, end);
const chunkHeaders = { ...headers };
chunkHeaders['content-range'] = `bytes ${start}-${end - 1}/${file.size}`;
chunkHeaders['x-{{spec.title | caseLower}}-id'] = firstResponse.$id;

const payload = { ...originalPayload };
payload[fileParam] = new File([chunk], file.name);

try {
const chunkResponse = await this.call(method, url, chunkHeaders, payload);
return {
success: true,
response: chunkResponse,
chunkInfo: { index: i, start, end },
error: null
};
} catch (error) {
return {
success: false,
response: null,
chunkInfo: { index: i, start, end },
error
};
}
})());
}

response = await this.call(method, url, headers, payload);
const batchResults = await Promise.all(batchPromises);

if (onProgress && typeof onProgress === 'function') {
onProgress({
$id: response.$id,
progress: Math.round((end / file.size) * 100),
sizeUploaded: end,
chunksTotal: Math.ceil(file.size / Client.CHUNK_SIZE),
chunksUploaded: Math.ceil(end / Client.CHUNK_SIZE)
});
const failures = batchResults.filter(result => !result.success);
if (failures.length > 0) {
const errorMessages = failures.map(f => `Chunk ${f.chunkInfo.index}: ${f.error}`);
Copy link
Preview

Copilot AI Aug 10, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The error object is being directly concatenated to a string, which may not provide meaningful error information. Consider using f.error.message || f.error.toString() for better error reporting.

Suggested change
const errorMessages = failures.map(f => `Chunk ${f.chunkInfo.index}: ${f.error}`);
const errorMessages = failures.map(f => `Chunk ${f.chunkInfo.index}: ${f.error && (f.error.message || f.error.toString())}`);

Copilot uses AI. Check for mistakes.

throw new Error(`Chunk upload failures: ${errorMessages.join(', ')}`);
}

if (response && response.$id) {
headers['x-{{spec.title | caseLower }}-id'] = response.$id;
for (const result of batchResults) {
if (result.success) {
completedChunks++;
totalUploaded += (result.chunkInfo.end - result.chunkInfo.start);
response = result.response;

if (onProgress && typeof onProgress === 'function') {
onProgress({
$id: firstResponse.$id,
progress: Math.round((totalUploaded / file.size) * 100),
sizeUploaded: totalUploaded,
chunksTotal: totalChunks,
chunksUploaded: completedChunks
});
}
}
}

start = end;
}

return response;
Expand Down