-
Notifications
You must be signed in to change notification settings - Fork 5.5k
[FEATURE] Recursive directories upload for S3 - Fixed #15700
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
The latest updates on your projects. Learn more about Vercel for Git ↗︎ 3 Skipped Deployments
|
WalkthroughThis update augments the S3 file upload functionality by introducing a new method, Changes
Sequence Diagram(s)sequenceDiagram
participant C as Caller
participant U as uploadFolderFiles
participant R as getFilesRecursive
participant FS as FileSystem (fs)
C->>U: Call uploadFolderFiles(folderPath)
U->>R: getFilesRecursive(folderPath)
R->>FS: readdirSync(dir)
Note right of R: Check each entry for directory or file
FS-->>R: Return directory entries and stats
R-->>U: Return full list of file paths (recursive)
U->>U: Construct S3 key using relative paths
U->>C: Proceed with file uploads to S3
Poem
Warning There were issues while running some tools. Please review the errors and either fix the tool’s configuration or disable the tool if it’s a critical failure. 🔧 ESLint
components/aws/actions/s3-upload-file-tmp/s3-upload-file-tmp.mjsOops! Something went wrong! :( ESLint: 8.57.1 Error [ERR_MODULE_NOT_FOUND]: Cannot find package 'jsonc-eslint-parser' imported from /eslint.config.mjs Tip CodeRabbit's docstrings feature is now available as part of our Pro Plan! Simply use the command ✨ Finishing Touches
Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media? 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 2
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
pnpm-lock.yamlis excluded by!**/pnpm-lock.yaml
📒 Files selected for processing (2)
components/aws/actions/s3-upload-file-tmp/s3-upload-file-tmp.mjs(2 hunks)components/aws/package.json(1 hunks)
✅ Files skipped from review due to trivial changes (1)
- components/aws/package.json
⏰ Context from checks skipped due to timeout of 90000ms (3)
- GitHub Check: pnpm publish
- GitHub Check: Verify TypeScript components
- GitHub Check: Publish TypeScript components
🔇 Additional comments (1)
components/aws/actions/s3-upload-file-tmp/s3-upload-file-tmp.mjs (1)
15-15: LGTM! Version increment is appropriate.The version bump from 1.0.2 to 1.0.3 correctly reflects the addition of the recursive directory upload feature.
7743fed to
143ac68
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
♻️ Duplicate comments (2)
components/aws/actions/s3-upload-file-tmp/s3-upload-file-tmp.mjs (2)
40-53: 🛠️ Refactor suggestionConsider improving robustness and performance of recursive file listing.
While the implementation is functional, it has several limitations:
- Uses synchronous file operations that can block the event loop
- Lacks error handling for file system operations
- Could encounter infinite loops with symbolic links
- Missing input validation
- getFilesRecursive(dir) { - let results = []; - const items = fs.readdirSync(dir); - for (const item of items) { - const itemPath = join(dir, item); - const stat = fs.statSync(itemPath); - if (stat.isDirectory()) { - results = results.concat(this.getFilesRecursive(itemPath)); - } else { - results.push(itemPath); - } - } - return results; + async getFilesRecursive(dir) { + if (!dir) throw new Error('Directory path is required'); + + try { + const results = []; + const items = await fs.promises.readdir(dir); + + await Promise.all(items.map(async (item) => { + const itemPath = join(dir, item); + const stat = await fs.promises.lstat(itemPath); + + if (stat.isSymbolicLink()) { + // Skip symbolic links to prevent infinite loops + return; + } + + if (stat.isDirectory()) { + const subDirFiles = await this.getFilesRecursive(itemPath); + results.push(...subDirFiles); + } else { + results.push(itemPath); + } + })); + + return results; + } catch (error) { + throw new Error(`Failed to read directory: ${error.message}`); + } + },
54-81: 🛠️ Refactor suggestionEnhance reliability and user experience of folder uploads.
The current implementation has several limitations:
- No error handling for individual file uploads
- No progress tracking for large directories
- No size limits or validation
- Using Promise.all for all files could cause memory issues with large directories
- const files = this.getFilesRecursive(folderPath); - const response = await Promise.all(files.map(async (filePath) => { - const fileContent = fs.readFileSync(filePath, { - encoding: "base64", - }); - const relativePath = filePath.substring(folderPath.length + 1); - const s3Key = join(prefix, relativePath); - - await uploadFile({ - Bucket: bucket, - Key: s3Key, - Body: Buffer.from(fileContent, "base64"), - }); - return { - filePath, - s3Key, - status: "uploaded", - }; - })); + const MAX_BATCH_SIZE = 10; // Process 10 files at a time + const MAX_FILE_SIZE = 100 * 1024 * 1024; // 100MB limit per file + + const files = await this.getFilesRecursive(folderPath); + const response = []; + + // Process files in batches + for (let i = 0; i < files.length; i += MAX_BATCH_SIZE) { + const batch = files.slice(i, i + MAX_BATCH_SIZE); + const batchResults = await Promise.all(batch.map(async (filePath) => { + try { + const stat = await fs.promises.stat(filePath); + if (stat.size > MAX_FILE_SIZE) { + return { + filePath, + status: "skipped", + error: "File exceeds size limit" + }; + } + + const fileContent = await fs.promises.readFile(filePath, { + encoding: "base64", + }); + const relativePath = filePath.substring(folderPath.length + 1); + const s3Key = join(prefix, relativePath); + + await uploadFile({ + Bucket: bucket, + Key: s3Key, + Body: Buffer.from(fileContent, "base64"), + }); + + // Update progress + $.export("$summary", `Uploaded ${i + batch.indexOf(filePath) + 1}/${files.length} files`); + + return { + filePath, + s3Key, + status: "uploaded", + }; + } catch (error) { + return { + filePath, + status: "failed", + error: error.message + }; + } + })); + response.push(...batchResults); + }
🧹 Nitpick comments (1)
components/aws/actions/s3-upload-file-tmp/s3-upload-file-tmp.mjs (1)
90-94: Maintain consistency in file reading approach.The
uploadSingleFilemethod uses synchronous file reading while we're recommending asynchronous operations in other methods. For consistency, consider using async file operations here as well.- const file = fs.readFileSync(filePath, { - encoding: "base64", - }); + const file = await fs.promises.readFile(filePath, { + encoding: "base64", + });
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
pnpm-lock.yamlis excluded by!**/pnpm-lock.yaml
📒 Files selected for processing (2)
components/aws/actions/s3-upload-file-tmp/s3-upload-file-tmp.mjs(3 hunks)components/aws/package.json(1 hunks)
🚧 Files skipped from review as they are similar to previous changes (1)
- components/aws/package.json
⏰ Context from checks skipped due to timeout of 90000ms (4)
- GitHub Check: pnpm publish
- GitHub Check: Verify TypeScript components
- GitHub Check: Lint Code Base
- GitHub Check: Publish TypeScript components
🔇 Additional comments (6)
components/aws/actions/s3-upload-file-tmp/s3-upload-file-tmp.mjs (6)
10-10: Component name updated and version incremented accordingly.The name change from "Upload File" to "Upload Files" accurately reflects the new capability to handle multiple files. Version increment from 1.0.2 to 1.0.3 follows semantic versioning for feature additions.
Also applies to: 15-15
28-29: Props descriptions updated to match new functionality.The label and description updates clearly communicate that the component now accepts both file and folder paths, which aligns with the new recursive directory upload feature.
111-113: Good error handling for non-existent paths.The explicit check for file existence with a clear error message improves user experience.
114-117: Good implementation of conditional path handling.The implementation correctly determines whether to use folder or file upload based on the path type.
60-60: Ensure synchronization between method signatures.If you implement the async version of
getFilesRecursive, you'll need to update this line to useawait:- const files = this.getFilesRecursive(folderPath); + const files = await this.getFilesRecursive(folderPath);
106-110: Props destructuring is clean and concise.The destructuring of methods and props is a good practice that makes the code more readable.
143ac68 to
3dd34e9
Compare
|
/approve |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
♻️ Duplicate comments (2)
components/aws/actions/s3-upload-file-tmp/s3-upload-file-tmp.mjs (2)
40-53: 🛠️ Refactor suggestionConsider improving robustness and performance of recursive file listing.
While the implementation is functional, there are several improvements that would make it more robust:
- Use async file operations to avoid blocking the event loop
- Add error handling for file system operations
- Add input validation for the directory path
- Handle symbolic links to prevent infinite loops
- getFilesRecursive(dir) { - let results = []; - const items = fs.readdirSync(dir); - for (const item of items) { - const itemPath = join(dir, item); - const stat = fs.statSync(itemPath); - if (stat.isDirectory()) { - results = results.concat(this.getFilesRecursive(itemPath)); - } else { - results.push(itemPath); - } - } - return results; + async getFilesRecursive(dir) { + if (!dir) throw new Error('Directory path is required'); + + try { + const results = []; + const items = await fs.promises.readdir(dir); + + await Promise.all(items.map(async (item) => { + const itemPath = join(dir, item); + try { + const stat = await fs.promises.lstat(itemPath); + + if (stat.isSymbolicLink()) { + // Skip symbolic links to prevent infinite loops + return; + } + + if (stat.isDirectory()) { + const subDirFiles = await this.getFilesRecursive(itemPath); + results.push(...subDirFiles); + } else { + results.push(itemPath); + } + } catch (error) { + // Log error but continue processing other files + console.error(`Error processing ${itemPath}: ${error.message}`); + } + })); + + return results; + } catch (error) { + throw new Error(`Failed to read directory: ${error.message}`); + } + },
54-78: 🛠️ Refactor suggestionEnhance reliability and user experience of folder uploads.
The implementation could benefit from:
- Error handling for individual file uploads
- Progress tracking for large directories
- Size limits and validation
- Batched uploads instead of Promise.all to manage memory usage
- const files = this.getFilesRecursive(folderPath); - const response = await Promise.all(files.map(async (filePath) => { - const fileContent = fs.readFileSync(filePath, { - encoding: "base64", - }); - const relativePath = filePath.substring(folderPath.length + 1); - const s3Key = join(prefix, relativePath); - - await uploadFile({ - Bucket: bucket, - Key: s3Key, - Body: Buffer.from(fileContent, "base64"), - }); - return { - filePath, - s3Key, - status: "uploaded", - }; - })); + const files = await this.getFilesRecursive(folderPath); + const totalFiles = files.length; + const response = []; + + // Process files in batches to manage memory + const BATCH_SIZE = 10; + + for (let i = 0; i < files.length; i += BATCH_SIZE) { + const batch = files.slice(i, i + BATCH_SIZE); + const batchResults = await Promise.all(batch.map(async (filePath) => { + try { + // Update progress + $.export("$summary", `Uploading file ${i + batch.indexOf(filePath) + 1} of ${totalFiles}`); + + const stat = await fs.promises.stat(filePath); + const MAX_FILE_SIZE = 100 * 1024 * 1024; // 100MB limit + + if (stat.size > MAX_FILE_SIZE) { + return { + filePath, + status: "skipped", + error: "File exceeds size limit" + }; + } + + const fileContent = await fs.promises.readFile(filePath, { + encoding: "base64", + }); + const relativePath = filePath.substring(folderPath.length + 1); + const s3Key = join(prefix, relativePath); + + await uploadFile({ + Bucket: bucket, + Key: s3Key, + Body: Buffer.from(fileContent, "base64"), + }); + + return { + filePath, + s3Key, + status: "uploaded", + }; + } catch (error) { + console.error(`Failed to upload ${filePath}: ${error.message}`); + return { + filePath, + status: "failed", + error: error.message + }; + } + })); + response.push(...batchResults); + }
🧹 Nitpick comments (3)
components/aws/actions/s3-upload-file-tmp/s3-upload-file-tmp.mjs (3)
90-94: Consider using async file operations for consistency.For consistency with the proposed changes in other methods, consider using async file operations here too.
- const file = fs.readFileSync(filePath, { - encoding: "base64", - }); - const filename = customFilename || filePath.split("/").pop(); + const file = await fs.promises.readFile(filePath, { + encoding: "base64", + }); + const filename = customFilename || filePath.split("/").pop();
111-113: Consider using async file operations for consistency.The file existence check could also be converted to use promises for consistency.
- if (!fs.existsSync(path)) { - throw new ConfigurationError(`The file or directory path \`${path}\` does not exist. Please verify the path and include the leading /tmp if needed.`); - } + try { + await fs.promises.access(path); + } catch (error) { + throw new ConfigurationError(`The file or directory path \`${path}\` does not exist. Please verify the path and include the leading /tmp if needed.`); + }
10-29: Add file type restrictions to prevent security issues.Consider adding validation to ensure only safe file types are being uploaded, especially when handling recursive uploads which could include unexpected file types.
You could add a
safeFileTypesarray and validation logic in both upload methods:// Add to props or methods safeFileTypes: ['.jpg', '.jpeg', '.png', '.pdf', '.txt', '.csv', '.json', '.xml', '.html', '.css', '.js'], // Example validation function to add to methods isFileTypeAllowed(filePath) { const ext = filePath.toLowerCase().split('.').pop(); return this.safeFileTypes.includes(`.${ext}`); }, // Then in your upload methods, add a check: if (!this.isFileTypeAllowed(filePath)) { return { filePath, status: "skipped", error: "File type not allowed" }; }This helps prevent the upload of potentially harmful files and provides better security.
Also applies to: 105-118
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (2)
components/aws/actions/s3-upload-file-tmp/s3-upload-file-tmp.mjs(3 hunks)components/aws/package.json(1 hunks)
🚧 Files skipped from review as they are similar to previous changes (1)
- components/aws/package.json
⏰ Context from checks skipped due to timeout of 90000ms (4)
- GitHub Check: pnpm publish
- GitHub Check: Verify TypeScript components
- GitHub Check: Lint Code Base
- GitHub Check: Publish TypeScript components
🔇 Additional comments (4)
components/aws/actions/s3-upload-file-tmp/s3-upload-file-tmp.mjs (4)
10-10: Name updated to reflect new multi-file capabilities.The component's name has been updated from "S3 - Upload File - /tmp" to "S3 - Upload Files - /tmp" to reflect the new capability of handling multiple files through recursive directory uploads.
15-15: Version number incremented appropriately.Version bumped from "1.0.2" to "1.0.3" to reflect the new feature addition.
28-29: Updated label and description to indicate directory support.The label and description now clearly indicate that the path can be either a file or a directory, which aligns with the new functionality.
114-117: Ensure consistent async/await usage.If you implement the async version of
getFilesRecursive, make sure to properly await it here and update this code accordingly.- const stat = fs.statSync(path); - return stat.isDirectory() - ? await uploadFolderFiles($, path) - : await uploadSingleFile($, path); + const stat = await fs.promises.stat(path); + return stat.isDirectory() + ? await uploadFolderFiles($, path) + : await uploadSingleFile($, path);
WHY
Resolves #15679
Summary by CodeRabbit
New Features
Chores