Skip to content

chore: setup changesets for automated versioning and npm publishing (#128)#1

Merged
Cybersoulja merged 27 commits intomainfrom
claude/add-claude-documentation-KjgQv
Apr 6, 2026
Merged

chore: setup changesets for automated versioning and npm publishing (#128)#1
Cybersoulja merged 27 commits intomainfrom
claude/add-claude-documentation-KjgQv

Conversation

@Cybersoulja
Copy link
Copy Markdown
Owner

@Cybersoulja Cybersoulja commented Apr 6, 2026

Replaces tag-based publishing with changesets workflow. Adds publint
for package validation, changelog-github plugin for PR-linked changelogs,
and npm trusted publishing via OIDC provenance.

Co-authored-by: Claude Opus 4.6 noreply@anthropic.com

Summary by Sourcery

Introduce object duplication capabilities in the dashboard and backend, strengthen metadata and share-link handling in the worker API, and expand automated testing and release workflows with Changesets-driven publishing.

New Features:

  • Add file and folder duplication from the dashboard context menu backed by a new bucket copy API endpoint.
  • Introduce Playwright-based E2E test suite and Vitest-powered component/unit tests for the dashboard, plus helpers and configuration for local E2E runs.

Bug Fixes:

  • Prevent share-link download counters from incrementing when the underlying file is missing.
  • Return HTTP 400 responses when upload endpoints receive malformed base64-encoded custom or HTTP metadata instead of generic server errors.
  • Harden key decoding logic for head and metadata endpoints to better handle edge-case encodings and align with object retrieval behavior.
  • Sanitize Content-Disposition filenames to avoid header injection and ensure proper Unicode handling.
  • Guard the email reception handler against missing R2 bucket bindings to avoid runtime crashes.

Enhancements:

  • Type share-link metadata via a new ShareMetadata interface and tighten middleware typing for read-only handling.
  • Expose documentation files in the published worker package and include publint validation in the release pipeline.
  • Refine dashboard downloads to use the centralized API handler with error notifications, and improve bucket listing by removing an incorrect CORS header.

Build:

  • Switch release workflow from tag-based publishing to a Changesets-driven pipeline that validates, versions, and publishes via pnpm with npm provenance.
  • Add publint as a package validation step in CI and wire changesets commands into the root and worker package scripts.
  • Extend the build workflow to run dashboard tests, introduce a dedicated E2E workflow job with Playwright reports, and upgrade artifact upload actions.

CI:

  • Add a Changesets check workflow on pull requests to enforce presence of a changeset for relevant changes.

Deployment:

  • Provide a dedicated Wrangler config and dev entrypoint for running the worker in E2E environments with the built dashboard attached.

Documentation:

  • Document project architecture, workflows, and contributor guidelines in a new CLAUDE.md file and mark internal packages as private in their manifests.

Tests:

  • Add extensive Vitest coverage for dashboard routing, core file UI components, and stores, along with shared testing helpers and setup.
  • Introduce integration tests for copy-object behavior and for malformed metadata handling in upload and multipart endpoints.

G4brym and others added 27 commits February 28, 2026 16:49
…4brym#128)

Replaces tag-based publishing with changesets workflow. Adds publint
for package validation, changelog-github plugin for PR-linked changelogs,
and npm trusted publishing via OIDC provenance.

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
)

* fix: use authenticated fetch for context menu downloads

When basic auth is enabled, the downloadObject method in FileContextMenu
created a direct <a> link to the API endpoint. Browser-initiated downloads
via <a> links do not include custom Authorization headers, causing 401
errors (especially in Chrome).

This changes the download to use apiHandler.downloadFile() which goes
through the Axios instance with proper auth headers already configured,
then creates a temporary blob URL for the browser download.

Fixes G4brym#63

* Add changeset for basic auth download fix

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

---------

Co-authored-by: Debian <openclaw@openclaw.1.1.1.1>
Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
Ensures every PR includes a changeset file, preventing merges
that skip versioning. Runs `changeset status --since=origin/main`
on all PRs.

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
Updates the engines field in the dashboard package.json from
"^20 || ^18 || ^16 || ^14.19" to "^24 || ^22 || ^20 || ^18",
fixing the publish CI which uses Node 24.

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
* fix: add missing publint script to package.json

The release workflow runs `pnpm --filter r2-explorer publint` but the
package had no "publint" script defined, causing CI to fail with
ERR_PNPM_RECURSIVE_RUN_NO_SCRIPT.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* chore: add empty changeset

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
* test: add comprehensive dashboard UI testing infrastructure

Add Vitest component tests (110 tests across 13 files) covering utilities,
Pinia stores, Vue components, and router configuration. Add Playwright E2E
test scaffolding and integrate dashboard tests into CI pipeline.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* chore: add empty changeset for testing infrastructure

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* ci: add Playwright E2E tests to GitHub Actions

Add a separate CI job that builds the dashboard, installs Chromium, and
runs Playwright E2E tests against a local wrangler dev server. Includes
a dedicated E2E wrangler config without auth for CI compatibility.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix: make E2E tests work by using worker dev directory for deps

Move E2E worker entry and wrangler config to packages/worker/dev/ so
worker dependencies (hono, chanfana, zod) resolve correctly. Fix app-loads
test to use getByRole for strict element matching. Remove login E2E tests
since auth is not configured in the E2E environment.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* test: add comprehensive E2E tests covering all dashboard features

Add 35 Playwright E2E tests across 10 spec files covering:
- File operations: create folder/file, delete, rename
- File preview: text, JSON, markdown with filename header
- File browsing: breadcrumbs, table columns, sorting
- Navigation: folder drill-down, breadcrumb back, sidebar
- Context menu: file vs folder menu items, open actions
- Search: prefix search and clear
- Metadata: open dialog, add custom metadata
- Share links: create dialog, generate link, manage shares
- Email: list with sender/subject, detail view, navigation

Also adds API helpers (helpers.ts) for seeding test data via
the worker upload/folder/delete endpoints.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* docs: add testing requirements to CLAUDE.md

Require E2E test coverage for any UI changes and document
test commands for component and E2E tests.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* test: expand E2E coverage with upload, download, preview, email, and share link tests

Add 11 new E2E tests covering file upload/download, CSV/HTML preview,
file edit/save/cancel, email body content, mark read/unread, share link
options (expiration/password/max downloads), manage dialog, and revocation.
Fix strict mode violations in app-loads and share-links tests.
Set workers to 1 to prevent parallel race conditions on shared R2 bucket.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
…m#134)

* chore: include CHANGELOG.md and markdown files in docs package

Add "files" field to docs package.json to ensure CHANGELOG.md and all
.md files are included in the published package.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix: include CHANGELOG.md in worker package, revert docs change

Add CHANGELOG.md to the worker package "files" field so it's included
in the published npm package. Revert the mistaken docs package change.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* chore: copy CHANGELOG.md and docs markdown files into worker package

Update the worker build script to copy CHANGELOG.md from root and all
docs markdown files into a docs/ directory, so they are bundled in the
published npm package. Add the generated paths to .gitignore.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
)

The build script was copying CHANGELOG.md from the repo root, but
changesets generates it directly in packages/worker/. Removed the
redundant copy that referenced a non-existent root file.

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Mark docs-new and r2-explorer-dev-worker as private to prevent
changesets from attempting to publish them to npm. Also add
r2-explorer-dev-worker to the changeset ignore list.

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
…G4brym#144)

The `c.header("Access-Control-Allow-Credentials", "asads")` line in the
ListObjects handler sets an invalid value for the header (valid values
are "true" or the header should be omitted). This appears to be a debug
leftover. CORS is already properly handled by the hono cors() middleware
when enabled via config.

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
…#149)

Wrap base64 + JSON decoding of customMetadata and httpMetadata query
parameters in try-catch blocks for PutObject and CreateUpload endpoints.
Previously, malformed input would crash with an unhandled exception
returning a 500 error. Now returns a descriptive 400 error instead.

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
)

If no R2 bucket is configured via emailRouting.targetBucket and no
bucket binding exists in the environment, the email handler would
crash with a TypeError when calling bucket.put() on undefined.

This adds a guard that throws a descriptive error message instead.

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
…G4brym#147)

- Fix OpenAPI summary in HeadObject from "Get Object" to "Head Object"
- Add missing third-level fallback for base64 key decoding in HeadObject
  and PutMetadata to match GetObject's decoding logic, preventing potential
  failures with certain encoded file keys

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
…eck (G4brym#146)

The download counter was incremented before verifying the shared file
still exists in the bucket. If the file was deleted, accessing the share
link would increment the counter and eventually exhaust the download
limit without any actual downloads. Move file retrieval before counter
increment so the count only increases on successful downloads.

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
Bumps [actions/upload-artifact](https://github.com/actions/upload-artifact) from 6 to 7.
- [Release notes](https://github.com/actions/upload-artifact/releases)
- [Commits](actions/upload-artifact@v6...v7)

---
updated-dependencies:
- dependency-name: actions/upload-artifact
  dependency-version: '7'
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Bumps [actions/setup-node](https://github.com/actions/setup-node) from 4 to 6.
- [Release notes](https://github.com/actions/setup-node/releases)
- [Commits](actions/setup-node@v4...v6)

---
updated-dependencies:
- dependency-name: actions/setup-node
  dependency-version: '6'
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Bumps [actions/checkout](https://github.com/actions/checkout) from 4 to 6.
- [Release notes](https://github.com/actions/checkout/releases)
- [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md)
- [Commits](actions/checkout@v4...v6)

---
updated-dependencies:
- dependency-name: actions/checkout
  dependency-version: '6'
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
* Fix Content-Disposition header injection in GetObject

The filename in the Content-Disposition header was interpolated
directly without sanitization. Filenames containing double quotes
could break header parsing or enable header injection.

This fix:
- Strips non-ASCII characters and replaces double quotes in the
  ASCII `filename` parameter for compatibility
- Adds RFC 5987 `filename*` parameter with proper UTF-8 percent
  encoding, matching the approach used in getShareLink.ts

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* Fix CI: update test for new Content-Disposition format and add changeset

- Update GetObject test to expect the sanitized filename with RFC 5987
  filename* parameter added by the header injection fix
- Add required changeset for the patch release

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
- Add ShareMetadata interface to types.d.ts for share link metadata
- Replace `any` type in getShareLink.ts with ShareMetadata
- Type parsed JSON in listShares.ts with ShareMetadata
- Type share metadata object in createShareLink.ts with ShareMetadata
- Fix readOnlyMiddleware to use Hono's Next type instead of CallableFunction
Add a "Duplicate" context menu option that copies files and folders within
the same bucket, appending " (copy)" to the name while preserving all metadata.

- New CopyObject backend endpoint (POST /api/buckets/:bucket/copy)
- Frontend duplicate logic with progress tracking for folder duplication
- Backend integration tests (6 tests) and E2E test for file duplication

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
…tants

Replaces the minimal 20-line CLAUDE.md with a full reference covering
repository structure, all development commands, package details, API
routes, build pipeline, code style conventions, important file locations,
and share link implementation notes.

https://claude.ai/code/session_018aqegB3K211sW1c1xf8Eux
@gitnotebooks
Copy link
Copy Markdown

gitnotebooks bot commented Apr 6, 2026

Review these changes at https://app.gitnotebooks.com/Cybersoulja/R2-Explorer/pull/1

@sourcery-ai
Copy link
Copy Markdown

sourcery-ai bot commented Apr 6, 2026

Reviewer's Guide

Replaces tag-triggered npm publishing with a Changesets-based release workflow (including publint validation and GitHub releases), adds comprehensive dashboard unit/E2E testing, and introduces several backend/frontend features and robustness fixes such as object copy/duplicate, safer metadata parsing, improved share link typing/behavior, and more resilient key decoding and email handling.

Sequence diagram for file and folder duplicate/copy operation

sequenceDiagram
  actor User
  participant FilesFolderPage
  participant FileContextMenu
  participant FileOptions
  participant apiHandler
  participant WorkerAPI
  participant R2Bucket

  User->>FileContextMenu: Right-click row
  User->>FileContextMenu: Click Duplicate
  FileContextMenu-->>FilesFolderPage: emit duplicateObject(row)
  FilesFolderPage-->>FileOptions: call duplicateObject(row)

  alt row.type == folder
    FileOptions->>FileOptions: generateCopyName(row.key, true)
    FileOptions->>apiHandler: fetchFile(selectedBucket, row.key, "")
    apiHandler->>WorkerAPI: GET /api/buckets/:bucket?prefix=...
    WorkerAPI->>R2Bucket: list(prefix)
    R2Bucket-->>WorkerAPI: folderContents
    WorkerAPI-->>apiHandler: folderContents
    apiHandler-->>FileOptions: folderContents

    FileOptions->>apiHandler: createFolder(destPrefix, selectedBucket)
    apiHandler->>WorkerAPI: POST /api/buckets/:bucket/folder
    WorkerAPI->>R2Bucket: put(destPrefix)
    R2Bucket-->>WorkerAPI: ok

    loop each innerFile
      FileOptions->>FileOptions: newKey = innerFile.key.replace(sourcePrefix, destPrefix)
      FileOptions->>apiHandler: copyObject(selectedBucket, innerFile.key, newKey)
      apiHandler->>WorkerAPI: POST /api/buckets/:bucket/copy { sourceKey, destinationKey }
      WorkerAPI->>R2Bucket: get(sourceKey)
      R2Bucket-->>WorkerAPI: object
      WorkerAPI->>R2Bucket: put(destinationKey, body, metadata)
      R2Bucket-->>WorkerAPI: ok
      WorkerAPI-->>apiHandler: ok
      apiHandler-->>FileOptions: ok
      FileOptions->>FileOptions: update progress notification
    end

    FileOptions->>FileOptions: show "Folder duplicated" notification
  else row.type == file
    FileOptions->>FileOptions: destKey = generateCopyName(row.key, false)
    FileOptions->>apiHandler: copyObject(selectedBucket, row.key, destKey)
    apiHandler->>WorkerAPI: POST /api/buckets/:bucket/copy { sourceKey, destinationKey }
    WorkerAPI->>R2Bucket: get(sourceKey)
    R2Bucket-->>WorkerAPI: object
    WorkerAPI->>R2Bucket: put(destKey, body, metadata)
    R2Bucket-->>WorkerAPI: ok
    WorkerAPI-->>apiHandler: ok
    apiHandler-->>FileOptions: ok
    FileOptions->>FileOptions: show "File duplicated" notification
  end

  FileOptions->>FilesFolderPage: $bus.emit(fetchFiles)
  FilesFolderPage->>WorkerAPI: refresh listing
  WorkerAPI-->>FilesFolderPage: updated objects
  FilesFolderPage-->>User: updated file list with copy
Loading

Sequence diagram for safer object download and share link access

sequenceDiagram
  actor User
  participant Dashboard
  participant apiHandler
  participant WorkerAPI
  participant R2Bucket

  User->>Dashboard: Click Download in context menu
  Dashboard->>apiHandler: downloadFile(bucket, key, { downloadType: objectUrl })
  apiHandler->>WorkerAPI: GET /api/buckets/:bucket/:key
  WorkerAPI->>R2Bucket: get(filePath)
  R2Bucket-->>WorkerAPI: object
  WorkerAPI->>WorkerAPI: build headers
  WorkerAPI->>WorkerAPI: sanitize asciiFileName
  WorkerAPI->>WorkerAPI: set Content-Disposition with ascii and UTF-8 filename*
  WorkerAPI-->>apiHandler: Response with body and headers
  apiHandler-->>Dashboard: data
  Dashboard->>Dashboard: create Blob and object URL
  Dashboard->>Dashboard: trigger anchor click
  Dashboard-->>User: File download starts

  Note over User,WorkerAPI: Public share link access
  User->>WorkerAPI: GET /share/:shareId
  WorkerAPI->>R2Bucket: search .r2-explorer/sharable-links/:shareId.json across buckets
  R2Bucket-->>WorkerAPI: share metadata object
  WorkerAPI->>WorkerAPI: parse JSON as ShareMetadata
  WorkerAPI->>R2Bucket: get(shareMetadata.key)
  alt file exists
    R2Bucket-->>WorkerAPI: file
    WorkerAPI->>WorkerAPI: increment currentDownloads
    WorkerAPI->>R2Bucket: put updated ShareMetadata
    WorkerAPI-->>User: file response
  else file missing
    R2Bucket-->>WorkerAPI: null
    WorkerAPI-->>User: 404 Shared file not found
  end
Loading

Class diagram for updated bucket operations and share metadata types

classDiagram
  class OpenAPIRoute {
    <<abstract>>
    +schema any
    +getValidatedData(schemaType)
    +handle(c)
  }

  class ShareMetadata {
    +string bucket
    +string key
    +number expiresAt
    +string passwordHash
    +number maxDownloads
    +number currentDownloads
    +string createdBy
    +number createdAt
  }

  class CreateShareLink {
    +schema
    +handle(c: AppContext)
  }

  class GetShareLink {
    +schema
    +handle(c: AppContext)
  }

  class ListShares {
    +schema
    +handle(c: AppContext)
  }

  class CopyObject {
    +schema
    +handle(c: AppContext)
  }

  class GetObject {
    +schema
    +handle(c: AppContext)
  }

  class readOnlyMiddleware {
    +readOnlyMiddleware(c: AppContext, next: Next)
  }

  OpenAPIRoute <|-- CreateShareLink
  OpenAPIRoute <|-- GetShareLink
  OpenAPIRoute <|-- ListShares
  OpenAPIRoute <|-- CopyObject
  OpenAPIRoute <|-- GetObject

  CreateShareLink --> ShareMetadata : creates
  GetShareLink --> ShareMetadata : parses and updates
  ListShares --> ShareMetadata : parses list

  class AppContext {
    +env Record~string, R2Bucket~
    +req any
    +header(name: string, value: string)
    +get(key: string) any
  }

  class R2Bucket {
    +get(key: string) Promise~R2Object~
    +put(key: string, body: any, options: R2PutOptions) Promise~R2Object~
    +list(options: any) Promise~R2Objects~
    +head(key: string) Promise~R2Object~
    +createMultipartUpload(key: string, options: any)
  }

  CopyObject --> AppContext : uses
  CopyObject --> R2Bucket : get, put
  GetObject --> R2Bucket : get

  class R2Object {
    +any body
    +number size
    +string httpEtag
    +Record~string, string~ customMetadata
    +R2HTTPMetadata httpMetadata
    +writeHttpMetadata(headers: Headers)
  }

  R2Bucket --> R2Object

  class R2HTTPMetadata {
    +string contentType
    +string contentLanguage
    +string contentEncoding
    +string cacheControl
  }

  class Next {
    +call()
  }

  readOnlyMiddleware --> AppContext : reads config
  readOnlyMiddleware --> Next : invokes
Loading

File-Level Changes

Change Details Files
Add object copy/duplicate capability across API and dashboard UI.
  • Introduce CopyObject API route that copies an object while preserving metadata and register it on /api/buckets/:bucket/copy.
  • Expose apiHandler.copyObject helper and wire a Duplicate action into FileOptions/FileContextMenu, including name generation for copies and handling both files and folders.
  • Emit fetchFiles after duplication and ensure folder duplication copies each contained object while showing progress notifications.
packages/worker/src/modules/buckets/copyObject.ts
packages/worker/src/index.ts
packages/dashboard/src/appUtils.js
packages/dashboard/src/components/files/FileOptions.vue
packages/dashboard/src/pages/files/FileContextMenu.vue
packages/dashboard/src/pages/files/FilesFolderPage.vue
Improve file download behavior and HTTP headers for object responses.
  • Update dashboard downloadObject to use apiHandler.downloadFile and Blob/object URLs with error notifications instead of a raw anchor href to the API.
  • Sanitize filenames and add RFC 5987 filename* parameter in GetObject Content-Disposition header to avoid header injection and better support Unicode names.
  • Adjust integration tests to validate the new Content-Disposition format.
packages/dashboard/src/pages/files/FileContextMenu.vue
packages/worker/src/modules/buckets/getObject.ts
packages/worker/tests/integration/object.test.ts
Harden upload endpoints and key decoding while adding tests for malformed metadata handling.
  • Wrap JSON parsing of customMetadata and httpMetadata in PutObject and CreateUpload in try/catch and return HTTP 400 with clear error messages on invalid base64/JSON.
  • Align key decoding logic in HeadObject and PutMetadata with GetObject by adding multi-step fallbacks to handle edge-case encodings and correct the HeadObject OpenAPI summary.
  • Add integration tests asserting 400 responses and error messages for malformed metadata on single and multipart uploads.
packages/worker/src/modules/buckets/putObject.ts
packages/worker/src/modules/buckets/multipart/createUpload.ts
packages/worker/src/modules/buckets/headObject.ts
packages/worker/src/modules/buckets/putMetadata.ts
packages/worker/tests/integration/buckets.test.ts
packages/worker/tests/integration/multipart.test.ts
Strengthen share-link typing/behavior and readonly/email middleware safety.
  • Introduce a ShareMetadata TypeScript type and use it in getShareLink, createShareLink, and listShares instead of any, including typed JSON.parse casts.
  • Change readOnlyMiddleware to use Hono’s Next type and keep rejecting non-GET/HEAD methods when readonly is enabled.
  • Ensure getShareLink increments the download counter only after verifying the file exists, and add a guard in receiveEmail to error clearly when no target bucket binding is configured.
packages/worker/src/types.d.ts
packages/worker/src/modules/buckets/getShareLink.ts
packages/worker/src/modules/buckets/createShareLink.ts
packages/worker/src/modules/buckets/listShares.ts
packages/worker/src/foundation/middlewares/readonly.ts
packages/worker/src/modules/emails/receiveEmail.ts
Introduce Changesets-based automated versioning/publishing and enhance CI with validation and tests.
  • Replace tag-based publish workflow with a release job on pushes to main using changesets/action, running lint/build/publint and publishing via pnpm release with npm provenance and GitHub releases.
  • Add a Changeset Check workflow that enforces presence of changesets on pull requests except for docs/metadata-only changes.
  • Wire root scripts for changeset and release commands, add Changesets/Playwright dev dependencies, and update worker package build to ship docs as well as publint script.
  • Bump worker package version and adjust files list to include docs and CHANGELOG.
  • Add multiple concrete changeset files describing feature/fix changes implemented in this PR.
.github/workflows/publish.yml
.github/workflows/changeset-check.yml
package.json
packages/worker/package.json
.changeset/add-duplicate-copy-operation.md
.changeset/add-share-metadata-type.md
.changeset/fix-content-disposition-injection.md
.changeset/fix-head-object-key-decoding.md
.changeset/fix-receive-email-guard.md
.changeset/fix-share-download-counter.md
.changeset/handle-malformed-metadata.md
.changeset/README.md
.changeset/config.json
Add and wire up dashboard unit/E2E test infrastructure and configurations.
  • Enable Vitest-based unit testing for the dashboard with happy-dom, Vue Test Utils, and @vitejs/plugin-vue, including vitest.config.ts and a Jest-like setup file that mocks Quasar, axios, Pinia, router, and provides helper mount utilities.
  • Add targeted component and unit tests for routing, context menus, file listing behavior, stores, appUtils, and key UI components.
  • Introduce Playwright E2E configuration for the dashboard with a dev worker harness (wrangler-e2e.toml and dev entry) and helper utilities for seeding and cleaning bucket state.
  • Add an e2e job to the build workflow that builds the dashboard, installs browsers, runs Playwright tests, and uploads the report as an artifact.
packages/dashboard/package.json
packages/dashboard/vitest.config.ts
packages/dashboard/tests/setup.ts
packages/dashboard/tests/helpers.ts
packages/dashboard/tests/components/FilesFolderPage.test.ts
packages/dashboard/tests/components/FileContextMenu.test.ts
packages/dashboard/tests/components/BucketPicker.test.ts
packages/dashboard/tests/components/CreateFolder.test.ts
packages/dashboard/tests/components/FilePreview.test.ts
packages/dashboard/tests/components/LeftSidebar.test.ts
packages/dashboard/tests/components/LoginPage.test.ts
packages/dashboard/tests/components/MainLayout.test.ts
packages/dashboard/tests/components/Topbar.test.ts
packages/dashboard/tests/unit/routes.test.ts
packages/dashboard/tests/unit/appUtils.test.ts
packages/dashboard/tests/unit/auth-store.test.ts
packages/dashboard/tests/unit/main-store.test.ts
packages/dashboard/e2e/helpers.ts
packages/dashboard/playwright.config.ts
packages/dashboard/e2e/app-loads.spec.ts
packages/dashboard/e2e/context-menu.spec.ts
packages/dashboard/e2e/email.spec.ts
packages/dashboard/e2e/file-browsing.spec.ts
packages/dashboard/e2e/file-operations.spec.ts
packages/dashboard/e2e/file-preview.spec.ts
packages/dashboard/e2e/file-upload.spec.ts
packages/dashboard/e2e/metadata.spec.ts
packages/dashboard/e2e/navigation.spec.ts
packages/dashboard/e2e/search.spec.ts
packages/dashboard/e2e/share-links.spec.ts
.github/workflows/build.yml
packages/worker/dev/e2e.ts
packages/worker/dev/wrangler-e2e.toml
Miscellaneous repo and packaging improvements/documentation updates.
  • Mark docs and dev worker packages as private to prevent accidental publishing and adjust pnpm-lock and .gitignore as needed.
  • Add CLAUDE.md as a contributor-oriented project guide covering architecture, workflows, and key conventions.
  • Update docs package metadata and ensure worker build now copies docs into the published package including getting-started and guides content.
packages/docs/package.json
packages/worker/dev/package.json
CLAUDE.md
packages/worker/package.json
.gitignore
pnpm-lock.yaml

Tips and commands

Interacting with Sourcery

  • Trigger a new review: Comment @sourcery-ai review on the pull request.
  • Continue discussions: Reply directly to Sourcery's review comments.
  • Generate a GitHub issue from a review comment: Ask Sourcery to create an
    issue from a review comment by replying to it. You can also reply to a
    review comment with @sourcery-ai issue to create an issue from it.
  • Generate a pull request title: Write @sourcery-ai anywhere in the pull
    request title to generate a title at any time. You can also comment
    @sourcery-ai title on the pull request to (re-)generate the title at any time.
  • Generate a pull request summary: Write @sourcery-ai summary anywhere in
    the pull request body to generate a PR summary at any time exactly where you
    want it. You can also comment @sourcery-ai summary on the pull request to
    (re-)generate the summary at any time.
  • Generate reviewer's guide: Comment @sourcery-ai guide on the pull
    request to (re-)generate the reviewer's guide at any time.
  • Resolve all Sourcery comments: Comment @sourcery-ai resolve on the
    pull request to resolve all Sourcery comments. Useful if you've already
    addressed all the comments and don't want to see them anymore.
  • Dismiss all Sourcery reviews: Comment @sourcery-ai dismiss on the pull
    request to dismiss all existing Sourcery reviews. Especially useful if you
    want to start fresh with a new review - don't forget to comment
    @sourcery-ai review to trigger a new review!

Customizing Your Experience

Access your dashboard to:

  • Enable or disable review features such as the Sourcery-generated pull request
    summary, the reviewer's guide, and others.
  • Change the review language.
  • Add, remove or edit custom review instructions.
  • Adjust other review settings.

Getting Help

@Cybersoulja Cybersoulja merged commit 1eeb4b6 into main Apr 6, 2026
4 checks passed
@Cybersoulja Cybersoulja deleted the claude/add-claude-documentation-KjgQv branch April 6, 2026 02:09
Copy link
Copy Markdown

@sourcery-ai sourcery-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey - I've found 5 issues, and left some high level feedback:

  • In the new CopyObject route, the base64 decoding of sourceKey/destinationKey will throw on malformed input and surface as a 500; consider wrapping the decode in a try/catch and returning a 400 with a clear error message (similar to the custom/http metadata handlers) and returning a JSON response instead of the raw bucket.put result for consistency with other bucket endpoints.
  • The duplicateObject folder logic in FileOptions.vue performs a sequential copyObject for every item without any per-file error handling or cancellation; for large folders this could be slow and brittle, so it may be worth adding basic error reporting (e.g., stop and notify on first failure) or batching/parallelizing copies while keeping the progress UI accurate.
Prompt for AI Agents
Please address the comments from this code review:

## Overall Comments
- In the new CopyObject route, the base64 decoding of `sourceKey`/`destinationKey` will throw on malformed input and surface as a 500; consider wrapping the decode in a try/catch and returning a 400 with a clear error message (similar to the custom/http metadata handlers) and returning a JSON response instead of the raw `bucket.put` result for consistency with other bucket endpoints.
- The `duplicateObject` folder logic in `FileOptions.vue` performs a sequential `copyObject` for every item without any per-file error handling or cancellation; for large folders this could be slow and brittle, so it may be worth adding basic error reporting (e.g., stop and notify on first failure) or batching/parallelizing copies while keeping the progress UI accurate.

## Individual Comments

### Comment 1
<location path="packages/dashboard/e2e/helpers.ts" line_range="7-8" />
<code_context>
+const BUCKET = "MY_BUCKET";
+
+/** Base64-encode a key the same way the dashboard does. */
+function encodeKey(key: string): string {
+	return btoa(unescape(encodeURIComponent(key)));
+}
+
</code_context>
<issue_to_address>
**issue (bug_risk):** encodeKey relies on browser-only globals (btoa/unescape) in a Node/Playwright context

In the Playwright test runner (Node), `btoa` and `unescape` are undefined, so this will throw at runtime in e2e tests. Please switch to a Node-safe base64 implementation, e.g.:

```ts
function encodeKey(key: string): string {
  return Buffer.from(key, 'utf8').toString('base64');
}
```

Apply the same change to other `btoa`/`unescape` usages in this file (such as when encoding custom metadata in `seedEmail`).
</issue_to_address>

### Comment 2
<location path="packages/dashboard/e2e/helpers.ts" line_range="117-118" />
<code_context>
+	};
+
+	const encodedKey = encodeKey(key);
+	const encodedMeta = btoa(
+		unescape(encodeURIComponent(JSON.stringify(customMeta))),
+	);
+
</code_context>
<issue_to_address>
**issue (bug_risk):** Encoding customMetadata also uses browser-only globals and may break in Node

As with `encodeKey`, this `btoa` + `unescape` usage will fail in the Node-based Playwright runner. Consider a Node-safe alternative:

```ts
const encodedMeta = Buffer
  .from(JSON.stringify(customMeta), 'utf8')
  .toString('base64');
```

This preserves behavior without relying on browser globals.
</issue_to_address>

### Comment 3
<location path="packages/worker/src/modules/buckets/copyObject.ts" line_range="42-45" />
<code_context>
+			});
+		}
+
+		const sourceKey = decodeURIComponent(escape(atob(data.body.sourceKey)));
+		const destinationKey = decodeURIComponent(
+			escape(atob(data.body.destinationKey)),
+		);
</code_context>
<issue_to_address>
**suggestion:** Decoding of sourceKey/destinationKey lacks error handling for malformed base64/encoding

If these values aren’t valid base64 or in the expected format, `atob`/`decodeURIComponent` will throw and currently cause a 500. To align with `putObject`/`createUpload`, consider wrapping the decoding in a try/catch and returning a 400 with a clear error message (e.g. `Invalid sourceKey: expected base64-encoded key`).

```suggestion
		let sourceKey: string;
		try {
			sourceKey = decodeURIComponent(escape(atob(data.body.sourceKey)));
		} catch {
			throw new HTTPException(400, {
				message: "Invalid sourceKey: expected base64-encoded key",
			});
		}

		let destinationKey: string;
		try {
			destinationKey = decodeURIComponent(
				escape(atob(data.body.destinationKey)),
			);
		} catch {
			throw new HTTPException(400, {
				message: "Invalid destinationKey: expected base64-encoded key",
			});
		}
```
</issue_to_address>

### Comment 4
<location path="packages/worker/tests/integration/multipart.test.ts" line_range="132-128" />
<code_context>
+			expect(body).toContain("Invalid customMetadata");
+		});
+
+		it("should return 400 for malformed httpMetadata", async () => {
+			if (!MY_TEST_BUCKET_1)
+				throw new Error("MY_TEST_BUCKET_1 binding not available");
+
+			const objectKey = "bad-metadata.txt";
+			const base64ObjectKey = btoa(objectKey);
+			const blobBody = new Blob(["content"], {
+				type: "application/octet-stream",
+			});
+
+			const badHttpMetadata = btoa("{broken json");
+
+			const request = createTestRequest(
+				`/api/buckets/MY_TEST_BUCKET_1/upload?key=${encodeURIComponent(base64ObjectKey)}&httpMetadata=${encodeURIComponent(badHttpMetadata)}`,
+				"POST",
+				blobBody,
+				{ "Content-Type": "application/octet-stream" },
+			);
+
+			const response = await app.fetch(request, env, createExecutionContext());
+			expect(response.status).toBe(400);
+			const body = await response.text();
+			expect(body).toContain("Invalid httpMetadata");
+		});
</code_context>
<issue_to_address>
**suggestion (testing):** Add a multipart test for non–base64 httpMetadata to complete malformed-input coverage.

Similar to `customMetadata`, please add a multipart test where `httpMetadata` is not base64 at all, so multipart coverage includes both invalid base64 and invalid JSON cases and fully exercises the new validation logic for malformed inputs.

Suggested implementation:

```typescript
			expect(body).toContain("Invalid customMetadata");
		});

		it("should return 400 for non-base64 httpMetadata in multipart upload", async () => {
			if (!MY_TEST_BUCKET_1)
				throw new Error("MY_TEST_BUCKET_1 binding not available");

			const objectKey = "bad-multipart-http-metadata.txt";

			const formData = new FormData();
			formData.append("key", objectKey);
			formData.append(
				"file",
				new Blob(["content"], {
					type: "application/octet-stream",
				}),
				objectKey,
			);
			// httpMetadata is not base64 at all
			formData.append("httpMetadata", "not-base64-http-metadata");

			const request = createTestRequest(
				`/api/buckets/MY_TEST_BUCKET_1/upload`,
				"POST",
				formData,
			);

			const response = await app.fetch(request, env, createExecutionContext());
			expect(response.status).toBe(400);
			const body = await response.text();
			expect(body).toContain("Invalid httpMetadata");
		});

		it("should return 400 for malformed customMetadata", async () => {

```

1. Ensure that `FormData` and `Blob` are available in the test environment. If not globally available, you may need to import or polyfill them at the top of the test file (for example via `undici` or `@cloudflare/workers-types` setup).
2. If `createTestRequest` requires explicit headers for multipart requests in your codebase, you may need to add a `headers` argument with the appropriate `Content-Type` including the multipart boundary, or adapt the helper to accept a `FormData` body and set headers automatically (if it doesn’t already).
3. If there is an existing helper for building multipart upload requests (e.g. `createMultipartUploadRequest`), prefer using that helper instead of constructing the `FormData` directly, mirroring how other multipart tests are written in this file.
</issue_to_address>

### Comment 5
<location path="packages/worker/tests/integration/object.test.ts" line_range="150-151" />
<code_context>
 			expect(response.headers.has("etag")).toBe(true);
-			// Default R2 GetObject includes Content-Disposition: attachment; filename="key"
-			expect(response.headers.get("content-disposition")).toBe(`attachment; filename="${TEST_OBJECT_KEY}"`);
+			// Default R2 GetObject includes Content-Disposition with sanitized filename and RFC 5987 filename*
+			expect(response.headers.get("content-disposition")).toBe(`attachment; filename="${TEST_OBJECT_KEY}"; filename*=UTF-8''${encodeURIComponent(TEST_OBJECT_KEY)}`);

 			const body = await response.text();
</code_context>
<issue_to_address>
**suggestion (testing):** Consider adding a dedicated test for non-ASCII and quote-containing filenames to validate sanitization logic.

Since the code now sanitizes non-ASCII and double-quote characters, please add a test where the key includes such characters (e.g. `file-ä".txt`) and assert that:
- `filename` shows the sanitized form (non-ASCII replaced, quotes normalized), and
- `filename*` preserves the original name via UTF-8 encoding.
This will ensure the new behavior is covered and validated.

Suggested implementation:

```typescript
			expect(response.headers.get("content-type")).toBe(TEST_OBJECT_CONTENT_TYPE);
			expect(response.headers.get("content-length")).toBe(TEST_OBJECT_CONTENT.length.toString());
			expect(response.headers.has("etag")).toBe(true);
			// Default R2 GetObject includes Content-Disposition with sanitized filename and RFC 5987 filename*
			expect(response.headers.get("content-disposition")).toBe(`attachment; filename="${TEST_OBJECT_KEY}"; filename*=UTF-8''${encodeURIComponent(TEST_OBJECT_KEY)}`);

			const body = await response.text();
			expect(body).toBe(TEST_OBJECT_CONTENT);

```

Please add the following new test case in the same `describe` block that contains the existing R2 `GetObject` test (typically close to the test that uses `TEST_OBJECT_KEY`):

```ts
	it("returns sanitized Content-Disposition for non-ASCII and quoted filenames", async () => {
		const SPECIAL_KEY = 'file-ä".txt';
		const SPECIAL_CONTENT = "special content";
		const SPECIAL_CONTENT_TYPE = "text/plain";

		// Arrange: upload an object with non-ASCII and quote-containing key
		// Adjust this part to match your existing helpers for putting objects into R2.
		// For example, if you already have a helper like `putObject(TEST_OBJECT_KEY, TEST_OBJECT_CONTENT, TEST_OBJECT_CONTENT_TYPE)`,
		// call it here instead of inlining the logic.
		await fetch(`http://127.0.0.1:${port}/r2/${encodeURIComponent(SPECIAL_KEY)}`, {
			method: "PUT",
			headers: {
				"content-type": SPECIAL_CONTENT_TYPE,
				"content-length": SPECIAL_CONTENT.length.toString(),
			},
			body: SPECIAL_CONTENT,
		});

		// Act: retrieve the object
		const response = await fetch(`http://127.0.0.1:${port}/r2/${encodeURIComponent(SPECIAL_KEY)}`);

		expect(response.status).toBe(200);
		expect(response.headers.get("content-type")).toBe(SPECIAL_CONTENT_TYPE);

		const contentDisposition = response.headers.get("content-disposition");
		expect(contentDisposition).not.toBeNull();

		// Example header value:
		// attachment; filename="file-a'.txt"; filename*=UTF-8''file-%C3%A4%22.txt
		// We don't assert the exact sanitized string to avoid coupling to the implementation,
		// but we do assert on the properties we care about.

		// Extract filename and filename* components
		const filenameMatch = contentDisposition!.match(/filename="([^"]+)"/);
		const filenameStarMatch = contentDisposition!.match(/filename\*\=UTF-8''([^;]+)$/);

		expect(filenameMatch).not.toBeNull();
		expect(filenameStarMatch).not.toBeNull();

		const sanitizedFilename = filenameMatch![1];
		const encodedFilenameStar = filenameStarMatch![1];

		// Assert: `filename` is sanitized:
		// - non-ASCII characters are removed/replaced
		// - double quotes are normalized/removed
		expect(sanitizedFilename).not.toContain("ä");
		expect(sanitizedFilename).not.toContain('"');
		// It should still generally resemble the original name (same base/extension)
		expect(sanitizedFilename).toContain("file");
		expect(sanitizedFilename).toContain(".txt");

		// Assert: `filename*` preserves the original filename (RFC 5987, UTF-8 encoded)
		expect(encodedFilenameStar).toBe(encodeURIComponent(SPECIAL_KEY));
		expect(decodeURIComponent(encodedFilenameStar)).toBe(SPECIAL_KEY);

		const body = await response.text();
		expect(body).toBe(SPECIAL_CONTENT);
	});
```

You may need to:

1. Replace the inline `fetch(..., { method: "PUT", ... })` used for uploading with the existing test helper or route you use elsewhere in this file to create the R2 object (e.g. a helper that seeds R2 with `TEST_OBJECT_KEY`).
2. Use the correct base URL/path (`/r2/...` above is an example; adjust to whatever path your existing tests use to interact with R2 objects).
3. If this file already defines utilities for parsing `Content-Disposition`, you can reuse them instead of doing the regex parsing inline here.
</issue_to_address>

Sourcery is free for open source - if you like our reviews please consider sharing them ✨
Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.

Comment on lines +7 to +8
function encodeKey(key: string): string {
return btoa(unescape(encodeURIComponent(key)));
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

issue (bug_risk): encodeKey relies on browser-only globals (btoa/unescape) in a Node/Playwright context

In the Playwright test runner (Node), btoa and unescape are undefined, so this will throw at runtime in e2e tests. Please switch to a Node-safe base64 implementation, e.g.:

function encodeKey(key: string): string {
  return Buffer.from(key, 'utf8').toString('base64');
}

Apply the same change to other btoa/unescape usages in this file (such as when encoding custom metadata in seedEmail).

Comment on lines +117 to +118
const encodedMeta = btoa(
unescape(encodeURIComponent(JSON.stringify(customMeta))),
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

issue (bug_risk): Encoding customMetadata also uses browser-only globals and may break in Node

As with encodeKey, this btoa + unescape usage will fail in the Node-based Playwright runner. Consider a Node-safe alternative:

const encodedMeta = Buffer
  .from(JSON.stringify(customMeta), 'utf8')
  .toString('base64');

This preserves behavior without relying on browser globals.

Comment on lines +42 to +45
const sourceKey = decodeURIComponent(escape(atob(data.body.sourceKey)));
const destinationKey = decodeURIComponent(
escape(atob(data.body.destinationKey)),
);
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

suggestion: Decoding of sourceKey/destinationKey lacks error handling for malformed base64/encoding

If these values aren’t valid base64 or in the expected format, atob/decodeURIComponent will throw and currently cause a 500. To align with putObject/createUpload, consider wrapping the decoding in a try/catch and returning a 400 with a clear error message (e.g. Invalid sourceKey: expected base64-encoded key).

Suggested change
const sourceKey = decodeURIComponent(escape(atob(data.body.sourceKey)));
const destinationKey = decodeURIComponent(
escape(atob(data.body.destinationKey)),
);
let sourceKey: string;
try {
sourceKey = decodeURIComponent(escape(atob(data.body.sourceKey)));
} catch {
throw new HTTPException(400, {
message: "Invalid sourceKey: expected base64-encoded key",
});
}
let destinationKey: string;
try {
destinationKey = decodeURIComponent(
escape(atob(data.body.destinationKey)),
);
} catch {
throw new HTTPException(400, {
message: "Invalid destinationKey: expected base64-encoded key",
});
}


const response = await app.fetch(request, env, createExecutionContext());
expect(response.status).toBe(400);
const body = await response.text();
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

suggestion (testing): Add a multipart test for non–base64 httpMetadata to complete malformed-input coverage.

Similar to customMetadata, please add a multipart test where httpMetadata is not base64 at all, so multipart coverage includes both invalid base64 and invalid JSON cases and fully exercises the new validation logic for malformed inputs.

Suggested implementation:

			expect(body).toContain("Invalid customMetadata");
		});

		it("should return 400 for non-base64 httpMetadata in multipart upload", async () => {
			if (!MY_TEST_BUCKET_1)
				throw new Error("MY_TEST_BUCKET_1 binding not available");

			const objectKey = "bad-multipart-http-metadata.txt";

			const formData = new FormData();
			formData.append("key", objectKey);
			formData.append(
				"file",
				new Blob(["content"], {
					type: "application/octet-stream",
				}),
				objectKey,
			);
			// httpMetadata is not base64 at all
			formData.append("httpMetadata", "not-base64-http-metadata");

			const request = createTestRequest(
				`/api/buckets/MY_TEST_BUCKET_1/upload`,
				"POST",
				formData,
			);

			const response = await app.fetch(request, env, createExecutionContext());
			expect(response.status).toBe(400);
			const body = await response.text();
			expect(body).toContain("Invalid httpMetadata");
		});

		it("should return 400 for malformed customMetadata", async () => {
  1. Ensure that FormData and Blob are available in the test environment. If not globally available, you may need to import or polyfill them at the top of the test file (for example via undici or @cloudflare/workers-types setup).
  2. If createTestRequest requires explicit headers for multipart requests in your codebase, you may need to add a headers argument with the appropriate Content-Type including the multipart boundary, or adapt the helper to accept a FormData body and set headers automatically (if it doesn’t already).
  3. If there is an existing helper for building multipart upload requests (e.g. createMultipartUploadRequest), prefer using that helper instead of constructing the FormData directly, mirroring how other multipart tests are written in this file.

Comment on lines +150 to +151
// Default R2 GetObject includes Content-Disposition with sanitized filename and RFC 5987 filename*
expect(response.headers.get("content-disposition")).toBe(`attachment; filename="${TEST_OBJECT_KEY}"; filename*=UTF-8''${encodeURIComponent(TEST_OBJECT_KEY)}`);
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

suggestion (testing): Consider adding a dedicated test for non-ASCII and quote-containing filenames to validate sanitization logic.

Since the code now sanitizes non-ASCII and double-quote characters, please add a test where the key includes such characters (e.g. file-ä".txt) and assert that:

  • filename shows the sanitized form (non-ASCII replaced, quotes normalized), and
  • filename* preserves the original name via UTF-8 encoding.
    This will ensure the new behavior is covered and validated.

Suggested implementation:

			expect(response.headers.get("content-type")).toBe(TEST_OBJECT_CONTENT_TYPE);
			expect(response.headers.get("content-length")).toBe(TEST_OBJECT_CONTENT.length.toString());
			expect(response.headers.has("etag")).toBe(true);
			// Default R2 GetObject includes Content-Disposition with sanitized filename and RFC 5987 filename*
			expect(response.headers.get("content-disposition")).toBe(`attachment; filename="${TEST_OBJECT_KEY}"; filename*=UTF-8''${encodeURIComponent(TEST_OBJECT_KEY)}`);

			const body = await response.text();
			expect(body).toBe(TEST_OBJECT_CONTENT);

Please add the following new test case in the same describe block that contains the existing R2 GetObject test (typically close to the test that uses TEST_OBJECT_KEY):

	it("returns sanitized Content-Disposition for non-ASCII and quoted filenames", async () => {
		const SPECIAL_KEY = 'file-ä".txt';
		const SPECIAL_CONTENT = "special content";
		const SPECIAL_CONTENT_TYPE = "text/plain";

		// Arrange: upload an object with non-ASCII and quote-containing key
		// Adjust this part to match your existing helpers for putting objects into R2.
		// For example, if you already have a helper like `putObject(TEST_OBJECT_KEY, TEST_OBJECT_CONTENT, TEST_OBJECT_CONTENT_TYPE)`,
		// call it here instead of inlining the logic.
		await fetch(`http://127.0.0.1:${port}/r2/${encodeURIComponent(SPECIAL_KEY)}`, {
			method: "PUT",
			headers: {
				"content-type": SPECIAL_CONTENT_TYPE,
				"content-length": SPECIAL_CONTENT.length.toString(),
			},
			body: SPECIAL_CONTENT,
		});

		// Act: retrieve the object
		const response = await fetch(`http://127.0.0.1:${port}/r2/${encodeURIComponent(SPECIAL_KEY)}`);

		expect(response.status).toBe(200);
		expect(response.headers.get("content-type")).toBe(SPECIAL_CONTENT_TYPE);

		const contentDisposition = response.headers.get("content-disposition");
		expect(contentDisposition).not.toBeNull();

		// Example header value:
		// attachment; filename="file-a'.txt"; filename*=UTF-8''file-%C3%A4%22.txt
		// We don't assert the exact sanitized string to avoid coupling to the implementation,
		// but we do assert on the properties we care about.

		// Extract filename and filename* components
		const filenameMatch = contentDisposition!.match(/filename="([^"]+)"/);
		const filenameStarMatch = contentDisposition!.match(/filename\*\=UTF-8''([^;]+)$/);

		expect(filenameMatch).not.toBeNull();
		expect(filenameStarMatch).not.toBeNull();

		const sanitizedFilename = filenameMatch![1];
		const encodedFilenameStar = filenameStarMatch![1];

		// Assert: `filename` is sanitized:
		// - non-ASCII characters are removed/replaced
		// - double quotes are normalized/removed
		expect(sanitizedFilename).not.toContain("ä");
		expect(sanitizedFilename).not.toContain('"');
		// It should still generally resemble the original name (same base/extension)
		expect(sanitizedFilename).toContain("file");
		expect(sanitizedFilename).toContain(".txt");

		// Assert: `filename*` preserves the original filename (RFC 5987, UTF-8 encoded)
		expect(encodedFilenameStar).toBe(encodeURIComponent(SPECIAL_KEY));
		expect(decodeURIComponent(encodedFilenameStar)).toBe(SPECIAL_KEY);

		const body = await response.text();
		expect(body).toBe(SPECIAL_CONTENT);
	});

You may need to:

  1. Replace the inline fetch(..., { method: "PUT", ... }) used for uploading with the existing test helper or route you use elsewhere in this file to create the R2 object (e.g. a helper that seeds R2 with TEST_OBJECT_KEY).
  2. Use the correct base URL/path (/r2/... above is an example; adjust to whatever path your existing tests use to interact with R2 objects).
  3. If this file already defines utilities for parsing Content-Disposition, you can reuse them instead of doing the regex parsing inline here.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants