Skip to content

Bulk Operations

Temp edited this page Jan 7, 2026 · 4 revisions

Bulk Operations

Guide to multi-select batch operations for databases and tables in D1 Manager.

Overview

D1 Manager supports efficient bulk operations for managing multiple databases or tables simultaneously:

Database Operations:

  • πŸ“¦ Bulk Download (SQL dumps in ZIP)
  • πŸ—‘οΈ Bulk Delete (with progress tracking)
  • ⚑ Bulk Optimize (ANALYZE for performance)
  • πŸ“€ Upload/Import databases

Table Operations:

  • πŸ“‹ Bulk Clone (duplicate tables)
  • πŸ“€ Bulk Export (SQL or CSV in ZIP)
  • πŸ—‘οΈ Bulk Delete (with dependency analysis)

Database Bulk Operations

Multi-Select Databases

Select Individual Databases

Click the checkbox on any database card to select it.

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚ [βœ“] πŸ“Š my-database              β”‚
β”‚                                 β”‚
β”‚ UUID: a1b2c3d4-...              β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

Selected databases show a blue ring around the card.

Select All Databases

Click the "Select All" button in the bulk operations toolbar to select all databases at once.

Clear Selection

Click "Clear Selection" to deselect all databases.

Bulk Download

Export multiple databases in your chosen format (SQL, JSON, or CSV) as a ZIP archive.

Steps:

  1. Select databases using checkboxes
  2. Choose export format from dropdown (SQL/JSON/CSV)
  3. Click "Download Selected"
  4. Wait for export (progress shows current database name)
  5. ZIP file downloads automatically

Format Selection:

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚  SQL β–Ό  β”‚ β”‚ Download Selected    β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

Export Formats:

  • SQL - Complete database dump with CREATE TABLE + INSERT (default)
  • JSON - Portable JSON with metadata + arrays of objects per table
  • CSV - ZIP containing _metadata.json + CSV per table

Progress Indicator:

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚ ↻ Exporting my-database...           2 / 5 databases β”‚
β”‚ β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘  40% β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

ZIP Contents (SQL):

databases-sql-1704614400000.zip
β”œβ”€β”€ my-database.sql
β”œβ”€β”€ test-database.sql
└── production-db.sql

ZIP Contents (JSON):

databases-json-1704614400000.zip
β”œβ”€β”€ my-database.json
β”œβ”€β”€ test-database.json
└── production-db.json

ZIP Contents (CSV):

databases-csv-1704614400000.zip
└── my-database/
    β”œβ”€β”€ _metadata.json
    β”œβ”€β”€ users.csv
    └── orders.csv

Rate Limiting:

  • 300ms delay between each database export
  • Exponential backoff (2sβ†’4sβ†’8s) on 429 errors
  • Prevents API throttling for large batch exports

Use Cases:

  • Regular backups (SQL)
  • Data analysis pipelines (CSV)
  • Cross-platform data transfer (JSON)
  • Archive old databases

FTS5 Limitation:

Databases containing FTS5 (Full-Text Search) virtual tables cannot be exported via the D1 REST API. This is a Cloudflare platform limitation.

When exporting multiple databases:

  • Databases with FTS5 tables will be skipped
  • A dialog shows which databases were skipped and why
  • The specific FTS5 table names are displayed
  • Other databases export normally

Workaround:

  1. Drop the FTS5 tables from the database
  2. Export the database
  3. Recreate the FTS5 tables

See FTS5 Full Text Search for more information about FTS5 tables.

Bulk Delete

Delete multiple databases with progress tracking and error reporting.

Steps:

  1. Select databases using checkboxes
  2. Click "Delete Selected"
  3. Review confirmation dialog
  4. Click "Delete X Databases"
  5. Watch progress

Confirmation Dialog:

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚ Delete 3 Databases?                         β”‚
β”œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€
β”‚ This action cannot be undone.               β”‚
β”‚                                             β”‚
β”‚ Databases to delete:                        β”‚
β”‚ β€’ my-database                               β”‚
β”‚ β€’ test-database                             β”‚
β”‚ β€’ old-database                              β”‚
β”‚                                             β”‚
β”‚ [Cancel] [Delete 3 Databases]               β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

Progress Tracking:

Deleting database 1 of 3: my-database...
Deleting database 2 of 3: test-database...
Deleting database 3 of 3: old-database...
Complete: 3 databases deleted βœ“

Error Handling:

Some databases failed to delete:
β€’ old-database: Permission denied
β€’ test-database: Not found

Successfully deleted: 1 of 3 databases

Important Notes:

  • ⚠️ Permanent: Deletion cannot be undone
  • ⚠️ Sequential: Deletes one at a time
  • βœ… Error Recovery: Continues even if one fails
  • βœ… Progress Tracking: Real-time status updates

Bulk Optimize

Run ANALYZE on multiple databases to update query statistics for better performance.

What it Does:

  • Executes PRAGMA optimize on each database
  • Updates statistics for SQLite query planner
  • Improves query performance
  • No data modification

Steps:

  1. Select databases using checkboxes
  2. Click "Optimize Selected"
  3. Review optimization dialog
  4. Click "Optimize X Databases"
  5. Wait for completion

Optimization Dialog:

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚ Optimize 3 Databases?                       β”‚
β”œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€
β”‚ Run ANALYZE to update query statistics      β”‚
β”‚                                             β”‚
β”‚ Operation: ANALYZE (PRAGMA optimize)        β”‚
β”‚                                             β”‚
β”‚ Databases to optimize:                      β”‚
β”‚ β€’ my-database                               β”‚
β”‚ β€’ test-database                             β”‚
β”‚ β€’ production-db                             β”‚
β”‚                                             β”‚
β”‚ ℹ️ Note: VACUUM is not available via D1    β”‚
β”‚ REST API. Use Wrangler CLI for VACUUM.     β”‚
β”‚                                             β”‚
β”‚ [Cancel] [Optimize 3 Databases]             β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

Progress:

Running ANALYZE (Database 1 of 3): my-database...
Running ANALYZE (Database 2 of 3): test-database...
Running ANALYZE (Database 3 of 3): production-db...
Complete: 3 databases optimized βœ“

When to Optimize:

  • After bulk data imports
  • After creating new indexes
  • After significant schema changes
  • Periodically for large databases (monthly)

About VACUUM:

  • VACUUM is not available via D1 REST API
  • D1 automatically manages space reclamation
  • For manual VACUUM:
    wrangler d1 execute database-name --remote --command="VACUUM"

Upload Database

Import SQL files to create new databases or update existing ones.

Steps:

  1. Click "Upload Database"
  2. Select SQL file (up to 5GB)
  3. Choose import mode:
    • Create new database - Creates fresh database
    • Import into existing - Adds to existing database
  4. Enter name or select target
  5. Click "Upload"

Import Modes:

Create New:

Mode: Create new database
Name: [imported-database     ]
File: backup.sql (2.4 MB)

Import Existing:

Mode: Import into existing database
Target: [β–Ό Select database    ]
File: additional-data.sql (1.2 MB)

Important Notes:

  • πŸ“ Max Size: 5GB per file
  • ⚠️ Conflicts: May fail if schema conflicts with existing data
  • βœ… Validation: SQL syntax validated before execution
  • πŸ”„ Progress: Upload progress shown during import

Clone Database

Duplicate an entire database with all its tables and data.

Steps:

  1. Find the database you want to clone
  2. Click the "Clone" button on the database card
  3. Enter a name for the new database
  4. Click "Clone Database"
  5. Wait for the clone to complete

Clone Dialog:

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚ πŸ“‹ Clone Database                           β”‚
β”œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€
β”‚ Create a copy of my-database with all its   β”‚
β”‚ tables and data.                            β”‚
β”‚                                             β”‚
β”‚ New Database Name:                          β”‚
β”‚ [my-database-copy                  ]        β”‚
β”‚                                             β”‚
β”‚ Lowercase letters, numbers, and hyphens     β”‚
β”‚ only. Must start with a letter.             β”‚
β”‚                                             β”‚
β”‚ [Cancel] [Clone Database]                   β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

Progress Tracking:

Exporting source database...      10%
Creating new database...          40%
Importing data...                 60%
Clone completed!                 100%

What Gets Cloned:

  • βœ… All tables (structure and data)
  • βœ… All indexes
  • βœ… Foreign key constraints
  • βœ… Check constraints
  • ❌ Database colors (metadata, not part of SQL dump)

FTS5 Limitation:

Databases containing FTS5 (Full-Text Search) virtual tables cannot be cloned due to Cloudflare D1's export API limitation.

When attempting to clone an FTS5 database:

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚ ⚠️ Cannot Clone FTS5 Database              β”‚
β”œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€
β”‚ This database contains 2 FTS5 table(s).     β”‚
β”‚ Cloudflare D1's export API does not         β”‚
β”‚ support virtual tables like FTS5.           β”‚
β”‚                                             β”‚
β”‚ To clone this database, first drop the      β”‚
β”‚ FTS5 tables, clone, then recreate them.     β”‚
β”‚                                             β”‚
β”‚ [Cancel] [Clone Database] (disabled)        β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

Workaround for FTS5 databases:

  1. Navigate to the FTS5 tab in the source database
  2. Delete the FTS5 tables (note the configuration first)
  3. Clone the database
  4. Recreate the FTS5 tables in the cloned database

Use Cases:

  • Create development/staging copies
  • Test schema changes safely
  • Create backups before major changes
  • Share database templates

Table Bulk Operations

Multi-Select Tables

Similar to database selection:

  • Checkbox on each table card
  • "Select All" button
  • "Clear Selection" to deselect

Bulk Clone

Duplicate multiple tables with suggested names.

Steps:

  1. Navigate to a database
  2. Select tables using checkboxes
  3. Click "Clone Selected"
  4. Review/modify suggested names
  5. Click "Clone Tables"

Name Suggestions:

Original      β†’ Suggested Name
users         β†’ users_copy
posts         β†’ posts_copy
comments      β†’ comments_copy

Customization:

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚ Clone 3 Tables                              β”‚
β”œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€
β”‚ users         β†’ [users_backup    ]          β”‚
β”‚ posts         β†’ [posts_archive   ]          β”‚
β”‚ comments      β†’ [comments_2024   ]          β”‚
β”‚                                             β”‚
β”‚ [Cancel] [Clone Tables]                     β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

What Gets Cloned:

  • βœ… Table structure (columns, types, constraints)
  • βœ… All data rows
  • βœ… All indexes
  • ❌ Foreign key references (new table is independent)
  • ❌ Triggers (SQLite limitation)

Progress:

Cloning table 1 of 3: users β†’ users_copy...
Cloning table 2 of 3: posts β†’ posts_copy...
Cloning table 3 of 3: comments β†’ comments_copy...
Complete: 3 tables cloned βœ“

Bulk Export

Export multiple tables as SQL or CSV in a ZIP archive.

Steps:

  1. Select tables using checkboxes
  2. Click "Export Selected"
  3. Choose format: SQL or CSV
  4. Click "Export"
  5. Download ZIP file

Format Selection:

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚ Export 3 Tables                             β”‚
β”œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€
β”‚ Format:                                     β”‚
β”‚ β—‹ SQL - Complete table dump                β”‚
β”‚ β—‹ CSV - Data only                          β”‚
β”‚                                             β”‚
β”‚ Tables:                                     β”‚
β”‚ β€’ users (152 rows)                          β”‚
β”‚ β€’ posts (1,234 rows)                        β”‚
β”‚ β€’ comments (5,678 rows)                     β”‚
β”‚                                             β”‚
β”‚ [Cancel] [Export]                           β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

ZIP Contents (SQL):

tables-export.zip
β”œβ”€β”€ users.sql
β”œβ”€β”€ posts.sql
└── comments.sql

ZIP Contents (CSV):

tables-export.zip
β”œβ”€β”€ users.csv
β”œβ”€β”€ posts.csv
└── comments.csv

Use Cases:

  • Table-level backups
  • Data analysis (CSV)
  • Migration preparation
  • Sharing data with others

Bulk Delete

Delete multiple tables with dependency analysis and progress tracking.

Steps:

  1. Select tables using checkboxes
  2. Click "Delete Selected"
  3. Review dependencies for each table
  4. Check confirmation boxes (if dependencies exist)
  5. Click "Delete Tables"

Dependencies Display:

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚ Delete 3 Tables?                            β”‚
β”œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€
β”‚ β–Ό users (2 dependencies)                    β”‚
β”‚   Inbound:                                  β”‚
β”‚   β€’ posts references users (152 rows)       β”‚
β”‚     ON DELETE: CASCADE                      β”‚
β”‚   β€’ comments references users (47 rows)     β”‚
β”‚     ON DELETE: CASCADE                      β”‚
β”‚                                             β”‚
β”‚   β–‘ I understand dependent data will be     β”‚
β”‚     deleted                                 β”‚
β”‚                                             β”‚
β”‚ β–Ό posts (1 dependency)                      β”‚
β”‚   Outbound:                                 β”‚
β”‚   β€’ references users (user_id)              β”‚
β”‚     ON DELETE: CASCADE                      β”‚
β”‚                                             β”‚
β”‚   β–‘ I understand dependent data will be     β”‚
β”‚     deleted                                 β”‚
β”‚                                             β”‚
β”‚ β–Ά comments (no dependencies)                β”‚
β”‚                                             β”‚
β”‚ [Cancel] [Delete Tables] (disabled)         β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

Delete button enabled only when:

  • All required confirmations checked
  • No dependencies, OR
  • User acknowledges dependency impact

Progress:

Analyzing dependencies...
Deleting table 1 of 3: users...
Deleting table 2 of 3: posts...
Deleting table 3 of 3: comments...
Complete: 3 tables deleted βœ“

Important Notes:

  • ⚠️ Cascade Impact: Dependent rows deleted automatically
  • ⚠️ Permanent: Cannot be undone
  • βœ… Dependency Analysis: Shows what will be affected
  • βœ… Safety Checks: Requires explicit confirmation

Best Practices

Before Bulk Operations

  1. Backup First:

    # Download backups before bulk delete
    # Use bulk download or Wrangler
    wrangler d1 execute database-name --remote --command=".dump" > backup.sql
  2. Test on Subset:

    Select 1-2 items first
    Verify operation works correctly
    Then select all items
    
  3. Check Dependencies:

    Review dependency analysis carefully
    Understand cascade impacts
    Confirm you want to proceed
    

During Operations

  1. Monitor Progress:

    • Watch progress indicators
    • Note any errors
    • Don't close browser during operation
  2. Be Patient:

    • Large operations take time
    • Don't refresh page
    • Wait for completion message

After Operations

  1. Verify Results:

    Check operation completed successfully
    Verify expected items created/deleted
    Test database/table functionality
    
  2. Review Errors:

    Note which items failed (if any)
    Check error messages
    Retry individual items if needed
    
  3. Clean Up:

    Clear selections
    Refresh views to see changes
    Delete temporary files/backups
    

Limitations

Database Operations

  • Sequential Processing: One database at a time
  • No Parallelization: Operations not concurrent
  • Quota Limits: Account-wide D1 quotas apply
  • Size Limits: Large databases may timeout
  • Protected Databases: System database excluded
  • FTS5 Virtual Tables: Databases with FTS5 tables cannot be exported (D1 API limitation)

Table Operations

  • Same Database Only: Can't clone across databases
  • No Cross-Database: Bulk operations within one database
  • Foreign Key Checks: Deletions respect FK constraints
  • Memory Limits: Very large tables may fail

API Limits

  • D1 REST API: Standard rate limits apply
  • Request Size: Large requests may timeout
  • Worker Limits: 50ms CPU time per request (paid plan)

Troubleshooting

"Operation timed out"

Cause: Too many items or items too large

Solutions:

  • Select fewer items per operation
  • Use Wrangler CLI for very large operations
  • Break into smaller batches

"Some operations failed"

Cause: Individual items had errors

Solutions:

  • Check error messages for each failed item
  • Verify permissions
  • Retry failed items individually
  • Check quota limits

"Cannot delete: Dependencies exist"

Cause: Tables have foreign key relationships

Solution:

  • Review dependency analysis
  • Check confirmation boxes
  • Delete in correct order (child tables first)
  • Or accept cascade deletions

"ZIP download failed"

Cause: Browser blocked download or memory issue

Solutions:

  • Check browser pop-up blocker
  • Allow downloads from site
  • Try with fewer items
  • Clear browser cache

API Reference

Bulk Delete Databases

POST /api/databases/bulk-delete
Content-Type: application/json

{
  "databaseIds": ["uuid1", "uuid2", "uuid3"]
}

Bulk Export Databases

POST /api/databases/export
Content-Type: application/json

{
  "databases": [
    {"uuid": "uuid1", "name": "db1"},
    {"uuid": "uuid2", "name": "db2"}
  ]
}

Bulk Optimize Databases

POST /api/databases/optimize
Content-Type: application/json

{
  "databaseIds": ["uuid1", "uuid2"]
}

Bulk Export Tables

POST /api/tables/:dbId/bulk-export
Content-Type: application/json

{
  "tableNames": ["users", "posts"],
  "format": "sql"
}

Next Steps


Need Help? See Troubleshooting or open an issue.

Clone this wiki locally