-
Notifications
You must be signed in to change notification settings - Fork 2
Bulk Operations
Guide to multi-select batch operations for databases and tables in D1 Manager.
D1 Manager supports efficient bulk operations for managing multiple databases or tables simultaneously:
Database Operations:
- π¦ Bulk Download (SQL dumps in ZIP)
- ποΈ Bulk Delete (with progress tracking)
- β‘ Bulk Optimize (ANALYZE for performance)
- π€ Upload/Import databases
Table Operations:
- π Bulk Clone (duplicate tables)
- π€ Bulk Export (SQL or CSV in ZIP)
- ποΈ Bulk Delete (with dependency analysis)
Click the checkbox on any database card to select it.
βββββββββββββββββββββββββββββββββββ
β [β] π my-database β
β β
β UUID: a1b2c3d4-... β
βββββββββββββββββββββββββββββββββββ
Selected databases show a blue ring around the card.
Click the "Select All" button in the bulk operations toolbar to select all databases at once.
Click "Clear Selection" to deselect all databases.
Export multiple databases in your chosen format (SQL, JSON, or CSV) as a ZIP archive.
Steps:
- Select databases using checkboxes
- Choose export format from dropdown (SQL/JSON/CSV)
- Click "Download Selected"
- Wait for export (progress shows current database name)
- ZIP file downloads automatically
Format Selection:
βββββββββββ ββββββββββββββββββββββββ
β SQL βΌ β β Download Selected β
βββββββββββ ββββββββββββββββββββββββ
Export Formats:
- SQL - Complete database dump with CREATE TABLE + INSERT (default)
- JSON - Portable JSON with metadata + arrays of objects per table
-
CSV - ZIP containing
_metadata.json+ CSV per table
Progress Indicator:
βββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β β» Exporting my-database... 2 / 5 databases β
β ββββββββββββββββββββββββββββββββββββββββββββββ 40% β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββ
ZIP Contents (SQL):
databases-sql-1704614400000.zip
βββ my-database.sql
βββ test-database.sql
βββ production-db.sql
ZIP Contents (JSON):
databases-json-1704614400000.zip
βββ my-database.json
βββ test-database.json
βββ production-db.json
ZIP Contents (CSV):
databases-csv-1704614400000.zip
βββ my-database/
βββ _metadata.json
βββ users.csv
βββ orders.csv
Rate Limiting:
- 300ms delay between each database export
- Exponential backoff (2sβ4sβ8s) on 429 errors
- Prevents API throttling for large batch exports
Use Cases:
- Regular backups (SQL)
- Data analysis pipelines (CSV)
- Cross-platform data transfer (JSON)
- Archive old databases
FTS5 Limitation:
Databases containing FTS5 (Full-Text Search) virtual tables cannot be exported via the D1 REST API. This is a Cloudflare platform limitation.
When exporting multiple databases:
- Databases with FTS5 tables will be skipped
- A dialog shows which databases were skipped and why
- The specific FTS5 table names are displayed
- Other databases export normally
Workaround:
- Drop the FTS5 tables from the database
- Export the database
- Recreate the FTS5 tables
See FTS5 Full Text Search for more information about FTS5 tables.
Delete multiple databases with progress tracking and error reporting.
Steps:
- Select databases using checkboxes
- Click "Delete Selected"
- Review confirmation dialog
- Click "Delete X Databases"
- Watch progress
Confirmation Dialog:
βββββββββββββββββββββββββββββββββββββββββββββββ
β Delete 3 Databases? β
βββββββββββββββββββββββββββββββββββββββββββββββ€
β This action cannot be undone. β
β β
β Databases to delete: β
β β’ my-database β
β β’ test-database β
β β’ old-database β
β β
β [Cancel] [Delete 3 Databases] β
βββββββββββββββββββββββββββββββββββββββββββββββ
Progress Tracking:
Deleting database 1 of 3: my-database...
Deleting database 2 of 3: test-database...
Deleting database 3 of 3: old-database...
Complete: 3 databases deleted β
Error Handling:
Some databases failed to delete:
β’ old-database: Permission denied
β’ test-database: Not found
Successfully deleted: 1 of 3 databases
Important Notes:
β οΈ Permanent: Deletion cannot be undoneβ οΈ Sequential: Deletes one at a time- β Error Recovery: Continues even if one fails
- β Progress Tracking: Real-time status updates
Run ANALYZE on multiple databases to update query statistics for better performance.
What it Does:
- Executes
PRAGMA optimizeon each database - Updates statistics for SQLite query planner
- Improves query performance
- No data modification
Steps:
- Select databases using checkboxes
- Click "Optimize Selected"
- Review optimization dialog
- Click "Optimize X Databases"
- Wait for completion
Optimization Dialog:
βββββββββββββββββββββββββββββββββββββββββββββββ
β Optimize 3 Databases? β
βββββββββββββββββββββββββββββββββββββββββββββββ€
β Run ANALYZE to update query statistics β
β β
β Operation: ANALYZE (PRAGMA optimize) β
β β
β Databases to optimize: β
β β’ my-database β
β β’ test-database β
β β’ production-db β
β β
β βΉοΈ Note: VACUUM is not available via D1 β
β REST API. Use Wrangler CLI for VACUUM. β
β β
β [Cancel] [Optimize 3 Databases] β
βββββββββββββββββββββββββββββββββββββββββββββββ
Progress:
Running ANALYZE (Database 1 of 3): my-database...
Running ANALYZE (Database 2 of 3): test-database...
Running ANALYZE (Database 3 of 3): production-db...
Complete: 3 databases optimized β
When to Optimize:
- After bulk data imports
- After creating new indexes
- After significant schema changes
- Periodically for large databases (monthly)
About VACUUM:
- VACUUM is not available via D1 REST API
- D1 automatically manages space reclamation
- For manual VACUUM:
wrangler d1 execute database-name --remote --command="VACUUM"
Import SQL files to create new databases or update existing ones.
Steps:
- Click "Upload Database"
- Select SQL file (up to 5GB)
- Choose import mode:
- Create new database - Creates fresh database
- Import into existing - Adds to existing database
- Enter name or select target
- Click "Upload"
Import Modes:
Create New:
Mode: Create new database
Name: [imported-database ]
File: backup.sql (2.4 MB)
Import Existing:
Mode: Import into existing database
Target: [βΌ Select database ]
File: additional-data.sql (1.2 MB)
Important Notes:
- π Max Size: 5GB per file
β οΈ Conflicts: May fail if schema conflicts with existing data- β Validation: SQL syntax validated before execution
- π Progress: Upload progress shown during import
Duplicate an entire database with all its tables and data.
Steps:
- Find the database you want to clone
- Click the "Clone" button on the database card
- Enter a name for the new database
- Click "Clone Database"
- Wait for the clone to complete
Clone Dialog:
βββββββββββββββββββββββββββββββββββββββββββββββ
β π Clone Database β
βββββββββββββββββββββββββββββββββββββββββββββββ€
β Create a copy of my-database with all its β
β tables and data. β
β β
β New Database Name: β
β [my-database-copy ] β
β β
β Lowercase letters, numbers, and hyphens β
β only. Must start with a letter. β
β β
β [Cancel] [Clone Database] β
βββββββββββββββββββββββββββββββββββββββββββββββ
Progress Tracking:
Exporting source database... 10%
Creating new database... 40%
Importing data... 60%
Clone completed! 100%
What Gets Cloned:
- β All tables (structure and data)
- β All indexes
- β Foreign key constraints
- β Check constraints
- β Database colors (metadata, not part of SQL dump)
FTS5 Limitation:
Databases containing FTS5 (Full-Text Search) virtual tables cannot be cloned due to Cloudflare D1's export API limitation.
When attempting to clone an FTS5 database:
βββββββββββββββββββββββββββββββββββββββββββββββ
β β οΈ Cannot Clone FTS5 Database β
βββββββββββββββββββββββββββββββββββββββββββββββ€
β This database contains 2 FTS5 table(s). β
β Cloudflare D1's export API does not β
β support virtual tables like FTS5. β
β β
β To clone this database, first drop the β
β FTS5 tables, clone, then recreate them. β
β β
β [Cancel] [Clone Database] (disabled) β
βββββββββββββββββββββββββββββββββββββββββββββββ
Workaround for FTS5 databases:
- Navigate to the FTS5 tab in the source database
- Delete the FTS5 tables (note the configuration first)
- Clone the database
- Recreate the FTS5 tables in the cloned database
Use Cases:
- Create development/staging copies
- Test schema changes safely
- Create backups before major changes
- Share database templates
Similar to database selection:
- Checkbox on each table card
- "Select All" button
- "Clear Selection" to deselect
Duplicate multiple tables with suggested names.
Steps:
- Navigate to a database
- Select tables using checkboxes
- Click "Clone Selected"
- Review/modify suggested names
- Click "Clone Tables"
Name Suggestions:
Original β Suggested Name
users β users_copy
posts β posts_copy
comments β comments_copy
Customization:
βββββββββββββββββββββββββββββββββββββββββββββββ
β Clone 3 Tables β
βββββββββββββββββββββββββββββββββββββββββββββββ€
β users β [users_backup ] β
β posts β [posts_archive ] β
β comments β [comments_2024 ] β
β β
β [Cancel] [Clone Tables] β
βββββββββββββββββββββββββββββββββββββββββββββββ
What Gets Cloned:
- β Table structure (columns, types, constraints)
- β All data rows
- β All indexes
- β Foreign key references (new table is independent)
- β Triggers (SQLite limitation)
Progress:
Cloning table 1 of 3: users β users_copy...
Cloning table 2 of 3: posts β posts_copy...
Cloning table 3 of 3: comments β comments_copy...
Complete: 3 tables cloned β
Export multiple tables as SQL or CSV in a ZIP archive.
Steps:
- Select tables using checkboxes
- Click "Export Selected"
- Choose format: SQL or CSV
- Click "Export"
- Download ZIP file
Format Selection:
βββββββββββββββββββββββββββββββββββββββββββββββ
β Export 3 Tables β
βββββββββββββββββββββββββββββββββββββββββββββββ€
β Format: β
β β SQL - Complete table dump β
β β CSV - Data only β
β β
β Tables: β
β β’ users (152 rows) β
β β’ posts (1,234 rows) β
β β’ comments (5,678 rows) β
β β
β [Cancel] [Export] β
βββββββββββββββββββββββββββββββββββββββββββββββ
ZIP Contents (SQL):
tables-export.zip
βββ users.sql
βββ posts.sql
βββ comments.sql
ZIP Contents (CSV):
tables-export.zip
βββ users.csv
βββ posts.csv
βββ comments.csv
Use Cases:
- Table-level backups
- Data analysis (CSV)
- Migration preparation
- Sharing data with others
Delete multiple tables with dependency analysis and progress tracking.
Steps:
- Select tables using checkboxes
- Click "Delete Selected"
- Review dependencies for each table
- Check confirmation boxes (if dependencies exist)
- Click "Delete Tables"
Dependencies Display:
βββββββββββββββββββββββββββββββββββββββββββββββ
β Delete 3 Tables? β
βββββββββββββββββββββββββββββββββββββββββββββββ€
β βΌ users (2 dependencies) β
β Inbound: β
β β’ posts references users (152 rows) β
β ON DELETE: CASCADE β
β β’ comments references users (47 rows) β
β ON DELETE: CASCADE β
β β
β β‘ I understand dependent data will be β
β deleted β
β β
β βΌ posts (1 dependency) β
β Outbound: β
β β’ references users (user_id) β
β ON DELETE: CASCADE β
β β
β β‘ I understand dependent data will be β
β deleted β
β β
β βΆ comments (no dependencies) β
β β
β [Cancel] [Delete Tables] (disabled) β
βββββββββββββββββββββββββββββββββββββββββββββββ
Delete button enabled only when:
- All required confirmations checked
- No dependencies, OR
- User acknowledges dependency impact
Progress:
Analyzing dependencies...
Deleting table 1 of 3: users...
Deleting table 2 of 3: posts...
Deleting table 3 of 3: comments...
Complete: 3 tables deleted β
Important Notes:
β οΈ Cascade Impact: Dependent rows deleted automaticallyβ οΈ Permanent: Cannot be undone- β Dependency Analysis: Shows what will be affected
- β Safety Checks: Requires explicit confirmation
-
Backup First:
# Download backups before bulk delete # Use bulk download or Wrangler wrangler d1 execute database-name --remote --command=".dump" > backup.sql
-
Test on Subset:
Select 1-2 items first Verify operation works correctly Then select all items -
Check Dependencies:
Review dependency analysis carefully Understand cascade impacts Confirm you want to proceed
-
Monitor Progress:
- Watch progress indicators
- Note any errors
- Don't close browser during operation
-
Be Patient:
- Large operations take time
- Don't refresh page
- Wait for completion message
-
Verify Results:
Check operation completed successfully Verify expected items created/deleted Test database/table functionality -
Review Errors:
Note which items failed (if any) Check error messages Retry individual items if needed -
Clean Up:
Clear selections Refresh views to see changes Delete temporary files/backups
- Sequential Processing: One database at a time
- No Parallelization: Operations not concurrent
- Quota Limits: Account-wide D1 quotas apply
- Size Limits: Large databases may timeout
- Protected Databases: System database excluded
- FTS5 Virtual Tables: Databases with FTS5 tables cannot be exported (D1 API limitation)
- Same Database Only: Can't clone across databases
- No Cross-Database: Bulk operations within one database
- Foreign Key Checks: Deletions respect FK constraints
- Memory Limits: Very large tables may fail
- D1 REST API: Standard rate limits apply
- Request Size: Large requests may timeout
- Worker Limits: 50ms CPU time per request (paid plan)
Cause: Too many items or items too large
Solutions:
- Select fewer items per operation
- Use Wrangler CLI for very large operations
- Break into smaller batches
Cause: Individual items had errors
Solutions:
- Check error messages for each failed item
- Verify permissions
- Retry failed items individually
- Check quota limits
Cause: Tables have foreign key relationships
Solution:
- Review dependency analysis
- Check confirmation boxes
- Delete in correct order (child tables first)
- Or accept cascade deletions
Cause: Browser blocked download or memory issue
Solutions:
- Check browser pop-up blocker
- Allow downloads from site
- Try with fewer items
- Clear browser cache
POST /api/databases/bulk-delete
Content-Type: application/json
{
"databaseIds": ["uuid1", "uuid2", "uuid3"]
}POST /api/databases/export
Content-Type: application/json
{
"databases": [
{"uuid": "uuid1", "name": "db1"},
{"uuid": "uuid2", "name": "db2"}
]
}POST /api/databases/optimize
Content-Type: application/json
{
"databaseIds": ["uuid1", "uuid2"]
}POST /api/tables/:dbId/bulk-export
Content-Type: application/json
{
"tableNames": ["users", "posts"],
"format": "sql"
}- Database Management - Individual database operations
- Table Operations - Individual table operations
- Foreign Key Dependencies - Understanding dependencies
- API Reference - Complete API documentation
Need Help? See Troubleshooting or open an issue.
- Database Management
- R2 Backup Restore
- Scheduled Backups
- Table Operations
- Query Console
- Schema Designer
- Column Management
- Bulk Operations
- Job History
- Time Travel
- Read Replication
- Undo Rollback
- Foreign Key Visualizer
- ER Diagram
- Foreign Key Dependencies
- Foreign Key Navigation
- Circular Dependency Detector
- Cascade Impact Simulator
- AI Search
- FTS5 Full Text Search
- Cross Database Search
- Index Analyzer
- Database Comparison
- Database Optimization