-
Notifications
You must be signed in to change notification settings - Fork 3
Database Performance Optimizations #6
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: db-cleanup-baseline
Are you sure you want to change the base?
Conversation
… deadlocks on MySQL (#80329) * Split subquery when cleaning annotations * update comment * Raise batch size, now that we pay attention to it * Iterate in batches * Separate cancellable batch implementation to allow for multi-statement callbacks, add overload for single-statement use * Use split-out utility in outer batching loop so it respects context cancellation * guard against empty queries * Use SQL parameters * Use same approach for tags * drop unused function * Work around parameter limit on sqlite for large batches * Bulk insert test data in DB * Refactor test to customise test data creation * Add test for catching SQLITE_MAX_VARIABLE_NUMBER limit * Turn annotation cleanup test to integration tests * lint --------- Co-authored-by: Sofia Papagiannaki <[email protected]>
PR Compliance Guide 🔍Below is a summary of compliance checks for this PR:
Compliance status legend🟢 - Fully Compliant🟡 - Partial Compliant 🔴 - Not Compliant ⚪ - Requires Further Human Verification 🏷️ - Compliance label |
|||||||||||||||||||||||||
PR Code Suggestions ✨Explore these optional code suggestions:
|
||||||||||||||
User description
PR #7
PR Type
Enhancement, Tests
Description
Refactor annotation cleanup to use batched ID fetching and deletion
Handle SQLite parameter limit (999) for large batch sizes
Convert cleanup tests to integration tests with parameterized scenarios
Reduce cleanup job frequency from 10 to 1 minute
Diagram Walkthrough
flowchart LR A["Single-statement<br/>DELETE queries"] -->|"Deadlock risk<br/>on MySQL"| B["Split into<br/>ID fetch + delete"] B -->|"Batch by ID"| C["fetchIDs"] C -->|"Delete batch"| D["deleteByIDs"] D -->|"Check DB type"| E{"SQLite with<br/>large batch?"} E -->|"Yes"| F["Direct ID<br/>insertion"] E -->|"No"| G["SQL parameters"] F --> H["Execute delete"] G --> HFile Walkthrough
cleanup_test.go
Convert to integration tests with parameterized scenariospkg/services/annotations/annotationsimpl/cleanup_test.go
TestIntegration*and add skip for short test runscustomizable batch sizes
SQLite parameter limit
createTestAnnotationsto bulk insert data in batches forbetter performance
errorsimport for error joiningxorm_store.go
Split cleanup queries into batched ID fetch and deletepkg/services/annotations/annotationsimpl/xorm_store.go
delete approach
fetchIDsmethod to load annotation IDs into memory respectingbatch size
deleteByIDsmethod with SQLite parameter limit workaround (999limit)
asAnyhelper to convert int64 slice to any slice for SQLparameters
untilDoneOrCancelledas reusable batching loop formulti-statement callbacks
CleanAnnotationsandCleanOrphanedAnnotationTagsto use newbatching approach
migratorpackage to detect database typecleanup.go
Increase cleanup job frequencypkg/services/cleanup/cleanup.go