Skip to content

feat: 🎸 move pricing agent to use a safer delete query#877

Merged
johnyeocx merged 3 commits intomainfrom
fix/nuke-batch-delete
Mar 4, 2026
Merged

feat: 🎸 move pricing agent to use a safer delete query#877
johnyeocx merged 3 commits intomainfrom
fix/nuke-batch-delete

Conversation

@SirTenzin
Copy link
Member

@SirTenzin SirTenzin commented Mar 4, 2026


Summary by cubic

Switch org-wide deletes to batched queries for customers, features, and products to avoid locks/timeouts when nuking sandbox orgs. Customer deletes remain blocked in Live, and the delete for customers now also scopes by org_id and env.

  • Refactors
    • Added CusService.safeDeleteByOrgId with 250-row batches, inArray, Live-mode guard, and org/env-scoped delete.
    • Added FeatureService.safeDeleteByOrgId and ProductService.safeDeleteByOrgId with 250-row batched deletes.
    • Updated handleNukeOrganisationConfiguration to use safe deletes for customers, features, and products in Sandbox.

Written for commit a214068. Summary will update on new commits.

Greptile Summary

This PR introduces CusService.safeDeleteByOrgId, a batched alternative to the existing deleteByOrgId that selects and deletes customers 250 rows at a time to avoid acquiring a full table lock during large deletions. The handleNukeOrganisationConfiguration handler is updated to use the new safe method.

Key changes:

  • [Improvements] Added CusService.safeDeleteByOrgId — batched (default 250) customer deletion scoped to a non-live environment, preventing long-held table locks during sandbox cleanup.
  • [Bug fixes] handleNukeOrganisationConfiguration now uses safeDeleteByOrgId instead of the single-query deleteByOrgId.
  • [Improvements] The PR title states the pricing agent (handleSyncPreviewPricing) was also updated, but it still calls the original CusService.deleteByOrgId — this appears to be an incomplete change and the same fix should be applied there.

Confidence Score: 3/5

  • Safe to merge with caution — the new method is correct, but the PR misses the pricing agent handler it claims to fix.
  • The safeDeleteByOrgId implementation is logically sound and the live-mode guard is preserved. However, the PR title explicitly states the goal is to move the pricing agent to the safer delete, yet handleSyncPreviewPricing.ts still calls the old deleteByOrgId. This is an incomplete implementation of the stated intent, dropping the score to 3.
  • server/src/internal/misc/pricingAgent/handlers/handleSyncPreviewPricing.ts — still uses the old deleteByOrgId and should be updated to safeDeleteByOrgId.

Important Files Changed

Filename Overview
server/src/internal/customers/CusService.ts Adds safeDeleteByOrgId which deletes customers in batches (default 250) to avoid table-wide locks; guarded against live mode. Minor: the inner delete statement lacks org_id/env filters as a defensive measure.
server/src/internal/misc/configs/handlers/handleNukeOrganisationConfiguration.ts Correctly swaps deleteByOrgId for safeDeleteByOrgId in the nuke-organisation handler; no issues in this file itself.

Sequence Diagram

sequenceDiagram
    participant Handler as handleNukeOrganisationConfiguration
    participant CusService
    participant DB as Database (customers table)

    Handler->>CusService: safeDeleteByOrgId(db, orgId, env=Sandbox)
    CusService->>CusService: guard: throw if env == Live

    loop Until no rows remain
        CusService->>DB: SELECT internal_id WHERE org_id=orgId AND env=Sandbox LIMIT 250
        DB-->>CusService: batch of internal_ids (up to 250)
        alt batch is empty
            CusService-->>Handler: done (break)
        else batch has rows
            CusService->>DB: DELETE WHERE internal_id IN (batch)
            DB-->>CusService: rows deleted
        end
    end
Loading

Last reviewed commit: fc5ff77

@vercel
Copy link

vercel bot commented Mar 4, 2026

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Actions Updated (UTC)
autumn-vite Ready Ready Preview, Comment Mar 4, 2026 0:09am

Request Review

Copy link
Contributor

@cubic-dev-ai cubic-dev-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No issues found across 2 files

Confidence score: 5/5

  • Automated review surfaced no issues in the provided summaries.
  • No files require special attention.

@greptile-apps
Copy link
Contributor

greptile-apps bot commented Mar 4, 2026

Additional Comments (1)

server/src/internal/misc/pricingAgent/handlers/handleSyncPreviewPricing.ts
Pricing agent not updated as stated in PR title

The PR title is "move pricing agent to use a safer delete query," but the pricing agent handler (handleSyncPreviewPricing.ts) still calls CusService.deleteByOrgId on line 74 — the original unbatched method that locks all rows at once. Only handleNukeOrganisationConfiguration.ts was updated to use safeDeleteByOrgId. If the intent is to protect the pricing agent from table-lock issues, this file needs the same fix.

		await CusService.safeDeleteByOrgId({
			db,
			orgId: previewOrg.id,
			env: AppEnv.Sandbox,
		});
Prompt To Fix With AI
This is a comment left during a code review.
Path: server/src/internal/misc/pricingAgent/handlers/handleSyncPreviewPricing.ts
Line: 74-78

Comment:
**Pricing agent not updated as stated in PR title**

The PR title is "move pricing agent to use a safer delete query," but the pricing agent handler (`handleSyncPreviewPricing.ts`) still calls `CusService.deleteByOrgId` on line 74 — the original unbatched method that locks all rows at once. Only `handleNukeOrganisationConfiguration.ts` was updated to use `safeDeleteByOrgId`. If the intent is to protect the pricing agent from table-lock issues, this file needs the same fix.

```suggestion
		await CusService.safeDeleteByOrgId({
			db,
			orgId: previewOrg.id,
			env: AppEnv.Sandbox,
		});
```

How can I resolve this? If you propose a fix, please make it concise.

Co-authored-by: greptile-apps[bot] <165735046+greptile-apps[bot]@users.noreply.github.com>
@johnyeocx johnyeocx merged commit aa22ea6 into main Mar 4, 2026
6 of 9 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants