Conversation
Co-authored-by: eassanassar <eassanassar@gmail.com>
|
Cursor Agent can help with this pull request. Just |
|
The latest updates on your projects. Learn more about Vercel for Git ↗︎
|
WalkthroughProject-wide refactor to split shared package exports into subpaths (types, lists, utils, country-codes) and update imports accordingly across API, Dashboard, Docs, Auth, RPC. Adds a comprehensive initial Drizzle/PostgreSQL schema migration and metadata. Minor optimization in basket user-agent bot detection by precompiling regexes. Changes
Sequence Diagram(s)Estimated code review effort🎯 4 (Complex) | ⏱️ ~45 minutes Possibly related PRs
Tip 🔌 Remote MCP (Model Context Protocol) integration is now available!Pro plan users can now connect to remote MCP servers from the Integrations page. Connect with popular remote MCPs such as Notion and Linear to add more context to your reviews and chats. ✨ Finishing Touches
🧪 Generate unit tests
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
SupportNeed help? Create a ticket on our support page for assistance with any issues or questions. CodeRabbit Commands (Invoked using PR/Issue comments)Type Other keywords and placeholders
CodeRabbit Configuration File (
|
Dependency Review✅ No vulnerabilities or license issues or OpenSSF Scorecard issues found.Scanned FilesNone |
There was a problem hiding this comment.
Actionable comments posted: 15
🔭 Outside diff range comments (70)
packages/rpc/src/utils/cache-invalidation.ts (2)
31-87: Avoid re-creating Drizzle cache instances on every callEach invalidation call instantiates a new cache instance. If
createDrizzleCacheis non-trivial, this adds overhead. Prefer hoisting and reusing per-namespace instances.Apply this diff to reuse instances:
@@ export const invalidateWebsiteCaches = async ( websiteId: string, userId: string, reason?: string ): Promise<void> => { try { - await Promise.all([ - // Website caches - createDrizzleCache({ redis, namespace: 'websites' }).invalidateByTables([ - 'websites', - ]), - createDrizzleCache({ redis, namespace: 'websites' }).invalidateByKey( - `getById:${websiteId}` - ), - - createDrizzleCache({ - redis, - namespace: 'website_by_id', - }).invalidateByKey(`website_by_id:${websiteId}`), - createDrizzleCache({ redis, namespace: 'auth' }).invalidateByKey( - `auth:${userId}:${websiteId}` - ), - - // Funnel caches - createDrizzleCache({ - redis, - namespace: 'funnels', - }).invalidateByTables(['funnelDefinitions']), - createDrizzleCache({ redis, namespace: 'funnels' }).invalidateByKey( - `funnels:list:${websiteId}` - ), - createDrizzleCache({ redis, namespace: 'funnels' }).invalidateByKey( - `funnels:listPublic:${websiteId}` - ), - - // Goals caches - createDrizzleCache({ redis, namespace: 'goals' }).invalidateByTables([ - 'goals', - ]), - createDrizzleCache({ redis, namespace: 'goals' }).invalidateByKey( - `goals:list:${websiteId}` - ), - - // Autocomplete caches - createDrizzleCache({ - redis, - namespace: 'autocomplete', - }).invalidateByTables(['websites']), - - // Mini-charts caches - createDrizzleCache({ - redis, - namespace: 'mini-charts', - }).invalidateByTables(['websites']), - createDrizzleCache({ - redis, - namespace: 'mini-charts', - }).invalidateByKey(`mini-charts:${userId}:${websiteId}`), - createDrizzleCache({ - redis, - namespace: 'mini-charts', - }).invalidateByKey(`mini-charts:public:${websiteId}`), - ]); + // Reuse per-namespace instances + const websitesCache = createDrizzleCache({ redis, namespace: 'websites' }); + const websiteByIdCache = createDrizzleCache({ redis, namespace: 'website_by_id' }); + const authCache = createDrizzleCache({ redis, namespace: 'auth' }); + const funnelsCache = createDrizzleCache({ redis, namespace: 'funnels' }); + const goalsCache = createDrizzleCache({ redis, namespace: 'goals' }); + const autocompleteCache = createDrizzleCache({ redis, namespace: 'autocomplete' }); + const miniChartsCache = createDrizzleCache({ redis, namespace: 'mini-charts' }); + + await Promise.all([ + // Website caches + websitesCache.invalidateByTables(['websites']), + websitesCache.invalidateByKey(`getById:${websiteId}`), + + // Note: verify `website_by_id` key format (see previous comment) + websiteByIdCache.invalidateByKey(`website_by_id:${websiteId}`), + authCache.invalidateByKey(`auth:${userId}:${websiteId}`), + + // Funnel caches + funnelsCache.invalidateByTables(['funnelDefinitions']), + funnelsCache.invalidateByKey(`funnels:list:${websiteId}`), + funnelsCache.invalidateByKey(`funnels:listPublic:${websiteId}`), + + // Goals caches + goalsCache.invalidateByTables(['goals']), + goalsCache.invalidateByKey(`goals:list:${websiteId}`), + + // Autocomplete caches + autocompleteCache.invalidateByTables(['websites']), + + // Mini-charts caches + miniChartsCache.invalidateByTables(['websites']), + miniChartsCache.invalidateByKey(`mini-charts:${userId}:${websiteId}`), + miniChartsCache.invalidateByKey(`mini-charts:public:${websiteId}`), + ]);If you prefer stronger guarantees, you can even move instance creation to module top-level and reuse across invocations.
31-47: Cache invalidation mismatch — cacheable keys (cacheable:website_by_id:...) are not being deletedShort summary: cacheable builds keys as
cacheable:<prefix>:<stringified args>(see packages/redis/cacheable.ts). getWebsiteById usesprefix: 'website_by_id', so its keys look likecacheable:website_by_id:[<websiteId>]. The invalidation call in packages/rpc/src/utils/cache-invalidation.ts calls createDrizzleCache({ namespace: 'website_by_id' }).invalidateByKey(website_by_id:${websiteId}) — that deleteswebsite_by_id:website_by_id:<id>and will not clear the cacheable entries.Files/locations to fix
- packages/redis/cacheable.ts — key format: cachePrefix =
cacheable:${prefix}; getKey =${cachePrefix}:${stringify(args)}.- packages/rpc/src/utils/auth.ts — defines getWebsiteById with cacheable(..., { prefix: 'website_by_id', ... }) (currently not exported).
- packages/rpc/src/utils/cache-invalidation.ts — invalidation call at the Promise.all block (the
website_by_id:${websiteId}line) does not match cacheable's key format.Suggested fixes (pick one)
Preferred, robust: expose an invalidation API from auth.ts and use it here (keeps key logic centralized).
- In packages/rpc/src/utils/auth.ts: export the cached fn or add a helper:
- export const getWebsiteById = cacheable(...);
- or export function invalidateGetWebsiteById(id: string) { return getWebsiteById.clear(id); }
- In packages/rpc/src/utils/cache-invalidation.ts replace the mismatched invalidation with the cached API:
- // import { getWebsiteById } from './auth';
- getWebsiteById.clear(websiteId),
Alternative: use drizzle to delete the exact cacheable key (less ideal because it duplicates key-format logic):
- Replace the current call with:
- createDrizzleCache({ redis, namespace: 'cacheable' }).invalidateByKey(
website_by_id:[${websiteId}]),- (This works because createDrizzleCache will prefix
cacheable:and producecacheable:website_by_id:[<id>], but it's brittle if stringify changes.)Please update packages/rpc/src/utils/cache-invalidation.ts and either export/add an invalidation helper in packages/rpc/src/utils/auth.ts or change the drizzle invalidation to match
cacheable's namespace/format.packages/rpc/src/utils/rate-limit.ts (3)
34-36: Sliding-window weight is inverted; rate limiter will miscount
prevWindowWeightshould decay as time advances within the current window. The current code increases the previous window’s weight as time passes, which is the opposite of the standard sliding window algorithm. This makes the limiter too permissive at the start and too strict at the end of a window.Apply this diff to correct the weighting (and simplify arithmetic with windowMs):
@@ - const windowSeconds = parseDurationToSeconds(this.config.duration); - const key = `${this.config.namespace}:${identifier}`; - const now = Date.now(); + const windowSeconds = parseDurationToSeconds(this.config.duration); + const windowMs = windowSeconds * 1000; + const key = `${this.config.namespace}:${identifier}`; + const now = Date.now(); @@ - const currentWindowStart = - Math.floor(now / (windowSeconds * 1000)) * (windowSeconds * 1000); - const prevWindowStart = currentWindowStart - windowSeconds * 1000; + const currentWindowStart = Math.floor(now / windowMs) * windowMs; + const prevWindowStart = currentWindowStart - windowMs; @@ - const timeInCurrentWindow = now - currentWindowStart; - const prevWindowWeight = timeInCurrentWindow / (windowSeconds * 1000); + const timeInCurrentWindow = now - currentWindowStart; + const prevWindowWeight = Math.max(0, Math.min(1, 1 - timeInCurrentWindow / windowMs)); @@ - const resetTime = currentWindowStart + windowSeconds * 1000; + const resetTime = currentWindowStart + windowMs;And in getStatus():
@@ - const key = `${this.config.namespace}:${identifier}`; - const now = Date.now(); - const windowSeconds = parseDurationToSeconds(this.config.duration); + const key = `${this.config.namespace}:${identifier}`; + const now = Date.now(); + const windowSeconds = parseDurationToSeconds(this.config.duration); + const windowMs = windowSeconds * 1000; @@ - const currentWindowStart = - Math.floor(now / (windowSeconds * 1000)) * (windowSeconds * 1000); - const prevWindowStart = currentWindowStart - windowSeconds * 1000; + const currentWindowStart = Math.floor(now / windowMs) * windowMs; + const prevWindowStart = currentWindowStart - windowMs; @@ - const timeInCurrentWindow = now - currentWindowStart; - const prevWindowWeight = timeInCurrentWindow / (windowSeconds * 1000); + const timeInCurrentWindow = now - currentWindowStart; + const prevWindowWeight = Math.max(0, Math.min(1, 1 - timeInCurrentWindow / windowMs)); @@ - return { + return { success: floatingWindowCount < this.config.limit, limit: this.config.limit, remaining: Math.max(0, this.config.limit - floatingWindowCount), - reset: currentWindowStart + windowSeconds * 1000, + reset: currentWindowStart + windowMs, };Also applies to: 103-105
54-58: Defensively coerce Redis pipeline result for INCR to a numberSome Redis clients return numbers as strings from pipelines. Casting with
as numberrisks runtime errors. Coerce and validate.Apply this diff:
- const newCurrentWindowCount = results[2]?.[1] as number; - - if (typeof newCurrentWindowCount !== 'number') { - throw new Error('Invalid Redis response'); - } + const newCurrentWindowCountRaw = results[2]?.[1]; + const newCurrentWindowCount = Number(newCurrentWindowCountRaw); + if (!Number.isFinite(newCurrentWindowCount)) { + throw new Error('Invalid Redis response'); + }
199-223: Consider broader client IP header support (optional)If you operate across multiple proxies/CDNs, optionally check headers like
true-client-ip,x-client-ip, or provider-specific ones (e.g., Fly.iofly-client-ip) before falling back tox-forwarded-for.I can add these conditionally without changing existing precedence.
packages/auth/src/auth.ts (1)
131-146: Replaceanywith minimal structural types in email sender callbacks.Avoid
anyin TS code per guidelines. Both places only useuser.email, so a narrow type is sufficient.- }: { - user: any; - url: string; - }) => { + }: { + user: { email: string }; + url: string; + }) => {- }: { - user: any; - url: string; - }) => { + }: { + user: { email: string }; + url: string; + }) => {Also applies to: 153-171
apps/dashboard/app/(main)/websites/[id]/funnels/_components/edit-funnel-dialog.tsx (5)
307-311: Fix controlled Sheet handler: respect the boolean argument of onOpenChange.Passing
handleClosedirectly can trigger close on open toggles. Use the boolean param to close only when false.- <Sheet onOpenChange={handleClose} open={isOpen}> + <Sheet + onOpenChange={(open) => { + if (!open) handleClose(); + }} + open={isOpen} + >
400-403: Replaceanywith proper DnD provided/snapshot types.Type safety matters here and prevents accidental misuse of the DnD API.
-import { - DragDropContext, - Draggable, - Droppable, - type DropResult, -} from '@hello-pangea/dnd'; +import { + DragDropContext, + Draggable, + Droppable, + type DropResult, + type DroppableProvided, + type DroppableStateSnapshot, + type DraggableProvided, + type DraggableStateSnapshot, +} from '@hello-pangea/dnd';- {(provided: any, snapshot: any) => ( + {(provided: DroppableProvided, snapshot: DroppableStateSnapshot) => (- {(provided: any, snapshot: any) => ( + {(provided: DraggableProvided, snapshot: DraggableStateSnapshot) => (Also applies to: 416-433
314-316: Tailwind guideline: userounded, avoidrounded-lg/rounded-xl.This file uses sizes prohibited by the repo guidelines. Replace with
roundedfor consistency.- <div className="rounded-xl border border-primary/20 bg-primary/10 p-3"> + <div className="rounded border border-primary/20 bg-primary/10 p-3">- className={`space-y-4 transition-colors duration-150 ${ - snapshot.isDraggingOver - ? 'rounded-lg bg-accent/10 p-1' - : '' - }`} + className={`space-y-4 transition-colors duration-150 ${ + snapshot.isDraggingOver + ? 'rounded bg-accent/10 p-1' + : '' + }`}Also applies to: 403-407
471-475: Keys for filters should be stable and not index-based.Index keys can cause input focus/controlled-state issues when filters are added/removed/reordered. Prefer a persistent id on each filter.
I recommend updating useFilters to include an id per filter and return it, e.g.:
- When adding a filter, attach { id: crypto.randomUUID(), ... }.
- Update this component to use key={filter.id}.
Want me to implement that across useFilters and here?
410-416: Avoid using array index for keys and draggableId in DnD list — add a stable client-only idConfirmed: apps/dashboard/app/(main)/websites/[id]/funnels/_components/edit-funnel-dialog.tsx uses the array index for both draggableId and key in the Draggable map — this will break React reconciliation and DnD stability on reorder. Please update to use a stable id.
Files / locations to change:
- apps/dashboard/app/(main)/websites/[id]/funnels/_components/edit-funnel-dialog.tsx — the Draggable inside the Droppable mapping (where formData.steps.map is rendered).
- Add a client id to FunnelStep (update the FunnelStep type in hooks/use-funnels or make it optional locally).
Minimal quick fix (less ideal but less invasive)
- {formData.steps.map((step, index) => ( - <Draggable - draggableId={`step-${index}`} - index={index} - key={`step-${index}`} - > + {formData.steps.map((step, index) => ( + <Draggable + draggableId={`step-${step.type}:${step.target || index}`} + index={index} + key={`step-${step.type}:${step.target || index}`} + >Recommended (robust) fix — persist a client-only id for each step
- Add a small id generator near the top of the component:
const genClientId = () => typeof crypto !== 'undefined' && 'randomUUID' in crypto ? crypto.randomUUID() : Math.random().toString(36).slice(2);
- Ensure steps always have clientId when initializing/updating formData:
- steps: [ - { type: 'PAGE_VIEW' as const, target: '/', name: 'Landing Page' }, - { type: 'PAGE_VIEW' as const, target: '/signup', name: 'Sign Up Page' }, - ], + steps: [ + { type: 'PAGE_VIEW' as const, target: '/', name: 'Landing Page', clientId: genClientId() }, + { type: 'PAGE_VIEW' as const, target: '/signup', name: 'Sign Up Page', clientId: genClientId() }, + ],When loading an existing funnel:
setFormData({ ...funnel, filters: funnel.filters || [], steps: (funnel.steps || []).map(s => ({ ...s, clientId: (s as any).clientId ?? genClientId() })), });
- When adding a step:
- { type: 'PAGE_VIEW' as const, target: '', name: '' }, + { type: 'PAGE_VIEW' as const, target: '', name: '', clientId: genClientId() },
- Use clientId for Draggable id/key:
- <Draggable - draggableId={`step-${index}`} - index={index} - key={`step-${index}`} - > + <Draggable + draggableId={`${step.clientId}`} + index={index} + key={`${step.clientId}`} + >Also update the FunnelStep type to include clientId?: string (or cast to any where used) so TypeScript stays correct.
Tag:
packages/rpc/src/routers/mini-charts.ts (1)
31-34: Minor: avoid redundant normalization/sorting of websiteIds.Ids are normalized in getMiniCharts; getAuthorizedWebsiteIds repeats sorting in the cache key. Not harmful, but you can drop one to shave a few CPU cycles.
Also applies to: 44-48
apps/dashboard/app/(main)/websites/[id]/goals/_components/edit-goal-dialog.tsx (4)
6-12: Normalize Phosphor icon usage and naming; fix non-existent component namesPer guidelines, append 'Icon' to Phosphor components and use consistent naming. Also, '@phosphor-icons/react' doesn’t export PencilIcon/PlusIcon/TrashIcon as such; alias real components to the expected names. This reduces confusion and avoids runtime/type errors.
Apply these diffs to normalize imports and usage:
- import { - Eye, - MouseMiddleClick, - PencilIcon, - PlusIcon, - Target, - TrashIcon, - } from '@phosphor-icons/react'; + import { + Eye as EyeIcon, + MouseMiddleClick as MouseMiddleClickIcon, + Pencil as PencilIcon, + Plus as PlusIcon, + Target as TargetIcon, + Trash as TrashIcon, + } from '@phosphor-icons/react';- const getGoalIcon = (type: string) => { + const getGoalIcon = (type: string) => { switch (type) { case 'PAGE_VIEW': - return <Eye className="text-blue-600" size={16} weight="duotone" />; + return <EyeIcon className="text-blue-600" size={16} weight="duotone" />; case 'EVENT': - return ( - <MouseMiddleClick + return ( + <MouseMiddleClickIcon className="text-green-600" size={16} weight="duotone" /> ); default: - return ( - <Target + return ( + <TargetIcon className="text-muted-foreground" size={16} weight="duotone" /> ); } };- {isCreateMode ? ( - <Target + {isCreateMode ? ( + <TargetIcon className="h-6 w-6 text-primary" size={16} weight="duotone" /> ) : ( <PencilIcon className="h-6 w-6 text-primary" size={16} weight="duotone" /> )}- <Target + <TargetIcon className="h-5 w-5 text-primary" size={16} weight="duotone" />- <MouseMiddleClick + <MouseMiddleClickIcon className="text-green-600" size={14} weight="duotone" />Also applies to: 193-214, 229-241, 300-305, 332-350, 441-447, 460-465
478-481: Add explicit type to the primary action buttonGuidelines: always include a type attribute for button elements. You already do this elsewhere; add it here for consistency.
- <Button + <Button className="relative rounded-lg bg-gradient-to-r from-primary to-primary/90 shadow-lg transition-all duration-200 hover:from-primary/90 hover:to-primary hover:shadow-xl" disabled={!isFormValid || isSaving} onClick={handleSubmit} + type="button" >
387-451: Avoid using array index in keys for dynamic listsKeys like key={
filter-${index}-...} can cause reconciliation issues when items are added/removed/reordered. Prefer a stable ID per filter (e.g., generated when the filter is created) and use that as the key.If useFilters doesn’t assign an id to new filters yet, I can provide a small refactor to add an id field (e.g., via nanoid) and update the key usage here. Want me to draft it?
228-231: Tailwind rounding: replacerounded-lg/rounded-xlwithroundedin edit-goal-dialog.tsxRepo guideline requires using
roundedonly — this file contains multiplerounded-lg/rounded-xloccurrences; please normalize them.
- apps/dashboard/app/(main)/websites/[id]/goals/_components/edit-goal-dialog.tsx
- 228: rounded-xl → rounded
- 266: rounded-lg → rounded
- 285: rounded-lg → rounded
- 310: rounded-xl → rounded
- 317: rounded-lg → rounded
- 327: rounded-lg → rounded
- 330: rounded-lg → rounded
- 354: rounded-lg → rounded
- 365: rounded-lg → rounded
- 391: rounded-lg → rounded
- 400: rounded-lg → rounded
- 403: rounded-lg → rounded
- 418: rounded-lg → rounded
- 421: rounded-lg → rounded
- 431: rounded-lg → rounded
- 441: rounded-lg → rounded
- 454: rounded-lg → rounded
- 470: rounded-lg → rounded
- 478: rounded-lg → rounded
Example quick change:
- from:
- to:
If you want, I can open a follow-up PR that replaces these instances (and updates referenced UI primitives) to keep visual consistency.
apps/dashboard/app/(main)/websites/[id]/_components/tabs/performance-tab.tsx (1)
5-12: Migrate from Lucide to Phosphor icons to comply with repository guidelinesGuidelines specify using Phosphor icons (weight="duotone") and naming them with the Icon suffix. This file imports from lucide-react. Recommend swapping to @phosphor-icons/react equivalents and aliasing names to XIcon for consistency (e.g., Zap -> LightningIcon; TrendingUp -> TrendUpIcon; AlertTriangle -> WarningIcon; CheckCircle -> CheckCircleIcon; Monitor -> MonitorIcon; Smartphone -> DeviceMobileIcon).
If you confirm preferred Phosphor equivalents for each, I can supply a precise diff updating imports and JSX usage throughout the file.
Also applies to: 205-213, 232-233, 252-253, 280-281, 548-561
apps/dashboard/app/(main)/websites/[id]/_components/utils/types.ts (3)
1-6: Import ReactNode explicitly for type usage.Avoid relying on the global React namespace; import the type directly.
} from '@databuddy/shared/types'; + +import type { ReactNode } from 'react';
39-46: Replace index signature any with a safer type.Using any undermines type-safety. Prefer unknown (or a constrained union if you know the value shapes).
export interface MetricPoint { date: string; pageviews?: number; visitors?: number; sessions?: number; bounce_rate?: number; - [key: string]: any; + [key: string]: unknown; }
53-58: Make TableColumn generic and remove any usage.Typing the row/value improves safety and autocompletion across table renderers.
-export interface TableColumn { - accessorKey: string; - header: string; - cell?: (value: any, row?: any) => React.ReactNode; - className?: string; -} +export interface TableColumn<T = Record<string, unknown>> { + accessorKey: keyof T & string; + header: string; + cell?: (value: T[keyof T], row?: T) => ReactNode; + className?: string; +}apps/dashboard/hooks/use-filters.ts (2)
4-14: Type operator from operatorOptions instead of string.This narrows the operator field and prevents invalid values.
export const operatorOptions = [ { value: 'equals', label: 'equals' }, { value: 'contains', label: 'contains' }, { value: 'not_equals', label: 'does not equal' }, ] as const; +type Operator = typeof operatorOptions[number]['value']; + type BaseFilterType = { field: DynamicQueryFilter['field']; - operator: string; + operator: Operator; value: DynamicQueryFilter['value']; };
46-54: Tighten updateFilter types to ensure value matches the selected field.Use a generic parameter to bind field and value types together.
-const updateFilter = useCallback( - (index: number, field: keyof T, value: T[keyof T]) => { +const updateFilter = useCallback( + <K extends keyof T>(index: number, field: K, value: T[K]) => { const newFilters = filters.map((filter, i) => i === index ? { ...filter, [field]: value } : filter ); onFiltersChange(newFilters); }, [filters, onFiltersChange] );apps/docs/app/(home)/api/query-types-grid.tsx (1)
60-63: Mark decorative icon as hidden from screen readersThe Caret icon is purely decorative. Hide it from assistive tech to reduce noise.
- <CaretRightIcon + <CaretRightIcon + aria-hidden="true" className="mt-0.5 hidden h-4 w-4 text-muted-foreground transition-transform group-hover:translate-x-0.5 sm:block" weight="fill" />apps/dashboard/app/(main)/websites/[id]/map/page.tsx (2)
145-148: Fix mismatched aria-label on MapPinIconThe label reads “Globe” but the icon is a map pin. Align the accessible name with the visual.
- <MapPinIcon - aria-label="Globe" + <MapPinIcon + aria-label="Map pin" className="h-5 w-5 text-primary" weight="duotone" />
203-216: Avoid using array index as React key for skeletonsPrefer stable keys to avoid reconciliation pitfalls, even for placeholders.
- {new Array(6).fill(0).map((_, i) => ( + {SKELETON_KEYS.map((key) => ( <div className="flex items-center justify-between p-3" - key={`country-skeleton-${i + 1}`} + key={`country-skeleton-${key}`} >Add this near the top of the file (outside the selected range):
const SKELETON_KEYS = ['a', 'b', 'c', 'd', 'e', 'f'] as const;apps/docs/app/(home)/blog/[slug]/page.tsx (5)
38-41: Next.js props should not be Promises
paramsandsearchParamsare passed as objects by the App Router. Typing them asPromise<...>is misleading and forces unnecessaryawaits.-interface PageProps { - params: Promise<{ slug: string }>; - searchParams: Promise<{ [key: string]: string | string[] | undefined }>; -} +interface PageProps { + params: { slug: string }; + searchParams: { [key: string]: string | string[] | undefined }; +}
43-47: Remove unnecessary await on params in generateMetadataSimplify slug extraction now that
paramsis correctly typed.-export async function generateMetadata({ - params, -}: PageProps): Promise<Metadata> { - const slug = (await params).slug; +export async function generateMetadata({ params }: PageProps): Promise<Metadata> { + const slug = params.slug;
93-99: Fix PostPage params type and remove unnecessary await
paramsis not a Promise in App Router page components.-export default async function PostPage({ - params, -}: { - params: Promise<{ slug: string }>; -}) { - const { slug } = await params; +export default async function PostPage({ + params, +}: { + params: { slug: string }; +}) { + const { slug } = params;
318-318: Tailwind rule: use “rounded” onlyProject guideline specifies using
roundedonly. Replacerounded-2xl.- <div className="mx-auto mb-3 flex h-16 w-16 items-center justify-center rounded-2xl bg-muted/20"> + <div className="mx-auto mb-3 flex h-16 w-16 items-center justify-center rounded bg-muted/20">
64-66: Use numeric width/height for Metadata images (fix required)Next.js Metadata expects numbers for width/height; these are currently strings and can cause type errors.
- apps/docs/app/(home)/blog/[slug]/page.tsx — twitter.images (lines ~64–66)
- apps/docs/app/(home)/blog/[slug]/page.tsx — openGraph.images (lines ~77–79)
- url: data.post.coverImage ?? `${SITE_URL}/og.webp`, - width: '1200', - height: '630', + url: data.post.coverImage ?? `${SITE_URL}/og.webp`, + width: 1200, + height: 630, alt: data.post.title,Apply the same change to both the twitter.images and openGraph.images blocks.
apps/dashboard/components/analytics/map-component.tsx (3)
104-113: Fix per-capita calculation to match “per million people” labelYou currently compute raw per-capita (count/population) but display “per million people”. Multiply by 1,000,000.
- const perCapitaValue = population > 0 ? item.count / population : 0; + const perCapitaValue = + population > 0 ? (item.count / population) * 1_000_000 : 0;Also applies to: 513-519
445-455: Invalid inline style values: '!important' not supported; bad hex colorReact style props don’t support
!important. Also#fffffis invalid (should be#ffffff). This can break theming.style={{ height: '100%', - backgroundColor: - resolvedTheme === 'dark' - ? '#1f2937 !important' - : '#fffff !important', + backgroundColor: + resolvedTheme === 'dark' + ? '#1f2937' + : '#ffffff', cursor: 'default', outline: 'none', - zIndex: '1', + zIndex: 1, }}
424-424: Remove incorrect ARIA role
role="tablist"is semantically wrong here and can confuse assistive tech. Drop the role.- role="tablist"apps/api/src/agent/handlers/metric-handler.ts (1)
38-42: Limit SQL contents in logs and gate by environmentAvoid logging full SQL in production to reduce leakage risk; gate logs to non-prod.
- console.error('❌ Metric SQL execution error', { - error: - queryError instanceof Error ? queryError.message : 'Unknown error', - sql: parsedAiJson.sql, - }); + if (process.env.NODE_ENV !== 'production') { + console.error('❌ Metric SQL execution error', { + error: + queryError instanceof Error ? queryError.message : 'Unknown error', + // Avoid logging full SQL in production + sql: parsedAiJson.sql?.slice(0, 200), + }); + }If you prefer, I can replace this with a structured logger and redact patterns (e.g., literals) proactively. Want me to open a follow-up?
apps/dashboard/hooks/use-funnels.ts (5)
266-275: Do not call tRPC hooks inside React Query’s queryFn; this breaks Rules of Hooks and won’t work
queryFnmust return a promise, not a React hook result. Callingtrpc.funnels.getAnalytics.useQueryinsideuseQuerieswill cause runtime errors and violates React’s Rules of Hooks.Refactor to use
trpc.useQueries(tRPC helper) instead of TanStack’suseQuerieshere.Apply this diff:
- const funnels = useQueries({ - queries: funnelIds.map((funnelId) => ({ - queryKey: ['funnels', 'getAnalytics', { websiteId, funnelId, dateRange }], - queryFn: () => - trpc.funnels.getAnalytics.useQuery({ - websiteId, - funnelId, - startDate: dateRange?.start_date, - endDate: dateRange?.end_date, - }), - enabled: enabled && !!websiteId && !!funnelId, - })), - }); + const queries = trpc.useQueries((t) => + funnelIds.map((funnelId) => + t.funnels.getAnalytics.useQuery( + { + websiteId, + funnelId, + startDate: dateRange?.start_date, + endDate: dateRange?.end_date, + }, + { enabled: enabled && !!websiteId && !!funnelId } + ) + ) + );Follow-up changes (outside this range) are required below to use
queriesinstead offunnels.
279-289: Adjust variable names after refactor to trpc.useQueriesAfter switching to
trpc.useQueries, update references accordingly.- return funnels.map((query, index) => { + return queries.map((query, index) => { const data = query.data; return { funnelId: funnelIds[index], data: data ? data : null, isLoading: query.isLoading, error: query.error, }; });
291-294: Update isLoading aggregate to use queries collectionReflect the rename from
funnelstoqueries.return { data: comparisonData, - isLoading: funnels.some((q) => q.isLoading), + isLoading: queries.some((q) => q.isLoading), };
306-322: Same issue: tRPC hook used inside React Query’s queryFnThis repeats the hooks misuse. Use
trpc.useQueriesto batch query definitions safely.- const results = useQueries({ - queries: (funnels || []).map((funnel) => ({ - queryKey: [ - 'funnels', - 'getAnalytics', - { websiteId, funnelId: funnel.id, dateRange }, - ], - queryFn: () => - trpc.funnels.getAnalytics.useQuery({ - websiteId, - funnelId: funnel.id, - startDate: dateRange?.start_date, - endDate: dateRange?.end_date, - }), - enabled: enabled && !!websiteId && !!funnel.id, - })), - }); + const results = trpc.useQueries((t) => + (funnels || []).map((funnel) => + t.funnels.getAnalytics.useQuery( + { + websiteId, + funnelId: funnel.id, + startDate: dateRange?.start_date, + endDate: dateRange?.end_date, + }, + { enabled: enabled && !!websiteId && !!funnel.id } + ) + ) + );
103-106: Improve cache invalidation: use tRPC utils rather than manual queryKey arraysManually crafting nested query keys like
[["funnels","list"]]is brittle and likely won’t match tRPC’s internal keys. Prefertrpc.useUtils()and call typed invalidators.- const createMutation = trpc.funnels.create.useMutation({ - onSuccess: () => { - queryClient.invalidateQueries({ queryKey: [['funnels', 'list']] }); - }, - }); + const createMutation = trpc.funnels.create.useMutation({ + onSuccess: async () => { + await utils.funnels.list.invalidate({ websiteId }); + }, + });- const updateMutation = trpc.funnels.update.useMutation({ - onSuccess: () => { - queryClient.invalidateQueries({ queryKey: [['funnels', 'list']] }); - queryClient.invalidateQueries({ - queryKey: [['funnels', 'getAnalytics']], - }); - }, - }); + const updateMutation = trpc.funnels.update.useMutation({ + onSuccess: async () => { + await Promise.all([ + utils.funnels.list.invalidate({ websiteId }), + utils.funnels.getAnalytics.invalidate(), + ]); + }, + });- const deleteMutation = trpc.funnels.delete.useMutation({ - onSuccess: () => { - queryClient.invalidateQueries({ queryKey: [['funnels', 'list']] }); - queryClient.invalidateQueries({ - queryKey: [['funnels', 'getAnalytics']], - }); - }, - }); + const deleteMutation = trpc.funnels.delete.useMutation({ + onSuccess: async () => { + await Promise.all([ + utils.funnels.list.invalidate({ websiteId }), + utils.funnels.getAnalytics.invalidate(), + ]); + }, + });Add this initialization near your other hooks (outside the selected ranges):
// Add near line 85 after queryClient const utils = trpc.useUtils();Also applies to: 109-115, 118-124
apps/dashboard/hooks/use-goals.ts (2)
47-50: Use tRPC utils for precise, type-safe cache invalidationThe nested array queryKey patterns are brittle and may not match tRPC’s internal keys. Use
trpc.useUtils()and invalidate via typed paths.- const createMutation = trpc.goals.create.useMutation({ - onSuccess: () => { - queryClient.invalidateQueries({ queryKey: [['goals', 'list']] }); - }, - }); + const createMutation = trpc.goals.create.useMutation({ + onSuccess: async (_, variables) => { + await utils.goals.list.invalidate({ websiteId: variables.websiteId }); + }, + });- const updateMutation = trpc.goals.update.useMutation({ - onSuccess: () => { - queryClient.invalidateQueries({ queryKey: [['goals', 'list']] }); - queryClient.invalidateQueries({ queryKey: [['goals', 'getAnalytics']] }); - }, - }); + const updateMutation = trpc.goals.update.useMutation({ + onSuccess: async () => { + await Promise.all([ + utils.goals.list.invalidate(), + utils.goals.getAnalytics.invalidate(), + ]); + }, + });- const deleteMutation = trpc.goals.delete.useMutation({ - onSuccess: () => { - queryClient.invalidateQueries({ queryKey: [['goals', 'list']] }); - queryClient.invalidateQueries({ queryKey: [['goals', 'getAnalytics']] }); - }, - }); + const deleteMutation = trpc.goals.delete.useMutation({ + onSuccess: async () => { + await Promise.all([ + utils.goals.list.invalidate(), + utils.goals.getAnalytics.invalidate(), + ]); + }, + });Add this (outside the selected ranges) near your other hooks:
const utils = trpc.useUtils();Also applies to: 51-56, 57-62
100-105: Unify dateRange typing with shared DateRange typeUse the shared DateRange for consistency with funnels and other modules.
- dateRange: { start_date: string; end_date: string }, + dateRange: DateRange,- dateRange: { start_date: string; end_date: string }, + dateRange: DateRange,Add this import at the top (outside the selected ranges):
import type { DateRange } from '@databuddy/shared/types';Also applies to: 117-122
apps/api/src/query/utils.ts (1)
289-293: Avoid generating invalid SQL when all clauses are filtered outIf every condition matches UNSAFE_CLAUSE_REGEX, this returns
AND (), which is invalid SQL. Return an empty string when no safe clauses remain.export function buildWhereClause(conditions?: string[]): string { if (!conditions?.length) { return ''; } const safeClauses = conditions.filter( (clause) => !UNSAFE_CLAUSE_REGEX.test(clause) ); - return `AND (${safeClauses.join(' AND ')})`; + return safeClauses.length ? `AND (${safeClauses.join(' AND ')})` : ''; }apps/api/src/routes/query.ts (3)
100-111: Critical: domainCache keying uses query parameter names instead of website IDsThe batch path builds uniqueWebsiteIds from request.parameters (query types), not from website IDs. This causes unnecessary DB calls with invalid IDs and the prefetch cache won’t be used later since executeDynamicQuery looks up by website_id from query params.
Fix by deriving the websiteId from queryParams and prefetching only that domain.
Apply this diff:
- if (Array.isArray(body)) { - const uniqueWebsiteIds = [ - ...new Set( - body.flatMap((req) => - req.parameters.map((param) => - typeof param === 'string' ? param : param.name - ) - ) - ), - ]; - const domainCache = await getCachedWebsiteDomain(uniqueWebsiteIds); + if (Array.isArray(body)) { + const websiteId = queryParams.website_id; + const domainCache: Record<string, string | null> = websiteId + ? { [websiteId]: await getWebsiteDomain(websiteId) } + : {};Also remove the now-unused import:
-import { getCachedWebsiteDomain, getWebsiteDomain } from '../lib/website-utils'; +import { getWebsiteDomain } from '../lib/website-utils';
31-39: Micro-opt: avoid remapping filterOptions for every query builderallowedFilters fallback remaps filterOptions on every entry. Cache it once to save repeated work.
You can hoist this above to compute once:
+const defaultAllowedFilters = filterOptions.map((f) => f.value); const configs = Object.fromEntries( Object.entries(QueryBuilders).map(([key, config]) => { const baseConfig = { - allowedFilters: - config.allowedFilters ?? - filterOptions.map((filter) => filter.value), + allowedFilters: config.allowedFilters ?? defaultAllowedFilters, customizable: config.customizable, defaultLimit: config.limit, };
76-81: Avoid leaking raw error messages to clientsReturning error.message may expose internals. Consider a generic message (or gating details by environment).
Example:
- error: error instanceof Error ? error.message : 'Compilation failed', + error: 'Compilation failed',and
- error: error instanceof Error ? error.message : 'Query failed', + error: 'Query failed',Also applies to: 149-153
apps/dashboard/app/(main)/organizations/[slug]/components/website-selector.tsx (1)
19-23: Tailwind guideline: useroundedinstead ofrounded-lgThe project guidelines require using rounded only (not rounded-md/xl/lg).
Apply this diff:
- 'flex w-full items-center gap-3 rounded-lg border p-3 text-left transition-all duration-200', + 'flex w-full items-center gap-3 rounded border p-3 text-left transition-all duration-200',apps/dashboard/app/(main)/websites/[id]/errors/_components/errors-page-content.tsx (4)
160-168: Avoidanyin reducers; add minimal types for safetyThe coding guidelines discourage any. Type the reducer items to the fields you read.
Apply this diff:
- const totalErrors = errorTypes.reduce( - (sum: number, type: any) => sum + (type.count || 0), - 0 - ); + const totalErrors = errorTypes.reduce( + (sum: number, type: { count?: number }) => sum + (type.count ?? 0), + 0 + ); const uniqueErrorTypes = errorTypes.length; - const affectedUsers = errorTypes.reduce( - (sum: number, type: any) => sum + (type.users || 0), - 0 - ); + const affectedUsers = errorTypes.reduce( + (sum: number, type: { users?: number }) => sum + (type.users ?? 0), + 0 + );
268-269: Tailwind guideline: useroundedinstead ofrounded-xlConform to the repo’s Tailwind rule.
Apply this diff:
- <Card className="rounded-xl border-red-200 bg-red-50 dark:border-red-800 dark:bg-red-950"> + <Card className="rounded border-red-200 bg-red-50 dark:border-red-800 dark:bg-red-950">
288-289: Tailwind guideline: useroundedinstead ofrounded-lgAlign with style guide.
Apply this diff:
- <Button - className="gap-2 rounded-lg" + <Button + className="gap-2 rounded"
202-206: Avoidanyin chart mapping; add minimal point shapeType the trend point instead of using any.
Apply this diff:
- return processedData.error_trends.map((point: any) => ({ + return processedData.error_trends.map( + (point: { date: string | number | Date; errors?: number; users?: number }) => ({ date: safeFormatDate(point.date, 'MMM d'), 'Total Errors': point.errors || 0, 'Affected Users': point.users || 0, - })); + }) + );apps/docs/app/(home)/api/actions.ts (2)
31-38: Fail fast if API key is missing; avoid casting env var to stringCasting possibly-undefined env var to string risks sending the literal "undefined" header and hard-to-diagnose API behavior. Validate the presence of DATABUDDY_API_KEY and use a local variable.
Apply this diff:
- const response = await fetch(url.toString(), { + const apiKey = process.env.DATABUDDY_API_KEY; + if (!apiKey) { + throw new Error('DATABUDDY_API_KEY is not set'); + } + const response = await fetch(url.toString(), { method: 'GET', headers: { 'Content-Type': 'application/json', - 'X-Api-Key': process.env.DATABUDDY_API_KEY as string, + 'X-Api-Key': apiKey, }, cache: 'force-cache', });
31-38: Consider explicit revalidation over force-cacheFor server actions, prefer Next.js fetch caching hints to make staleness explicit. For example, revalidate every 5 minutes.
Apply this minimal change:
- const response = await fetch(url.toString(), { + const response = await fetch(url.toString(), { method: 'GET', headers: { 'Content-Type': 'application/json', 'X-Api-Key': apiKey, }, - cache: 'force-cache', + next: { revalidate: 300 }, });apps/dashboard/hooks/use-dynamic-query.ts (5)
35-44: Avoid passing numbers to URLSearchParams init; standardize param buildingnew URLSearchParams({ ...additionalParams }) expects string values; numbers can lead to typing issues and inconsistencies. Build params via set/append and ensure string conversion.
Apply this diff:
function buildParams( websiteId: string, dateRange?: DateRange, additionalParams?: Record<string, string | number> ): URLSearchParams { - const params = new URLSearchParams({ - website_id: websiteId, - ...additionalParams, - }); + const params = new URLSearchParams(); + params.set('website_id', websiteId); + if (additionalParams) { + for (const [k, v] of Object.entries(additionalParams)) { + params.set(k, String(v)); + } + }
201-210: Stabilize React Query keys to avoid unnecessary cache missesUsing object references (dateRange, queryData) in queryKey can cause cache churn due to identity changes. Key off primitive fields and a serialized query instead.
Apply this diff:
- const query = useQuery({ - queryKey: ['dynamic-query', websiteId, dateRange, queryData, userTimezone], + const query = useQuery({ + queryKey: [ + 'dynamic-query', + websiteId, + dateRange.start_date, + dateRange.end_date, + dateRange.granularity, + userTimezone, + JSON.stringify(queryData), + ], queryFn: fetchData, ...defaultQueryOptions, ...options, enabled: options?.enabled !== false && !!websiteId && queryData.parameters.length > 0, });
664-681: Guard window access for SSR safety in referrer parsingIf this hook is ever imported in a server-rendered context, referencing window will throw. Add a typeof window check.
Apply this diff:
- if (session.referrer) { + if (session.referrer) { try { - const url = new URL(session.referrer); - referrerParsed = { - type: - url.hostname === window.location.hostname ? 'internal' : 'external', - name: url.hostname, - domain: url.hostname, - }; + const url = new URL(session.referrer); + const isBrowser = + typeof window !== 'undefined' && typeof window.location !== 'undefined'; + const sameHost = isBrowser + ? url.hostname === window.location.hostname + : false; + referrerParsed = { + type: sameHost ? 'internal' : 'external', + name: url.hostname, + domain: url.hostname, + }; } catch { referrerParsed = { type: 'direct', name: 'Direct', domain: null, }; } }
684-689: Use getCountryName for consistent country namingElsewhere (profiles) you normalize country names via getCountryName; here you keep the raw value. For consistency, derive both code and canonical name.
Apply this diff:
- const countryCode = getCountryCode(session.country || ''); - const countryName = session.country || 'Unknown'; + const countryCode = getCountryCode(session.country || ''); + const countryName = getCountryName(session.country || '') || 'Unknown';
571-601: Harden pagination logic against missing meta.pageIf the API ever omits meta.page, next/prev logic can break. Guard with Number.isFinite checks.
Apply this diff:
- getNextPageParam: (lastPage) => { - const sessions = (lastPage.data as any)?.session_list || []; - return sessions.length === limit ? lastPage.meta.page + 1 : undefined; - }, - getPreviousPageParam: (firstPage) => { - return firstPage.meta.page > 1 ? firstPage.meta.page - 1 : undefined; - }, + getNextPageParam: (lastPage) => { + const sessions = (lastPage.data as any)?.session_list || []; + const page = (lastPage as any)?.meta?.page; + return sessions.length === limit && Number.isFinite(page) + ? page + 1 + : undefined; + }, + getPreviousPageParam: (firstPage) => { + const page = (firstPage as any)?.meta?.page; + return Number.isFinite(page) && page > 1 ? page - 1 : undefined; + },apps/dashboard/components/layout/navigation/website-header.tsx (1)
32-32: Tailwind guideline: use rounded instead of rounded-lgProject guidelines specify using rounded only in Tailwind. Replace rounded-lg.
Apply this diff:
- <div className="rounded-lg border border-border/50 bg-accent/30 px-2 py-2"> + <div className="rounded border border-border/50 bg-accent/30 px-2 py-2">apps/api/src/agent/handlers/chart-handler.ts (1)
81-86: Avoid logging raw SQL in production logsLogging the generated SQL unconditionally can leak sensitive data and increases risk. Gate logs behind admin/debug or redact SQL.
Apply this diff:
- } catch (queryError: unknown) { - console.error('❌ SQL execution error', { - error: queryError instanceof Error ? queryError.message : 'Unknown error', - sql: parsedAiJson.sql, - }); + } catch (queryError: unknown) { + const errMsg = queryError instanceof Error ? queryError.message : 'Unknown error'; + if (context.user?.role === 'ADMIN') { + console.error('❌ SQL execution error', { + error: errMsg, + // Avoid logging full SQL in non-dev environments + sql: process.env.NODE_ENV === 'development' ? parsedAiJson.sql : '[REDACTED]', + }); + } else { + console.error('❌ SQL execution error', { error: errMsg }); + }apps/dashboard/app/(main)/websites/[id]/revenue/hooks/use-website-revenue.ts (4)
39-47: Avoid passing numbers to URLSearchParams init; standardize param buildingLike in use-dynamic-query, initialize URLSearchParams without numeric values to satisfy DOM typings and ensure consistent serialization.
Apply this diff:
function buildParams( websiteId: string, dateRange?: DateRange, additionalParams?: Record<string, string | number> ): URLSearchParams { - const params = new URLSearchParams({ - website_id: websiteId, - ...additionalParams, - }); + const params = new URLSearchParams(); + // No website_id here since it's already in the path below + if (additionalParams) { + for (const [k, v] of Object.entries(additionalParams)) { + params.set(k, String(v)); + } + }
72-79: Remove duplicated website identifier in query stringThe websiteId is already part of the path. Also, buildParams should not implicitly add website_id here to avoid duplication and potential confusion downstream.
Apply this diff (works with the param-builder change above):
- const params = buildParams(websiteId, dateRange); + const params = buildParams(websiteId, dateRange); const url = `${API_BASE_URL}/v1/revenue/analytics/website/${websiteId}?${params}`;And ensure buildParams no longer adds website_id by default (see previous diff).
93-109: Align default query options with other hooks (optional)Elsewhere you use refetchOnMount: true to show loading on refresh. If intentional to differ here, ignore; otherwise align for consistency.
No diff provided since this may be a deliberate product choice. If desired, change refetchOnMount to true.
5-5: Inconsistent API base-url fallbacks found — please centralize and align defaultsConfirmed: multiple different fallbacks for NEXT_PUBLIC_API_URL exist. The file under review uses http://localhost:4000 while most dashboard hooks use http://localhost:3001; other places use 8000 or the production domain. Please pick the intended default and centralize it.
Files/locations found (fallback shown):
- apps/dashboard/app/(main)/websites/[id]/revenue/hooks/use-website-revenue.ts — || 'http://localhost:4000' (line 5)
- apps/dashboard/app/(main)/sandbox/reddit-mentions/hooks/use-reddit-mentions.ts — || 'http://localhost:8000' (line 66)
- apps/dashboard/hooks/use-domain-info.ts — || 'http://localhost:3001' (line 5)
- apps/dashboard/hooks/use-dynamic-query.ts — || 'http://localhost:3001' (line 32)
- apps/dashboard/hooks/use-revenue-config.ts — || 'http://localhost:3001' (line 10)
- apps/dashboard/app/(main)/revenue/hooks/use-revenue-config.ts — || 'http://localhost:3001' (line 13)
- apps/dashboard/app/providers.tsx — || 'http://localhost:3001' (lines 51, 70)
- apps/docs/app/(home)/api/actions.ts and apps/docs/app/(home)/api/query-type-dialog.tsx — || 'https://api.databuddy.cc' (docs)
- dashboard.Dockerfile — ENV NEXT_PUBLIC_API_URL=https://localhost (lines 24, 45)
- additional usages (no fallback or dynamic): apps/dashboard/app/(main)/websites/[id]/assistant/hooks/use-chat.ts, apps/dashboard/hooks/use-organizations.ts
Recommended action (short):
- Create a single shared source for the API base URL (env-only or an exported constant/module) and update these files to consume it so defaults can't drift.
apps/dashboard/app/(main)/websites/[id]/page.tsx (4)
321-324: Phosphor icon weight: use “fill” for arrows per project conventionGuideline: default to duotone; use fill for arrows. Set
weight="fill"on ArrowClockwiseIcon.- <ArrowClockwiseIcon + <ArrowClockwiseIcon aria-hidden="true" - className={`h-4 w-4 ${isRefreshing ? 'animate-spin' : ''}`} + className={`h-4 w-4 ${isRefreshing ? 'animate-spin' : ''}`} + weight="fill" />
292-292: Tailwind rounding: align with “use rounded only” conventionThis file uses
rounded-md. The guidelines specify usingroundedonly. Suggest updating these instances.- <div className="flex h-8 overflow-hidden rounded-md border bg-background shadow-sm"> + <div className="flex h-8 overflow-hidden rounded border bg-background shadow-sm">- <div className="flex items-center gap-2 overflow-x-auto rounded-md border bg-background p-1 shadow-sm"> + <div className="flex items-center gap-2 overflow-x-auto rounded border bg-background p-1 shadow-sm">Also applies to: 328-328
401-404: Avoid array index in React keysUsing the array index in keys can cause subtle UI bugs on re-order/removal. Build a stable key from filter properties.
- <div - className="flex items-center gap-0 rounded border bg-background py-1 pr-2 pl-3 shadow-sm" - key={`filter-${index}-${filter.field}-${filter.operator}`} - > + <div + className="flex items-center gap-0 rounded border bg-background py-1 pr-2 pl-3 shadow-sm" + key={`filter-${filter.field}-${filter.operator}-${JSON.stringify(filter.value)}`} + >
446-452: Redundant state update: rely on Tabs onValueChange and valueYou already control the active tab via
Tabs’svalueandonValueChange. TheonClick={() => setActiveTab(tab.id)}on eachTabsTriggeris redundant.- <TabsTrigger + <TabsTrigger className="relative h-10 cursor-pointer touch-manipulation whitespace-nowrap rounded-none px-2 text-xs transition-colors hover:bg-muted/50 sm:px-4 sm:text-sm" key={tab.id} - onClick={() => setActiveTab(tab.id)} value={tab.id} >apps/dashboard/app/(main)/websites/[id]/_components/utils/add-filters.tsx (3)
127-133: A11y: avoid assigning combobox role to a button; use proper ARIA patternPer guidelines, don’t assign ARIA roles that don’t match the element. A Button should not be given
role="combobox". Preferaria-haspopup="listbox",aria-expanded, and link it to the popup list viaaria-controls. Give the popup list anid.- <Button - aria-expanded={isOpen} - className="w-full flex-1 justify-between overflow-x-auto overflow-y-hidden bg-transparent px-3" - role="combobox" - variant="outline" - > + <Button + aria-expanded={isOpen} + aria-haspopup="listbox" + aria-controls="filter-value-list" + className="w-full flex-1 justify-between overflow-x-auto overflow-y-hidden bg-transparent px-3" + variant="outline" + > {value === '' ? 'Select a value' : value} </Button>- <CommandList> + <CommandList id="filter-value-list"> {filteredSuggestions.map((suggestion) => ( <CommandItemAlso applies to: 143-155
305-316: Controlled Dropdown: remove manual onClick toggle on the triggerYou already control the menu via
openandonOpenChange. TogglingisOpenagain inonClickcan cause double toggles and racey UI. Let Radix control the trigger.- <Button - aria-expanded={isOpen} - aria-haspopup="menu" - aria-label="Add filter" - className="h-8" - onClick={() => setIsOpen(!isOpen)} - variant="outline" - > + <Button + aria-expanded={isOpen} + aria-haspopup="menu" + aria-label="Add filter" + className="h-8" + variant="outline" + >
258-262: Avoid index as key in lists (even for skeletons)Use a stable identifier. Since this skeleton mirrors filters, you can derive keys from
filterOptions.- {Array.from({ length: Math.min(numberOfFilters, 5) }).map( - (_, index) => ( - <Skeleton className="h-8 w-full" key={`filter-skeleton-${index}`} /> - ) - )} + {filterOptions + .slice(0, Math.min(numberOfFilters, 5)) + .map((opt) => ( + <Skeleton + className="h-8 w-full" + key={`filter-skeleton-${opt.value}`} + /> + ))}
| // Precompile bot regexes once to avoid per-request overhead | ||
| const compiledBotRegexes = bots.map((bot) => ({ | ||
| name: bot.name, | ||
| category: bot.category, | ||
| regex: new RegExp(bot.regex, 'i'), | ||
| })); | ||
|
|
There was a problem hiding this comment.
🛠️ Refactor suggestion
Short-circuit on missing User-Agent before running regex checks
Currently, detection scans all compiled regexes before checking for a missing UA, doing unnecessary work for bot/malformed requests. Early-return to avoid the scan.
export function detectBot(
userAgent: string,
request: Request
): {
isBot: boolean;
reason?: string;
category?: string;
botName?: string;
} {
- const ua = userAgent || '';
-
- const detectedBot = compiledBotRegexes.find((bot) => bot.regex.test(ua));
- if (detectedBot) {
- return {
- isBot: true,
- reason: 'known_bot_user_agent',
- category: 'Known Bot',
- botName: detectedBot.name,
- };
- }
-
- if (!userAgent) {
+ if (!userAgent) {
return {
isBot: true,
reason: 'missing_user_agent',
category: 'Missing Headers',
};
}
+ const ua = userAgent || '';
+
+ const detectedBot = compiledBotRegexes.find((bot) => bot.regex.test(ua));
+ if (detectedBot) {
+ return {
+ isBot: true,
+ reason: 'known_bot_user_agent',
+ category: 'Known Bot',
+ botName: detectedBot.name,
+ };
+ }Also applies to: 96-116
🧰 Tools
🪛 ast-grep (0.38.6)
[warning] 83-83: Regular expression constructed from variable input detected. This can lead to Regular Expression Denial of Service (ReDoS) attacks if the variable contains malicious patterns. Use libraries like 'recheck' to validate regex safety or use static patterns.
Context: new RegExp(bot.regex, 'i')
Note: [CWE-1333] Inefficient Regular Expression Complexity [REFERENCES]
- https://owasp.org/www-community/attacks/Regular_expression_Denial_of_Service_-_ReDoS
- https://cwe.mitre.org/data/definitions/1333.html
(regexp-from-variable)
🤖 Prompt for AI Agents
In apps/basket/src/utils/user-agent.ts around lines 80 to 86 (and also apply
same change to the logic at lines 96-116), the code compiles and iterates all
bot regexes before checking whether the incoming User-Agent header exists;
change the function(s) to short-circuit immediately when userAgent is
null/undefined/empty (e.g., if (!userAgent) return the appropriate
non-bot/malformed result) before running any regex tests, and ensure any callers
still receive the correct return shape when UA is missing.
💡 Verification agent
🧩 Analysis chain
Mitigate potential ReDoS risk from variable-driven regex patterns
Static analysis correctly flags new RegExp(bot.regex, 'i'). While these patterns are internal (not user-supplied), an adversarial UA can still trigger worst-case backtracking. Consider:
- Using a safe regex engine (e.g., RE2-backed solutions) for untrusted inputs.
- Pre-validating patterns using a linter (safe-regex) at build-time.
- Defensive guards: quick substring heuristics before regex pass to reduce scan volume.
I can add a build-time check that scans packages/shared/src/lists/bots.ts for potentially catastrophic patterns and surfaces a CI warning. Want me to draft that script?
🏁 Script executed:
#!/bin/bash
set -euo pipefail
echo "1) Locate user-agent.ts containing `compiledBotRegexes`..."
ua_file=$(rg --hidden -n --no-line-number "compiledBotRegexes" -S --files-with-matches | head -n1 || true)
if [ -n "$ua_file" ]; then
echo "Found: $ua_file"
echo "---- start: $ua_file (1-240) ----"
sed -n '1,240p' "$ua_file"
echo "---- end snippet ----"
else
echo "compiledBotRegexes not found."
fi
echo
echo "2) Try to locate bots list file (common exports: `const bots =`, `export const bots`, `export default bots`)..."
bots_file=$(rg --hidden -n --no-line-number "export .*bots|const bots =|export default .*bots" -S --files-with-matches | head -n1 || true)
if [ -n "$bots_file" ]; then
echo "Found bots list file: $bots_file"
echo "---- start: $bots_file (1-400) ----"
sed -n '1,400p' "$bots_file"
echo "---- end snippet ----"
else
echo "No bots list file found with those patterns. Searching for files named bots.ts..."
bots_file=$(fd --hidden -IH bots.ts || true)
if [ -n "$bots_file" ]; then
echo "fd found: $bots_file"
echo "---- start: $bots_file (1-400) ----"
sed -n '1,400p' "$bots_file"
echo "---- end snippet ----"
else
echo "No bots.ts found."
fi
fi
echo
echo "3) Search repo for other dynamic RegExp usages (`new RegExp(...)`)..."
rg --hidden -n "new RegExp\(" -S || true
echo
echo "4) Search for usages of bot.regex or similar references..."
rg --hidden -n "bot\.regex|bots\]" -S || true
echo
echo "5) If bots file found, search it for suspicious constructs (backrefs, nested quantifiers)..."
if [ -n "${bots_file:-}" ]; then
echo "Searching for backreferences (\\1, \\2, ...):"
rg -n "\\\\[0-9]" "$bots_file" || true
echo "Searching for patterns that look like nested quantifiers (e.g., '(.+)+', '(.+)*', '(.+){2,}'):"
rg -n "\\(\\.\\+\\)\\+|\\(\\.\\+\\)\\*|\\(\\.\\+\\)\\{[0-9]" "$bots_file" || true
echo "Searching for other risky tokens like '(.+)' followed by '+' or '*' in the same literal:"
rg -n "\\)\\+" "$bots_file" || true
fi
echo
echo "Script finished."Length of output: 6413
🏁 Script executed:
#!/bin/bash
set -euo pipefail
dir="packages/shared/src/lists"
echo "1) List files in $dir"
ls -la "$dir" || true
echo
echo "2) Locate bots file by scanning exports/definitions"
bots_file=$(rg --hidden -n --no-line-number "(^|\s)(export\s+const\s+bots|const\s+bots\s*=|export\s+default\s+bots)" -S --files-with-matches | head -n1 || true)
if [ -z "$bots_file" ]; then
# fallback: any file named bots.*
bots_file=$(fd --hidden -IH "^bots\\." "$dir" || true)
fi
if [ -n "$bots_file" ]; then
echo "Found bots file: $bots_file"
echo "---- start: $bots_file (1-400) ----"
sed -n '1,400p' "$bots_file"
echo "---- end snippet ----"
else
echo "No explicit bots file found via patterns. Searching for any files exporting lists named 'bots'..."
rg --hidden -n "bots\\s*[:=]\\s*\\[" -S || true
fi
echo
if [ -n "$bots_file" ]; then
echo "3) Show lines with regex declarations in $bots_file"
rg -n "regex\\s*:\\s*['\"]" "$bots_file" || true
echo
echo "4) Quick heuristics for risky regex constructs in $bots_file"
echo "- Backreferences (\\1, \\2, ...):"
rg -n "\\\\[0-9]" "$bots_file" || true
echo "- Nested/repeated quantifiers (e.g. '(.+)+', '(.*)+', '(.+)*'):"
rg -n "\\(\\.\\+\\)\\+|\\(\\.\\*\\)\\+|\\(\\.\\+\\)\\*|\\(\\.\\*\\)\\*|\\(\\.\\+\\)\\{[0-9]" "$bots_file" || true
echo "- Any ')+' occurrences (simple indicator of repeating groups):"
rg -n "\\)\\+" "$bots_file" || true
echo "- Very long regex literals (>200 chars):"
awk '{
if (match($0,/regex[[:space:]]*:[[:space:]]*["'\'']/)) {
line=$0
# accumulate continued lines until closing quote found
q=substr($0, RSTART+RLENGTH)
if (gsub(/["'\'']/,"",q)==0) {
# attempt simple length check on the whole line
if (length(line) > 200) print NR ":" line
} else {
if (length(line) > 200) print NR ":" line
}
}
}' "$bots_file" || true
fi
echo
echo "Script finished."Length of output: 38333
Action: low immediate ReDoS risk — recommend adding a CI safe-regex scan (I can draft it)
Short summary:
- I inspected apps/basket/src/utils/user-agent.ts and packages/shared/src/lists/bots.ts (device-detector-derived).
- Heuristic checks for backreferences, nested/repeated quantifiers, ')+' patterns and very long regex literals returned no obvious ReDoS patterns.
- Because these patterns are internal, the immediate risk is low — but they run against untrusted UA strings, so defensive CI/runtime checks are still worthwhile.
Locations:
- apps/basket/src/utils/user-agent.ts — compiledBotRegexes: new RegExp(bot.regex, 'i') (around line ~80).
- packages/shared/src/lists/bots.ts — source list of regex patterns.
Recommended (minimal, prioritized):
- Add a build-time safe-regex scan (e.g., safe-regex / safe-regex-cli or custom checker) on packages/shared/src/lists/bots.ts to surface risky patterns in CI.
- Optional runtime guards: keep UA length checks (already present), consider a fast substring pre-check (e.g., require "bot"/known token) before running full regexes as defence-in-depth.
Want me to draft the build-time script that scans packages/shared/src/lists/bots.ts and emits a CI warning/fail for risky patterns?
🧰 Tools
🪛 ast-grep (0.38.6)
[warning] 83-83: Regular expression constructed from variable input detected. This can lead to Regular Expression Denial of Service (ReDoS) attacks if the variable contains malicious patterns. Use libraries like 'recheck' to validate regex safety or use static patterns.
Context: new RegExp(bot.regex, 'i')
Note: [CWE-1333] Inefficient Regular Expression Complexity [REFERENCES]
- https://owasp.org/www-community/attacks/Regular_expression_Denial_of_Service_-_ReDoS
- https://cwe.mitre.org/data/definitions/1333.html
(regexp-from-variable)
🤖 Prompt for AI Agents
In apps/basket/src/utils/user-agent.ts around lines 80–86 the code compiles
external bot regexes with new RegExp(bot.regex, 'i'); add a CI build-time
safe-regex scan against packages/shared/src/lists/bots.ts (use safe-regex or
safe-regex-cli or a small custom checker) that fails or warns on patterns
flagged as ReDoS-risky; additionally add a defensive runtime guard: keep the
existing UA length check, wrap regex compilation/tests in try/catch and
optionally add a cheap substring token pre-check (e.g., look for "bot" or known
tokens) before running the full regex tests to reduce exposure.
| CREATE TYPE "public"."ab_test_status" AS ENUM('draft', 'running', 'paused', 'completed');--> statement-breakpoint | ||
| CREATE TYPE "public"."ab_variant_type" AS ENUM('visual', 'redirect', 'code');--> statement-breakpoint | ||
| CREATE TYPE "public"."api_key_type" AS ENUM('user', 'sdk', 'automation');--> statement-breakpoint | ||
| CREATE TYPE "public"."api_resource_type" AS ENUM('global', 'website', 'ab_experiment', 'feature_flag');--> statement-breakpoint | ||
| CREATE TYPE "public"."api_scope" AS ENUM('read:data', 'write:data', 'read:experiments', 'track:events', 'admin:apikeys');--> statement-breakpoint | ||
| CREATE TYPE "public"."FunnelGoalType" AS ENUM('COMPLETION', 'STEP_CONVERSION', 'TIME_TO_CONVERT');--> statement-breakpoint | ||
| CREATE TYPE "public"."FunnelStepType" AS ENUM('PAGE_VIEW', 'EVENT', 'CUSTOM');--> statement-breakpoint | ||
| CREATE TYPE "public"."MemberRole" AS ENUM('owner', 'admin', 'member', 'viewer');--> statement-breakpoint | ||
| CREATE TYPE "public"."OrganizationRole" AS ENUM('admin', 'owner', 'member', 'viewer');--> statement-breakpoint | ||
| CREATE TYPE "public"."Role" AS ENUM('ADMIN', 'USER', 'EARLY_ADOPTER', 'INVESTOR', 'BETA_TESTER', 'GUEST');--> statement-breakpoint | ||
| CREATE TYPE "public"."UserStatus" AS ENUM('ACTIVE', 'SUSPENDED', 'INACTIVE');--> statement-breakpoint | ||
| CREATE TYPE "public"."VerificationStatus" AS ENUM('PENDING', 'VERIFIED', 'FAILED');--> statement-breakpoint | ||
| CREATE TYPE "public"."WebsiteStatus" AS ENUM('ACTIVE', 'HEALTHY', 'UNHEALTHY', 'INACTIVE', 'PENDING');--> statement-breakpoint | ||
| CREATE TABLE "ab_experiments" ( | ||
| "id" text PRIMARY KEY NOT NULL, | ||
| "websiteId" text NOT NULL, | ||
| "name" text NOT NULL, | ||
| "description" text, | ||
| "status" "ab_test_status" DEFAULT 'draft' NOT NULL, | ||
| "trafficAllocation" integer DEFAULT 100 NOT NULL, | ||
| "startDate" timestamp(3), | ||
| "endDate" timestamp(3), | ||
| "primaryGoal" text, | ||
| "createdBy" text NOT NULL, | ||
| "createdAt" timestamp(3) DEFAULT CURRENT_TIMESTAMP NOT NULL, | ||
| "updatedAt" timestamp(3) DEFAULT CURRENT_TIMESTAMP NOT NULL, | ||
| "deletedAt" timestamp(3) | ||
| ); | ||
| --> statement-breakpoint | ||
| CREATE TABLE "ab_goals" ( | ||
| "id" text PRIMARY KEY NOT NULL, | ||
| "experimentId" text NOT NULL, | ||
| "name" text NOT NULL, | ||
| "type" text NOT NULL, | ||
| "target" text NOT NULL, | ||
| "description" text, | ||
| "createdAt" timestamp(3) DEFAULT CURRENT_TIMESTAMP NOT NULL, | ||
| "updatedAt" timestamp(3) DEFAULT CURRENT_TIMESTAMP NOT NULL | ||
| ); | ||
| --> statement-breakpoint | ||
| CREATE TABLE "ab_variants" ( | ||
| "id" text PRIMARY KEY NOT NULL, | ||
| "experimentId" text NOT NULL, | ||
| "name" text NOT NULL, | ||
| "type" "ab_variant_type" DEFAULT 'visual' NOT NULL, | ||
| "content" jsonb NOT NULL, | ||
| "trafficWeight" integer DEFAULT 50 NOT NULL, | ||
| "isControl" boolean DEFAULT false NOT NULL, | ||
| "createdAt" timestamp(3) DEFAULT CURRENT_TIMESTAMP NOT NULL, | ||
| "updatedAt" timestamp(3) DEFAULT CURRENT_TIMESTAMP NOT NULL | ||
| ); | ||
| --> statement-breakpoint | ||
| CREATE TABLE "account" ( | ||
| "id" text PRIMARY KEY NOT NULL, | ||
| "account_id" text NOT NULL, | ||
| "provider_id" text NOT NULL, | ||
| "user_id" text NOT NULL, | ||
| "access_token" text, | ||
| "refresh_token" text, | ||
| "id_token" text, | ||
| "access_token_expires_at" timestamp, | ||
| "refresh_token_expires_at" timestamp, | ||
| "scope" text, | ||
| "password" text, | ||
| "created_at" timestamp NOT NULL, | ||
| "updated_at" timestamp NOT NULL | ||
| ); | ||
| --> statement-breakpoint | ||
| CREATE TABLE "apikey" ( | ||
| "id" text PRIMARY KEY NOT NULL, | ||
| "name" text NOT NULL, | ||
| "prefix" text NOT NULL, | ||
| "start" text NOT NULL, | ||
| "key" text NOT NULL, | ||
| "key_hash" text, | ||
| "user_id" text, | ||
| "organization_id" text, | ||
| "type" "api_key_type" DEFAULT 'user' NOT NULL, | ||
| "scopes" "api_scope"[] DEFAULT '{}' NOT NULL, | ||
| "enabled" boolean DEFAULT true NOT NULL, | ||
| "revoked_at" timestamp, | ||
| "rate_limit_enabled" boolean DEFAULT true NOT NULL, | ||
| "rate_limit_time_window" integer, | ||
| "rate_limit_max" integer, | ||
| "request_count" integer DEFAULT 0 NOT NULL, | ||
| "remaining" integer, | ||
| "last_request" timestamp, | ||
| "last_refill_at" timestamp, | ||
| "refill_interval" integer, | ||
| "refill_amount" integer, | ||
| "expires_at" timestamp, | ||
| "metadata" jsonb DEFAULT '{}'::jsonb, | ||
| "created_at" timestamp DEFAULT CURRENT_TIMESTAMP NOT NULL, | ||
| "updated_at" timestamp DEFAULT CURRENT_TIMESTAMP NOT NULL | ||
| ); | ||
| --> statement-breakpoint | ||
| CREATE TABLE "apikey_access" ( | ||
| "id" text PRIMARY KEY NOT NULL, | ||
| "apikey_id" text NOT NULL, | ||
| "resource_type" "api_resource_type" DEFAULT 'global' NOT NULL, | ||
| "resource_id" text, | ||
| "scopes" "api_scope"[] DEFAULT '{}' NOT NULL, | ||
| "created_at" timestamp DEFAULT CURRENT_TIMESTAMP NOT NULL, | ||
| "updated_at" timestamp DEFAULT CURRENT_TIMESTAMP NOT NULL | ||
| ); | ||
| --> statement-breakpoint | ||
| CREATE TABLE "funnel_definitions" ( | ||
| "id" text PRIMARY KEY NOT NULL, | ||
| "websiteId" text NOT NULL, | ||
| "name" text NOT NULL, | ||
| "description" text, | ||
| "steps" jsonb NOT NULL, | ||
| "filters" jsonb, | ||
| "isActive" boolean DEFAULT true NOT NULL, | ||
| "createdBy" text NOT NULL, | ||
| "createdAt" timestamp(3) DEFAULT CURRENT_TIMESTAMP NOT NULL, | ||
| "updatedAt" timestamp(3) DEFAULT CURRENT_TIMESTAMP NOT NULL, | ||
| "deletedAt" timestamp(3) | ||
| ); | ||
| --> statement-breakpoint | ||
| CREATE TABLE "funnel_goals" ( | ||
| "id" text PRIMARY KEY NOT NULL, | ||
| "funnelId" text NOT NULL, | ||
| "goalType" "FunnelGoalType" NOT NULL, | ||
| "targetValue" text, | ||
| "description" text, | ||
| "isActive" boolean DEFAULT true NOT NULL, | ||
| "createdAt" timestamp(3) DEFAULT CURRENT_TIMESTAMP NOT NULL, | ||
| "updatedAt" timestamp(3) DEFAULT CURRENT_TIMESTAMP NOT NULL | ||
| ); | ||
| --> statement-breakpoint | ||
| CREATE TABLE "goals" ( | ||
| "id" text PRIMARY KEY NOT NULL, | ||
| "websiteId" text NOT NULL, | ||
| "type" text NOT NULL, | ||
| "target" text NOT NULL, | ||
| "name" text NOT NULL, | ||
| "description" text, | ||
| "filters" jsonb, | ||
| "isActive" boolean DEFAULT true NOT NULL, | ||
| "createdBy" text NOT NULL, | ||
| "createdAt" timestamp(3) DEFAULT CURRENT_TIMESTAMP NOT NULL, | ||
| "updatedAt" timestamp(3) DEFAULT CURRENT_TIMESTAMP NOT NULL, | ||
| "deletedAt" timestamp(3) | ||
| ); | ||
| --> statement-breakpoint | ||
| CREATE TABLE "invitation" ( | ||
| "id" text PRIMARY KEY NOT NULL, | ||
| "organization_id" text NOT NULL, | ||
| "email" text NOT NULL, | ||
| "role" text DEFAULT 'member', | ||
| "team_id" text, | ||
| "status" text DEFAULT 'pending' NOT NULL, | ||
| "expires_at" timestamp NOT NULL, | ||
| "inviter_id" text NOT NULL | ||
| ); | ||
| --> statement-breakpoint | ||
| CREATE TABLE "member" ( | ||
| "id" text PRIMARY KEY NOT NULL, | ||
| "organization_id" text NOT NULL, | ||
| "user_id" text NOT NULL, | ||
| "role" text DEFAULT 'member' NOT NULL, | ||
| "team_id" text, | ||
| "created_at" timestamp NOT NULL | ||
| ); | ||
| --> statement-breakpoint | ||
| CREATE TABLE "organization" ( | ||
| "id" text PRIMARY KEY NOT NULL, | ||
| "name" text NOT NULL, | ||
| "slug" text, | ||
| "logo" text, | ||
| "created_at" timestamp NOT NULL, | ||
| "metadata" text, | ||
| CONSTRAINT "organizations_slug_unique" UNIQUE("slug") | ||
| ); | ||
| --> statement-breakpoint | ||
| CREATE TABLE "session" ( | ||
| "id" text PRIMARY KEY NOT NULL, | ||
| "expiresAt" timestamp(3) NOT NULL, | ||
| "token" text NOT NULL, | ||
| "createdAt" timestamp(3) DEFAULT CURRENT_TIMESTAMP NOT NULL, | ||
| "updatedAt" timestamp(3) NOT NULL, | ||
| "ipAddress" text, | ||
| "userAgent" text, | ||
| "userId" text, | ||
| "active_organization_id" text | ||
| ); | ||
| --> statement-breakpoint | ||
| CREATE TABLE "team" ( | ||
| "id" text PRIMARY KEY NOT NULL, | ||
| "name" text NOT NULL, | ||
| "organization_id" text NOT NULL, | ||
| "created_at" timestamp NOT NULL, | ||
| "updated_at" timestamp | ||
| ); | ||
| --> statement-breakpoint | ||
| CREATE TABLE "two_factor" ( | ||
| "id" text PRIMARY KEY NOT NULL, | ||
| "secret" text NOT NULL, | ||
| "backup_codes" text NOT NULL, | ||
| "user_id" text NOT NULL | ||
| ); | ||
| --> statement-breakpoint | ||
| CREATE TABLE "user" ( | ||
| "id" text PRIMARY KEY NOT NULL, | ||
| "name" text NOT NULL, | ||
| "email" text NOT NULL, | ||
| "email_verified" boolean NOT NULL, | ||
| "image" text, | ||
| "firstName" text, | ||
| "lastName" text, | ||
| "status" "UserStatus" DEFAULT 'ACTIVE' NOT NULL, | ||
| "created_at" timestamp NOT NULL, | ||
| "updated_at" timestamp NOT NULL, | ||
| "deletedAt" timestamp(3), | ||
| "role" "Role" DEFAULT 'USER' NOT NULL, | ||
| "two_factor_enabled" boolean, | ||
| CONSTRAINT "users_email_unique" UNIQUE("email") | ||
| ); | ||
| --> statement-breakpoint | ||
| CREATE TABLE "user_preferences" ( | ||
| "id" text PRIMARY KEY NOT NULL, | ||
| "userId" text NOT NULL, | ||
| "timezone" text DEFAULT 'auto' NOT NULL, | ||
| "dateFormat" text DEFAULT 'MMM D, YYYY' NOT NULL, | ||
| "timeFormat" text DEFAULT 'h:mm a' NOT NULL, | ||
| "createdAt" timestamp(3) DEFAULT CURRENT_TIMESTAMP NOT NULL, | ||
| "updatedAt" timestamp(3) NOT NULL | ||
| ); | ||
| --> statement-breakpoint | ||
| CREATE TABLE "user_stripe_config" ( | ||
| "id" text PRIMARY KEY NOT NULL, | ||
| "user_id" text NOT NULL, | ||
| "webhook_token" text NOT NULL, | ||
| "stripe_secret_key" text NOT NULL, | ||
| "stripe_publishable_key" text, | ||
| "webhook_secret" text NOT NULL, | ||
| "is_live_mode" boolean DEFAULT false NOT NULL, | ||
| "is_active" boolean DEFAULT true NOT NULL, | ||
| "last_webhook_at" timestamp, | ||
| "webhook_failure_count" integer DEFAULT 0 NOT NULL, | ||
| "created_at" timestamp NOT NULL, | ||
| "updated_at" timestamp NOT NULL | ||
| ); | ||
| --> statement-breakpoint | ||
| CREATE TABLE "verification" ( | ||
| "id" text PRIMARY KEY NOT NULL, | ||
| "identifier" text NOT NULL, | ||
| "value" text NOT NULL, | ||
| "expires_at" timestamp NOT NULL, | ||
| "created_at" timestamp, | ||
| "updated_at" timestamp | ||
| ); | ||
| --> statement-breakpoint | ||
| CREATE TABLE "websites" ( | ||
| "id" text PRIMARY KEY NOT NULL, | ||
| "domain" text NOT NULL, | ||
| "name" text, | ||
| "status" "WebsiteStatus" DEFAULT 'ACTIVE' NOT NULL, | ||
| "userId" text, | ||
| "isPublic" boolean DEFAULT false NOT NULL, | ||
| "createdAt" timestamp(3) DEFAULT CURRENT_TIMESTAMP NOT NULL, | ||
| "updatedAt" timestamp(3) DEFAULT CURRENT_TIMESTAMP NOT NULL, | ||
| "deletedAt" timestamp(3), | ||
| "organization_id" text | ||
| ); | ||
| --> statement-breakpoint | ||
| ALTER TABLE "ab_experiments" ADD CONSTRAINT "ab_experiments_websiteId_fkey" FOREIGN KEY ("websiteId") REFERENCES "public"."websites"("id") ON DELETE cascade ON UPDATE cascade;--> statement-breakpoint | ||
| ALTER TABLE "ab_experiments" ADD CONSTRAINT "ab_experiments_createdBy_fkey" FOREIGN KEY ("createdBy") REFERENCES "public"."user"("id") ON DELETE restrict ON UPDATE cascade;--> statement-breakpoint | ||
| ALTER TABLE "ab_goals" ADD CONSTRAINT "ab_goals_experimentId_fkey" FOREIGN KEY ("experimentId") REFERENCES "public"."ab_experiments"("id") ON DELETE cascade ON UPDATE cascade;--> statement-breakpoint | ||
| ALTER TABLE "ab_variants" ADD CONSTRAINT "ab_variants_experimentId_fkey" FOREIGN KEY ("experimentId") REFERENCES "public"."ab_experiments"("id") ON DELETE cascade ON UPDATE cascade;--> statement-breakpoint | ||
| ALTER TABLE "account" ADD CONSTRAINT "account_user_id_user_id_fk" FOREIGN KEY ("user_id") REFERENCES "public"."user"("id") ON DELETE cascade ON UPDATE no action;--> statement-breakpoint | ||
| ALTER TABLE "apikey" ADD CONSTRAINT "apikey_user_id_user_id_fk" FOREIGN KEY ("user_id") REFERENCES "public"."user"("id") ON DELETE cascade ON UPDATE no action;--> statement-breakpoint | ||
| ALTER TABLE "apikey" ADD CONSTRAINT "apikey_organization_id_organization_id_fk" FOREIGN KEY ("organization_id") REFERENCES "public"."organization"("id") ON DELETE cascade ON UPDATE no action;--> statement-breakpoint | ||
| ALTER TABLE "apikey_access" ADD CONSTRAINT "apikey_access_apikey_id_fkey" FOREIGN KEY ("apikey_id") REFERENCES "public"."apikey"("id") ON DELETE cascade ON UPDATE no action;--> statement-breakpoint | ||
| ALTER TABLE "funnel_definitions" ADD CONSTRAINT "funnel_definitions_websiteId_fkey" FOREIGN KEY ("websiteId") REFERENCES "public"."websites"("id") ON DELETE cascade ON UPDATE cascade;--> statement-breakpoint | ||
| ALTER TABLE "funnel_definitions" ADD CONSTRAINT "funnel_definitions_createdBy_fkey" FOREIGN KEY ("createdBy") REFERENCES "public"."user"("id") ON DELETE restrict ON UPDATE cascade;--> statement-breakpoint | ||
| ALTER TABLE "funnel_goals" ADD CONSTRAINT "funnel_goals_funnelId_fkey" FOREIGN KEY ("funnelId") REFERENCES "public"."funnel_definitions"("id") ON DELETE cascade ON UPDATE cascade;--> statement-breakpoint | ||
| ALTER TABLE "goals" ADD CONSTRAINT "goals_websiteId_fkey" FOREIGN KEY ("websiteId") REFERENCES "public"."websites"("id") ON DELETE cascade ON UPDATE cascade;--> statement-breakpoint | ||
| ALTER TABLE "goals" ADD CONSTRAINT "goals_createdBy_fkey" FOREIGN KEY ("createdBy") REFERENCES "public"."user"("id") ON DELETE restrict ON UPDATE cascade;--> statement-breakpoint | ||
| ALTER TABLE "invitation" ADD CONSTRAINT "invitation_organization_id_organization_id_fk" FOREIGN KEY ("organization_id") REFERENCES "public"."organization"("id") ON DELETE cascade ON UPDATE no action;--> statement-breakpoint | ||
| ALTER TABLE "invitation" ADD CONSTRAINT "invitation_inviter_id_user_id_fk" FOREIGN KEY ("inviter_id") REFERENCES "public"."user"("id") ON DELETE cascade ON UPDATE no action;--> statement-breakpoint | ||
| ALTER TABLE "member" ADD CONSTRAINT "member_organization_id_organization_id_fk" FOREIGN KEY ("organization_id") REFERENCES "public"."organization"("id") ON DELETE cascade ON UPDATE no action;--> statement-breakpoint | ||
| ALTER TABLE "member" ADD CONSTRAINT "member_user_id_user_id_fk" FOREIGN KEY ("user_id") REFERENCES "public"."user"("id") ON DELETE cascade ON UPDATE no action;--> statement-breakpoint | ||
| ALTER TABLE "session" ADD CONSTRAINT "session_userId_fkey" FOREIGN KEY ("userId") REFERENCES "public"."user"("id") ON DELETE cascade ON UPDATE cascade;--> statement-breakpoint | ||
| ALTER TABLE "team" ADD CONSTRAINT "team_organization_id_organization_id_fk" FOREIGN KEY ("organization_id") REFERENCES "public"."organization"("id") ON DELETE cascade ON UPDATE no action;--> statement-breakpoint | ||
| ALTER TABLE "two_factor" ADD CONSTRAINT "two_factor_user_id_user_id_fk" FOREIGN KEY ("user_id") REFERENCES "public"."user"("id") ON DELETE cascade ON UPDATE no action;--> statement-breakpoint | ||
| ALTER TABLE "user_preferences" ADD CONSTRAINT "user_preferences_userId_fkey" FOREIGN KEY ("userId") REFERENCES "public"."user"("id") ON DELETE cascade ON UPDATE cascade;--> statement-breakpoint | ||
| ALTER TABLE "user_stripe_config" ADD CONSTRAINT "user_stripe_config_userId_fkey" FOREIGN KEY ("user_id") REFERENCES "public"."user"("id") ON DELETE cascade ON UPDATE cascade;--> statement-breakpoint | ||
| ALTER TABLE "websites" ADD CONSTRAINT "websites_userId_fkey" FOREIGN KEY ("userId") REFERENCES "public"."user"("id") ON DELETE set null ON UPDATE cascade;--> statement-breakpoint | ||
| ALTER TABLE "websites" ADD CONSTRAINT "websites_organization_id_organization_id_fk" FOREIGN KEY ("organization_id") REFERENCES "public"."organization"("id") ON DELETE cascade ON UPDATE no action;--> statement-breakpoint | ||
| CREATE INDEX "ab_experiments_websiteId_idx" ON "ab_experiments" USING btree ("websiteId");--> statement-breakpoint | ||
| CREATE INDEX "ab_experiments_createdBy_idx" ON "ab_experiments" USING btree ("createdBy");--> statement-breakpoint | ||
| CREATE INDEX "ab_experiments_status_idx" ON "ab_experiments" USING btree ("status");--> statement-breakpoint | ||
| CREATE INDEX "ab_goals_experimentId_idx" ON "ab_goals" USING btree ("experimentId");--> statement-breakpoint | ||
| CREATE INDEX "ab_variants_experimentId_idx" ON "ab_variants" USING btree ("experimentId");--> statement-breakpoint | ||
| CREATE INDEX "accounts_userId_idx" ON "account" USING btree ("user_id" text_ops);--> statement-breakpoint | ||
| CREATE UNIQUE INDEX "accounts_provider_account_unique" ON "account" USING btree ("provider_id" text_ops,"account_id" text_ops);--> statement-breakpoint | ||
| CREATE UNIQUE INDEX "apikey_key_unique" ON "apikey" USING btree ("key" text_ops);--> statement-breakpoint | ||
| CREATE INDEX "apikey_user_id_idx" ON "apikey" USING btree ("user_id" text_ops);--> statement-breakpoint | ||
| CREATE INDEX "apikey_organization_id_idx" ON "apikey" USING btree ("organization_id" text_ops);--> statement-breakpoint | ||
| CREATE INDEX "apikey_prefix_idx" ON "apikey" USING btree ("prefix" text_ops);--> statement-breakpoint | ||
| CREATE INDEX "apikey_key_hash_idx" ON "apikey" USING btree ("key_hash" text_ops);--> statement-breakpoint | ||
| CREATE INDEX "apikey_enabled_idx" ON "apikey" USING btree ("enabled");--> statement-breakpoint | ||
| CREATE INDEX "apikey_access_apikey_id_idx" ON "apikey_access" USING btree ("apikey_id" text_ops);--> statement-breakpoint | ||
| CREATE INDEX "apikey_access_resource_idx" ON "apikey_access" USING btree ("resource_type","resource_id" text_ops);--> statement-breakpoint | ||
| CREATE UNIQUE INDEX "apikey_access_unique" ON "apikey_access" USING btree ("apikey_id" text_ops,"resource_type","resource_id" text_ops);--> statement-breakpoint | ||
| CREATE INDEX "funnel_definitions_createdBy_idx" ON "funnel_definitions" USING btree ("createdBy" text_ops);--> statement-breakpoint | ||
| CREATE INDEX "funnel_definitions_websiteId_idx" ON "funnel_definitions" USING btree ("websiteId" text_ops);--> statement-breakpoint | ||
| CREATE INDEX "funnel_goals_funnelId_idx" ON "funnel_goals" USING btree ("funnelId" text_ops);--> statement-breakpoint | ||
| CREATE INDEX "goals_websiteId_idx" ON "goals" USING btree ("websiteId" text_ops);--> statement-breakpoint | ||
| CREATE INDEX "goals_createdBy_idx" ON "goals" USING btree ("createdBy" text_ops);--> statement-breakpoint | ||
| CREATE INDEX "goals_websiteId_deletedAt_createdAt_idx" ON "goals" USING btree ("websiteId" text_ops,"deletedAt","createdAt" DESC NULLS LAST);--> statement-breakpoint | ||
| CREATE INDEX "goals_deletedAt_idx" ON "goals" USING btree ("deletedAt");--> statement-breakpoint | ||
| CREATE INDEX "invitations_email_idx" ON "invitation" USING btree ("email" text_ops);--> statement-breakpoint | ||
| CREATE INDEX "invitations_organizationId_idx" ON "invitation" USING btree ("organization_id" text_ops);--> statement-breakpoint | ||
| CREATE INDEX "members_userId_idx" ON "member" USING btree ("user_id" text_ops);--> statement-breakpoint | ||
| CREATE INDEX "members_organizationId_idx" ON "member" USING btree ("organization_id" text_ops);--> statement-breakpoint | ||
| CREATE INDEX "organizations_slug_idx" ON "organization" USING btree ("slug" text_ops);--> statement-breakpoint | ||
| CREATE UNIQUE INDEX "sessions_token_key" ON "session" USING btree ("token" text_ops);--> statement-breakpoint | ||
| CREATE INDEX "sessions_userId_idx" ON "session" USING btree ("userId" text_ops);--> statement-breakpoint | ||
| CREATE INDEX "sessions_expiresAt_idx" ON "session" USING btree ("expiresAt");--> statement-breakpoint | ||
| CREATE INDEX "twoFactor_secret_idx" ON "two_factor" USING btree ("secret" text_ops);--> statement-breakpoint | ||
| CREATE UNIQUE INDEX "user_preferences_userId_key" ON "user_preferences" USING btree ("userId" text_ops);--> statement-breakpoint | ||
| CREATE UNIQUE INDEX "user_stripe_config_userId_key" ON "user_stripe_config" USING btree ("user_id" text_ops);--> statement-breakpoint | ||
| CREATE UNIQUE INDEX "user_stripe_config_webhookToken_key" ON "user_stripe_config" USING btree ("webhook_token" text_ops);--> statement-breakpoint | ||
| CREATE INDEX "verifications_identifier_idx" ON "verification" USING btree ("identifier" text_ops);--> statement-breakpoint | ||
| CREATE INDEX "verifications_expiresAt_idx" ON "verification" USING btree ("expires_at");--> statement-breakpoint | ||
| CREATE UNIQUE INDEX "websites_user_domain_unique" ON "websites" USING btree ("userId","domain") WHERE "websites"."organization_id" is null;--> statement-breakpoint | ||
| CREATE UNIQUE INDEX "websites_org_domain_unique" ON "websites" USING btree ("organization_id","domain") WHERE "websites"."organization_id" is not null;--> statement-breakpoint | ||
| CREATE INDEX "websites_userId_idx" ON "websites" USING btree ("userId" text_ops); No newline at end of file |
There was a problem hiding this comment.
🧹 Nitpick (assertive)
🛠️ Refactor suggestion
Enable RLS for multi-tenant tables.
RLS is disabled everywhere. For a multi-tenant app, enable RLS and add policies on sensitive tables (websites, ab_experiments, goals, session, apikey, etc.) to prevent cross-tenant data leakage at the DB layer.
Example (outside this file’s current ranges; apply after tables are created):
-- Example for websites
ALTER TABLE "websites" ENABLE ROW LEVEL SECURITY;
CREATE POLICY websites_tenant_isolation
ON "websites"
USING (
-- allow select if userId matches or organization_id is in actor's org list
-- adapt this predicate to your authn/authz approach
current_setting('app.current_user_id', true) IS NOT NULL
AND (
"userId" = current_setting('app.current_user_id')
OR "organization_id" = current_setting('app.current_org_id')
)
);
-- Repeat with appropriate predicates for other tables.🤖 Prompt for AI Agents
In packages/db/src/drizzle/0000_aberrant_felicia_hardy.sql around lines 1 to
331, RLS is not enabled and no row-level policies exist for multi-tenant
sensitive tables; enable RLS for each multi-tenant table (websites,
ab_experiments, goals, session, apikey, etc.) immediately after their CREATE
TABLE statements by issuing ALTER TABLE ... ENABLE ROW LEVEL SECURITY and add
CREATE POLICY statements that use secure predicates (e.g.
current_setting('app.current_user_id', true) and
current_setting('app.current_org_id', true)) to restrict access to rows
belonging to the current user or organization, and ensure
SELECT/INSERT/UPDATE/DELETE policies cover required operations and consider
adding a restrictive fallback USING/WITH CHECK where appropriate; place these
ALTER TABLE and CREATE POLICY statements after table creation in this migration
or a follow-up migration so the DB enforces tenant isolation at the row level.
| CREATE TABLE "ab_experiments" ( | ||
| "id" text PRIMARY KEY NOT NULL, | ||
| "websiteId" text NOT NULL, | ||
| "name" text NOT NULL, | ||
| "description" text, | ||
| "status" "ab_test_status" DEFAULT 'draft' NOT NULL, | ||
| "trafficAllocation" integer DEFAULT 100 NOT NULL, | ||
| "startDate" timestamp(3), | ||
| "endDate" timestamp(3), | ||
| "primaryGoal" text, | ||
| "createdBy" text NOT NULL, | ||
| "createdAt" timestamp(3) DEFAULT CURRENT_TIMESTAMP NOT NULL, | ||
| "updatedAt" timestamp(3) DEFAULT CURRENT_TIMESTAMP NOT NULL, | ||
| "deletedAt" timestamp(3) | ||
| ); | ||
| --> statement-breakpoint | ||
| CREATE TABLE "ab_goals" ( | ||
| "id" text PRIMARY KEY NOT NULL, | ||
| "experimentId" text NOT NULL, | ||
| "name" text NOT NULL, | ||
| "type" text NOT NULL, | ||
| "target" text NOT NULL, | ||
| "description" text, | ||
| "createdAt" timestamp(3) DEFAULT CURRENT_TIMESTAMP NOT NULL, | ||
| "updatedAt" timestamp(3) DEFAULT CURRENT_TIMESTAMP NOT NULL | ||
| ); | ||
| --> statement-breakpoint | ||
| CREATE TABLE "ab_variants" ( | ||
| "id" text PRIMARY KEY NOT NULL, | ||
| "experimentId" text NOT NULL, | ||
| "name" text NOT NULL, | ||
| "type" "ab_variant_type" DEFAULT 'visual' NOT NULL, | ||
| "content" jsonb NOT NULL, | ||
| "trafficWeight" integer DEFAULT 50 NOT NULL, | ||
| "isControl" boolean DEFAULT false NOT NULL, | ||
| "createdAt" timestamp(3) DEFAULT CURRENT_TIMESTAMP NOT NULL, | ||
| "updatedAt" timestamp(3) DEFAULT CURRENT_TIMESTAMP NOT NULL | ||
| ); | ||
| --> statement-breakpoint | ||
| CREATE TABLE "account" ( | ||
| "id" text PRIMARY KEY NOT NULL, | ||
| "account_id" text NOT NULL, | ||
| "provider_id" text NOT NULL, | ||
| "user_id" text NOT NULL, | ||
| "access_token" text, | ||
| "refresh_token" text, | ||
| "id_token" text, | ||
| "access_token_expires_at" timestamp, | ||
| "refresh_token_expires_at" timestamp, | ||
| "scope" text, | ||
| "password" text, | ||
| "created_at" timestamp NOT NULL, | ||
| "updated_at" timestamp NOT NULL | ||
| ); | ||
| --> statement-breakpoint | ||
| CREATE TABLE "apikey" ( | ||
| "id" text PRIMARY KEY NOT NULL, | ||
| "name" text NOT NULL, | ||
| "prefix" text NOT NULL, | ||
| "start" text NOT NULL, | ||
| "key" text NOT NULL, | ||
| "key_hash" text, | ||
| "user_id" text, | ||
| "organization_id" text, | ||
| "type" "api_key_type" DEFAULT 'user' NOT NULL, | ||
| "scopes" "api_scope"[] DEFAULT '{}' NOT NULL, | ||
| "enabled" boolean DEFAULT true NOT NULL, | ||
| "revoked_at" timestamp, | ||
| "rate_limit_enabled" boolean DEFAULT true NOT NULL, | ||
| "rate_limit_time_window" integer, | ||
| "rate_limit_max" integer, | ||
| "request_count" integer DEFAULT 0 NOT NULL, | ||
| "remaining" integer, | ||
| "last_request" timestamp, | ||
| "last_refill_at" timestamp, | ||
| "refill_interval" integer, | ||
| "refill_amount" integer, | ||
| "expires_at" timestamp, | ||
| "metadata" jsonb DEFAULT '{}'::jsonb, | ||
| "created_at" timestamp DEFAULT CURRENT_TIMESTAMP NOT NULL, | ||
| "updated_at" timestamp DEFAULT CURRENT_TIMESTAMP NOT NULL | ||
| ); | ||
| --> statement-breakpoint | ||
| CREATE TABLE "apikey_access" ( | ||
| "id" text PRIMARY KEY NOT NULL, | ||
| "apikey_id" text NOT NULL, | ||
| "resource_type" "api_resource_type" DEFAULT 'global' NOT NULL, | ||
| "resource_id" text, | ||
| "scopes" "api_scope"[] DEFAULT '{}' NOT NULL, | ||
| "created_at" timestamp DEFAULT CURRENT_TIMESTAMP NOT NULL, | ||
| "updated_at" timestamp DEFAULT CURRENT_TIMESTAMP NOT NULL | ||
| ); | ||
| --> statement-breakpoint | ||
| CREATE TABLE "funnel_definitions" ( | ||
| "id" text PRIMARY KEY NOT NULL, | ||
| "websiteId" text NOT NULL, | ||
| "name" text NOT NULL, | ||
| "description" text, | ||
| "steps" jsonb NOT NULL, | ||
| "filters" jsonb, | ||
| "isActive" boolean DEFAULT true NOT NULL, | ||
| "createdBy" text NOT NULL, | ||
| "createdAt" timestamp(3) DEFAULT CURRENT_TIMESTAMP NOT NULL, | ||
| "updatedAt" timestamp(3) DEFAULT CURRENT_TIMESTAMP NOT NULL, | ||
| "deletedAt" timestamp(3) | ||
| ); | ||
| --> statement-breakpoint | ||
| CREATE TABLE "funnel_goals" ( | ||
| "id" text PRIMARY KEY NOT NULL, | ||
| "funnelId" text NOT NULL, | ||
| "goalType" "FunnelGoalType" NOT NULL, | ||
| "targetValue" text, | ||
| "description" text, | ||
| "isActive" boolean DEFAULT true NOT NULL, | ||
| "createdAt" timestamp(3) DEFAULT CURRENT_TIMESTAMP NOT NULL, | ||
| "updatedAt" timestamp(3) DEFAULT CURRENT_TIMESTAMP NOT NULL | ||
| ); | ||
| --> statement-breakpoint | ||
| CREATE TABLE "goals" ( | ||
| "id" text PRIMARY KEY NOT NULL, | ||
| "websiteId" text NOT NULL, | ||
| "type" text NOT NULL, | ||
| "target" text NOT NULL, | ||
| "name" text NOT NULL, | ||
| "description" text, | ||
| "filters" jsonb, | ||
| "isActive" boolean DEFAULT true NOT NULL, | ||
| "createdBy" text NOT NULL, | ||
| "createdAt" timestamp(3) DEFAULT CURRENT_TIMESTAMP NOT NULL, | ||
| "updatedAt" timestamp(3) DEFAULT CURRENT_TIMESTAMP NOT NULL, | ||
| "deletedAt" timestamp(3) | ||
| ); | ||
| --> statement-breakpoint | ||
| CREATE TABLE "invitation" ( | ||
| "id" text PRIMARY KEY NOT NULL, | ||
| "organization_id" text NOT NULL, | ||
| "email" text NOT NULL, | ||
| "role" text DEFAULT 'member', | ||
| "team_id" text, | ||
| "status" text DEFAULT 'pending' NOT NULL, | ||
| "expires_at" timestamp NOT NULL, | ||
| "inviter_id" text NOT NULL | ||
| ); | ||
| --> statement-breakpoint | ||
| CREATE TABLE "member" ( | ||
| "id" text PRIMARY KEY NOT NULL, | ||
| "organization_id" text NOT NULL, | ||
| "user_id" text NOT NULL, | ||
| "role" text DEFAULT 'member' NOT NULL, | ||
| "team_id" text, | ||
| "created_at" timestamp NOT NULL | ||
| ); | ||
| --> statement-breakpoint | ||
| CREATE TABLE "organization" ( | ||
| "id" text PRIMARY KEY NOT NULL, | ||
| "name" text NOT NULL, | ||
| "slug" text, | ||
| "logo" text, | ||
| "created_at" timestamp NOT NULL, | ||
| "metadata" text, | ||
| CONSTRAINT "organizations_slug_unique" UNIQUE("slug") | ||
| ); | ||
| --> statement-breakpoint | ||
| CREATE TABLE "session" ( | ||
| "id" text PRIMARY KEY NOT NULL, | ||
| "expiresAt" timestamp(3) NOT NULL, | ||
| "token" text NOT NULL, | ||
| "createdAt" timestamp(3) DEFAULT CURRENT_TIMESTAMP NOT NULL, | ||
| "updatedAt" timestamp(3) NOT NULL, | ||
| "ipAddress" text, | ||
| "userAgent" text, | ||
| "userId" text, | ||
| "active_organization_id" text | ||
| ); | ||
| --> statement-breakpoint | ||
| CREATE TABLE "team" ( | ||
| "id" text PRIMARY KEY NOT NULL, | ||
| "name" text NOT NULL, | ||
| "organization_id" text NOT NULL, | ||
| "created_at" timestamp NOT NULL, | ||
| "updated_at" timestamp | ||
| ); | ||
| --> statement-breakpoint | ||
| CREATE TABLE "two_factor" ( | ||
| "id" text PRIMARY KEY NOT NULL, | ||
| "secret" text NOT NULL, | ||
| "backup_codes" text NOT NULL, | ||
| "user_id" text NOT NULL | ||
| ); | ||
| --> statement-breakpoint | ||
| CREATE TABLE "user" ( | ||
| "id" text PRIMARY KEY NOT NULL, | ||
| "name" text NOT NULL, | ||
| "email" text NOT NULL, | ||
| "email_verified" boolean NOT NULL, | ||
| "image" text, | ||
| "firstName" text, | ||
| "lastName" text, | ||
| "status" "UserStatus" DEFAULT 'ACTIVE' NOT NULL, | ||
| "created_at" timestamp NOT NULL, | ||
| "updated_at" timestamp NOT NULL, | ||
| "deletedAt" timestamp(3), | ||
| "role" "Role" DEFAULT 'USER' NOT NULL, | ||
| "two_factor_enabled" boolean, | ||
| CONSTRAINT "users_email_unique" UNIQUE("email") | ||
| ); | ||
| --> statement-breakpoint | ||
| CREATE TABLE "user_preferences" ( | ||
| "id" text PRIMARY KEY NOT NULL, | ||
| "userId" text NOT NULL, | ||
| "timezone" text DEFAULT 'auto' NOT NULL, | ||
| "dateFormat" text DEFAULT 'MMM D, YYYY' NOT NULL, | ||
| "timeFormat" text DEFAULT 'h:mm a' NOT NULL, | ||
| "createdAt" timestamp(3) DEFAULT CURRENT_TIMESTAMP NOT NULL, | ||
| "updatedAt" timestamp(3) NOT NULL | ||
| ); | ||
| --> statement-breakpoint | ||
| CREATE TABLE "user_stripe_config" ( | ||
| "id" text PRIMARY KEY NOT NULL, | ||
| "user_id" text NOT NULL, | ||
| "webhook_token" text NOT NULL, | ||
| "stripe_secret_key" text NOT NULL, | ||
| "stripe_publishable_key" text, | ||
| "webhook_secret" text NOT NULL, | ||
| "is_live_mode" boolean DEFAULT false NOT NULL, | ||
| "is_active" boolean DEFAULT true NOT NULL, | ||
| "last_webhook_at" timestamp, | ||
| "webhook_failure_count" integer DEFAULT 0 NOT NULL, | ||
| "created_at" timestamp NOT NULL, | ||
| "updated_at" timestamp NOT NULL | ||
| ); | ||
| --> statement-breakpoint | ||
| CREATE TABLE "verification" ( | ||
| "id" text PRIMARY KEY NOT NULL, | ||
| "identifier" text NOT NULL, | ||
| "value" text NOT NULL, | ||
| "expires_at" timestamp NOT NULL, | ||
| "created_at" timestamp, | ||
| "updated_at" timestamp | ||
| ); | ||
| --> statement-breakpoint | ||
| CREATE TABLE "websites" ( | ||
| "id" text PRIMARY KEY NOT NULL, | ||
| "domain" text NOT NULL, | ||
| "name" text, | ||
| "status" "WebsiteStatus" DEFAULT 'ACTIVE' NOT NULL, | ||
| "userId" text, | ||
| "isPublic" boolean DEFAULT false NOT NULL, | ||
| "createdAt" timestamp(3) DEFAULT CURRENT_TIMESTAMP NOT NULL, | ||
| "updatedAt" timestamp(3) DEFAULT CURRENT_TIMESTAMP NOT NULL, | ||
| "deletedAt" timestamp(3), | ||
| "organization_id" text | ||
| ); |
There was a problem hiding this comment.
🧹 Nitpick (assertive)
Column naming style is mixed (camelCase and snake_case).
Consider adopting a single convention (prefer snake_case in Postgres) for consistency and easier tooling.
🤖 Prompt for AI Agents
In packages/db/src/drizzle/0000_aberrant_felicia_hardy.sql around lines 14 to
266, many column names use mixed casing (camelCase) instead of snake_case;
normalize all column names to snake_case (e.g., websiteId -> website_id,
createdAt -> created_at, updatedAt -> updated_at, deletedAt -> deleted_at,
isActive -> is_active, etc.) across every CREATE TABLE statement, update any
column-level defaults or types that reference those names, ensure
UNIQUE/CONSTRAINT names, foreign keys and indexes (if present elsewhere) are
updated to use the new snake_case names, and add a concise migration note/update
for consumers of this schema so downstream code and queries are updated
accordingly.
| CREATE TABLE "ab_experiments" ( | ||
| "id" text PRIMARY KEY NOT NULL, | ||
| "websiteId" text NOT NULL, | ||
| "name" text NOT NULL, | ||
| "description" text, | ||
| "status" "ab_test_status" DEFAULT 'draft' NOT NULL, | ||
| "trafficAllocation" integer DEFAULT 100 NOT NULL, | ||
| "startDate" timestamp(3), | ||
| "endDate" timestamp(3), | ||
| "primaryGoal" text, | ||
| "createdBy" text NOT NULL, | ||
| "createdAt" timestamp(3) DEFAULT CURRENT_TIMESTAMP NOT NULL, | ||
| "updatedAt" timestamp(3) DEFAULT CURRENT_TIMESTAMP NOT NULL, | ||
| "deletedAt" timestamp(3) | ||
| ); | ||
| --> statement-breakpoint | ||
| CREATE TABLE "ab_goals" ( | ||
| "id" text PRIMARY KEY NOT NULL, | ||
| "experimentId" text NOT NULL, | ||
| "name" text NOT NULL, | ||
| "type" text NOT NULL, | ||
| "target" text NOT NULL, | ||
| "description" text, | ||
| "createdAt" timestamp(3) DEFAULT CURRENT_TIMESTAMP NOT NULL, | ||
| "updatedAt" timestamp(3) DEFAULT CURRENT_TIMESTAMP NOT NULL | ||
| ); | ||
| --> statement-breakpoint | ||
| CREATE TABLE "ab_variants" ( | ||
| "id" text PRIMARY KEY NOT NULL, | ||
| "experimentId" text NOT NULL, | ||
| "name" text NOT NULL, | ||
| "type" "ab_variant_type" DEFAULT 'visual' NOT NULL, | ||
| "content" jsonb NOT NULL, | ||
| "trafficWeight" integer DEFAULT 50 NOT NULL, | ||
| "isControl" boolean DEFAULT false NOT NULL, | ||
| "createdAt" timestamp(3) DEFAULT CURRENT_TIMESTAMP NOT NULL, | ||
| "updatedAt" timestamp(3) DEFAULT CURRENT_TIMESTAMP NOT NULL | ||
| ); | ||
| --> statement-breakpoint | ||
| CREATE TABLE "account" ( | ||
| "id" text PRIMARY KEY NOT NULL, | ||
| "account_id" text NOT NULL, | ||
| "provider_id" text NOT NULL, | ||
| "user_id" text NOT NULL, | ||
| "access_token" text, | ||
| "refresh_token" text, | ||
| "id_token" text, | ||
| "access_token_expires_at" timestamp, | ||
| "refresh_token_expires_at" timestamp, | ||
| "scope" text, | ||
| "password" text, | ||
| "created_at" timestamp NOT NULL, | ||
| "updated_at" timestamp NOT NULL |
There was a problem hiding this comment.
🧹 Nitpick (assertive)
Timestamp types and updatedAt semantics are inconsistent.
Mixing timestamp(3) and timestamp complicates type handling. Prefer timestamptz (UTC) across the board. Also, many updatedAt columns lack automatic updates; either add triggers or ensure app-layer updates.
Also applies to: 107-145, 167-187, 197-219, 221-229, 231-244, 255-266
| CREATE TABLE "account" ( | ||
| "id" text PRIMARY KEY NOT NULL, | ||
| "account_id" text NOT NULL, | ||
| "provider_id" text NOT NULL, | ||
| "user_id" text NOT NULL, | ||
| "access_token" text, | ||
| "refresh_token" text, | ||
| "id_token" text, | ||
| "access_token_expires_at" timestamp, | ||
| "refresh_token_expires_at" timestamp, | ||
| "scope" text, | ||
| "password" text, | ||
| "created_at" timestamp NOT NULL, | ||
| "updated_at" timestamp NOT NULL | ||
| ); |
There was a problem hiding this comment.
🧹 Nitpick (assertive)
Sensitive OAuth tokens and user secrets stored in plaintext.
Fields like access_token, refresh_token, id_token, two_factor.secret should be encrypted or hashed. At minimum, encrypt at rest using app-level encryption with KMS-managed keys.
Also applies to: 177-195, 205-219, 221-229, 246-253
| CREATE TABLE "user_stripe_config" ( | ||
| "id" text PRIMARY KEY NOT NULL, | ||
| "user_id" text NOT NULL, | ||
| "webhook_token" text NOT NULL, | ||
| "stripe_secret_key" text NOT NULL, | ||
| "stripe_publishable_key" text, | ||
| "webhook_secret" text NOT NULL, | ||
| "is_live_mode" boolean DEFAULT false NOT NULL, | ||
| "is_active" boolean DEFAULT true NOT NULL, | ||
| "last_webhook_at" timestamp, | ||
| "webhook_failure_count" integer DEFAULT 0 NOT NULL, | ||
| "created_at" timestamp NOT NULL, | ||
| "updated_at" timestamp NOT NULL | ||
| ); |
There was a problem hiding this comment.
🧹 Nitpick (assertive)
🛠️ Refactor suggestion
Stripe secrets and webhook secrets should be encrypted or kept out of the DB.
Storing "stripe_secret_key" and "webhook_secret" unencrypted is high risk. Consider:
- Application-level encryption (libsodium) with KMS-managed keys, store only ciphertext + key id.
- Or move these to a secret manager and store references.
Would you like a follow-up migration patch that adds encrypted_* columns and a helper to read/write via KMS?
| CREATE TABLE "verification" ( | ||
| "id" text PRIMARY KEY NOT NULL, | ||
| "identifier" text NOT NULL, | ||
| "value" text NOT NULL, | ||
| "expires_at" timestamp NOT NULL, | ||
| "created_at" timestamp, | ||
| "updated_at" timestamp | ||
| ); |
There was a problem hiding this comment.
Verification tokens should be hashed at rest.
Do not store verification values (e.g., magic links, OTPs) in plaintext.
CREATE TABLE "verification" (
"id" text PRIMARY KEY NOT NULL,
"identifier" text NOT NULL,
- "value" text NOT NULL,
+ "token_hash" text NOT NULL,
"expires_at" timestamp NOT NULL,
"created_at" timestamp,
"updated_at" timestamp
);
@@
-CREATE INDEX "verifications_identifier_idx" ON "verification" USING btree ("identifier" text_ops);--> statement-breakpoint
+CREATE INDEX "verifications_identifier_idx" ON "verification" USING btree ("identifier" text_ops);--> statement-breakpoint
-- no index needed on the original value since we store only token_hashAlso applies to: 2317-2348
🤖 Prompt for AI Agents
In packages/db/src/drizzle/0000_aberrant_felicia_hardy.sql around lines 246 to
253, the verification table stores token "value" in plaintext; change the schema
and app flow to store only a hashed token instead: rename or replace the "value"
column with something like "hashed_value" (sufficient length/text), add an
optional "salt" or "algorithm" column if using per-token salts or to record the
hash type, and remove/stop using plaintext storage. Update all code paths that
create verification tokens to hash the token server-side using a secure
primitive (e.g., HMAC-SHA256 with an app secret, or bcrypt/Argon2 with a
per-token salt) and persist the hash and salt/algorithm; update lookup logic to
hash incoming tokens the same way and compare hashes. For migration, do not
attempt to reverse hashes—either invalidate existing plaintext tokens (delete
them) or force re-issuance; update any tests and docs accordingly.
| CREATE UNIQUE INDEX "apikey_key_unique" ON "apikey" USING btree ("key" text_ops);--> statement-breakpoint | ||
| CREATE INDEX "apikey_user_id_idx" ON "apikey" USING btree ("user_id" text_ops);--> statement-breakpoint | ||
| CREATE INDEX "apikey_organization_id_idx" ON "apikey" USING btree ("organization_id" text_ops);--> statement-breakpoint | ||
| CREATE INDEX "apikey_prefix_idx" ON "apikey" USING btree ("prefix" text_ops);--> statement-breakpoint | ||
| CREATE INDEX "apikey_key_hash_idx" ON "apikey" USING btree ("key_hash" text_ops);--> statement-breakpoint | ||
| CREATE INDEX "apikey_enabled_idx" ON "apikey" USING btree ("enabled");--> statement-breakpoint |
There was a problem hiding this comment.
Make key_hash unique; remove uniqueness on plaintext key.
Move uniqueness to key_hash, not key. Drop the unique index on key and add a unique index on key_hash.
-CREATE UNIQUE INDEX "apikey_key_unique" ON "apikey" USING btree ("key" text_ops);--> statement-breakpoint
+CREATE UNIQUE INDEX "apikey_key_hash_unique" ON "apikey" USING btree ("key_hash" text_ops);--> statement-breakpoint
@@
-CREATE INDEX "apikey_key_hash_idx" ON "apikey" USING btree ("key_hash" text_ops);--> statement-breakpoint
+-- replaced by unique index above📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| CREATE UNIQUE INDEX "apikey_key_unique" ON "apikey" USING btree ("key" text_ops);--> statement-breakpoint | |
| CREATE INDEX "apikey_user_id_idx" ON "apikey" USING btree ("user_id" text_ops);--> statement-breakpoint | |
| CREATE INDEX "apikey_organization_id_idx" ON "apikey" USING btree ("organization_id" text_ops);--> statement-breakpoint | |
| CREATE INDEX "apikey_prefix_idx" ON "apikey" USING btree ("prefix" text_ops);--> statement-breakpoint | |
| CREATE INDEX "apikey_key_hash_idx" ON "apikey" USING btree ("key_hash" text_ops);--> statement-breakpoint | |
| CREATE INDEX "apikey_enabled_idx" ON "apikey" USING btree ("enabled");--> statement-breakpoint | |
| CREATE UNIQUE INDEX "apikey_key_hash_unique" ON "apikey" USING btree ("key_hash" text_ops);--> statement-breakpoint | |
| CREATE INDEX "apikey_user_id_idx" ON "apikey" USING btree ("user_id" text_ops);--> statement-breakpoint | |
| CREATE INDEX "apikey_organization_id_idx" ON "apikey" USING btree ("organization_id" text_ops);--> statement-breakpoint | |
| CREATE INDEX "apikey_prefix_idx" ON "apikey" USING btree ("prefix" text_ops);--> statement-breakpoint | |
| -- replaced by unique index above | |
| CREATE INDEX "apikey_enabled_idx" ON "apikey" USING btree ("enabled");--> statement-breakpoint |
🤖 Prompt for AI Agents
In packages/db/src/drizzle/0000_aberrant_felicia_hardy.sql around lines 299-304,
the migration currently creates a UNIQUE index on the plaintext "key" but the
review requires uniqueness enforced on "key_hash" instead; remove or change the
CREATE UNIQUE INDEX "apikey_key_unique" on "key" so it is non-unique (or drop
it), and add a CREATE UNIQUE INDEX "apikey_key_hash_unique" ON "apikey" USING
btree ("key_hash") (while keeping the other non-unique indexes intact); ensure
index names are updated and duplicate/conflicting index statements are removed
so the schema enforces uniqueness on key_hash only.
| CREATE UNIQUE INDEX "websites_user_domain_unique" ON "websites" USING btree ("userId","domain") WHERE "websites"."organization_id" is null;--> statement-breakpoint | ||
| CREATE UNIQUE INDEX "websites_org_domain_unique" ON "websites" USING btree ("organization_id","domain") WHERE "websites"."organization_id" is not null;--> statement-breakpoint | ||
| CREATE INDEX "websites_userId_idx" ON "websites" USING btree ("userId" text_ops); No newline at end of file |
There was a problem hiding this comment.
🛠️ Refactor suggestion
Make domain uniqueness case-insensitive.
Domains are case-insensitive. Enforce uniqueness on lower(domain) to prevent duplicates with varying case.
-CREATE UNIQUE INDEX "websites_user_domain_unique" ON "websites" USING btree ("userId","domain") WHERE "websites"."organization_id" is null;--> statement-breakpoint
-CREATE UNIQUE INDEX "websites_org_domain_unique" ON "websites" USING btree ("organization_id","domain") WHERE "websites"."organization_id" is not null;--> statement-breakpoint
+CREATE UNIQUE INDEX "websites_user_domain_unique" ON "websites" USING btree ("userId", lower("domain")) WHERE "websites"."organization_id" is null;--> statement-breakpoint
+CREATE UNIQUE INDEX "websites_org_domain_unique" ON "websites" USING btree ("organization_id", lower("domain")) WHERE "websites"."organization_id" is not null;--> statement-breakpoint📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| CREATE UNIQUE INDEX "websites_user_domain_unique" ON "websites" USING btree ("userId","domain") WHERE "websites"."organization_id" is null;--> statement-breakpoint | |
| CREATE UNIQUE INDEX "websites_org_domain_unique" ON "websites" USING btree ("organization_id","domain") WHERE "websites"."organization_id" is not null;--> statement-breakpoint | |
| CREATE INDEX "websites_userId_idx" ON "websites" USING btree ("userId" text_ops); | |
| CREATE UNIQUE INDEX "websites_user_domain_unique" ON "websites" USING btree ("userId", lower("domain")) WHERE "websites"."organization_id" is null;--> statement-breakpoint | |
| CREATE UNIQUE INDEX "websites_org_domain_unique" ON "websites" USING btree ("organization_id", lower("domain")) WHERE "websites"."organization_id" is not null;--> statement-breakpoint | |
| CREATE INDEX "websites_userId_idx" ON "websites" USING btree ("userId" text_ops); |
🤖 Prompt for AI Agents
packages/db/src/drizzle/0000_aberrant_felicia_hardy.sql lines 329-331: the
unique indexes on domain are case-sensitive; update both unique indexes to use a
lower(domain) expression so uniqueness is enforced case-insensitively. Replace
the two CREATE UNIQUE INDEX lines with expression indexes like CREATE UNIQUE
INDEX "websites_user_domain_unique" ON "websites" USING btree ("userId",
lower("domain")) WHERE "websites"."organization_id" is null; and CREATE UNIQUE
INDEX "websites_org_domain_unique" ON "websites" USING btree ("organization_id",
lower("domain")) WHERE "websites"."organization_id" is not null; leave the
userId index as-is or add a matching index on lower(domain) if needed for query
performance.
| "exports": { | ||
| ".": "./src/index.ts" | ||
| ".": "./src/index.ts", | ||
| "./lists": "./src/lists/index.ts", | ||
| "./lists/*": "./src/lists/*", | ||
| "./utils": "./src/utils/index.ts", | ||
| "./utils/*": "./src/utils/*", | ||
| "./types": "./src/types/index.ts", | ||
| "./types/*": "./src/types/*", | ||
| "./country-codes": "./src/country-codes.ts" |
There was a problem hiding this comment.
🧹 Nitpick (assertive)
Optional: add typesVersions for TS subpath pattern ergonomics
If editors/tooling need improved resolution for subpath patterns, consider typesVersions. Not required if you rely on Bun-only workflows, but helpful for TS in some IDEs.
Example:
"exports": {
".": "./src/index.ts",
"./lists": "./src/lists/index.ts",
"./lists/*": "./src/lists/*",
"./utils": "./src/utils/index.ts",
"./utils/*": "./src/utils/*",
"./types": "./src/types/index.ts",
"./types/*": "./src/types/*",
"./country-codes": "./src/country-codes.ts"
},
+ "typesVersions": {
+ "*": {
+ "lists": ["src/lists/index.ts"],
+ "lists/*": ["src/lists/*"],
+ "utils": ["src/utils/index.ts"],
+ "utils/*": ["src/utils/*"],
+ "types": ["src/types/index.ts"],
+ "types/*": ["src/types/*"],
+ "country-codes": ["src/country-codes.ts"]
+ }
+ },📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| "exports": { | |
| ".": "./src/index.ts" | |
| ".": "./src/index.ts", | |
| "./lists": "./src/lists/index.ts", | |
| "./lists/*": "./src/lists/*", | |
| "./utils": "./src/utils/index.ts", | |
| "./utils/*": "./src/utils/*", | |
| "./types": "./src/types/index.ts", | |
| "./types/*": "./src/types/*", | |
| "./country-codes": "./src/country-codes.ts" | |
| "exports": { | |
| ".": "./src/index.ts", | |
| "./lists": "./src/lists/index.ts", | |
| "./lists/*": "./src/lists/*", | |
| "./utils": "./src/utils/index.ts", | |
| "./utils/*": "./src/utils/*", | |
| "./types": "./src/types/index.ts", | |
| "./types/*": "./src/types/*", | |
| "./country-codes": "./src/country-codes.ts" | |
| }, | |
| "typesVersions": { | |
| "*": { | |
| "lists": ["src/lists/index.ts"], | |
| "lists/*": ["src/lists/*"], | |
| "utils": ["src/utils/index.ts"], | |
| "utils/*": ["src/utils/*"], | |
| "types": ["src/types/index.ts"], | |
| "types/*": ["src/types/*"], | |
| "country-codes": ["src/country-codes.ts"] | |
| } | |
| }, |
🤖 Prompt for AI Agents
In packages/shared/package.json around lines 7 to 15, the package exports use
subpath patterns that some TypeScript IDEs don't resolve well; add a
"typesVersions" section mapping the same subpath patterns to the corresponding
source .ts (or .d.ts) files so TS tooling can resolve imports like
"your-pkg/lists" and "your-pkg/lists/*". Add a "typesVersions": {">=4.1": { "":
["src/index.ts"], "lists": ["src/lists/index.ts"], "lists/*": ["src/lists/*"],
"utils": ["src/utils/index.ts"], "utils/*": ["src/utils/*"], "types":
["src/types/index.ts"], "types/*": ["src/types/*"], "country-codes":
["src/country-codes.ts"] }} entry in package.json (adjust paths/extensions if
you emit .d.ts) so editors can pick up the correct declarations.
Pull Request
Description
This PR addresses performance issues by optimizing module bundling and server-side processing. It refactors the
@databuddy/sharedpackage to use granular subpath exports, preventing large data lists and heavy utilities from being bundled into client-side applications unnecessarily. Additionally, regular expressions used for bot detection are now precompiled to reduce per-request CPU overhead.Checklist
Summary by CodeRabbit