Conversation
|
Note Reviews pausedIt looks like this branch is under active development. To avoid overwhelming you with review comments due to an influx of new commits, CodeRabbit has automatically paused this review. You can configure this behavior by changing the Use the following commands to manage reviews:
Use the checkboxes below for quick actions:
WalkthroughCentralized bulk processing in Changes
Sequence Diagram(s)sequenceDiagram
autonumber
actor Orchestrator
participant DB as Supabase DB
participant State as gradebook_row_recalc_state
participant RPC as update_gradebook_rows_batch
participant Sentry
Orchestrator->>DB: query gradebook_column_students (paginated)
DB-->>Orchestrator: rows
Orchestrator->>Orchestrator: group rows by student_id, build rowsInputScoped & updatesByStudentScoped
Orchestrator->>State: batch upsert sorted batchUpsertDataScoped
State-->>Orchestrator: upsert results/errors
Orchestrator->>Orchestrator: chunk students into batches (MAX_BATCH_UPDATE_SIZE)
loop per chunk
Orchestrator->>RPC: call update_gradebook_rows_batch(chunk)
RPC->>DB: perform updates/archives/clears
DB-->>RPC: chunk result summary
RPC-->>Orchestrator: result summary
Orchestrator->>Sentry: capture errors (if any)
end
Orchestrator-->>Orchestrator: aggregate results, log metrics
Estimated code review effort🎯 4 (Complex) | ⏱️ ~45 minutes Possibly related PRs
Poem
🚥 Pre-merge checks | ✅ 2 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 1
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In
`@supabase/functions/gradebook-column-recalculate/expression/DependencySource.ts`:
- Around line 994-996: There's a Prettier formatting failure due to a missing
space after the `if` keyword in the conditional that checks variable `v` in
DependencySource.ts (the block that returns { score: undefined, max_score:
undefined }); update the conditional to conform to Prettier style by inserting
the required space after `if`, then run the formatter/lint step to confirm the
pipeline passes.
supabase/functions/gradebook-column-recalculate/expression/DependencySource.ts
Outdated
Show resolved
Hide resolved
There was a problem hiding this comment.
Actionable comments posted: 1
🧹 Nitpick comments (3)
supabase/functions/gradebook-column-recalculate/expression/DependencySource.ts (2)
1105-1107: Logic looks correct, but minor comment clarity suggestion.The filtering at line 1107 and return condition at line 1134 are consistent: entries with
max_score <= 0orscore === nullare excluded from both the drop consideration and the final output. This aligns with the learning that entries with max_score = 0 cannot contribute meaningfully to weighted averages.However, the comment "These entries should be preserved" on line 1106 could be misread as "preserved in the output." Consider clarifying to "These entries should not count toward drop_lowest selection" or similar.
Also applies to: 1129-1134
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@supabase/functions/gradebook-column-recalculate/expression/DependencySource.ts` around lines 1105 - 1107, Clarify the comment wording to avoid implying these entries are removed from output: update the comment above the validEntries filter (around the declaration of validEntries) and the similar comment near the return/selection logic (around the drop_lowest handling) to state that entries with max_score <= 0 or score === null are excluded from drop_lowest consideration (i.e., "do not count toward drop_lowest selection") rather than "preserved" which can be misread as preserved in the final output; keep the code logic unchanged and only adjust the comment text referencing validEntries and the drop_lowest selection logic.
1151-1162: Dead code: Warning can never trigger.The warning at lines 1151-1162 checks for items in
retwith!v.max_score || v.max_score <= 0. However, the filter condition at line 1134 already ensures all items inrethave(v.max_score ?? 0) > 0, meaning this warning will never fire.You can safely remove this block or update it to validate against different criteria if needed.
🧹 Proposed removal of dead code
console.log( `Drop_lowest returning ${ret.length} values after dropping ${numDropped}:`, JSON.stringify(outputSummary) ); - // Warn if any returned values have invalid max_score - const invalidMaxScore = ret.filter((v) => !v.max_score || v.max_score <= 0); - if (invalidMaxScore.length > 0) { - console.warn( - `Drop_lowest: ${invalidMaxScore.length} returned values have invalid max_score: ${JSON.stringify( - invalidMaxScore.map((v) => ({ - score: v.score, - max_score: v.max_score, - column_slug: (v as unknown as { column_slug?: string })?.column_slug - })) - )}` - ); - } - return ret;🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@supabase/functions/gradebook-column-recalculate/expression/DependencySource.ts` around lines 1151 - 1162, The console.warn block that checks invalidMaxScore is dead code because the earlier filter that builds ret already enforces (v.max_score ?? 0) > 0; remove the entire invalidMaxScore/console.warn block from DependencySource.ts (the code referencing invalidMaxScore, ret, and v.max_score) to avoid unreachable checks, or if you intended a different validation, replace the condition to check the intended property (e.g., validate column_slug or negative score) rather than duplicating the existing max_score check.supabase/functions/gradebook-column-recalculate/index.ts (1)
399-624: Scoped path now mirrors bulk path structure - significant code duplication.The non-bulk (scoped) path now replicates the bulk path's logic with consistent data fetching, grouping, batch upserts, chunked RPC updates, and logging. While this ensures behavioral consistency, there's substantial code duplication between the two branches (~225 lines).
Consider extracting shared logic into helper functions in a future refactor to reduce maintenance burden. For example:
- Upsert preparation and duplicate detection logic
- Batch update chunking and RPC invocation
- Result summarization logging
This is optional given the current PR scope focuses on resilience improvements.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@supabase/functions/gradebook-column-recalculate/index.ts` around lines 399 - 624, The scoped branch duplicates the bulk branch logic (fetch/grouping, upsert prep, duplicate detection, batchUpsertDataScoped creation, chunking of batchUpdatesScoped, calling adminSupabase.rpc("update_gradebook_rows_batch"), and result summarization), so extract shared steps into helpers to remove duplication: create functions like prepareUpsertRows(class_id, gradebook_id, is_private, rowEntries) that returns batchUpsertData and upsert keys/counts (use rowKey and upsertRowKeyCountsScoped logic), chunkBatchUpdates(batchUpdates, MAX_BATCH_UPDATE_SIZE) to produce chunks, callUpdateGradebookRowsChunk(adminSupabase, gbScope, chunk, workerId) to invoke the RPC and return parsed results, and summarizeBatchResults(results, workerId, chunkIndex) for logging; then replace the scoped and bulk inline blocks with calls to these helpers and keep unique per-branch data fetching (scopedGcs grouping and versionsByStudentScoped) intact.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@supabase/functions/gradebook-column-recalculate/index.ts`:
- Around line 399-400: Prettier is flagging formatting issues in the file (an
unmatched or misformatted "} else {" block); fix by running the formatter (npx
prettier --write) on the file or manually correct spacing/indentation around the
else block and any nearby statements (ensure consistent semicolons, trailing
commas, and indentation) then re-run prettier --check and commit the formatted
file.
---
Nitpick comments:
In
`@supabase/functions/gradebook-column-recalculate/expression/DependencySource.ts`:
- Around line 1105-1107: Clarify the comment wording to avoid implying these
entries are removed from output: update the comment above the validEntries
filter (around the declaration of validEntries) and the similar comment near the
return/selection logic (around the drop_lowest handling) to state that entries
with max_score <= 0 or score === null are excluded from drop_lowest
consideration (i.e., "do not count toward drop_lowest selection") rather than
"preserved" which can be misread as preserved in the final output; keep the code
logic unchanged and only adjust the comment text referencing validEntries and
the drop_lowest selection logic.
- Around line 1151-1162: The console.warn block that checks invalidMaxScore is
dead code because the earlier filter that builds ret already enforces
(v.max_score ?? 0) > 0; remove the entire invalidMaxScore/console.warn block
from DependencySource.ts (the code referencing invalidMaxScore, ret, and
v.max_score) to avoid unreachable checks, or if you intended a different
validation, replace the condition to check the intended property (e.g., validate
column_slug or negative score) rather than duplicating the existing max_score
check.
In `@supabase/functions/gradebook-column-recalculate/index.ts`:
- Around line 399-624: The scoped branch duplicates the bulk branch logic
(fetch/grouping, upsert prep, duplicate detection, batchUpsertDataScoped
creation, chunking of batchUpdatesScoped, calling
adminSupabase.rpc("update_gradebook_rows_batch"), and result summarization), so
extract shared steps into helpers to remove duplication: create functions like
prepareUpsertRows(class_id, gradebook_id, is_private, rowEntries) that returns
batchUpsertData and upsert keys/counts (use rowKey and upsertRowKeyCountsScoped
logic), chunkBatchUpdates(batchUpdates, MAX_BATCH_UPDATE_SIZE) to produce
chunks, callUpdateGradebookRowsChunk(adminSupabase, gbScope, chunk, workerId) to
invoke the RPC and return parsed results, and summarizeBatchResults(results,
workerId, chunkIndex) for logging; then replace the scoped and bulk inline
blocks with calls to these helpers and keep unique per-branch data fetching
(scopedGcs grouping and versionsByStudentScoped) intact.
ℹ️ Review info
Configuration used: Repository UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (2)
supabase/functions/gradebook-column-recalculate/expression/DependencySource.tssupabase/functions/gradebook-column-recalculate/index.ts
|
The latest updates on your projects. Learn more about Argos notifications ↗︎ Awaiting the start of a new Argos build… |
There was a problem hiding this comment.
Actionable comments posted: 1
🧹 Nitpick comments (1)
supabase/functions/gradebook-column-recalculate/index.ts (1)
450-617: Extract duplicated scoped/bulk batch-update pipeline into one helperLines 450-617 duplicate the same upsert/chunked-RPC/result-parsing flow already present in the bulk branch. This is likely to drift and cause branch-specific bugs over time.
♻️ Refactor direction
+async function applyBatchUpdates( + adminSupabase: ReturnType<typeof createClient<Database>>, + gbScope: Sentry.Scope, + workerId: string, + gradebook_id: number, + rowEntries: RowEntry[], + updatesByStudent: Map<string, unknown[]>, + versionsByStudent: Map<string, number>, + classId: number, + is_private: boolean, + mode: "bulk" | "scoped" +) { + // shared: upsert recalc state, chunk updates, call RPC, log summaries +} ... - // duplicated scoped upsert + chunk RPC flow + await applyBatchUpdates( + adminSupabase, + gbScope, + workerId, + gradebook_id, + rowEntries, + updatesByStudentScoped, + versionsByStudentScoped, + classId, + is_private, + "scoped" + );🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@supabase/functions/gradebook-column-recalculate/index.ts` around lines 450 - 617, This code duplicates the upsert + chunked-RPC batch-update flow (upsertRowKeysScoped, batchUpsertDataScoped, entriesByStudentScoped, batchUpdatesScoped, chunksScoped and the loop calling update_gradebook_rows_batch) and should be extracted into a single reusable helper; create a shared function (e.g., processGradebookBatch or processRowRecalcBatch) that accepts classId, gradebook_id, is_private, rowEntries, workerId, adminSupabase and gbScope, move the logic that computes upsert keys/counts, performs the upsert (batchUpsertDataScoped), groups entries by student, chunks batchUpdatesScoped, calls the RPC update_gradebook_rows_batch, and parses/logs results, then replace this duplicated block with a single call to that helper from both scoped and bulk branches so the flow is maintained in one place and won’t drift.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@supabase/functions/gradebook-column-recalculate/index.ts`:
- Around line 496-506: If upsertErrorScoped is truthy, stop further processing
instead of falling through to the subsequent batch RPC execution: after logging
and Sentry.captureException( upsertErrorScoped, gbScope ) (see variables
upsertErrorScoped, workerId, gradebook_id, rowEntries, gbScope) return or throw
to abort the handler so the recalculation state is not left unset and downstream
batch RPC/archive logic is not executed.
---
Nitpick comments:
In `@supabase/functions/gradebook-column-recalculate/index.ts`:
- Around line 450-617: This code duplicates the upsert + chunked-RPC
batch-update flow (upsertRowKeysScoped, batchUpsertDataScoped,
entriesByStudentScoped, batchUpdatesScoped, chunksScoped and the loop calling
update_gradebook_rows_batch) and should be extracted into a single reusable
helper; create a shared function (e.g., processGradebookBatch or
processRowRecalcBatch) that accepts classId, gradebook_id, is_private,
rowEntries, workerId, adminSupabase and gbScope, move the logic that computes
upsert keys/counts, performs the upsert (batchUpsertDataScoped), groups entries
by student, chunks batchUpdatesScoped, calls the RPC
update_gradebook_rows_batch, and parses/logs results, then replace this
duplicated block with a single call to that helper from both scoped and bulk
branches so the flow is maintained in one place and won’t drift.
Summary by CodeRabbit
Bug Fixes
Refactor