Conversation
|
Warning Rate limit exceeded
⌛ How to resolve this issue?After the wait time has elapsed, a review can be triggered using the We recommend that you space out your commits to avoid hitting the rate limit. 🚦 How do rate limits work?CodeRabbit enforces hourly rate limits for each developer per organization. Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout. Please see our FAQ for further information. WalkthroughCentralizes E2E OIDC token validation in the shared GitHub wrapper, moves many Supabase imports from esm.sh to Changes
Sequence Diagram(s)sequenceDiagram
participant Caller as Caller (function / test)
participant GitHubWrapper as GitHubWrapper
participant JWKS as GitHub JWKS / OIDC
Caller->>GitHubWrapper: validateOIDCTokenOrAllowE2E(token)
GitHubWrapper->>GitHubWrapper: decode token header & payload
GitHubWrapper->>GitHubWrapper: repo startsWith END_TO_END_REPO_PREFIX?
alt E2E token (prefix match)
GitHubWrapper->>GitHubWrapper: E2E enabled & header.kid == END_TO_END_SECRET?
GitHubWrapper-->>Caller: return decoded payload
else Standard token
GitHubWrapper->>JWKS: fetch JWKS & validate signature/audience
JWKS-->>GitHubWrapper: validation result
GitHubWrapper-->>Caller: return decoded payload
end
Estimated code review effort🎯 4 (Complex) | ⏱️ ~60 minutes Possibly related PRs
Suggested labels
Suggested reviewers
Poem
🚥 Pre-merge checks | ✅ 2 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 5
🤖 Fix all issues with AI agents
In `@supabase/functions/_shared/GitHubWrapper.ts`:
- Around line 625-647: The current END_TO_END_SECRET uses a guessable default
which allows bypassing validation; change validateOIDCTokenOrAllowE2E to require
an explicit secret (or explicit opt‑in) instead of defaulting to "not-a-secret":
remove the fallback in END_TO_END_SECRET and read the env var strictly, and in
validateOIDCTokenOrAllowE2E (referencing END_TO_END_REPO_PREFIX and header.kid)
fail closed by throwing a SecurityError if the secret env var is missing/empty
(or if an explicit E2E_ENABLE flag is not set) before accepting any E2E token.
Ensure the function logs or throws a clear error when the secret/opt-in is not
present so E2E bypass cannot be used in production.
In `@supabase/functions/assignment-create-all-repos/index.ts`:
- Line 1: The import for Supabase client currently uses a floating major
constraint "jsr:`@supabase/supabase-js`@2"; change it to a specific pinned version
(e.g., "jsr:`@supabase/supabase-js`@2.x.y") or reference the project's import map
entry so it matches the project's versioning strategy (align with other pins
like ^2.4.5); update the import statement at the top of the file (the
createClient import) to use the chosen exact version or import map key.
In `@supabase/functions/scripts/CheckBranchProtection.ts`:
- Around line 344-355: The comment says "max 50/min" but the Bottleneck instance
named rateLimiter is configured with reservoir and reservoirRefreshAmount set to
100; update the Bottleneck configuration in CheckBranchProtection.ts (the
rateLimiter creation) to match the comment by changing reservoir and
reservoirRefreshAmount to 50 (or alternatively update the comment to reflect 100
if that is intended) and keep maxConcurrent as-is.
In `@tests/e2e/TestingUtils.ts`:
- Around line 1638-1644: Remove the leftover debug statement that prints stack
traces during tests by deleting the console.trace(data) call inside the
error-handling block that checks if ("error" in data) (the branch that inspects
data.error and throws a new Error using data.error.details); leave the
surrounding logic and throws intact so the function still throws the appropriate
Error when data.error is present.
- Around line 1538-1585: Remove the duplicated GradeResponse type declaration
and instead import the existing GradeResponse type from
supabase/functions/_shared/FunctionTypes.d.ts; keep the local
GradingScriptResult declaration as-is if the expanded feedback shape is
intentionally different, otherwise refactor GradingScriptResult to reference the
shared AutograderFeedback type from FunctionTypes.d.ts (use the type name
AutograderFeedback and GradeResponse to locate the shared definitions and update
the imports at the top of Tests/e2e/TestingUtils.ts accordingly).
♻️ Duplicate comments (9)
supabase/functions/scripts/PushChangesToRepoFromHandout.ts (1)
13-13: Same jsr import verification as above.Please apply the same runtime compatibility check for
jsr:@supabase/supabase-js@2in this script’s execution environment.supabase/functions/assignment-create-handout-repo/index.ts (1)
1-1: Same jsr import verification as above.Please apply the same runtime compatibility check for
jsr:@supabase/supabase-js@2here as well.supabase/functions/enrollments-sync-canvas/index.ts (1)
6-6: Same jsr import verification as above.Please apply the same runtime compatibility check for
jsr:@supabase/supabase-js@2here as well.supabase/functions/scripts/ClearCaches.ts (1)
2-2: Same jsr import verification as above.Please apply the same runtime compatibility check for
jsr:@supabase/supabase-js@2here as well.supabase/functions/gradebook-column-recalculate/index.ts (1)
2-2: Duplicate: jsr Supabase import verification.Same verification as earlier regarding jsr resolver support and version pinning.
jsr "@supabase/supabase-js" versioning policy and Deno/Supabase Edge Functions support for jsr: specifierssupabase/functions/gradebook-column-recalculate/GradebookProcessor.ts (1)
1-1: Duplicate: jsr Supabase import verification.Same verification as earlier regarding jsr resolver support and version pinning.
jsr "@supabase/supabase-js" versioning policy and Deno/Supabase Edge Functions support for jsr: specifierssupabase/functions/_shared/ChimeWrapper.ts (1)
5-7: Duplicate: jsr Supabase import verification.Same verification as earlier regarding jsr resolver support and version pinning.
jsr "@supabase/supabase-js" versioning policy and Deno/Supabase Edge Functions support for jsr: specifierssupabase/functions/gradebook-column-recalculate/BatchProcessor.ts (1)
1-1: Duplicate: jsr Supabase import verification.Same verification as earlier regarding jsr resolver support and version pinning.
jsr "@supabase/supabase-js" versioning policy and Deno/Supabase Edge Functions support for jsr: specifierssupabase/functions/metrics/index.ts (1)
2-2: Duplicate: jsr Supabase import verification.Same verification as earlier regarding jsr resolver support and version pinning.
jsr "@supabase/supabase-js" versioning policy and Deno/Supabase Edge Functions support for jsr: specifiers
🧹 Nitpick comments (7)
supabase/functions/discord-async-worker/index.ts (1)
2-2: Consider aligning the PostgRESTJsontype import with jsr to avoid mixed registries.
If the jsr package (or a local alias) can provide the same type, it’ll reduce dependency variability.supabase/functions/github-repo-configure-webhook/index.ts (1)
4-4: Optional: align the remaining Supabase-related type import with jsr for consistency.
Since you’ve moved the client to jsr, consider migrating theJsontype import too if an equivalent jsr path exists.supabase/functions/scripts/CheckBranchProtection.ts (4)
35-36: Inconsistent import sources with the JSR migration goal.The PR objective is to migrate Supabase imports to JSR, but
RequestErrorandBottleneckare still imported fromesm.sh. While these packages may not be available on JSR, it's worth noting this inconsistency.
222-223: Magic delay for ruleset propagation.The 1-second delay is a reasonable approach for allowing the ruleset to propagate, but this could be flaky in high-latency environments. Consider making this configurable or documenting why this specific value was chosen.
385-401: Direct mutation of status object.The code mutates the
statusobject fromindexedResultsdirectly. While this works for this CLI script, it's a side effect that could be confusing. Consider creating a new object or being more explicit about the update.
483-486: Consider awaiting the main function call.The
main()function is async but isn't awaited at the entry point. While Deno handles unhandled promise rejections, explicitly awaiting ensures proper error propagation and process exit codes.Suggested fix
// Run the main function if (import.meta.main) { - main(); + await main(); }supabase/functions/autograder-create-submission/index.ts (1)
30-30: Import Json from the local SupabaseTypes.d.ts instead of esm.sh.The
Jsontype is already exported fromSupabaseTypes.d.ts(line 1), which is being used consistently in other functions likeautograder-submit-feedbackand_shared/FunctionTypes.d.ts. Update this import to align with the migration pattern and remove the esm.sh dependency.
|
The latest updates on your projects. Learn more about Argos notifications ↗︎ Awaiting the start of a new Argos build… |
There was a problem hiding this comment.
Actionable comments posted: 2
🤖 Fix all issues with AI agents
In `@supabase/migrations/20260129000000_fix_error_pin_created_by_fkey.sql`:
- Around line 139-167: Before processing p_rules, add a guard that checks if
jsonb_array_length(p_rules) = 0 and rule_logic = 'and' and handle it explicitly
(either RAISE EXCEPTION with a clear message or short-circuit to skip inserts
and auto-populate). Locate the proc/section that uses p_rules and
v_all_rules_match (the block that deletes from error_pin_rules, loops over
p_rules, and later deletes from error_pin_submission_matches) and insert the
check there so you do not proceed to insert zero rules or compute matches when
no rules were supplied; ensure any early exit prevents the auto-populate step
from running and documents the failure reason.
- Around line 88-117: The UPDATE can null-out or error when the JSON payload
omits fields; before UPDATE (inside the branch handling existing pins) SELECT
the current error_pins row into local variables (e.g.,
v_existing_discussion_thread_id, v_existing_assignment_id,
v_existing_class_id_already_set) and then use COALESCE between the parsed
p_error_pin values and those existing variables in the UPDATE statement (keep
conversions: ::bigint/::boolean and preserve rule_logic default). Ensure you
still enforce authorizeforclassgrader(v_existing_class_id) and derive
v_class_id/v_assignment_id as before, but fall back to the selected existing
assignment_id/class_id/discussion_thread_id when the JSON key is missing.
supabase/migrations/20260129000000_fix_error_pin_created_by_fkey.sql
Outdated
Show resolved
Hide resolved
There was a problem hiding this comment.
Actionable comments posted: 4
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
supabase/functions/discord-discussion-stats-update/index.ts (1)
330-348:⚠️ Potential issue | 🔴 CriticalAuth check:
x-supabase-webhook-sourceheader is user-spoofable and should not be used for security decisions.The condition on line 342 allows bypass of the secret check if the
x-supabase-webhook-sourceheader equals"discord-discussion-stats-update". External callers can trivially set this header—Supabase edge functions do not strip custom headers. Supabase's own documentation explicitly discourages relying onx-supabase-*headers for authentication.Remove the
webhookSourcecheck from the auth condition (line 342). Rely solely on thex-edge-function-secretvalidation, or implement a dedicated secret header likex-webhook-secretas recommended by Supabase's security guidance.
🤖 Fix all issues with AI agents
In `@supabase/functions/discord-discussion-stats-update/index.ts`:
- Line 5: The file imports Json from an external esm.sh module redundantly;
update the import that brings in Database from "../_shared/SupabaseTypes.d.ts"
to also import Json (i.e., import type { Database, Json } from
"../_shared/SupabaseTypes.d.ts") and remove the separate esm.sh import
statement; ensure you only reference the local Json type and delete the import
line that mentions esm.sh to keep imports consistent (affects the Json symbol
and the existing Database import).
In `@supabase/migrations/20260130180853_submission-index-clean.sql`:
- Around line 1-5: Change each DROP INDEX statement to use DROP INDEX IF EXISTS
for idempotency (apply to idx_submissions_assignment_id,
submissions_repository_idx, idx_submissions_assignment_id_is_active,
idx_submissions_assignment_group_assignment_active,
idx_submissions_class_profile_assignment_active_covering) and add a brief SQL
comment above the statements explaining that these indexes were originally
created to optimize the assignment_overview view but became obsolete after the
correlated-subqueries refactor (see migration
20250819000010_fix_assignment_overview_correlated_subqueries.sql), so the drops
are safe and should be performed without failing on missing indexes.
In `@supabase/migrations/20260207011716_create-discord-async-worker-dlq.sql`:
- Around line 10-16: The DO $$ block currently swallows all exceptions around
the pgmq.create('discord_async_calls_dlq') call; change the handler to catch
only the specific duplicate-object/duplicate-table SQLSTATE and re-raise any
other errors so unrelated failures aren’t hidden. Replace the generic "exception
when others then" with a targeted handler such as "exception when SQLSTATE
'42710' then" (or '42P07' if appropriate for duplicate table) and ensure any
other exceptions are not suppressed (i.e., re-raise them) while keeping the
comment about "queue likely exists".
In `@supabase/migrations/20260207123636_discord-stats-change-tracking.sql`:
- Around line 9-11: Update the COMMENT ON COLUMN for
public.discord_messages.last_synced_stats to reflect the actual JSON keys stored
(likes_count, children_count, is_question, has_answer) instead of the incorrect
tuple (likes_count, children_count, answer); locate the COMMENT ON COLUMN
statement that sets the description for last_synced_stats and change the text to
mention the four keys and that it is used by discord-discussion-stats-update to
skip no-op updates.
🧹 Nitpick comments (3)
supabase/functions/_shared/DiscordWrapper.ts (1)
145-173: Solid timeout implementation — one small redundancy to tidy up.The
clearTimeout(timer)on line 161 inside thecatchblock is redundant because thefinallyblock on line 172 already unconditionally clears it. Having it in both places won't cause bugs, but it's unnecessary noise that might confuse a future reader into thinking thefinallycleanup is insufficient.🧹 Remove the redundant clearTimeout in catch
} catch (fetchError) { - clearTimeout(timer); // Convert AbortError into a descriptive timeout error if (fetchError instanceof DOMException && fetchError.name === "AbortError") {supabase/functions/metrics/index.ts (1)
124-131: Pre-existing: constant-time comparison isn't actually constant-time.Not introduced by this PR, but worth noting since you're in the neighborhood:
Array.prototype.every()short-circuits on the firstfalse, so Line 128 isn't truly constant-time. Additionally,new Uint8Array(providedHash)is re-allocated on every iteration. And thecatchfallback on Line 130 silently degrades to a direct===comparison, which is timing-attackable.If you ever come back to harden this,
crypto.subtle.timingSafeEqual(or a fixed-length XOR accumulator over the hash bytes) would be the way to go.♻️ Suggested constant-time comparison
- const expectedHash = await crypto.subtle.digest("SHA-256", expectedBytes); - const providedHash = await crypto.subtle.digest("SHA-256", providedBytes); - return new Uint8Array(expectedHash).every((byte, i) => byte === new Uint8Array(providedHash)[i]); - } catch { - return providedToken === metricsToken; - } + const expectedHash = new Uint8Array(await crypto.subtle.digest("SHA-256", expectedBytes)); + const providedHash = new Uint8Array(await crypto.subtle.digest("SHA-256", providedBytes)); + let mismatch = 0; + for (let i = 0; i < expectedHash.length; i++) { + mismatch |= expectedHash[i] ^ providedHash[i]; + } + return mismatch === 0; + } catch { + // Fallback: still not constant-time, but better than nothing + return providedToken === metricsToken; + }supabase/functions/discord-discussion-stats-update/index.ts (1)
260-267: Stats-skipping logic is sound, butprocessedcount becomes misleading.When a thread or topic isn't found (lines 247-258), execution hits
continuewithout incrementingskippedorerrors— onlyprocessedwas incremented. Combined with the newskippedcounter, the invariantprocessed == updated + skipped + errorsdoesn't hold for those "not found" cases.This is pre-existing, but now that you're reporting
skippedit becomes more visible. Consider either incrementingskipped(or a dedicated counter) for the "not found" cases, or documenting thatprocessedcan exceed the sum.
There was a problem hiding this comment.
Actionable comments posted: 3
🤖 Fix all issues with AI agents
In `@supabase/functions/_shared/GitHubWrapper.ts`:
- Around line 662-663: Prettier formatting is failing; run the project's
Prettier formatter on the file containing the "const header = decoded[0] as {
... }" declaration (and any other changed files) by executing something like
"prettier --write" (or your repo's npm/yarn format script) and commit the
resulting changes so the CI formatting check passes; ensure you use the repo's
Prettier config so the "GitHubWrapper.ts" header/decoded block is reformatted
consistently.
In `@supabase/migrations/20260129000000_fix_error_pin_created_by_fkey.sql`:
- Around line 215-219: The guard that checks for zero rules uses
jsonb_array_length(p_rules) which returns NULL for NULL p_rules and thus
bypasses the check; update the condition to treat NULL as empty by testing
p_rules IS NULL OR jsonb_array_length(p_rules) = 0 before raising the exception
(the check around p_rules and v_pin_record.rule_logic = 'and'), and verify any
code that initializes v_all_rules_match (used later) relies on this guard so it
won't remain true for NULL/empty rule sets.
- Around line 314-321: The current loop can reassign v_first_test_id after it
was nulled; declare a new boolean flag v_mixed_tests := false alongside the
other vars (e.g., near the existing declarations around line 30), set
v_mixed_tests := true when you detect a differing test (where the code currently
sets v_first_test_id := NULL), and change the matching logic so you only assign
v_first_test_id when v_mixed_tests is false (i.e., check v_mixed_tests before
the IF v_first_test_id IS NULL branch). Also ensure any later use of
v_first_test_id treats it as valid only when v_mixed_tests is false.
🧹 Nitpick comments (3)
supabase/functions/_shared/GitHubWrapper.ts (1)
9-9: Consider consolidating imports fromHandlerUtils.ts.There are two separate import statements from the same module —
SecurityErroron line 9 andUserVisibleErroron line 39. These could be merged into one.🧹 Suggested consolidation
-import { SecurityError } from "./HandlerUtils.ts"; ... -import { UserVisibleError } from "./HandlerUtils.ts"; +import { SecurityError, UserVisibleError } from "./HandlerUtils.ts";Also applies to: 39-39
supabase/migrations/20260129000000_fix_error_pin_created_by_fkey.sql (2)
39-61: Duplicateclass_idselection into two variables is confusing.
class_idis selected twice (lines 48 and 51) intov_existing_class_idandv_existing_class_id_already_set, which always hold the same value. One of these variables (and its declaration at line 43) can be removed — just reusev_existing_class_idon line 96.♻️ Suggested cleanup
DECLARE v_existing_class_id bigint; v_existing_discussion_thread_id bigint; v_existing_assignment_id bigint; - v_existing_class_id_already_set bigint; v_existing_rule_logic text; v_existing_enabled boolean; BEGIN SELECT class_id, discussion_thread_id, assignment_id, - class_id, rule_logic, enabled INTO v_existing_class_id, v_existing_discussion_thread_id, v_existing_assignment_id, - v_existing_class_id_already_set, v_existing_rule_logic, v_existing_enabled FROM error_pins WHERE id = (p_error_pin->>'id')::bigint;And on line 96:
- v_class_id := v_existing_class_id_already_set; + v_class_id := v_existing_class_id;
254-396: Auto-populate loop may be slow for large datasets — consider performance implications.The triple-nested loop (submissions × rules × tests) with per-iteration calls to
evaluate_error_pin_ruleruns inside a single transaction underSECURITY DEFINER. For classes with many active submissions, this could hold locks and cause timeouts. If this becomes a concern, consider:
- Moving auto-population to an async/background job.
- Using set-based SQL instead of row-by-row PL/pgSQL loops.
No immediate action required — just flagging for awareness as the feature scales.
Summary by CodeRabbit
New Features
Refactor
Bug Fixes