Skip to content

Add support for OpenRouter in Feedbot#622

Open
jon-bell wants to merge 2 commits intostagingfrom
feedbot-to-open-router
Open

Add support for OpenRouter in Feedbot#622
jon-bell wants to merge 2 commits intostagingfrom
feedbot-to-open-router

Conversation

@jon-bell
Copy link
Contributor

@jon-bell jon-bell commented Feb 27, 2026

Summary by CodeRabbit

  • New Features

    • Added OpenRouter as a supported AI provider option for AI-powered features.
  • Behavior

    • Normalized temperature handling to respect explicit defaults.
    • Added clear error messaging when an OpenRouter API key is missing.
  • Tests

    • Expanded unit and end-to-end tests to cover OpenRouter flows.
    • Increased grader-result test coverage per submission to include additional OpenRouter-backed tests.

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Feb 27, 2026

Walkthrough

Adds OpenRouter as a supported LLM provider across the LLM hint flow: updates provider types, configures ChatOpenAI for OpenRouter (baseURL and API key handling), adjusts model/temperature resolution, and adds corresponding unit, e2e, and seeding changes.

Changes

Cohort / File(s) Summary
LLM Hint API Route
app/api/llm-hint/route.ts
Added "openrouter" to provider union; implemented OpenRouter branch in getChatModel with API-key resolution (global and account-scoped), baseURL https://openrouter.ai/api/v1, temperature handling via ??, and model-name resolution defaults for OpenRouter. Updated invalid-provider error text.
Type Definitions
utils/supabase/DatabaseTypes.d.ts
Extended GraderResultTestExtraData.llm.provider literal type to include "openrouter".
Database Seeding
scripts/DatabaseSeedingUtils.ts
Increased grader result tests per submission (5 → 6) by adding OpenRouter-based LLM hint test entries in initial construction and per-chunk aggregation.
Unit Tests
tests/unit/llm-hint.test.ts
Added OpenRouter test cases: client selection uses ChatOpenAI with baseURL, model/temperature preserved, env var handling for OPENROUTER_API_KEY and account-scoped keys, and missing-key error expectations. Kept existing provider tests.
E2E Tests
tests/e2e/llm-hint.test.tsx
Added OpenRouter e2e test path gated on OPENROUTER_API_KEY_e2e_test, created/linked OpenRouter grader_result_test entries, and added a dedicated should work with openrouter test. Also changed grader_result_tests query ordering for determinism.
Manifest / Misc
(manifest lines changed)
Manifest updated (approx. +24/-6 lines) to reflect additions.
sequenceDiagram
    rect rgba(60,130,200,0.5)
    Client->>API_Route: POST /api/llm-hint (provider: "openrouter", model, temp)
    end
    rect rgba(120,200,80,0.5)
    API_Route->>GetChatModel: resolve provider & api key (env or account)
    GetChatModel->>Env: read OPENROUTER_API_KEY / account-scoped key
    GetChatModel-->>API_Route: ChatOpenAI client configured (baseURL=https://openrouter.ai/api/v1)
    end
    rect rgba(200,90,140,0.5)
    API_Route->>OpenRouter_API: send request via ChatOpenAI client
    OpenRouter_API-->>API_Route: LLM response
    API_Route-->>Client: LLM hint response
    end
Loading

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~22 minutes

Possibly related PRs

Poem

A new route hums toward a distant shore,
Keys and models clasped behind the door,
Tests awake to greet the change,
Types extended, seeds arranged,
OpenRouter sails — a quiet encore. 🚢

🚥 Pre-merge checks | ✅ 2 | ❌ 1

❌ Failed checks (1 warning)

Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 0.00% which is insufficient. The required threshold is 80.00%. Write docstrings for the functions missing them to satisfy the coverage threshold.
✅ Passed checks (2 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title accurately summarizes the main objective of the pull request: adding OpenRouter as a new provider to the Feedbot system across multiple files and test suites.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
  • 📝 Generate docstrings (stacked PR)
  • 📝 Generate docstrings (commit on current branch)
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch feedbot-to-open-router

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🧹 Nitpick comments (1)
tests/unit/llm-hint.test.ts (1)

717-781: Add one test for OpenRouter default model fallback.

Please add a case where provider: "openrouter" and model is omitted, then assert model: "openai/gpt-4o-mini" is used. That protects the new defaulting logic from regressions.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@tests/unit/llm-hint.test.ts` around lines 717 - 781, Add a new test inside
the "OpenRouter Provider" describe block that sets
mockTestResult.extra_data.llm!.provider = "openrouter" but leaves model
undefined, then POST to the API via the existing POST(request) helper and assert
response.status === 200 and that mockChatOpenAI was called with an
objectContaining({ model: "openai/gpt-4o-mini", apiKey:
process.env.OPENROUTER_API_KEY (or account-specific env if you set
mockTestResult.extra_data.llm!.account), configuration: { baseURL:
"https://openrouter.ai/api/v1" } }); use the same pattern as the other
OpenRouter tests and refer to POST, mockTestResult, and mockChatOpenAI so the
test verifies the default model fallback.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@app/api/llm-hint/route.ts`:
- Around line 188-200: The temperature fallback currently uses the || operator
which treats 0 as falsy and overrides explicit 0; update each provider branch
("openai", "azure", "anthropic", "openrouter") to use the nullish coalescing
operator (??) for the temperature assignment (e.g., replace temperature || 0.85
with temperature ?? 0.85) in the constructors (e.g., ChatOpenAI / AzureOpenAI /
ChatAnthropic or whichever classes are used in those branches) so an explicit 0
is preserved.

---

Nitpick comments:
In `@tests/unit/llm-hint.test.ts`:
- Around line 717-781: Add a new test inside the "OpenRouter Provider" describe
block that sets mockTestResult.extra_data.llm!.provider = "openrouter" but
leaves model undefined, then POST to the API via the existing POST(request)
helper and assert response.status === 200 and that mockChatOpenAI was called
with an objectContaining({ model: "openai/gpt-4o-mini", apiKey:
process.env.OPENROUTER_API_KEY (or account-specific env if you set
mockTestResult.extra_data.llm!.account), configuration: { baseURL:
"https://openrouter.ai/api/v1" } }); use the same pattern as the other
OpenRouter tests and refer to POST, mockTestResult, and mockChatOpenAI so the
test verifies the default model fallback.

ℹ️ Review info

Configuration used: Repository UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 14a4890 and 555ed79.

📒 Files selected for processing (5)
  • app/api/llm-hint/route.ts
  • scripts/DatabaseSeedingUtils.ts
  • tests/e2e/llm-hint.test.tsx
  • tests/unit/llm-hint.test.ts
  • utils/supabase/DatabaseTypes.d.ts

@argos-ci
Copy link

argos-ci bot commented Feb 27, 2026

The latest updates on your projects. Learn more about Argos notifications ↗︎

Awaiting the start of a new Argos build…

coderabbitai[bot]
coderabbitai bot previously approved these changes Feb 28, 2026
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick comments (1)
app/api/llm-hint/route.ts (1)

188-200: OpenRouter integration is correct — optional attribution headers can enhance tracking.

The implementation correctly uses ChatOpenAI with baseURL override, which is the standard approach for OpenAI-compatible APIs. API key handling and error messaging match established patterns.

OpenRouter supports optional HTTP-Referer and X-Title headers for attribution and analytics. These are entirely optional and won't affect functionality, but can improve tracking in the OpenRouter dashboard if desired.

🔧 Optional: Add attribution headers
     return new ChatOpenAI({
       model,
       apiKey: process.env[key_env_name],
-      configuration: { baseURL: "https://openrouter.ai/api/v1" },
+      configuration: { 
+        baseURL: "https://openrouter.ai/api/v1",
+        defaultHeaders: {
+          "HTTP-Referer": process.env.OPENROUTER_REFERER || "https://pawtograder.com",
+          "X-Title": "Pawtograder Feedbot"
+        }
+      },
       temperature: temperature ?? 0.85,
       maxTokens: maxTokens,
       maxRetries: maxRetries || 2
     });
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@app/api/llm-hint/route.ts` around lines 188 - 200, Add optional attribution
headers to the OpenRouter client construction: when provider === "openrouter"
and creating the ChatOpenAI instance, extend the configuration object passed to
ChatOpenAI (currently configuration: { baseURL: "https://openrouter.ai/api/v1"
}) to include a headers map with optional "HTTP-Referer" and "X-Title" values
(e.g., taken from environment variables or request context); update key_env_name
usage remains the same and ensure these headers are only added when present so
behavior is unchanged if they are not set.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Nitpick comments:
In `@app/api/llm-hint/route.ts`:
- Around line 188-200: Add optional attribution headers to the OpenRouter client
construction: when provider === "openrouter" and creating the ChatOpenAI
instance, extend the configuration object passed to ChatOpenAI (currently
configuration: { baseURL: "https://openrouter.ai/api/v1" }) to include a headers
map with optional "HTTP-Referer" and "X-Title" values (e.g., taken from
environment variables or request context); update key_env_name usage remains the
same and ensure these headers are only added when present so behavior is
unchanged if they are not set.

ℹ️ Review info

Configuration used: Repository UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 555ed79 and 0fc1f3a.

📒 Files selected for processing (1)
  • app/api/llm-hint/route.ts

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant