Skip to content

fix: Gemini API key invalid due to Ollama Cloud key overwrite#2288

Merged
naorpeled merged 3 commits intoqodo-ai:mainfrom
shine911:fix-gemini-api-key-invalid
Mar 31, 2026
Merged

fix: Gemini API key invalid due to Ollama Cloud key overwrite#2288
naorpeled merged 3 commits intoqodo-ai:mainfrom
shine911:fix-gemini-api-key-invalid

Conversation

@shine911
Copy link
Copy Markdown
Contributor

@shine911 shine911 commented Mar 25, 2026

Log

WARNING  | pr_agent.algo.ai_handlers.litellm_ai_handler:chat_completion:420 - Error during LLM inference: litellm.AuthenticationError: GeminiException - {
  "error": {
    "code": 400,
    "message": "API key not valid. Please pass a valid API key.",
    "status": "INVALID_ARGUMENT",
    "details": [
      {
        "@type": "type.googleapis.com/google.rpc.ErrorInfo",
        "reason": "API_KEY_INVALID",
        "domain": "googleapis.com",
        "metadata": {
          "service": "generativelanguage.googleapis.com"
        }
      },
      {
        "@type": "type.googleapis.com/google.rpc.LocalizedMessage",
        "locale": "en-US",
        "message": "API key not valid. Please pass a valid API key."
      }
    ]
  }
}

Solution: Fixes Gemini key overwrite when an Ollama Cloud API key is not present.
Reference: #2287, #2292

@qodo-free-for-open-source-projects
Copy link
Copy Markdown
Contributor

Review Summary by Qodo

Fix API key conflicts between Ollama Cloud and other providers

🐞 Bug fix

Grey Divider

Walkthroughs

Description
• Prevent Ollama Cloud API key from overwriting other providers' keys
• Only set api_key in kwargs when Ollama Cloud is explicitly configured
• Fixes Gemini API authentication failures caused by key conflicts
Diagram
flowchart LR
  A["LiteLLM AI Handler"] --> B["Check for Ollama Cloud API Key"]
  B -->|Key exists| C["Set api_key in kwargs"]
  B -->|Key missing| D["Skip api_key assignment"]
  C --> E["Prevent key overwrite for other providers"]
  D --> E
  E --> F["Successful API calls for all providers"]
Loading

Grey Divider

File Changes

1. pr_agent/algo/ai_handlers/litellm_ai_handler.py 🐞 Bug fix +5/-3

Conditionally set API key only for Ollama Cloud

• Replaced unconditional api_key assignment with conditional logic
• Added check for OLLAMA.API_KEY setting before setting kwargs["api_key"]
• Only applies litellm.api_key when Ollama Cloud is configured
• Prevents unintended API key overwrites for non-Ollama providers like Gemini

pr_agent/algo/ai_handlers/litellm_ai_handler.py


Grey Divider

Qodo Logo

@qodo-free-for-open-source-projects
Copy link
Copy Markdown
Contributor

qodo-free-for-open-source-projects bot commented Mar 25, 2026

Code Review by Qodo

🐞 Bugs (0) 📘 Rule violations (1) 📎 Requirement gaps (0)

Grey Divider


Action required

1. Ollama key can be wrong🐞 Bug ≡ Correctness
Description
For Ollama Cloud requests, chat_completion() sets kwargs['api_key'] from the process-global
litellm.api_key, but that global is overwritten during handler initialization by other provider
configs (e.g., OpenRouter/Azure). In a multi-provider config, Ollama calls can authenticate with the
wrong key and fail.
Code

pr_agent/algo/ai_handlers/litellm_ai_handler.py[R409-411]

+            # Support for Ollama Cloud
+            if model.startswith('ollama') and get_settings().get("OLLAMA.API_KEY", None):
+                kwargs["api_key"] = litellm.api_key
Evidence
The new conditional path for Ollama Cloud pulls the API key from litellm.api_key. But
LiteLLMAIHandler.__init__() assigns litellm.api_key multiple times for different providers,
including assigning the Ollama key and then later overwriting it for OpenRouter (and also Azure AD),
so litellm.api_key is not a stable source of the Ollama key even when OLLAMA.API_KEY is
configured.

pr_agent/algo/ai_handlers/litellm_ai_handler.py[405-414]
pr_agent/algo/ai_handlers/litellm_ai_handler.py[82-87]
pr_agent/algo/ai_handlers/litellm_ai_handler.py[115-133]

Agent prompt
The issue below was found during a code review. Follow the provided context and guidance below and implement a solution

## Issue description
`chat_completion()` sets `kwargs["api_key"]` for Ollama Cloud using `litellm.api_key`, but `litellm.api_key` is a process-global that `LiteLLMAIHandler.__init__()` overwrites for multiple providers (e.g., OpenRouter, Azure AD). This makes Ollama Cloud calls use the wrong API key in multi-provider configurations.
### Issue Context
The PR’s goal is to prevent provider API keys (e.g., Ollama Cloud) from contaminating Gemini requests. The new conditional correctly limits when an explicit `api_key` is passed, but the Ollama path still relies on a mutable global (`litellm.api_key`) rather than the Ollama-specific setting.
### Fix Focus Areas
- pr_agent/algo/ai_handlers/litellm_ai_handler.py[408-412]
- pr_agent/algo/ai_handlers/litellm_ai_handler.py[82-87]
- pr_agent/algo/ai_handlers/litellm_ai_handler.py[128-133]
### What to change
- In the Ollama Cloud block, set `kwargs["api_key"]` from the Ollama configuration directly (e.g., `get_settings().ollama.api_key` or `get_settings().get("OLLAMA.API_KEY")`) instead of `litellm.api_key`.
- (Optional hardening) Consider avoiding assigning provider-specific credentials into the shared `litellm.api_key` global in `__init__`, or store per-provider keys in handler instance fields and select based on `model`.

ⓘ Copy this prompt and use it to remediate the issue with your preferred AI generation tools


2. api_key set for all models📘 Rule violation ≡ Correctness
Description
The new logic sets kwargs["api_key"] solely based on presence of OLLAMA.API_KEY, regardless of
which model is being called. This can override provider-specific authentication (e.g., Gemini) and
cause invalid-key failures due to request-parameter collisions.
Code

pr_agent/algo/ai_handlers/litellm_ai_handler.py[R409-411]

+            # Specific only for ollma cloud api key
+            if get_settings().get("OLLAMA.API_KEY", None):
+                kwargs["api_key"] = litellm.api_key
Evidence
PR Compliance ID 17 requires preventing critical-parameter override collisions in external API
integrations. The added code sets the critical api_key request parameter whenever OLLAMA.API_KEY
exists, without restricting it to Ollama requests, which can override the correct auth mechanism for
other providers.

pr_agent/algo/ai_handlers/litellm_ai_handler.py[409-411]
Best Practice: Learned patterns

Agent prompt
The issue below was found during a code review. Follow the provided context and guidance below and implement a solution

## Issue description
`kwargs["api_key"]` is being set whenever `OLLAMA.API_KEY` exists, regardless of the target `model`. This can collide with/override other providers' authentication and break calls (e.g., Gemini).
## Issue Context
`api_key` is a critical request field for LiteLLM provider routing/auth; it should only be injected for the provider(s) that require it for the current request, and should not override existing auth inputs.
## Fix Focus Areas
- pr_agent/algo/ai_handlers/litellm_ai_handler.py[409-411]
## What to change
- Restrict setting `kwargs["api_key"]` to Ollama requests only (e.g., based on `model` prefix like `ollama/` and/or `self.api_base`).
- Add a collision guard: if `"api_key"` already exists in `kwargs`, raise a clear error rather than overwriting.
- Consider provider-specific key handling rather than reusing the global `litellm.api_key` for non-Ollama models.

ⓘ Copy this prompt and use it to remediate the issue with your preferred AI generation tools



Remediation recommended

3. Single quotes in startswith 📘 Rule violation ⚙ Maintainability
Description
New code uses single quotes in model.startswith('ollama'), which conflicts with the repo’s stated
Ruff formatting preference for double quotes. This can cause style/lint failures and introduces
inconsistent formatting.
Code

pr_agent/algo/ai_handlers/litellm_ai_handler.py[R409-411]

+            # Support for Ollama Cloud
+            if model.startswith('ollama') and get_settings().get("OLLAMA.API_KEY", None):
+                kwargs["api_key"] = litellm.api_key
Evidence
PR Compliance ID 12 requires adhering to Ruff formatting conventions, including preferring double
quotes. The added condition uses single quotes in the modified lines.

AGENTS.md
pr_agent/algo/ai_handlers/litellm_ai_handler.py[409-411]

Agent prompt
The issue below was found during a code review. Follow the provided context and guidance below and implement a solution

## Issue description
The newly added Ollama check uses single quotes (and includes whitespace-only blank lines in the diff), which conflicts with the repository’s Ruff formatting preference for double quotes and clean formatting.
## Issue Context
The compliance checklist requires Ruff/isort style, including preferring double quotes for strings.
## Fix Focus Areas
- pr_agent/algo/ai_handlers/litellm_ai_handler.py[409-411]

ⓘ Copy this prompt and use it to remediate the issue with your preferred AI generation tools



Advisory comments

4. Misspelled provider name🐞 Bug ⚙ Maintainability
Description
A new comment misspells “ollama” as “ollma”, reducing clarity when reasoning about provider-specific
behavior.
Code

pr_agent/algo/ai_handlers/litellm_ai_handler.py[R409-410]

+            # Specific only for ollma cloud api key
+            if get_settings().get("OLLAMA.API_KEY", None):
Evidence
The repo consistently uses “ollama” (settings section [ollama], model prefix ollama/...), but
the new comment says “ollma”.

pr_agent/settings/.secrets_template.toml[52-54]
pr_agent/algo/init.py[215-216]

Agent prompt
The issue below was found during a code review. Follow the provided context and guidance below and implement a solution

## Issue description
The comment in `chat_completion()` says “ollma” instead of “ollama”.
### Fix Focus Areas
- pr_agent/algo/ai_handlers/litellm_ai_handler.py[409-410]
- Change “ollma” to “ollama”.

ⓘ Copy this prompt and use it to remediate the issue with your preferred AI generation tools


Grey Divider

ⓘ The new review experience is currently in Beta. Learn more

Grey Divider

Qodo Logo

@shine911 shine911 changed the title Fix gemini api key invalid cause by ollama cloud key overwrite fix: gemini api key invalid cause by ollama cloud key overwrite Mar 25, 2026
@shine911 shine911 changed the title fix: gemini api key invalid cause by ollama cloud key overwrite fix: gemini api key invalid caused by ollama cloud key overwrite Mar 26, 2026
@shine911 shine911 changed the title fix: gemini api key invalid caused by ollama cloud key overwrite fix: Gemini API key invalid due to Ollama Cloud key overwrite Mar 26, 2026
Copy link
Copy Markdown

@JiwaniZakir JiwaniZakir left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The condition in chat_completion checks get_settings().get("OLLAMA.API_KEY", None) to decide whether to set kwargs["api_key"] = litellm.api_key, but this logic has a subtle flaw: if a user has both an Ollama API key configured and is currently running a non-Ollama model, the Ollama key will still be injected into kwargs["api_key"] — which is precisely the overwrite bug this PR aims to fix, just under different circumstances. A more robust guard would check whether the active model string corresponds to an Ollama model (e.g. model.startswith("ollama")) rather than whether any Ollama key exists in settings.

Additionally, the original unconditional assignment kwargs["api_key"] = litellm.api_key presumably served a purpose for providers other than Ollama and Gemini — removing it entirely for all non-Ollama paths could silently break API key injection for other litellm-managed providers that rely on that field being set in kwargs. It's worth verifying that those providers aren't regressed by this change.

Minor nit: the comment has a typo ("ollma" instead of "Ollama") and reads awkwardly ("Specific only for…") — worth cleaning up before merge.

@qodo-free-for-open-source-projects
Copy link
Copy Markdown
Contributor

qodo-free-for-open-source-projects bot commented Mar 26, 2026

Persistent review updated to latest commit 554c2a1

@shine911
Copy link
Copy Markdown
Contributor Author

The line kwargs["api_key"] = litellm.api_key was introduced in a recent commit related to the Ollama Cloud key. The author mentioned that Ollama Cloud would not work without this assignment.

However, applying this change globally unintentionally breaks the existing architecture. It forces the api_key to be overwritten for all requests, including those targeting non-Ollama models. As a result, other providers no longer receive their correct API keys, which prevents API calls to those models from working properly.

I have also fixed a typo in the comment and added a condition to ensure this logic only applies when the model is an Ollama model.

@rynomster
Copy link
Copy Markdown
Contributor

rynomster commented Mar 27, 2026

This would be great, a lot of our reviews are broken now that use google :)

@rynomster
Copy link
Copy Markdown
Contributor

workaround hack:

    - export OLLAMA__API_KEY=$MY_KEY

@shine911 shine911 requested a review from JiwaniZakir March 28, 2026 09:37
Copy link
Copy Markdown

@JiwaniZakir JiwaniZakir left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The fix addresses the right symptom but introduces a subtle inconsistency: the condition checks get_settings().get("OLLAMA.API_KEY", None) to decide whether to proceed, but then assigns litellm.api_key rather than the actual OLLAMA.API_KEY setting value. If litellm.api_key and the Ollama settings key can diverge, this could still result in a wrong key being used.

More critically, the original unconditional kwargs["api_key"] = litellm.api_key line served all models, not just Ollama. By scoping it exclusively to Ollama, any other provider that previously relied on this assignment to pass its API key into kwargs will silently stop doing so. It's worth verifying that Gemini, OpenAI-compatible, and other non-Ollama handlers set kwargs["api_key"] through a separate code path before this block.

The model.startswith('ollama') check could also be fragile — litellm uses prefixes like ollama/ and ollama_chat/, so a model string like ollama_chat/llama3 would match, but it's worth confirming all Ollama-variant prefixes used across the codebase are covered by this single check. A dedicated helper or a config-driven prefix list would be more maintainable than an inline startswith.

@shine911
Copy link
Copy Markdown
Contributor Author

shine911 commented Mar 30, 2026

@JiwaniZakir Have you had a chance to review this merge request? It’s currently causing quite a bit of disruption just because of a single line that overwrites the API key.

This change seems to be intended only for supporting the Ollama Cloud API key. Given that, I think it would be sufficient to scope the behavior by checking model.startswith(...) as you suggested, and only applying the override when OLLAMA.API_KEY is explicitly set. That should preserve the original intent of the author without impacting other providers.

To be honest, this line probably shouldn’t have been introduced in the first place, as it creates unintended side effects across the entire request flow. Changes that impact the whole execution flow like this should be carefully reviewed at the merge request stage upfront.

P/S: OLLAMA.API_KEY and litellm.api_key should never diverge as long as OLLAMA.API_KEY is set. Please take a look at merge request #2278.

@shine911 shine911 requested a review from JiwaniZakir March 30, 2026 07:22
@benyaminbeyzaie
Copy link
Copy Markdown

For anyone facing the issue you may want to temporarily set OPENAI.KEY to your provider's api key until a fix pushed in the releases.

@JiwaniZakir
Copy link
Copy Markdown

The test issue @yanukadeneth99 raised is valid — test_anthropic_key_not_shadowed_by_dummy_key needs to mock or unset OPENAI_API_KEY before asserting, otherwise it's environment-dependent and will produce false failures in CI. Regarding PR #2278, the root fix here is straightforward: the litellm.api_key assignment for Ollama Cloud should be gated behind a check that a valid Ollama key is actually present, so it doesn't clobber keys set by other providers like Gemini or Anthropic. I'll get this merged once the test is hardened.

@naorpeled
Copy link
Copy Markdown
Collaborator

Thanks @shine911 !

@ofir-frd let's release this ASAP when you have time 🙏

@naorpeled naorpeled merged commit d9e4c62 into qodo-ai:main Mar 31, 2026
2 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants