fix: Gemini API key invalid due to Ollama Cloud key overwrite#2288
fix: Gemini API key invalid due to Ollama Cloud key overwrite#2288naorpeled merged 3 commits intoqodo-ai:mainfrom
Conversation
- It make gemini model was failed when call api Refs: qodo-ai#2278
Review Summary by QodoFix API key conflicts between Ollama Cloud and other providers
WalkthroughsDescription• Prevent Ollama Cloud API key from overwriting other providers' keys • Only set api_key in kwargs when Ollama Cloud is explicitly configured • Fixes Gemini API authentication failures caused by key conflicts Diagramflowchart LR
A["LiteLLM AI Handler"] --> B["Check for Ollama Cloud API Key"]
B -->|Key exists| C["Set api_key in kwargs"]
B -->|Key missing| D["Skip api_key assignment"]
C --> E["Prevent key overwrite for other providers"]
D --> E
E --> F["Successful API calls for all providers"]
File Changes1. pr_agent/algo/ai_handlers/litellm_ai_handler.py
|
Code Review by Qodo
1.
|
JiwaniZakir
left a comment
There was a problem hiding this comment.
The condition in chat_completion checks get_settings().get("OLLAMA.API_KEY", None) to decide whether to set kwargs["api_key"] = litellm.api_key, but this logic has a subtle flaw: if a user has both an Ollama API key configured and is currently running a non-Ollama model, the Ollama key will still be injected into kwargs["api_key"] — which is precisely the overwrite bug this PR aims to fix, just under different circumstances. A more robust guard would check whether the active model string corresponds to an Ollama model (e.g. model.startswith("ollama")) rather than whether any Ollama key exists in settings.
Additionally, the original unconditional assignment kwargs["api_key"] = litellm.api_key presumably served a purpose for providers other than Ollama and Gemini — removing it entirely for all non-Ollama paths could silently break API key injection for other litellm-managed providers that rely on that field being set in kwargs. It's worth verifying that those providers aren't regressed by this change.
Minor nit: the comment has a typo ("ollma" instead of "Ollama") and reads awkwardly ("Specific only for…") — worth cleaning up before merge.
- fix typo mistake on comment
|
Persistent review updated to latest commit 554c2a1 |
|
The line kwargs["api_key"] = litellm.api_key was introduced in a recent commit related to the Ollama Cloud key. The author mentioned that Ollama Cloud would not work without this assignment. However, applying this change globally unintentionally breaks the existing architecture. It forces the api_key to be overwritten for all requests, including those targeting non-Ollama models. As a result, other providers no longer receive their correct API keys, which prevents API calls to those models from working properly. I have also fixed a typo in the comment and added a condition to ensure this logic only applies when the model is an Ollama model. |
|
This would be great, a lot of our reviews are broken now that use google :) |
|
workaround hack: |
JiwaniZakir
left a comment
There was a problem hiding this comment.
The fix addresses the right symptom but introduces a subtle inconsistency: the condition checks get_settings().get("OLLAMA.API_KEY", None) to decide whether to proceed, but then assigns litellm.api_key rather than the actual OLLAMA.API_KEY setting value. If litellm.api_key and the Ollama settings key can diverge, this could still result in a wrong key being used.
More critically, the original unconditional kwargs["api_key"] = litellm.api_key line served all models, not just Ollama. By scoping it exclusively to Ollama, any other provider that previously relied on this assignment to pass its API key into kwargs will silently stop doing so. It's worth verifying that Gemini, OpenAI-compatible, and other non-Ollama handlers set kwargs["api_key"] through a separate code path before this block.
The model.startswith('ollama') check could also be fragile — litellm uses prefixes like ollama/ and ollama_chat/, so a model string like ollama_chat/llama3 would match, but it's worth confirming all Ollama-variant prefixes used across the codebase are covered by this single check. A dedicated helper or a config-driven prefix list would be more maintainable than an inline startswith.
|
@JiwaniZakir Have you had a chance to review this merge request? It’s currently causing quite a bit of disruption just because of a single line that overwrites the API key. This change seems to be intended only for supporting the Ollama Cloud API key. Given that, I think it would be sufficient to scope the behavior by checking To be honest, this line probably shouldn’t have been introduced in the first place, as it creates unintended side effects across the entire request flow. Changes that impact the whole execution flow like this should be carefully reviewed at the merge request stage upfront. P/S: |
|
For anyone facing the issue you may want to temporarily set OPENAI.KEY to your provider's api key until a fix pushed in the releases. |
|
The test issue @yanukadeneth99 raised is valid — |
Log
Solution: Fixes Gemini key overwrite when an Ollama Cloud API key is not present.
Reference: #2287, #2292