Replies: 4 comments 2 replies
-
|
The same happens to me.
|
Beta Was this translation helpful? Give feedback.
-
|
This could be fixed by this patch: gemini_2_5_flash = function()
return require("codecompanion.adapters.http").extend("gemini", {
schema = {
model = {
default = "gemini-2.5-flash",
},
temperature = {
default = 0.5,
},
},
handlers = { -- This is the patch
form_messages = function(self, messages)
local gemini = require("codecompanion.adapters.http.gemini")
local raw_messages = gemini.handlers.form_messages(self, messages)
-- Collect and merge system messages
local system_content = {}
local other_messages = {}
for _, msg in ipairs(raw_messages.messages) do
if msg.role == "system" then
table.insert(system_content, msg.content)
else
table.insert(other_messages, msg)
end
end
-- If we found system messages, merge them and prepend
if #system_content > 0 then
local merged_system_message = {
role = "system",
content = table.concat(system_content, "\n\n"),
}
table.insert(other_messages, 1, merged_system_message)
end
raw_messages.messages = other_messages
return raw_messages
end,
},
})
end,This happens due to gemini doesn't support multiple system prompts: it reads the latest system prompt only: @olimorris shall we apply this patch to https://github.com/olimorris/codecompanion.nvim/blob/main/lua/codecompanion/adapters/http/gemini.lua#L59 ? |
Beta Was this translation helpful? Give feedback.
-
|
All, Can someone actually prove this? This link points to an unconfirmed post by a user on the Google forums. I'd be surprised if Google adopt OpenAI for their endpoints and don't account for this. Also, we have a utility function to merge system prompts together. |
Beta Was this translation helpful? Give feedback.
-
|
Fixed in #2530 |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
The language configuration is not being respected by the models when using the Gemini adapter. This issue is specific to the adapter, as the configuration functions as expected when using Gemini 2.5 Pro through Copilot. An inspection of the debug window confirms that both the language and system prompt options are correctly applied, suggesting the issue occurs downstream.
Beta Was this translation helpful? Give feedback.
All reactions