Request to Set gpt-4.1 as Default Model for Title Generation in Copilot Adapter #34
Replies: 7 comments
-
As far as I know setting a cheaper model based on the adapter is difficult. Even for But I completely understand the issue here. For now I made sure to add a warning note in the README. And will also mention this in the discussion at codecompanion. Thanks for raising this. |
Beta Was this translation helpful? Give feedback.
-
Thank you for your thorough responses and for updating the README. It’s completely understandable and much appreciated. Thanks again! |
Beta Was this translation helpful? Give feedback.
-
Hello! Thank you for your works on the plugin and forgive me for resurrecting this thread again. I am trying now for more than few hours to set the 'generation_opts' on both summary and title to the following:
But I keep on getting the same error, during title and summary generations
How I could turn-off thinking, if I'd like to use a cheaper model for the summaries? With kind regards, |
Beta Was this translation helpful? Give feedback.
-
Yes you can use a cheaper model. You can just set string for adapter enabled = true,
title_generation_opts = {
adapter = "anthropic",
model = "claude-3-5-haiku-latest",
} As the claude-3-5-haiku doesn't support reason as declared here, the reasoning should be automatically disabled. Please let me know if that doesn't work. Thanks |
Beta Was this translation helpful? Give feedback.
-
Whoa. @ravitemer That was a rapid response. I tried in many different way and I still get the same error message
Not sure, if my modification of the adapter could be the reason for the error, as this is the way I am passing the token through.
Edit: I also tried using a direct key-map from the code you referred: 'claude-3-5-haiku-20241022'. The same issue. |
Beta Was this translation helpful? Give feedback.
-
@carbonbasedman Can you try adding this schema field as well along with the env to explicitly override the extended_thinking property? schema = {
extended_thinking = {
default = false
}
} I think this should fix it! |
Beta Was this translation helpful? Give feedback.
-
Yes! It did! However, I had to also adjust the 'max_tokens' parameter. In case anyone will need to do the same fix
Thank you a lot! Especially for the speed of your support! Cheers! |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Hello,
After the introduction of premium requests in Copilot, I’ve noticed that the default title generation consumes premium requests when using
Claude-sonnet-4
because it uses whichever model is set in Copilot. I believe the title generation doesn’t need to use such a powerful LLM, so I kindly request that, if possible, the Copilot adapter defaults to usinggpt-4.1
, which is the default model in Copilot and does not consume premium requests.I understand this can be changed in the configuration, but I feel it would be beneficial for this to be the default, especially since many users may not be fully aware of how premium requests are being consumed.
Thank you for considering this!
Beta Was this translation helpful? Give feedback.
All reactions