Expose Context Window Size for GitHub Copilot Models #2467
Closed
PhilippOesch
started this conversation in
Ideas
Replies: 1 comment
-
|
Hey @PhilippOesch. Yep great idea. This was on my radar after I made |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hello everyone,
this has quietly but surely become my go-to plugin for LLM pair programming. Thanks to the maintainer for the amazing job.
Recently, I’ve done some research and read about other people’s experiences with LLMs. A recurring theme, which I can also confirm from my own experience, is that there seems to be a correlation between how much of the LLM’s context window is used and the quality of its results.
Fortunately, the API of the adapter I use (Copilot) already provides this information.
I looked into the current implementation and realized that the Copilot adapter already reads other token limits when retrieving available models from the API, so adding the context window limits in practice would only require a single line addition in lua/codecompanion/adapters/http/copilot/get_models.lua
to do the trick.
This would already be enough to, for example, display how much of the context window is currently being used:
I would, of course, be happy to provide this one liner PR.
It’s a small and non-intrusive change that would make me, and maybe some others, very happy.
Beta Was this translation helpful? Give feedback.
All reactions