Skip to content

Conversation

@RaySinner
Copy link
Contributor

@RaySinner RaySinner commented Jan 5, 2025

Thanks to this provider, Roo-Cline gets the opportunity to free, unlimited use of models provided by Github Copilot, namely:

image

Attention: The provider is only available on the original version of VS Code due to the fact that Copilot is a proprietary functionality.

To use the provider, you need to have a PRO subscription to Copilot, or create a new Microsoft account every 30 days and activate a 30-day free trial period on it. But believe me, the free unlimited sonnet 3.5 is worth it.

Description

This PR adds support for VS Code Language Model provider integration, allowing users to leverage VS Code's built-in language model capabilities directly within Cline. The integration includes a complete provider implementation, transformation logic for message formatting, and UI components for configuration.

Key features:

  • Add vscode-lm provider implementation with TypeScript type definitions
  • Add provider transformation logic for message format compatibility
  • Update API options UI to support vscode-lm configuration
  • Add comprehensive provider documentation

Type of change

  • New feature

How Has This Been Tested?

  • Tested provider initialization and configuration
  • Verified message transformation logic with different input formats
  • Tested UI integration and settings persistence
  • Verified documentation accuracy and completeness
  • All existing tests pass without regression

Additional context

This provider integration enables seamless use of VS Code's language model capabilities, enhancing the extension's functionality while maintaining compatibility with existing provider interfaces. The implementation follows VS Code's extension API best practices and includes proper TypeScript definitions for improved development experience.

Reviewers

@original-fork-owner


Important

This PR integrates VS Code's Language Model API into the extension, adding a new provider for GitHub Copilot models and updating UI and configuration components to support this integration.

  • Behavior:
    • Adds VsCodeLmHandler to integrate VS Code Language Model API in src/api/providers/vscode-lm.ts.
    • Supports model selection and configuration via ApiOptions.tsx.
    • Handles model changes and resets client in VsCodeLmHandler.
  • UI:
    • Updates ApiOptions.tsx to include VS Code LM API as a provider option.
    • Adds dropdown for selecting language models in ApiOptions.tsx.
  • Configuration:
    • Adds vsCodeLmModelSelector to package.json and ClineProvider.ts for model configuration.
  • Utilities:
    • Adds stringifyVsCodeLmModelSelector in vsCodeSelectorUtils.ts for model selector string conversion.
  • Misc:
    • Updates ExtensionMessage.ts and WebviewMessage.ts to handle VS Code LM API messages.

This description was created by Ellipsis for cc162c1b906cd6735069685112a771b86368ca51. It will automatically update as commits are pushed.

@changeset-bot
Copy link

changeset-bot bot commented Jan 5, 2025

⚠️ No Changeset found

Latest commit: d5fd2bb

Merging this PR will not cause a version bump for any packages. If these changes should not result in a new version, you're good to go. If these changes should result in a version bump, you need to add a changeset.

This PR includes no changesets

When changesets are added to this PR, you'll see the packages that this PR includes changesets for and the associated semver types

Click here to learn what changesets are, and how to add one.

Click here if you're a maintainer who wants to add a changeset to this PR

@mrubens
Copy link
Collaborator

mrubens commented Jan 5, 2025

Awesome! I’ll take a look.

@RaySinner
Copy link
Contributor Author

Awesome! I’ll take a look.

It looks like all hope is only for you, bro! If you don’t look at the crooked syntax and the lack of unit tests, then the main thing you need to understand about the provider is the free and unlimited use of OpenAI 4o, o1 and cloude 3.5 sonnet. I've been using this provider for 2 weeks now and I'm very pleased! I think this is the most important expansion of capabilities since the Roo branch of Cline. All that remains is to do tests, clean up the code, and it’s probably worth making a filter for the presence of the vs code lm API in the platform so that it does not appear on open source IDEs. Don't let the idea die! :)

@RaySinner
Copy link
Contributor Author

I cleaned up the code a little and updated the branch to the current version

@Bigsy
Copy link

Bigsy commented Jan 8, 2025

Not sure if this is based off this cline PR cline/cline#970 but just to say I've been using a build of cline with that code in for a few days and it's been working amazingly well. I really hope this gets some attention; I think it would be a massive game changer for this fork/project.

@mrubens
Copy link
Collaborator

mrubens commented Jan 8, 2025

Not sure if this is based off this cline PR cline#970 but just to say I've been using a build of cline with that code in for a few days and it's been working amazingly well. I really hope this gets some attention; I think it would be a massive game changer for this fork/project.

Thanks! I've had a few things ahead of this but am very excited to review.

@mrubens
Copy link
Collaborator

mrubens commented Jan 8, 2025

What's the best way for me to test this out? I haven't used the VS Code LM API before

@julesmons
Copy link

julesmons commented Jan 8, 2025

Hello RooCline!

Whilst reviewing this PR please read the findings in the original pr this code is indeed based upon.

The original code of this provider did work, but has MAJOR inefficiencies that waste a lot of resources.
You might think few hundered wasted tokens and/or extra ms won't be a big deal; But this will snowball and sometimes crash Cline itself or completely brick co-pilot due to a rate-limit (way earlier than it should be doing)
The rate-limit will also lock you out of regular copilot chat which is inconvenient.

I want to clarify that i absolutely do not want to 'claim credit' and would love for this amazing feature to be rolled out; But this provider, in it's current state, has a lot off critical issues (as i've described in the original PR)

Short version:

  • Usage of Copilot to back Cline is not unlimited
  • The getModel info method will never return correct info (the this.client workaround as i've implemented it does not work correctly)
  • the countTokens method the VSCode LM API provides is VERY performance hungry. I've tried implementing a cache which does not work correctly because of the life-cycle of the provider.
  • Currently Cline API providers do not have "dispose" functionality. This should also be implemented in Cline core for the cancellation to work properly.
  • Cline core code identifies "claude" in the model id and forces the context window to 200K which copilot can't handle; freezing the task process entirely. This also breaks the sliding context window logic, causing the provider to submit the entire message history to copilot every request. This will possibly trigger the rate-limit after just a few messages (depending on history).
  • Model info cannot be hardcoded as the VSCode LM API is not just copilot. Copilot is a driver you can use (vendor / family system) so hardcoding magic numbers will break other vendors.

To adress these issues:

  • The getModel should become async (breaking change as all other providers need to adapt)
  • A token count should be stored in the message history so that no performance waste occurs. This way messages only have to be tokenized once. (Breaking change)
  • Some minor refactoring of the cline core to accomodate. (No clue what has already been done in RooCline)

As you can see this providers requires a slew of breaking changes to function optimally.
This is why the original PR has hit a crossroads and i've decided to fork and create Recline

You're welcome to go and have a look how i've implemented the fixes for the problems described, and translate them to RooCline compatible code. However i must warn you, I've deviated from the Cline code quite a lot already and have no idea if any of the changes I've made are stable enough to actually be released.

I cannot stress this enough; I don't think the original provider code is production ready just yet.
Make sure you address each of these issues properly before merging!

Cheers!

@julesmons
Copy link

julesmons commented Jan 8, 2025

What's the best way for me to test this out? I haven't used the VS Code LM API before

You'll need to install the Copilot and Copilot Chat extensions for VSCode.
These extensions use the built-in LM API to allow other extensions to send requests to the LLM.

Sidenote: You can use it either with a subscription or the new GitHub Copilot Free.

@mrubens
Copy link
Collaborator

mrubens commented Jan 8, 2025

Thank you @julesmons, I'll take a look at your repo!

I did get the list of models showing up with the Copilot account but now I see this error when I try to start a task - is there a step I'm missing?

Could not resolve authentication method. Expected either apiKey or authToken to be set. Or for one of the "X-Api-Key" or "Authorization" headers to be explicitly omitted

@julesmons
Copy link

Thank you @julesmons, I'll take a look at your repo!

I did get the list of models showing up with the Copilot account but now I see this error when I try to start a task - is there a step I'm missing?

Could not resolve authentication method. Expected either apiKey or authToken to be set. Or for one of the "X-Api-Key" or "Authorization" headers to be explicitly omitted

Sadly i don't know what @RaySinner has changed so i have no clue why this happens.
The original provider did not throw this error.

@mrubens
Copy link
Collaborator

mrubens commented Jan 8, 2025

Thank you @julesmons, I'll take a look at your repo!
I did get the list of models showing up with the Copilot account but now I see this error when I try to start a task - is there a step I'm missing?

Could not resolve authentication method. Expected either apiKey or authToken to be set. Or for one of the "X-Api-Key" or "Authorization" headers to be explicitly omitted

Sadly i don't know what @RaySinner has changed so i have no clue why this happens. The original provider did not throw this error.

Er sorry to bug - it's something I broke when trying to clean up whitespace on this branch. As you all were...

@RaySinner
Copy link
Contributor Author

I didn't know that it had so many critical errors, and I haven't encountered any restrictions with it yet. I like how it works. But he's right, it's based on his code, so this PR can be closed further, you'll figure it out yourself.

although I'm sure that he's just greedy and you won't reach an agreement, and roo-cline will simply be left without such a wonderful provider.

@RaySinner RaySinner closed this Jan 9, 2025
@julesmons
Copy link

julesmons commented Jan 9, 2025

Greedy? Agreement?? That's absolutely not the point of the message i've left...
If that were true i wouldn't have committed to an open-source repository.

The point was to make sure stable code got merged.
Making sure the provider doesn't freeze/crash and needlessly overrun GitHub Copilot servers with wasteful requests in the process (As i can image a lot of people would be excited to use this provider)

I was merely listing known-issues so not only the RooCline dev(s) can make informed decisions regarding their codebase, but you also have pointers to go and improve the existing code with (As i've directed my attention elsewhere for the time being).

If any developers are looking to pick this up, feel free to message me for additional information.

@mrubens
Copy link
Collaborator

mrubens commented Jan 9, 2025

I really appreciate both of you moving this forward. I'll take a closer look and see if we can get this out in a responsible way. Thanks so much!

@Katsumata-ai
Copy link

Salut, vscode-lm:copilot/claude-3.7-sonnet ne marche pas sur cline, quand je l'utilise il affiche le message suivant : Request Failed: 400 {"error":{"message":"Model is not supported for this request.","param":"model","code":"model_not_supported","type":"invalid_request_error"}} je tient a préciser que j'ai la version pro de github copilot et que vscode-lm:copilot/claude-3.5-sonnet fonctionne très bien dans cline, quelqu'un aurait une solution ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants