Skip to content

Conversation

@roomote
Copy link
Contributor

@roomote roomote bot commented Jul 23, 2025

This PR fixes issue #6112 where the token count and context length progress bar don't update properly when using VS Code LM API as the API Provider.

The VS Code LM provider was only yielding token usage information once at the end of the stream, which prevented the UI from updating the context window progress bar during streaming.

Changes:

  • Added initial usage yield with input tokens at stream start
  • Added periodic token updates during streaming (every 500 chars)
  • Included cache token fields for consistency with other providers

Fixes #6112


Important

Improves token usage reporting in VsCodeLmHandler by adding initial and periodic updates during streaming, addressing issue #6112.

  • Behavior:
    • VsCodeLmHandler in vscode-lm.ts now yields initial token usage at stream start and periodic updates every 500 characters.
    • Final token usage is reported at stream end, with cache token fields set to 0.
  • Tests:
    • Updated vscode-lm.spec.ts to expect initial, periodic, and final token usage chunks.
    • Added test for getApiProtocol in provider-settings.test.ts to return 'openai' for vscode-lm provider.
  • Misc:

This description was created by Ellipsis for 048fcf7. You can customize this summary. It will automatically update as commits are pushed.

- Add initial usage yield with input tokens at stream start
- Yield periodic token updates during streaming (every 500 chars)
- Include cache token fields (set to 0) for consistency with other providers
- This ensures the context window progress bar updates properly during streaming
@roomote roomote bot requested review from cte, jr and mrubens as code owners July 23, 2025 13:31
@dosubot dosubot bot added size:M This PR changes 30-99 lines, ignoring generated files. bug Something isn't working labels Jul 23, 2025
@hannesrudolph hannesrudolph added the Issue/PR - Triage New issue. Needs quick review to confirm validity and assign labels. label Jul 23, 2025
The PR added an initial usage yield at the start of the stream, which
causes tests to receive 3 chunks instead of 2. Updated tests to:
- Expect 3 chunks (initial usage + text + final usage)
- Handle the new chunk ordering correctly
- Fix error handling test to account for initial usage before error
@daniel-lxs daniel-lxs moved this from Triage to PR [Needs Prelim Review] in Roo Code Roadmap Jul 24, 2025
@hannesrudolph hannesrudolph added PR - Needs Preliminary Review and removed Issue/PR - Triage New issue. Needs quick review to confirm validity and assign labels. labels Jul 24, 2025
@daniel-lxs
Copy link
Member

This just returns 0 tokens because VSCode LM API doesn't return token info. We should probably use tiktoken instead.

@daniel-lxs daniel-lxs closed this Jul 28, 2025
@github-project-automation github-project-automation bot moved this from New to Done in Roo Code Roadmap Jul 28, 2025
@github-project-automation github-project-automation bot moved this from PR [Needs Prelim Review] to Done in Roo Code Roadmap Jul 28, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

bug Something isn't working PR - Needs Preliminary Review size:M This PR changes 30-99 lines, ignoring generated files.

Projects

Archived in project

Development

Successfully merging this pull request may close these issues.

Token count or the Token progress bar doesn't work when using "VS Code LM API" as the API Provider

4 participants