Skip to content

Conversation

@roomote
Copy link
Contributor

@roomote roomote bot commented Jul 15, 2025

Fixes #5726

Problem

Claude 4 Sonnet was showing only 200k tokens instead of its actual 1 million token capacity in the context window display. This affected both the UI display and context condensing functionality.

Solution

Updated Claude 4 model definitions across all providers to reflect the correct 1M token context window:

  • Anthropic provider: Updated claude-sonnet-4-20250514 and claude-opus-4-20250514 from 200k to 1M tokens
  • Bedrock provider: Updated anthropic.claude-sonnet-4-20250514-v1:0 and anthropic.claude-opus-4-20250514-v1:0 from 200k to 1M tokens
  • Vertex AI provider: Updated claude-sonnet-4@20250514 and claude-opus-4@20250514 from 200k to 1M tokens
  • OpenRouter provider: Updated default model info from 200k to 1M tokens
  • LiteLLM provider: Added logic to detect Claude 4 models and default to 1M tokens instead of 200k
  • Tests: Updated test expectations to reflect correct 1M token context window

Files Changed

  • packages/types/src/providers/anthropic.ts
  • packages/types/src/providers/bedrock.ts
  • packages/types/src/providers/vertex.ts
  • packages/types/src/providers/openrouter.ts
  • src/api/providers/fetchers/litellm.ts
  • src/core/webview/__tests__/ClineProvider.spec.ts

Testing

  • All existing tests pass
  • Updated test expectations to verify 1M token context window
  • Linting and type checking pass

Important

Update Claude 4 Sonnet models to reflect 1M token context window across multiple providers and update tests accordingly.

  • Behavior:
    • Update context window for Claude 4 Sonnet models to 1M tokens in anthropic.ts, bedrock.ts, vertex.ts, openrouter.ts, and litellm.ts.
    • Update test expectations in ClineProvider.spec.ts to reflect 1M token context window.
  • Providers:
    • Anthropic: Update claude-sonnet-4-20250514 and claude-opus-4-20250514.
    • Bedrock: Update anthropic.claude-sonnet-4-20250514-v1:0 and anthropic.claude-opus-4-20250514-v1:0.
    • Vertex AI: Update claude-sonnet-4@20250514 and claude-opus-4@20250514.
    • OpenRouter: Update default model info.
    • LiteLLM: Add logic to detect Claude 4 models and set context window to 1M tokens.
  • Testing:
    • All existing tests pass.
    • Update test expectations to verify 1M token context window.

This description was created by Ellipsis for 3a2278a. You can customize this summary. It will automatically update as commits are pushed.

- Update claude-sonnet-4-20250514 and claude-opus-4-20250514 from 200k to 1M tokens across all providers
- Fix Anthropic provider model definitions
- Fix Bedrock provider model definitions
- Fix Vertex AI provider model definitions
- Fix OpenRouter default model info
- Update LiteLLM fallback logic to default to 1M tokens for Claude 4 models
- Update test expectations to reflect correct 1M token context window

Fixes #5726
@roomote roomote bot requested review from cte, jr and mrubens as code owners July 15, 2025 08:45
@dosubot dosubot bot added size:S This PR changes 10-29 lines, ignoring generated files. bug Something isn't working labels Jul 15, 2025
@hannesrudolph hannesrudolph added the Issue/PR - Triage New issue. Needs quick review to confirm validity and assign labels. label Jul 15, 2025
@mrubens mrubens closed this Jul 15, 2025
@github-project-automation github-project-automation bot moved this from New to Done in Roo Code Roadmap Jul 15, 2025
@github-project-automation github-project-automation bot moved this from Triage to Done in Roo Code Roadmap Jul 15, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

bug Something isn't working Issue/PR - Triage New issue. Needs quick review to confirm validity and assign labels. size:S This PR changes 10-29 lines, ignoring generated files.

Projects

Archived in project

Development

Successfully merging this pull request may close these issues.

Context window token count display broken for Claude models

3 participants