Skip to content

Conversation

@daniel-lxs
Copy link
Member

Description

Fixes #5075

This PR fixes the issue where LM Studio models were showing context length "1" instead of their actual values. The problem was that LmStudioHandler.getModel() was returning static defaults instead of using dynamically fetched model information.

Changes Made

  • Integrated LmStudioHandler with centralized model cache: Updated LmStudioHandler.fetchModel() to use getModels({ provider: "lmstudio" }) from the model cache instead of calling getLMStudioModels() directly
  • Simplified fetching pattern: Removed the custom getModelWithFetch() method and updated createMessage() and completePrompt() to use await this.fetchModel() directly
  • Consistency with other providers: Now follows the same pattern as RequestyHandler and other providers that use the centralized model cache
  • Updated test suite: Modified tests to mock getModels instead of getLMStudioModels and updated expectations to match the new implementation

Testing

  • All existing tests pass
  • Added tests for new functionality
  • Manual testing completed:
    • LM Studio handler tests: 12/12 passing
    • LM Studio fetcher tests: 9/9 passing
    • All provider tests: 337/338 passing (1 skipped)
    • Linting and type checking pass

Verification of Acceptance Criteria

  • LM Studio models now show correct context length instead of "1"
  • Integration with centralized model cache ensures consistent model data
  • No breaking changes to existing functionality
  • Follows established patterns used by other providers

Checklist

  • Code follows project style guidelines
  • Self-review completed
  • Comments added for complex logic
  • No breaking changes
  • Tests updated and passing
  • Linting and type checking pass

Files Changed

  • src/api/providers/lm-studio.ts - Updated to use centralized model cache
  • src/api/providers/__tests__/lmstudio.spec.ts - Updated tests to match new implementation

How it works

Before: getModel() returned openAiModelInfoSaneDefaults with context length 128,000 but maxTokens -1

After: fetchModel() retrieves actual model information from LM Studio via the model cache, including the correct context length from the model's metadata

This ensures that users will see the actual context length of their LM Studio models instead of the generic default value.

- Add fetchModel() method to LmStudioHandler to dynamically fetch model info
- Update getModel() to return actual context length instead of default value of 1
- Add comprehensive tests for new functionality with proper error handling
- Maintain backward compatibility with graceful fallback to defaults

Fixes #5075
- Updated LmStudioHandler.fetchModel() to use getModels() from model cache
- Removed custom getModelWithFetch() method for consistency with other providers
- Updated createMessage() and completePrompt() to use await this.fetchModel()
- Fixed tests to mock getModels instead of getLMStudioModels
- Ensures LM Studio models show correct context length instead of default value

Fixes #5075
@delve-auditor
Copy link

delve-auditor bot commented Jun 24, 2025

No security or compliance issues detected. Reviewed everything up to 258e82a.

Security Overview
  • 🔎 Scanned files: 2 changed file(s)
Detected Code Changes
Change Type Relevant files
Bug Fix ► lmstudio.spec.ts
    Update tests to use model cache mocking
► lm-studio.ts
    Update model fetching to use centralized cache
    Improve model ID matching logic
Refactor ► Workflow.xml
    Remove redundant PR check monitoring steps
► common_patterns.xml
    Remove PR check watching pattern
► tool_usage.xml
    Remove PR check watching tool priority

Reply to this PR with @delve-auditor followed by a description of what change you want and we'll auto-submit a change to this PR to implement it.

@daniel-lxs daniel-lxs moved this from Triage to PR [Draft / In Progress] in Roo Code Roadmap Jun 24, 2025
@Angular-Angel
Copy link

This fix does not solve the problem for me at the current time.

@github-project-automation github-project-automation bot moved this from PR [Draft / In Progress] to Done in Roo Code Roadmap Jul 7, 2025
@github-project-automation github-project-automation bot moved this from New to Done in Roo Code Roadmap Jul 7, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

Archived in project

Development

Successfully merging this pull request may close these issues.

Roo does not correctly detect context length for LM Studio models

4 participants