Skip to content

Conversation

@roomote
Copy link
Contributor

@roomote roomote bot commented Aug 11, 2025

Summary

This PR fixes an issue where LM Studio models appeared twice in the Provider Configuration Profile - once with the model path and once with the model id.

Problem

When a model is both downloaded and loaded in LM Studio, it was being added to the models list twice with different keys:

  • Downloaded models used model.path as the key
  • Loaded models used model.modelKey as the key

This caused the same model to appear as duplicate entries in the UI.

Solution

Use model.path consistently as the key for both downloaded and loaded models. This ensures each model appears only once in the configuration profile, regardless of whether it's just downloaded or also loaded.

Changes

  • Modified getLMStudioModels() in src/api/providers/fetchers/lmstudio.ts to use lmstudioModel.path instead of lmstudioModel.modelKey for loaded models
  • Updated the corresponding test in src/api/providers/fetchers/__tests__/lmstudio.test.ts to expect the path as the key

Testing

  • All existing tests pass
  • The change maintains backward compatibility as the path is what users see and select in the UI

Fixes #6954


Important

Fixes duplicate LM Studio models in Provider Configuration by using model.path as the key for both downloaded and loaded models.

  • Behavior:
    • Fixes duplicate LM Studio models in Provider Configuration by using model.path as the key for both downloaded and loaded models in getLMStudioModels() in lmstudio.ts.
  • Testing:
    • Updates test in lmstudio.test.ts to expect model.path as the key.
    • All existing tests pass, ensuring backward compatibility.

This description was created by Ellipsis for 21c34f1. You can customize this summary. It will automatically update as commits are pushed.

- Use consistent model.path as key for both downloaded and loaded models
- Fixes issue where models appeared twice with different identifiers
- Update tests to reflect the new consistent key usage

Fixes #6954
@roomote roomote bot requested review from cte, jr and mrubens as code owners August 11, 2025 21:09
@dosubot dosubot bot added size:XS This PR changes 0-9 lines, ignoring generated files. bug Something isn't working labels Aug 11, 2025
Copy link
Contributor Author

@roomote roomote bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Reviewing my own code because apparently I trust no one, not even myself.

for (const lmstudioModel of loadedModels) {
models[lmstudioModel.modelKey] = parseLMStudioModel(lmstudioModel)
modelsWithLoadedDetails.add(lmstudioModel.modelKey)
// Use path as the key to avoid duplicates when a model is both downloaded and loaded
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good fix! The comment clearly explains why we're using path as the key. This should resolve the duplicate model issue reported in #6954.

modelsWithLoadedDetails.add(lmstudioModel.modelKey)
// Use path as the key to avoid duplicates when a model is both downloaded and loaded
models[lmstudioModel.path] = parseLMStudioModel(lmstudioModel)
modelsWithLoadedDetails.add(lmstudioModel.path)
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice consistency - updating the Set to track paths instead of modelKeys aligns with the new keying strategy.


const expectedParsedModel = parseLMStudioModel(mockRawModel)
expect(result).toEqual({ [mockRawModel.modelKey]: expectedParsedModel })
expect(result).toEqual({ [mockRawModel.path]: expectedParsedModel })
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Test correctly updated to expect the path as the key instead of modelKey. Good to see the test coverage maintained.

@hannesrudolph hannesrudolph added the Issue/PR - Triage New issue. Needs quick review to confirm validity and assign labels. label Aug 12, 2025
@pwilkin
Copy link
Contributor

pwilkin commented Aug 12, 2025

LGTM.

@daniel-lxs daniel-lxs moved this from Triage to PR [Needs Prelim Review] in Roo Code Roadmap Aug 13, 2025
@hannesrudolph hannesrudolph added PR - Needs Preliminary Review and removed Issue/PR - Triage New issue. Needs quick review to confirm validity and assign labels. labels Aug 13, 2025
@daniel-lxs
Copy link
Member

Closing this PR to re-scope the solution. The better approach is to only show loaded models from LM Studio, which completely eliminates duplicates and ensures only usable models are displayed.

@daniel-lxs daniel-lxs closed this Aug 16, 2025
@github-project-automation github-project-automation bot moved this from New to Done in Roo Code Roadmap Aug 16, 2025
@github-project-automation github-project-automation bot moved this from PR [Needs Prelim Review] to Done in Roo Code Roadmap Aug 16, 2025
@pwilkin
Copy link
Contributor

pwilkin commented Aug 16, 2025

@daniel-lxs Just to chime in: only showing loaded models would IMO be a really bad idea. Since LMStudio has JIT-model-loading, in many cases "showing only loaded models" would amount to showing no models at all. The way most people use LMStudio (me included) is they keep a repository of available models and load them on demand according to the needs. I think the fix is actually quite decent, since the problem was showing the loaded and available models using a different key.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

bug Something isn't working PR - Needs Preliminary Review size:XS This PR changes 0-9 lines, ignoring generated files.

Projects

Archived in project

Development

Successfully merging this pull request may close these issues.

LM Studio: Loaded models appear twice in Provider Configuration Profile

5 participants