Skip to content

[FR] Support for LM Studio #3

@huyz

Description

@huyz

Besides ollama, I run LM Studio which has URL http://localhost:1234/v1/chat/completions

This allows me to run MLX models.

Can you support this URL?

An example of something that supports it: Brave's Leo AI

Metadata

Metadata

Labels

enhancementNew feature or request

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions