Positron Assistant now has preview support for Custom Providers (OpenAI compatible) #9988
jthomasmock
announced in
Announcements
Replies: 1 comment 2 replies
-
|
Hi i have problems configuring models provided by azure foundry. I configured liked said above In the configuration panel i setted the API key and the base URL (the same that i use with ellmer) but it fails with the error Failed to add language model Custom Provider: Could not fetch models Resource Not Found Am I doing something wrong or there's simply no support for azure foundry right now? |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
As of Positron 2025.10.0, we have added initial preview support for Custom Providers for Positron Assistant! This will also soon be available in preview for Posit Workbench: Positron Pro in the next Q4-2025 release.
Custom Providers allow you to use arbitrary AI model providers that adhere to the OpenAI API specification. Custom Providers could include tools such as litellm, vllm, OpenRouter, Portkey.ai, or other arbitrary centralized routing, proxying, or LLM hosting frameworks that otherwise sit in front of direct model provider access.
Note
In the future, we intend to provide additional guardrail checks for models that have the core capabilities necessary to provide a positive Positron Assistant experience, which at a minimum, would be tool-calling support. In future iterations of Custom Provider support, we intend to at least warn users if models do not meet sufficient requirements to provide the expected user experience.
Caution
While Databot can also use models via a Custom Provider, we currently only suggest using Anthropic models with Databot per our testing during its Research Preview.
Why Custom Providers?
Data Science teams
Custom Providers may be interesting for teams that require:
Individual Data Scientists
Individuals may be interested in using Custom Providers to access “free” foundational models that run in-session via tools like Ollama, which often means sharing compute with their data science environment. Our current recommendations, based on model quality and Positron Assistant’s need for tool-calling and agentic workflows, are that models should be run in a separate computational environment.
This recommendation is based on:
Enable Custom Providers
To make use of Custom Providers, Positron Assistant must first be enabled as outlined in the Positron Assistant Getting Started Guide. Custom Providers can then be enabled via the
positron.assistant.enabledProviderssetting in your settings.json.While Custom Providers could be used to access OpenAI endpoints, you should instead make use of the
“openai”option within the same setting. The direct OpenAI option will change to“openai-api”in the 2025.11.0 release to avoid naming conflicts we have recently observed with other tools and model providers.Once the Custom Provider option is enabled, you can sign in to your custom provider.
Sign-in to Custom Provider
We want your feedback!
We've heard the interest for Custom Providers in #8592 and #8326 and are happy to provide this initial support.
Please give it a try, and leave any ideas or feedback in this discussion or open an issue if you run into any problems!
Beta Was this translation helpful? Give feedback.
All reactions