-
Notifications
You must be signed in to change notification settings - Fork 2.3k
Description
Summary
When using OPENAI_API_BASE to point to an OpenAI-compatible proxy (e.g., GitHub Copilot API, LiteLLM), Letta automatically registers models with the openai-proxy/ provider prefix. However, the agent creation API only accepts letta/ or openai/ as valid provider prefixes, making it impossible to create agents with proxy-registered models.
Environment
- Letta Version: Latest (tested with Docker setup)
- Python Version: 3.11+
- Deployment: Docker Compose with
OPENAI_API_BASEset
Steps to Reproduce
-
Set
OPENAI_API_BASEto an OpenAI-compatible proxy:export OPENAI_API_BASE=http://localhost:8080/v1 # e.g., LiteLLM or Copilot API
-
Start Letta server - models are automatically registered with
openai-proxy/prefix:Available models: openai-proxy/gpt-4o, openai-proxy/gpt-4-turbo, ... -
Try to create an agent using a proxy model:
from letta import Letta client = Letta(base_url="http://localhost:8283") agent = client.agents.create( name="test-agent", model="openai-proxy/gpt-4o", # This fails embedding="letta/letta-free" )
-
Error: Model validation fails because
openai-proxy/is not an accepted provider prefix.
Expected Behavior
Agents should be creatable with any model that Letta has successfully registered, regardless of the provider prefix used during auto-discovery.
Current Behavior
Agent creation fails with a validation error. The only workaround is to use letta/letta-free which routes through Letta's hosted service instead of the configured proxy.
Workaround Attempted
Tried to normalize the model name from openai-proxy/gpt-4o to openai/gpt-4o, but this fails because the model must exist in Letta's registry under that exact name.
Related
- Issue Not able to create an agent with Openai proxy - Embedding model is not supported by provider #2817 (closed as stale) mentioned similar challenges with model registration
- This affects users who want to use Letta with corporate OpenAI proxies or alternative API providers
Suggested Fix
Option A: Accept any registered model name in agent creation, not just those with specific provider prefixes.
Option B: Allow users to configure a provider name alias (e.g., map openai-proxy → openai for validation purposes).
Option C: Provide a way to register models under the openai/ prefix even when using a custom OPENAI_API_BASE.
Impact
This limitation prevents Letta adoption in enterprise environments where:
- Direct OpenAI API access is blocked
- Traffic must route through corporate proxies
- Alternative OpenAI-compatible providers are used (Azure, LiteLLM, etc.)
Happy to provide more details or test patches if needed!