Skip to content

OpenAI Proxy (OPENAI_API_BASE) model registration breaks agent creation #3133

@kikokikok

Description

@kikokikok

Summary

When using OPENAI_API_BASE to point to an OpenAI-compatible proxy (e.g., GitHub Copilot API, LiteLLM), Letta automatically registers models with the openai-proxy/ provider prefix. However, the agent creation API only accepts letta/ or openai/ as valid provider prefixes, making it impossible to create agents with proxy-registered models.

Environment

  • Letta Version: Latest (tested with Docker setup)
  • Python Version: 3.11+
  • Deployment: Docker Compose with OPENAI_API_BASE set

Steps to Reproduce

  1. Set OPENAI_API_BASE to an OpenAI-compatible proxy:

    export OPENAI_API_BASE=http://localhost:8080/v1  # e.g., LiteLLM or Copilot API
  2. Start Letta server - models are automatically registered with openai-proxy/ prefix:

    Available models: openai-proxy/gpt-4o, openai-proxy/gpt-4-turbo, ...
    
  3. Try to create an agent using a proxy model:

    from letta import Letta
    
    client = Letta(base_url="http://localhost:8283")
    agent = client.agents.create(
        name="test-agent",
        model="openai-proxy/gpt-4o",  # This fails
        embedding="letta/letta-free"
    )
  4. Error: Model validation fails because openai-proxy/ is not an accepted provider prefix.

Expected Behavior

Agents should be creatable with any model that Letta has successfully registered, regardless of the provider prefix used during auto-discovery.

Current Behavior

Agent creation fails with a validation error. The only workaround is to use letta/letta-free which routes through Letta's hosted service instead of the configured proxy.

Workaround Attempted

Tried to normalize the model name from openai-proxy/gpt-4o to openai/gpt-4o, but this fails because the model must exist in Letta's registry under that exact name.

Related

Suggested Fix

Option A: Accept any registered model name in agent creation, not just those with specific provider prefixes.

Option B: Allow users to configure a provider name alias (e.g., map openai-proxyopenai for validation purposes).

Option C: Provide a way to register models under the openai/ prefix even when using a custom OPENAI_API_BASE.

Impact

This limitation prevents Letta adoption in enterprise environments where:

  • Direct OpenAI API access is blocked
  • Traffic must route through corporate proxies
  • Alternative OpenAI-compatible providers are used (Azure, LiteLLM, etc.)

Happy to provide more details or test patches if needed!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions