Skip to content

Connection failed: error sending request for url (https://ai.megallm.io/v1/ chat/completions?) #6815

@Ximingwang-09

Description

@Ximingwang-09

When I use codex with my megallm api , There is a Connection failed error. But I can send a request with OpenAI Client like


client = OpenAI(
    api_key="sk-mega-xxx",
    base_url="https://ai.megallm.io/v1",
)

response = client.chat.completions.create(
    model="gpt-5",
    messages=[
        {
            "role": "user",
            "content": "Hello, Who are you?"
        }
    ],
    max_tokens=150,
    temperature=0.7
)

And this is my config.toml

model = "gpt-5.1"
model_reasoning_effort = "high"
model_provider = "megallm"

[model_providers.packycode]
name = "packycode"
base_url = "https://ai.megallm.io/v1"
wire_api = "chat"

[model_providers.megallm]
name = "OpenAI using Chat Completions"
base_url = "https://ai.megallm.io/v1"
env_key = "MEGALLM_API_KEY"
query_params = { }

Metadata

Metadata

Assignees

No one assigned

    Labels

    CLIIssues related to the Codex CLIbugSomething isn't workingcustom-modelIssues related to custom model providers (including local models)

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions