-
Notifications
You must be signed in to change notification settings - Fork 7.1k
Closed
Labels
CLIIssues related to the Codex CLIIssues related to the Codex CLIbugSomething isn't workingSomething isn't workingcustom-modelIssues related to custom model providers (including local models)Issues related to custom model providers (including local models)
Description
When I use codex with my megallm api , There is a Connection failed error. But I can send a request with OpenAI Client like
client = OpenAI(
api_key="sk-mega-xxx",
base_url="https://ai.megallm.io/v1",
)
response = client.chat.completions.create(
model="gpt-5",
messages=[
{
"role": "user",
"content": "Hello, Who are you?"
}
],
max_tokens=150,
temperature=0.7
)
And this is my config.toml
model = "gpt-5.1"
model_reasoning_effort = "high"
model_provider = "megallm"
[model_providers.packycode]
name = "packycode"
base_url = "https://ai.megallm.io/v1"
wire_api = "chat"
[model_providers.megallm]
name = "OpenAI using Chat Completions"
base_url = "https://ai.megallm.io/v1"
env_key = "MEGALLM_API_KEY"
query_params = { }
Metadata
Metadata
Assignees
Labels
CLIIssues related to the Codex CLIIssues related to the Codex CLIbugSomething isn't workingSomething isn't workingcustom-modelIssues related to custom model providers (including local models)Issues related to custom model providers (including local models)