Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
66 changes: 61 additions & 5 deletions registry/coder-labs/modules/codex/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ display_name: Codex CLI
icon: ../../../../.icons/openai.svg
description: Run Codex CLI in your workspace with AgentAPI integration
verified: true
tags: [agent, codex, ai, openai, tasks]
tags: [agent, codex, ai, openai, tasks, aibridge]
---

# Codex CLI
Expand All @@ -13,7 +13,7 @@ Run Codex CLI in your workspace to access OpenAI's models through the Codex inte
```tf
module "codex" {
source = "registry.coder.com/coder-labs/codex/coder"
version = "4.0.0"
version = "4.1.0"
agent_id = coder_agent.example.id
openai_api_key = var.openai_api_key
workdir = "/home/coder/project"
Expand All @@ -32,7 +32,7 @@ module "codex" {
module "codex" {
count = data.coder_workspace.me.start_count
source = "registry.coder.com/coder-labs/codex/coder"
version = "4.0.0"
version = "4.1.0"
agent_id = coder_agent.example.id
openai_api_key = "..."
workdir = "/home/coder/project"
Expand All @@ -52,7 +52,7 @@ data "coder_task" "me" {}

module "codex" {
source = "registry.coder.com/coder-labs/codex/coder"
version = "4.0.0"
version = "4.1.0"
agent_id = coder_agent.example.id
openai_api_key = "..."
ai_prompt = data.coder_task.me.prompt
Expand Down Expand Up @@ -99,7 +99,7 @@ For custom Codex configuration, use `base_config_toml` and/or `additional_mcp_se
```tf
module "codex" {
source = "registry.coder.com/coder-labs/codex/coder"
version = "4.0.0"
version = "4.1.0"
# ... other variables ...

# Override default configuration
Expand All @@ -122,6 +122,61 @@ module "codex" {
> [!NOTE]
> If no custom configuration is provided, the module uses secure defaults. The Coder MCP server is always included automatically. For containerized workspaces (Docker/Kubernetes), you may need `sandbox_mode = "danger-full-access"` to avoid permission issues. For advanced options, see [Codex config docs](https://github.com/openai/codex/blob/main/codex-rs/config.md).

### AI Bridge Configuration

[AI Bridge](https://coder.com/docs/ai-coder/ai-bridge) is a centralized AI gateway that securely intermediates between users’ coding tools and AI providers, managing authentication, auditing, and usage attribution.

To the AI Bridge integration, first [set up AI Bridge](https://coder.com/docs/ai-coder/ai-bridge/setup) and set `enable_aibridge` to `true`.

#### Usage with tasks and AI Bridge

```tf
resource "coder_ai_task" "task" {
count = data.coder_workspace.me.start_count
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
count = data.coder_workspace.me.start_count
count = data.coder_task.me.enabled ? data.coder_workspace.me.start_count : 0

This needs to be changed as well

app_id = module.codex.task_app_id
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
app_id = module.codex.task_app_id
app_id = module.codex[count.index].task_app_id

This needs to be changed here as well

}

data "coder_task" "me" {}

module "codex" {
source = "registry.coder.com/coder-labs/codex/coder"
Comment on lines +141 to +142
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
module "codex" {
source = "registry.coder.com/coder-labs/codex/coder"
module "codex" {
count = data.coder_task.me.enabled ? data.coder_workspace.me.start_count : 0
source = "registry.coder.com/coder-labs/codex/coder"

The count needs to be included here since this is a reference for tasks

version = "4.1.0"
agent_id = coder_agent.example.id
ai_prompt = data.coder_task.me.prompt
workdir = "/home/coder/project"
enable_aibridge = true
}
```

#### Standalone usage with AI Bridge

```tf
module "codex" {
source = "registry.coder.com/coder-labs/codex/coder"
version = "4.1.0"
agent_id = coder_agent.example.id
workdir = "/home/coder/project"
enable_aibridge = true
}
```

This adds a new model_provider and a profile to the Codex configuration:

```toml
[model_providers.aibridge]
name = "AI Bridge"
base_url = "https://dev.coder.com/api/v2/aibridge/openai/v1"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
base_url = "https://dev.coder.com/api/v2/aibridge/openai/v1"
base_url = "https://coder.example.com/api/v2/aibridge/openai/v1"

I would use an example url here since we dont need to expose our aibridge endpoint, and this uses the deployment url from whatever coder deployment its running from if im not mistaken

env_key = "CODER_AIBRIDGE_SESSION_TOKEN"
wire_api = "responses"

[profiles.aibridge]
model_provider = "aibridge"
model = "<model>" # as configured in the module input
model_reasoning_effort = "<model_reasoning_effort>" # as configured in the module input
```

Codex then runs with `--profile aibridge`

## Troubleshooting

- Check installation and startup logs in `~/.codex-module/`
Expand All @@ -137,3 +192,4 @@ module "codex" {
- [Codex CLI Documentation](https://github.com/openai/codex)
- [AgentAPI Documentation](https://github.com/coder/agentapi)
- [Coder AI Agents Guide](https://coder.com/docs/tutorials/ai-agents)
- [AI Bridge](https://coder.com/docs/ai-coder/ai-bridge)
36 changes: 32 additions & 4 deletions registry/coder-labs/modules/codex/main.test.ts
Original file line number Diff line number Diff line change
Expand Up @@ -113,7 +113,7 @@ describe("codex", async () => {
sandbox_mode = "danger-full-access"
approval_policy = "never"
preferred_auth_method = "apikey"

[custom_section]
new_feature = true
`.trim();
Expand Down Expand Up @@ -189,7 +189,7 @@ describe("codex", async () => {
args = ["-y", "@modelcontextprotocol/server-github"]
type = "stdio"
description = "GitHub integration"

[mcp_servers.FileSystem]
command = "npx"
args = ["-y", "@modelcontextprotocol/server-filesystem", "/workspace"]
Expand All @@ -215,7 +215,7 @@ describe("codex", async () => {
approval_policy = "untrusted"
preferred_auth_method = "chatgpt"
custom_setting = "test-value"

[advanced_settings]
timeout = 30000
debug = true
Expand All @@ -228,7 +228,7 @@ describe("codex", async () => {
args = ["--serve", "--port", "8080"]
type = "stdio"
description = "Custom development tool"

[mcp_servers.DatabaseMCP]
command = "python"
args = ["-m", "database_mcp_server"]
Expand Down Expand Up @@ -454,4 +454,32 @@ describe("codex", async () => {
);
expect(startLog.stdout).not.toContain("test prompt");
});

test("codex-with-aibridge", async () => {
const { id } = await setup({
moduleVariables: {
enable_aibridge: "true",
model_reasoning_effort: "none",
},
});

await execModuleScript(id);

const startLog = await readFileContainer(
id,
"/home/coder/.codex-module/agentapi-start.log",
);

const configToml = await readFileContainer(
id,
"/home/coder/.codex/config.toml",
);
expect(startLog).toContain("AI Bridge is enabled, using profile aibridge");
expect(startLog).toContain(
"Starting Codex with arguments: --profile aibridge",
);
expect(configToml).toContain(
"[profiles.aibridge]\n" + 'model_provider = "aibridge"',
);
});
});
47 changes: 45 additions & 2 deletions registry/coder-labs/modules/codex/main.tf
Original file line number Diff line number Diff line change
Expand Up @@ -71,6 +71,27 @@ variable "cli_app_display_name" {
default = "Codex CLI"
}

variable "enable_aibridge" {
type = bool
description = "Use AI Bridge for Codex. https://coder.com/docs/ai-coder/ai-bridge"
default = false

validation {
condition = !(var.enable_aibridge && length(var.openai_api_key) > 0)
error_message = "openai_api_key cannot be provided when enable_aibridge is true. AI Bridge automatically authenticates the client using their Coder credentials."
}
}

variable "model_reasoning_effort" {
type = string
description = "The reasoning effort for the AI Bridge model. One of: none, low, medium, high. https://platform.openai.com/docs/guides/latest-model#lower-reasoning-effort"
default = "medium"
validation {
condition = contains(["none", "low", "medium", "high"], var.model_reasoning_effort)
error_message = "model_reasoning_effort must be one of: none, low, medium, high."
}
}

variable "install_codex" {
type = bool
description = "Whether to install Codex."
Expand Down Expand Up @@ -115,8 +136,8 @@ variable "agentapi_version" {

variable "codex_model" {
type = string
description = "The model for Codex to use. Defaults to gpt-5.1-codex-max."
default = ""
description = "The model for Codex to use. Defaults to gpt-5.2-codex."
default = "gpt-5.2-codex"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks like it is not yet available for the API billing we use. So maybe use next best gpt-5.1-codex?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

image

Copy link
Collaborator Author

@35C4n0r 35C4n0r Jan 15, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@matifali @DevelopmentCats imo, we should leave the default as empty and not mention this either

Defaults to gpt-5.1-codex-max.

And make passing the model mandatory in the case of enable_aibridge = true.
lmk your thoughts

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

One issue that I notice with using these codex models with aibridge, is that the openai codex modules only support the v1/responses endpoint, and not v1/chat/completions

Currently from what I can see aibridge does not currently log v1/responses and only v1/chat/completions in the current release of aibridge, but I can see there has been work on this and it will be in soon.

coder/aibridge#84
coder/aibridge#87

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is fine. It will be supported by next release. We will release this until we bring the interception functionality

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is fine. It will be supported by next release. We will release this until we bring the interception functionality

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is fine. It will be supported by next release. We will release this until we bring the interception functionality

}

variable "pre_install_script" {
Expand Down Expand Up @@ -155,12 +176,31 @@ resource "coder_env" "openai_api_key" {
value = var.openai_api_key
}

resource "coder_env" "coder_aibridge_session_token" {
count = var.enable_aibridge ? 1 : 0
agent_id = var.agent_id
name = "CODER_AIBRIDGE_SESSION_TOKEN"
value = data.coder_workspace_owner.me.session_token
}

locals {
workdir = trimsuffix(var.workdir, "/")
app_slug = "codex"
install_script = file("${path.module}/scripts/install.sh")
start_script = file("${path.module}/scripts/start.sh")
module_dir_name = ".codex-module"
aibridge_config = <<-EOF
[model_providers.aibridge]
name = "AI Bridge"
base_url = "${data.coder_workspace.me.access_url}/api/v2/aibridge/openai/v1"
env_key = "CODER_AIBRIDGE_SESSION_TOKEN"
wire_api = "responses"

[profiles.aibridge]
model_provider = "aibridge"
model = "${var.codex_model}"
model_reasoning_effort = "${var.model_reasoning_effort}"
EOF
}

module "agentapi" {
Expand Down Expand Up @@ -196,6 +236,7 @@ module "agentapi" {
ARG_CODEX_START_DIRECTORY='${local.workdir}' \
ARG_CODEX_TASK_PROMPT='${base64encode(var.ai_prompt)}' \
ARG_CONTINUE='${var.continue}' \
ARG_ENABLE_AIBRIDGE='${var.enable_aibridge}' \
/tmp/start.sh
EOT

Expand All @@ -211,6 +252,8 @@ module "agentapi" {
ARG_INSTALL='${var.install_codex}' \
ARG_CODEX_VERSION='${var.codex_version}' \
ARG_BASE_CONFIG_TOML='${base64encode(var.base_config_toml)}' \
ARG_ENABLE_AIBRIDGE='${var.enable_aibridge}' \
ARG_AIBRIDGE_CONFIG='${base64encode(var.enable_aibridge ? local.aibridge_config : "")}' \
ARG_ADDITIONAL_MCP_SERVERS='${base64encode(var.additional_mcp_servers)}' \
ARG_CODER_MCP_APP_STATUS_SLUG='${local.app_slug}' \
ARG_CODEX_START_DIRECTORY='${local.workdir}' \
Expand Down
22 changes: 21 additions & 1 deletion registry/coder-labs/modules/codex/scripts/install.sh
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,8 @@ set -o nounset
ARG_BASE_CONFIG_TOML=$(echo -n "$ARG_BASE_CONFIG_TOML" | base64 -d)
ARG_ADDITIONAL_MCP_SERVERS=$(echo -n "$ARG_ADDITIONAL_MCP_SERVERS" | base64 -d)
ARG_CODEX_INSTRUCTION_PROMPT=$(echo -n "$ARG_CODEX_INSTRUCTION_PROMPT" | base64 -d)
ARG_ENABLE_AIBRIDGE=${ARG_ENABLE_AIBRIDGE:-false}
ARG_AIBRIDGE_CONFIG=$(echo -n "$ARG_AIBRIDGE_CONFIG" | base64 -d)

echo "=== Codex Module Configuration ==="
printf "Install Codex: %s\n" "$ARG_INSTALL"
Expand All @@ -24,6 +26,7 @@ printf "Has Additional MCP: %s\n" "$([ -n "$ARG_ADDITIONAL_MCP_SERVERS" ] && ech
printf "Has System Prompt: %s\n" "$([ -n "$ARG_CODEX_INSTRUCTION_PROMPT" ] && echo "Yes" || echo "No")"
printf "OpenAI API Key: %s\n" "$([ -n "$ARG_OPENAI_API_KEY" ] && echo "Provided" || echo "Not provided")"
printf "Report Tasks: %s\n" "$ARG_REPORT_TASKS"
printf "Enable Coder AI Bridge: %s\n" "$ARG_ENABLE_AIBRIDGE"
echo "======================================"

set +o nounset
Expand Down Expand Up @@ -127,6 +130,15 @@ EOF
fi
}

append_aibridge_config_section() {
local config_path="$1"

if [ -n "$ARG_AIBRIDGE_CONFIG" ]; then
printf "Adding AI Bridge configuration\n"
echo -e "\n# AI Bridge Configuration\n$ARG_AIBRIDGE_CONFIG" >> "$config_path"
fi
}

function populate_config_toml() {
CONFIG_PATH="$HOME/.codex/config.toml"
mkdir -p "$(dirname "$CONFIG_PATH")"
Expand All @@ -140,6 +152,11 @@ function populate_config_toml() {
fi

append_mcp_servers_section "$CONFIG_PATH"

if [ "$ARG_ENABLE_AIBRIDGE" = "true" ]; then
printf "AI Bridge is enabled\n"
append_aibridge_config_section "$CONFIG_PATH"
fi
}

function add_instruction_prompt_if_exists() {
Expand Down Expand Up @@ -185,4 +202,7 @@ install_codex
codex --version
populate_config_toml
add_instruction_prompt_if_exists
add_auth_json

if [ "$ARG_ENABLE_AIBRIDGE" = "false" ]; then
add_auth_json
fi
7 changes: 6 additions & 1 deletion registry/coder-labs/modules/codex/scripts/start.sh
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,7 @@ printf "Version: %s\n" "$(codex --version)"
set -o nounset
ARG_CODEX_TASK_PROMPT=$(echo -n "$ARG_CODEX_TASK_PROMPT" | base64 -d)
ARG_CONTINUE=${ARG_CONTINUE:-true}
ARG_ENABLE_AIBRIDGE=${ARG_ENABLE_AIBRIDGE:-false}

echo "=== Codex Launch Configuration ==="
printf "OpenAI API Key: %s\n" "$([ -n "$ARG_OPENAI_API_KEY" ] && echo "Provided" || echo "Not provided")"
Expand All @@ -26,6 +27,7 @@ printf "Start Directory: %s\n" "$ARG_CODEX_START_DIRECTORY"
printf "Has Task Prompt: %s\n" "$([ -n "$ARG_CODEX_TASK_PROMPT" ] && echo "Yes" || echo "No")"
printf "Report Tasks: %s\n" "$ARG_REPORT_TASKS"
printf "Continue Sessions: %s\n" "$ARG_CONTINUE"
printf "Enable Coder AI Bridge: %s\n" "$ARG_ENABLE_AIBRIDGE"
echo "======================================"
set +o nounset

Expand Down Expand Up @@ -153,7 +155,10 @@ setup_workdir() {
build_codex_args() {
CODEX_ARGS=()

if [ -n "$ARG_CODEX_MODEL" ]; then
if [ "$ARG_ENABLE_AIBRIDGE" = "true" ]; then
printf "AI Bridge is enabled, using profile aibridge\n"
CODEX_ARGS+=("--profile" "aibridge")
elif [ -n "$ARG_CODEX_MODEL" ]; then
CODEX_ARGS+=("--model" "$ARG_CODEX_MODEL")
fi

Expand Down