Skip to content

Conversation

@35C4n0r
Copy link
Collaborator

@35C4n0r 35C4n0r commented Jan 12, 2026

Description

  • Add support for AI Bridge

Type of Change

  • New module
  • New template
  • Bug fix
  • Feature/enhancement
  • Documentation
  • Other

Module Information

Path: registry/coder-labs/modules/codex
New version: v4.1.0
Breaking change: [ ] Yes [x] No

Testing & Validation

  • Tests pass (bun test)
  • Code formatted (bun fmt)
  • Changes tested locally

Related Issues

Closes: #650

@35C4n0r 35C4n0r changed the title feat(coder-labs/modules/codex) feat(coder-labs/modules/codex): add support for aibridge Jan 12, 2026
@35C4n0r 35C4n0r marked this pull request as draft January 12, 2026 16:06
@35C4n0r 35C4n0r marked this pull request as ready for review January 12, 2026 16:11
@35C4n0r 35C4n0r self-assigned this Jan 12, 2026
@matifali matifali requested a review from pawbana January 15, 2026 08:05
description = "The model for Codex to use. Defaults to gpt-5.1-codex-max."
default = ""
description = "The model for Codex to use. Defaults to gpt-5.2-codex."
default = "gpt-5.2-codex"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks like it is not yet available for the API billing we use. So maybe use next best gpt-5.1-codex?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

image

Copy link
Collaborator Author

@35C4n0r 35C4n0r Jan 15, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@matifali @DevelopmentCats imo, we should leave the default as empty and not mention this either

Defaults to gpt-5.1-codex-max.

And make passing the model mandatory in the case of enable_aibridge = true.
lmk your thoughts

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

One issue that I notice with using these codex models with aibridge, is that the openai codex modules only support the v1/responses endpoint, and not v1/chat/completions

Currently from what I can see aibridge does not currently log v1/responses and only v1/chat/completions in the current release of aibridge, but I can see there has been work on this and it will be in soon.

coder/aibridge#84
coder/aibridge#87

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is fine. It will be supported by next release. We will release this until we bring the interception functionality

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is fine. It will be supported by next release. We will release this until we bring the interception functionality

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is fine. It will be supported by next release. We will release this until we bring the interception functionality

@pawbana
Copy link

pawbana commented Jan 15, 2026

I'm not familiar with modules/templates enough to confidently review this but from my shallow understanding it looks good, codex config changes look good.

@DevelopmentCats
Copy link
Contributor

I will go ahead and give this a test today but everything looks good after looking through this again.

```toml
[model_providers.aibridge]
name = "AI Bridge"
base_url = "https://dev.coder.com/api/v2/aibridge/openai/v1"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
base_url = "https://dev.coder.com/api/v2/aibridge/openai/v1"
base_url = "https://coder.example.com/api/v2/aibridge/openai/v1"

I would use an example url here since we dont need to expose our aibridge endpoint, and this uses the deployment url from whatever coder deployment its running from if im not mistaken

Comment on lines +141 to +142
module "codex" {
source = "registry.coder.com/coder-labs/codex/coder"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
module "codex" {
source = "registry.coder.com/coder-labs/codex/coder"
module "codex" {
count = data.coder_task.me.enabled ? data.coder_workspace.me.start_count : 0
source = "registry.coder.com/coder-labs/codex/coder"

The count needs to be included here since this is a reference for tasks

```tf
resource "coder_ai_task" "task" {
count = data.coder_workspace.me.start_count
app_id = module.codex.task_app_id
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
app_id = module.codex.task_app_id
app_id = module.codex[count.index].task_app_id

This needs to be changed here as well


```tf
resource "coder_ai_task" "task" {
count = data.coder_workspace.me.start_count
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
count = data.coder_workspace.me.start_count
count = data.coder_task.me.enabled ? data.coder_workspace.me.start_count : 0

This needs to be changed as well

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Add AI Bridge Integration to Codex Module

4 participants