Skip to content

Conversation

dhofheinz
Copy link

Summary

Resolves the confusion reported in #4728 about whether GPT-5-Codex has a 272k or 400k context window.

After analyzing the codebase (codex-rs/core/src/openai_model_info.rs), I confirmed that:

  • The model_context_window value represents input tokens (272,000 for GPT-5-Codex)
  • The model_max_output_tokens is output tokens (128,000)
  • The total token budget is 400,000 tokens (272k input + 128k output)

Changes

  • Updated docs/config.md to clarify that model_context_window refers to input tokens
  • Added an explanatory note showing the breakdown for GPT-5-Codex (272k + 128k = 400k)
  • Enhanced model_max_output_tokens description to explain it's separate from input
  • Added a new FAQ entry explaining why /status shows 272k while platform docs say 400k

Testing

Documentation-only changes - no code modified.

Fixes #4728

Resolves confusion about 272k vs 400k context window mentioned in openai#4728.

- Clarify that model_context_window is INPUT tokens (272k for GPT-5-Codex)
- Explain model_max_output_tokens is OUTPUT tokens (128k for GPT-5-Codex)
- Document total token budget = 400k (272k input + 128k output)
- Add FAQ entry explaining /status display vs platform docs

Fixes openai#4728
Copy link

github-actions bot commented Oct 7, 2025

All contributors have signed the CLA ✍️ ✅
Posted by the CLA Assistant Lite bot.

@dhofheinz
Copy link
Author

I have read the CLA Document and I hereby sign the CLA

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Confused: 272k context window or 400k?
1 participant