Replies: 1 comment
-
In the introductory post for Codex, a context window of 192k tokens is mentioned for testing (source). |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
For example, ChatGPT Plus is normally limited to a context window of 32k tokens. GPT-5 natively supports 400k tokens though, which is especially useful while coding. So which limit applies for Codex on a ChatGPT Plus or Pro plan (compared to via API key)?
Beta Was this translation helpful? Give feedback.
All reactions