Replies: 4 comments 1 reply
-
In the introductory post for Codex, a context window of 192k tokens is mentioned for testing (source). |
Beta Was this translation helpful? Give feedback.
-
I am using Codex CLI and seeing this.. 67801 tokens used 92% context left, based on that it would be 847,512 tokens. |
Beta Was this translation helpful? Give feedback.
-
Codex has a 272k token context, but the usage shown is a misleading indicator because the input in /status displays the same number. However, this isn't the actual input size — it's the count of used tokens, excluding the cached ones. So it's more of a cost metric than a true measure of context size. |
Beta Was this translation helpful? Give feedback.
-
feels odd if its 272k, given acli rovodev has gpt4 with 400k context, only thing is its "gpt5" god knows which gpt5 |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
For example, ChatGPT Plus is normally limited to a context window of 32k tokens. GPT-5 natively supports 400k tokens though, which is especially useful while coding. So which limit applies for Codex on a ChatGPT Plus or Pro plan (compared to via API key)?
Beta Was this translation helpful? Give feedback.
All reactions