Allow using OpenAI Pro and Plus plan's for an api provider #6990
tholum
started this conversation in
Feature Requests
Replies: 2 comments
-
As far my understanding is, this is against the TOS of OpenAI and Codex. This is similar to Gemini CLI. |
Beta Was this translation helpful? Give feedback.
0 replies
-
Yup! Waiting this feature too, very need |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Codex is open source, and it supports using your pro and plus plans for the api provider, it should help keep the api pricing down, especially now that ChatGPT models are usable for coding
Current Plan ( Yes ai generated )
Proposal: Add “Sign in with ChatGPT (Plus/Pro)” authentication for the OpenAI provider
This document outlines a compliant, testable plan to let users authenticate Roo Code with their ChatGPT Plus/Pro account instead of manually pasting an API key. It follows project policies in
CONTRIBUTING.md
,CODE_OF_CONDUCT.md
, andSECURITY.md
, and integrates cleanly with the existing OpenAI provider implementation described inREADME.md
.Why
Scope (MVP)
"apiKey" | "chatgpt"
.http://127.0.0.1:<port>/auth/callback
, exchange code for tokens, then perform token-exchange to retrieve an “openai-api-key”.SecretStorage
; wire the OpenAI provider to read from SecretStorage whenauthMode === "chatgpt"
.Out of scope (future): enterprise SSO variants; multi-account switching UI.
High-level design
UX additions
Roo: Sign in with ChatGPT (OpenAI)
Roo: Sign out ChatGPT (OpenAI)
Roo: Import OpenAI credentials from Codex CLI
Roo: Paste Codex auth.json
Data model (SecretStorage keys)
roo.openai.chatgpt.apiKey
: exchanged OpenAI API key (Bearer token)roo.openai.chatgpt.idToken
: OAuth ID token (JWT)roo.openai.chatgpt.refreshToken
: OAuth refresh tokenroo.openai.chatgpt.lastRefreshIso
: ISO timestampConfiguration
openAi.authMode
:"apiKey" | "chatgpt"
(default remains"apiKey"
)Request path
src/api/providers/openai*.ts
request logic. WhenauthMode === "chatgpt"
, we read the API key from SecretStorage and pass it as usual.API endpoints used (explicit)
codex-mini-latest
):POST https://api.openai.com/v1/responses
Authorization: Bearer <apiKey>
,Content-Type: application/json
,Accept: text/event-stream
(for streaming)POST https://api.openai.com/v1/chat/completions
Authorization: Bearer <apiKey>
,Content-Type: application/json
(SDK handles streaming headers)openAiNativeBaseUrl
(defaulthttps://api.openai.com
) foropenai-native
handler.openAiBaseUrl
(defaulthttps://api.openai.com/v1
) foropenai
handler.OpenAI-Organization
,OpenAI-Project
.https://chatgpt.com/backend-api/codex
; the “drop‑in” behavior is limited to the authentication flow and token‑exchange semantics.OAuth + token-exchange flow (mirrors Codex CLI)
Start local server
127.0.0.1
on port 1455 by default; if occupied, pick a random free port.state
and PKCEcode_verifier
/code_challenge
(S256).Open browser to authorization URL
https://auth.openai.com/oauth/authorize
response_type=code
client_id=app_EMoamEEZ73f0CkXaXp7hrann
redirect_uri=http://localhost:1455/auth/callback
scope=openid profile email offline_access
code_challenge
+code_challenge_method=S256
id_token_add_organizations=true
codex_cli_simplified_flow=true
state
Handle callback on
/auth/callback
state
.POST https://auth.openai.com/oauth/token
withgrant_type=authorization_code
,code
,redirect_uri
,client_id
,code_verifier
.id_token
,access_token
,refresh_token
in SecretStorage.Token exchange → API key
id_token
claims include organization/project (or allowed for personal), request an API key via token-exchange:grant_type=urn:ietf:params:oauth:grant-type:token-exchange
requested_token=openai-api-key
subject_token=<id_token>
subject_token_type=urn:ietf:params:oauth:token-type:id_token
client_id=<RooClientId>
roo.openai.chatgpt.apiKey
in SecretStorage.Optional: complimentary credit redemption (best-effort)
https://api.openai.com/v1/billing/redeem_credits
with theid_token
when Plus/Pro and eligible. Errors are logged as warnings only.Finish
Refresh policy
id_token
is expiring (or older than ~28 days). If so, refresh viaPOST /oauth/token
withgrant_type=refresh_token
, rotate tokens, and optionally re-run token-exchange to rotate the API key if required. UpdatelastRefreshIso
.Headless/remote support
ssh -L 1455:localhost:1455 <host>
, or copy/paste the printed URL to a local browser.Implementation plan (issue-first, small PRs)
Provider wiring (small PR)
openAi.authMode
config and readSecretStorage
forchatgpt
mode.apiKey
mode.UI actions (small PR)
OAuth helper (medium PR)
env.openExternal
.Token-exchange + storage (medium PR)
roo.openai.chatgpt.apiKey
.Refresh + best-effort credit redemption (small PR)
Import from Codex CLI (small PR)
~/.codex/auth.json
exists, parse and importOPENAI_API_KEY
and tokens to SecretStorage (user confirmation required).auth.json
. Parse in-memory and discard the raw text after success.OPENAI_API_KEY
ortokens.access_token
and atokens.id_token
.OPENAI_API_KEY
→roo.openai.chatgpt.apiKey
(if present)tokens.id_token
→roo.openai.chatgpt.idToken
tokens.refresh_token
→roo.openai.chatgpt.refreshToken
last_refresh
→roo.openai.chatgpt.lastRefreshIso
(if present)openAi.authMode
to"chatgpt"
and update status.Tests & docs (small PR)
Rollout
Security & privacy
vscode.SecretStorage
; never log raw tokens or keys.state
; use PKCE S256.SECURITY.md
for responsible handling and disclosure.Community/process compliance
CONTRIBUTING.md
.CODE_OF_CONDUCT.md
.Compatibility constraints (drop‑in replacement for Codex CLI)
app_EMoamEEZ73f0CkXaXp7hrann
.1455
and the redirect URIhttp://localhost:1455/auth/callback
.id_token_add_organizations=true
andcodex_cli_simplified_flow=true
.openai-api-key
.Risks and mitigations
Beta Was this translation helpful? Give feedback.
All reactions