Skip to content

Commit 515b633

Browse files
authored
feat: add support for login with ChatGPT (openai#1212)
This does not implement the full Login with ChatGPT experience, but it should unblock people. **What works** * The `codex` multitool now has a `login` subcommand, so you can run `codex login`, which should write `CODEX_HOME/auth.json` if you complete the flow successfully. The TUI will now read the `OPENAI_API_KEY` from `auth.json`. * The TUI should refresh the token if it has expired and the necessary information is in `auth.json`. * There is a `LoginScreen` in the TUI that tells you to run `codex login` if both (1) your model provider expects to use `OPENAI_API_KEY` as its env var, and (2) `OPENAI_API_KEY` is not set. **What does not work** * The `LoginScreen` does not support the login flow from within the TUI. Instead, it tells you to quit, run `codex login`, and then run `codex` again. * `codex exec` does read from `auth.json` yet, nor does it direct the user to go through the login flow if `OPENAI_API_KEY` is not be found. * The `maybeRedeemCredits()` function from `get-api-key.tsx` has not been ported from TypeScript to `login_with_chatgpt.py` yet: https://github.com/openai/codex/blob/a67a67f3258fc21e147b6786a143fe3e15e6d5ba/codex-cli/src/utils/get-api-key.tsx#L84-L89 **Implementation** Currently, the OAuth flow requires running a local webserver on `127.0.0.1:1455`. It seemed wasteful to incur the additional binary cost of a webserver dependency in the Rust CLI just to support login, so instead we implement this logic in Python, as Python has a `http.server` module as part of its standard library. Specifically, we bundle the contents of a single Python file as a string in the Rust CLI and then use it to spawn a subprocess as `python3 -c {{SOURCE_FOR_PYTHON_SERVER}}`. As such, the most significant files in this PR are: ``` codex-rs/login/src/login_with_chatgpt.py codex-rs/login/src/lib.rs ``` Now that the CLI may load `OPENAI_API_KEY` from the environment _or_ `CODEX_HOME/auth.json`, we need a new abstraction for reading/writing this variable, so we introduce: ``` codex-rs/core/src/openai_api_key.rs ``` Note that `std::env::set_var()` is [rightfully] `unsafe` in Rust 2024, so we use a LazyLock<RwLock<Option<String>>> to store `OPENAI_API_KEY` so it is read in a thread-safe manner. Ultimately, it should be possible to go through the entire login flow from the TUI. This PR introduces a placeholder `LoginScreen` UI for that right now, though the new `codex login` subcommand introduced in this PR should be a viable workaround until the UI is ready. **Testing** Because the login flow is currently implemented in a standalone Python file, you can test it without building any Rust code as follows: ``` rm -rf /tmp/codex_home && mkdir /tmp/codex_home CODEX_HOME=/tmp/codex_home python3 codex-rs/login/src/login_with_chatgpt.py ``` For reference: * the original TypeScript implementation was introduced in openai#963 * support for redeeming credits was later added in openai#974
1 parent a67a67f commit 515b633

File tree

18 files changed

+1051
-25
lines changed

18 files changed

+1051
-25
lines changed

codex-cli/src/utils/get-api-key.tsx

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -382,6 +382,8 @@ async function handleCallback(
382382

383383
const exchanged = (await exchangeRes.json()) as {
384384
access_token: string;
385+
// NOTE(mbolin): I did not see the "key" property set in practice. Note
386+
// this property is not read by the code.
385387
key: string;
386388
};
387389

codex-rs/Cargo.lock

Lines changed: 14 additions & 0 deletions
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

codex-rs/Cargo.toml

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -9,6 +9,7 @@ members = [
99
"exec",
1010
"execpolicy",
1111
"linux-sandbox",
12+
"login",
1213
"mcp-client",
1314
"mcp-server",
1415
"mcp-types",

codex-rs/cli/Cargo.toml

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -20,6 +20,7 @@ clap = { version = "4", features = ["derive"] }
2020
codex-core = { path = "../core" }
2121
codex-common = { path = "../common", features = ["cli"] }
2222
codex-exec = { path = "../exec" }
23+
codex-login = { path = "../login" }
2324
codex-linux-sandbox = { path = "../linux-sandbox" }
2425
codex-mcp-server = { path = "../mcp-server" }
2526
codex-tui = { path = "../tui" }

codex-rs/cli/src/lib.rs

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,6 @@
11
pub mod debug_sandbox;
22
mod exit_status;
3+
pub mod login;
34
pub mod proto;
45

56
use clap::Parser;

codex-rs/cli/src/login.rs

Lines changed: 35 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,35 @@
1+
use codex_common::CliConfigOverrides;
2+
use codex_core::config::Config;
3+
use codex_core::config::ConfigOverrides;
4+
use codex_login::login_with_chatgpt;
5+
6+
pub async fn run_login_with_chatgpt(cli_config_overrides: CliConfigOverrides) -> ! {
7+
let cli_overrides = match cli_config_overrides.parse_overrides() {
8+
Ok(v) => v,
9+
Err(e) => {
10+
eprintln!("Error parsing -c overrides: {e}");
11+
std::process::exit(1);
12+
}
13+
};
14+
15+
let config_overrides = ConfigOverrides::default();
16+
let config = match Config::load_with_cli_overrides(cli_overrides, config_overrides) {
17+
Ok(config) => config,
18+
Err(e) => {
19+
eprintln!("Error loading configuration: {e}");
20+
std::process::exit(1);
21+
}
22+
};
23+
24+
let capture_output = false;
25+
match login_with_chatgpt(&config.codex_home, capture_output).await {
26+
Ok(_) => {
27+
eprintln!("Successfully logged in");
28+
std::process::exit(0);
29+
}
30+
Err(e) => {
31+
eprintln!("Error logging in: {e}");
32+
std::process::exit(1);
33+
}
34+
}
35+
}

codex-rs/cli/src/main.rs

Lines changed: 12 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,7 @@
11
use clap::Parser;
22
use codex_cli::LandlockCommand;
33
use codex_cli::SeatbeltCommand;
4+
use codex_cli::login::run_login_with_chatgpt;
45
use codex_cli::proto;
56
use codex_common::CliConfigOverrides;
67
use codex_exec::Cli as ExecCli;
@@ -36,6 +37,9 @@ enum Subcommand {
3637
#[clap(visible_alias = "e")]
3738
Exec(ExecCli),
3839

40+
/// Login with ChatGPT.
41+
Login(LoginCommand),
42+
3943
/// Experimental: run Codex as an MCP server.
4044
Mcp,
4145

@@ -63,7 +67,10 @@ enum DebugCommand {
6367
}
6468

6569
#[derive(Debug, Parser)]
66-
struct ReplProto {}
70+
struct LoginCommand {
71+
#[clap(skip)]
72+
config_overrides: CliConfigOverrides,
73+
}
6774

6875
fn main() -> anyhow::Result<()> {
6976
codex_linux_sandbox::run_with_sandbox(|codex_linux_sandbox_exe| async move {
@@ -88,6 +95,10 @@ async fn cli_main(codex_linux_sandbox_exe: Option<PathBuf>) -> anyhow::Result<()
8895
Some(Subcommand::Mcp) => {
8996
codex_mcp_server::run_main(codex_linux_sandbox_exe).await?;
9097
}
98+
Some(Subcommand::Login(mut login_cli)) => {
99+
prepend_config_flags(&mut login_cli.config_overrides, cli.config_overrides);
100+
run_login_with_chatgpt(login_cli.config_overrides).await;
101+
}
91102
Some(Subcommand::Proto(mut proto_cli)) => {
92103
prepend_config_flags(&mut proto_cli.config_overrides, cli.config_overrides);
93104
proto::run_main(proto_cli).await?;

codex-rs/core/Cargo.toml

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -16,6 +16,7 @@ async-channel = "2.3.1"
1616
base64 = "0.21"
1717
bytes = "1.10.1"
1818
codex-apply-patch = { path = "../apply-patch" }
19+
codex-login = { path = "../login" }
1920
codex-mcp-client = { path = "../mcp-client" }
2021
dirs = "6"
2122
env-flags = "0.1.1"

codex-rs/core/src/lib.rs

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -27,6 +27,7 @@ mod model_provider_info;
2727
pub use model_provider_info::ModelProviderInfo;
2828
pub use model_provider_info::WireApi;
2929
mod models;
30+
pub mod openai_api_key;
3031
mod openai_tools;
3132
mod project_doc;
3233
pub mod protocol;

codex-rs/core/src/model_provider_info.rs

Lines changed: 21 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -11,6 +11,7 @@ use std::collections::HashMap;
1111
use std::env::VarError;
1212

1313
use crate::error::EnvVarError;
14+
use crate::openai_api_key::get_openai_api_key;
1415

1516
/// Wire protocol that the provider speaks. Most third-party services only
1617
/// implement the classic OpenAI Chat Completions JSON schema, whereas OpenAI
@@ -52,20 +53,27 @@ impl ModelProviderInfo {
5253
/// cannot be found, returns an error.
5354
pub fn api_key(&self) -> crate::error::Result<Option<String>> {
5455
match &self.env_key {
55-
Some(env_key) => std::env::var(env_key)
56-
.and_then(|v| {
57-
if v.trim().is_empty() {
58-
Err(VarError::NotPresent)
59-
} else {
60-
Ok(Some(v))
61-
}
62-
})
63-
.map_err(|_| {
64-
crate::error::CodexErr::EnvVar(EnvVarError {
65-
var: env_key.clone(),
66-
instructions: self.env_key_instructions.clone(),
56+
Some(env_key) => {
57+
let env_value = if env_key == crate::openai_api_key::OPENAI_API_KEY_ENV_VAR {
58+
get_openai_api_key().map_or_else(|| Err(VarError::NotPresent), Ok)
59+
} else {
60+
std::env::var(env_key)
61+
};
62+
env_value
63+
.and_then(|v| {
64+
if v.trim().is_empty() {
65+
Err(VarError::NotPresent)
66+
} else {
67+
Ok(Some(v))
68+
}
6769
})
68-
}),
70+
.map_err(|_| {
71+
crate::error::CodexErr::EnvVar(EnvVarError {
72+
var: env_key.clone(),
73+
instructions: self.env_key_instructions.clone(),
74+
})
75+
})
76+
}
6977
None => Ok(None),
7078
}
7179
}

0 commit comments

Comments
 (0)