@@ -509,9 +509,12 @@ When `smartExtraction` is enabled (default: `true`), the plugin uses an LLM to i
509509| Field | Type | Default | Description |
510510| -------| ------| ---------| -------------|
511511| ` smartExtraction ` | boolean | ` true ` | Enable/disable LLM-powered 6-category extraction |
512+ | ` llm.auth ` | string | ` api-key ` | ` api-key ` uses ` llm.apiKey ` / ` embedding.apiKey ` ; ` oauth ` uses a project-scoped OAuth token file |
512513| ` llm.apiKey ` | string | * (falls back to ` embedding.apiKey ` )* | API key for the LLM provider |
513514| ` llm.model ` | string | ` openai/gpt-oss-120b ` | LLM model name |
514515| ` llm.baseURL ` | string | * (falls back to ` embedding.baseURL ` )* | LLM API endpoint |
516+ | ` llm.oauthProvider ` | string | ` openai-codex ` | OAuth provider id used when ` llm.auth ` is ` oauth ` |
517+ | ` llm.oauthPath ` | string | ` .memory-lancedb-pro/oauth.json ` | Project-scoped OAuth token file used when ` llm.auth ` is ` oauth ` |
515518| ` extractMinMessages ` | number | ` 2 ` | Minimum messages before extraction triggers |
516519| ` extractMaxChars ` | number | ` 8000 ` | Maximum characters sent to the LLM |
517520
@@ -534,6 +537,21 @@ Full config (separate LLM endpoint):
534537}
535538```
536539
540+ OAuth ` llm ` config (use existing Codex / ChatGPT login cache for LLM calls):
541+ ``` json
542+ {
543+ "llm" : { "auth" : " oauth" , "oauthProvider" : " openai-codex" , "model" : " gpt-5.4" , "oauthPath" : " .memory-lancedb-pro/oauth.json" }
544+ }
545+ ```
546+
547+ Notes for ` llm.auth: "oauth" ` :
548+
549+ - ` llm.oauthProvider ` is currently ` openai-codex ` .
550+ - OAuth tokens are project-scoped by default and should live in ` .memory-lancedb-pro/oauth.json ` .
551+ - You can set ` llm.oauthPath ` if you want to store that file somewhere else inside the project.
552+ - In ` oauth ` mode, leave ` llm.baseURL ` unset unless you intentionally want a custom ChatGPT/Codex-compatible backend.
553+ - This makes token rotation and revocation local to the project instead of sharing ` ~/.codex/auth.json ` across unrelated workspaces.
554+
537555Disable: ` { "smartExtraction": false } `
538556
539557</details >
@@ -748,6 +766,9 @@ The **agent workspace** is the agent's working directory (default: `~/.openclaw/
748766openclaw memory-pro list [--scope global] [--category fact] [--limit 20] [--json]
749767openclaw memory-pro search " query" [--scope global] [--limit 10] [--json]
750768openclaw memory-pro stats [--scope global] [--json]
769+ openclaw memory-pro auth login [--provider openai-codex] [--model gpt-5.4] [--oauth-path /abs/path/oauth.json]
770+ openclaw memory-pro auth status
771+ openclaw memory-pro auth logout
751772openclaw memory-pro delete < id>
752773openclaw memory-pro delete-bulk --scope global [--before 2025-01-01] [--dry-run]
753774openclaw memory-pro export [--scope global] [--output memories.json]
@@ -757,6 +778,13 @@ openclaw memory-pro upgrade [--dry-run] [--batch-size 10] [--no-llm] [--limit N]
757778openclaw memory-pro migrate check [--source /path]
758779openclaw memory-pro migrate run [--source /path] [--dry-run] [--skip-existing]
759780openclaw memory-pro migrate verify [--source /path]
781+
782+ OAuth login flow:
783+
784+ 1. Run ` openclaw memory-pro auth login`
785+ 2. If ` --provider` is omitted in an interactive terminal, the CLI shows an OAuth provider picker before opening the browser
786+ 3. The command prints an authorization URL and opens your browser unless ` --no-browser` is set
787+ 4. After the callback succeeds, the command saves a project OAuth file and replaces the plugin ` llm` config with OAuth settings (` auth` , ` oauthProvider` , ` model` , ` oauthPath` )
760788```
761789
762790---
0 commit comments