feat(skills): 新增 codex-session-reader skill#30
Conversation
Co-authored-by: OpenAI Codex <codex@openai.com>
|
Important Review skippedThis PR was authored by the user configured for CodeRabbit reviews. CodeRabbit does not review PRs authored by this user. It's recommended to use a dedicated user account to post CodeRabbit review feedback. ⚙️ Run configurationConfiguration used: Organization UI Review profile: CHILL Plan: Pro Run ID: You can disable this status message by setting the Use the checkbox below for a quick retry:
📝 WalkthroughWalkthroughAdds a new read-only "codex-session-reader" skill: documentation (README and SKILL.md) and a Python CLI that uses a synchronous JSON‑RPC client to fetch and render a single Codex thread via a codex app-server, with turn-slicing, validation, and Typer/Rich CLI entrypoints. Changes
Sequence DiagramsequenceDiagram
participant User as CLI User
participant CLI as codex_session_reader.py
participant Client as CodexAppServerClient
participant Server as Codex App Server
User->>CLI: invoke read_thread_command(thread_id, opts)
CLI->>CLI: validate_thread_id()
CLI->>Client: start codex app-server / __enter__()
Client->>Server: initialize (JSON-RPC)
Server-->>Client: initialize response
CLI->>Client: request thread/read (JSON-RPC)
Server-->>Client: thread/read response
Client-->>CLI: ThreadReadResponse
CLI->>CLI: select_turns_by_expr()
CLI->>CLI: render_thread_markdown() or serialize JSON
CLI->>User: emit_output (stdout)
CLI->>Client: __exit__() / cleanup
Estimated code review effort🎯 4 (Complex) | ⏱️ ~50 minutes Possibly related PRs
🚥 Pre-merge checks | ✅ 3✅ Passed checks (3 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. Comment |
There was a problem hiding this comment.
Actionable comments posted: 4
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: CHILL
Plan: Pro
Run ID: bbd0c2a2-0afa-45dd-bc0e-7c35c95cb537
📒 Files selected for processing (4)
README.mdskills/codex-session-reader/README.mdskills/codex-session-reader/SKILL.mdskills/codex-session-reader/scripts/codex_session_reader.py
Co-authored-by: OpenAI Codex <codex@openai.com>
Co-authored-by: OpenAI Codex <codex@openai.com>
Co-authored-by: OpenAI Codex <codex@openai.com>
Co-authored-by: OpenAI Codex <codex@openai.com>
There was a problem hiding this comment.
Actionable comments posted: 4
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: CHILL
Plan: Pro
Run ID: 6e250414-5853-422f-bcd5-06e10809aa92
📒 Files selected for processing (1)
skills/codex-session-reader/scripts/codex_session_reader.py
Co-authored-by: OpenAI Codex <codex@openai.com>
Co-authored-by: OpenAI Codex <codex@openai.com>
Co-authored-by: OpenAI Codex <codex@openai.com>
Co-authored-by: OpenAI Codex <codex@openai.com>
There was a problem hiding this comment.
Actionable comments posted: 1
🧹 Nitpick comments (1)
skills/codex-session-reader/scripts/codex_session_reader.py (1)
735-777: Avoid duplicate turn-slice work between JSON and Markdown paths.
selected_turnsandeffective_resultare computed even when--format markdown, then slicing is done again inrender_thread_markdown. Consider moving the pre-slice/model-copy logic inside the JSON branch only.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: CHILL
Plan: Pro
Run ID: 3e167a0b-3882-4d3b-993a-e4752b640c45
📒 Files selected for processing (1)
skills/codex-session-reader/scripts/codex_session_reader.py
Co-authored-by: OpenAI Codex <codex@openai.com>
Co-authored-by: OpenAI Codex <codex@openai.com>
There was a problem hiding this comment.
Actionable comments posted: 1
🧹 Nitpick comments (1)
skills/codex-session-reader/scripts/codex_session_reader.py (1)
256-268: Reset per-session stream buffers on__enter__for safe client reuse.Instance-scoped
_stdout_queue/_stderr_linespersist across runs. A stale EOF sentinel from a previous session can poison the next request if the same client object is reused.Proposed fix
self._process = process + self._stderr_lines.clear() + self._stdout_queue = Queue() self._stdout_thread = threading.Thread( target=self._drain_stdout, name="codex-session-reader-stdout", daemon=True, )
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: CHILL
Plan: Pro
Run ID: f572e2fc-bd14-47a3-9b6d-7198041330ea
📒 Files selected for processing (1)
skills/codex-session-reader/scripts/codex_session_reader.py
Co-authored-by: OpenAI Codex <codex@openai.com>
Co-authored-by: OpenAI Codex <codex@openai.com>
There was a problem hiding this comment.
Actionable comments posted: 1
🧹 Nitpick comments (2)
skills/codex-session-reader/scripts/codex_session_reader.py (2)
278-292: Closestdoutandstderrduring teardown as well.
__exit__()only closesstdin. Becauseself._processstays referenced on the client, the parent-sidestdout/stderrpipe objects can remain open until GC, which will accumulate file descriptors in long-lived processes or repeated tests. Close both streams and clearself._processafter teardown.Proposed fix
def __exit__(self, exc_type: object, exc: object, tb: object) -> None: """关闭 app-server 子进程。""" if self._process is None: return - if self._process.stdin: - self._process.stdin.close() - if self._process.poll() is None: - self._process.terminate() - try: - self._process.wait(timeout=3) - except subprocess.TimeoutExpired: - self._process.kill() - self._process.wait(timeout=3) + process = self._process + try: + if process.stdin: + process.stdin.close() + if process.poll() is None: + process.terminate() + try: + process.wait(timeout=3) + except subprocess.TimeoutExpired: + process.kill() + process.wait(timeout=3) + finally: + if process.stdout: + process.stdout.close() + if process.stderr: + process.stderr.close() + self._process = None
411-417: Makenotify()serialize models the same way asrequest().
request()acceptsAppModeland dumps it before encoding, butnotify()forwardsparamsverbatim. Passing a model here will fail later with a rawTypeErrorfromjson.dumps, so the public client API is currently inconsistent. Either mirror themodel_dump()branch or narrow the signature todict[str, Any] | None.Proposed fix
- def notify(self, method: str, params: Any | None) -> None: + def notify( + self, method: str, params: AppModel | dict[str, Any] | None + ) -> None: """发送 JSON-RPC 通知。""" payload: dict[str, Any] = {"method": method} if params is not None: - payload["params"] = params + payload["params"] = ( + params.model_dump(by_alias=True, exclude_none=True) + if isinstance(params, AppModel) + else params + ) self._send_json(payload)
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: CHILL
Plan: Pro
Run ID: ab827f5b-8d90-403b-bbdc-96911f5ede28
📒 Files selected for processing (1)
skills/codex-session-reader/scripts/codex_session_reader.py
Co-authored-by: OpenAI Codex <codex@openai.com>
Co-authored-by: OpenAI Codex <codex@openai.com>
Co-authored-by: OpenAI Codex <codex@openai.com>
Why
codex app-server官方接口读取数据,可以避免耦合底层本地存储格式,并降低后续维护成本。What
codex-session-readerskill,提供read子命令,通过codex app-server读取单个 Codex thread。--preview-only与 0-based、接近 Python 切片语法的--turns参数,用于控制输出范围。README.md的 skills 表格,补充codex-session-reader与遗漏的coderabbit-cli条目。Testing
uvx ruff check skills/codex-session-reader/scripts/codex_session_reader.py./skills/codex-session-reader/scripts/codex_session_reader.py --help./skills/codex-session-reader/scripts/codex_session_reader.py read --help./skills/codex-session-reader/scripts/codex_session_reader.py read <thread-id> --preview-only./skills/codex-session-reader/scripts/codex_session_reader.py read <thread-id> --turns :5./skills/codex-session-reader/scripts/codex_session_reader.py read <thread-id> --turns -5:./skills/codex-session-reader/scripts/codex_session_reader.py read <thread-id> --turns 10:-1./skills/codex-session-reader/scripts/codex_session_reader.py read <thread-id> --turns 13./skills/codex-session-reader/scripts/codex_session_reader.py read <thread-id> --turns 1:10:2(确认报错)