Commit c6fb72b
committed
feat: improve codex chat completions format conversion and response handling
This commit enhances the Codex API implementation with several key improvements:
## Response Format Standardization
- Made `usage` field optional in `OpenAIChatCompletionResponse` to match OpenAI API spec
- Improved type safety by returning `OpenAIUsage` objects instead of raw dictionaries
- Added better null checking for response data to prevent crashes on malformed responses
## Reasoning Content Processing
- Simplified reasoning block handling to follow official OpenAI Response API format
- Removed complex signature tracking and custom XML wrapping
- Fixed reasoning content to stream properly without custom modifications
- Improved chunk processing with cleaner state management
## Streaming Response Improvements
- Enhanced streaming response detection with better header checking
- Fixed usage data serialization in streaming chunks using proper model dumping
- Improved error handling for edge cases in streaming responses
## Documentation Updates
- Updated README with comprehensive Codex API documentation
- Added clear limitations section for chat completions endpoint
- Included working model examples (gpt-5) and configuration samples
- Improved formatting consistency throughout documentation
## Model and Proxy Service Updates
- Enhanced model mapping validation and error handling
- Improved proxy service resilience for various response formats
- Better handling of OpenAI parameter restrictions in Codex mode
These changes provide a more reliable and specification-compliant implementation of the Codex chat completions endpoint while maintaining backward compatibility.1 parent cbe4438 commit c6fb72b
File tree
7 files changed
+492
-188
lines changed- ccproxy
- adapters/openai
- api/routes
- services
- utils
- docs/user-guide
7 files changed
+492
-188
lines changed
0 commit comments