You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Add return_token_ids_alongside parameter to OpenAI API endpoints
- Add optional return_token_ids_alongside parameter to ChatCompletionRequest and CompletionRequest
- Include token_ids and prompt_token_ids fields in response models when requested
- Implement conditional logic in serving endpoints to return token IDs alongside generated text
- Useful for debugging and agent scenarios where token-level tracing is needed
🤖 Generated with [Claude Code](https://claude.ai/code)
Co-Authored-By: Claude <[email protected]>
0 commit comments