Skip to content

Conversation

@wangxiaolong100
Copy link
Contributor

@wangxiaolong100 wangxiaolong100 commented Sep 2, 2025

Related GitHub Issue

Closes: #6936

Description

The new version of the OpenAI API uses max_completion_tokens instead of the previous max_tokens. However, other providers such as Qwen, Moonshot, and DeepSeek still use the old max_tokens. Therefore, I've modified the request to include both parameters simultaneously. Additionally, according to the OpenAI API documentation, the O-series models are incompatible with max_tokens. Hence, I've added a condition to include only max_completion_tokens when the model is an O-series model.

Test Procedure

Using MoonPalace, you can see that both parameters are included in the request body: "max_tokens": -1, "max_completion_tokens": -1.

Pre-Submission Checklist

  • Issue Linked: This PR is linked to an approved GitHub Issue (see "Related GitHub Issue" above).
  • Scope: My changes are focused on the linked issue (one major feature/fix per PR).
  • Self-Review: I have performed a thorough self-review of my code.
  • Testing: New and/or updated tests have been added to cover my changes (if applicable).
  • Documentation Impact: I have considered if my changes require documentation updates (see "Documentation Updates" section below).
  • Contribution Guidelines: I have read and agree to the Contributor Guidelines.

Get in Touch

Discord username: haha.w


Important

Fix handling of max_tokens and max_completion_tokens for different model providers in openai.ts, ensuring compatibility with O-series models.

  • Behavior:
    • Modify addMaxTokensIfNeeded() in openai.ts to include both max_tokens and max_completion_tokens for compatibility with different model providers.
    • Add condition to use only max_completion_tokens for O-series models using isO3FamilyModel().
  • Functions:
    • Introduce isO3FamilyModel() to check if a model is part of the O-series in openai.ts.
  • Misc:
    • Update comments in openai.ts to reflect changes in token handling logic.

This description was created by Ellipsis for 3f1b4cb. You can customize this summary. It will automatically update as commits are pushed.

@dosubot dosubot bot added size:S This PR changes 10-29 lines, ignoring generated files. bug Something isn't working labels Sep 2, 2025
@hannesrudolph hannesrudolph added the Issue/PR - Triage New issue. Needs quick review to confirm validity and assign labels. label Sep 2, 2025
@daniel-lxs daniel-lxs moved this from Triage to PR [Needs Prelim Review] in Roo Code Roadmap Sep 3, 2025
@hannesrudolph hannesrudolph added PR - Needs Preliminary Review and removed Issue/PR - Triage New issue. Needs quick review to confirm validity and assign labels. labels Sep 3, 2025
@wangxiaolong100
Copy link
Contributor Author

I've implemented this fix specifically in the Moonshot provider implementation, so I'm closing this PR as it may have a broader impact.

@github-project-automation github-project-automation bot moved this from New to Done in Roo Code Roadmap Sep 5, 2025
@github-project-automation github-project-automation bot moved this from PR [Needs Prelim Review] to Done in Roo Code Roadmap Sep 5, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

bug Something isn't working PR - Needs Preliminary Review size:S This PR changes 10-29 lines, ignoring generated files.

Projects

Archived in project

Development

Successfully merging this pull request may close these issues.

Output is truncated when using Kimi K2

2 participants