Skip to content

Conversation

@roomote
Copy link

@roomote roomote bot commented Nov 6, 2025

Description

This PR addresses Issue #9082 by migrating the Minimax provider from OpenAI SDK to Anthropic SDK, as requested. According to Minimax documentation, their models achieve higher accuracy when integrated with the Anthropic SDK.

Changes

  • 🔄 Replaced OpenAI SDK with Anthropic SDK for the Minimax provider implementation
  • Added support for Anthropic-specific features:
    • Message streaming with proper chunk handling
    • Prompt caching when available (for supported models)
    • Reasoning/thinking block support
    • Cache read/write token tracking
  • 🧪 Updated all tests to mock Anthropic SDK instead of OpenAI
  • 💰 Implemented cost calculation based on Minimax pricing
  • 🔧 Maintained backward compatibility with existing API key and base URL configurations

Technical Details

The new implementation:

  • Extends and implements interface
  • Uses the Anthropic SDK's message creation API with streaming support
  • Handles both international () and China () endpoints
  • Includes fallback to tiktoken for token counting if API doesn't support it
  • Supports both MiniMax-M2 and MiniMax-M2-Stable models with their full 192k context window

Testing

  • ✅ All existing tests have been updated and are passing
  • ✅ Test coverage includes:
    • API key validation
    • Base URL configuration (international/China)
    • Model selection and configuration
    • Message streaming with text and reasoning blocks
    • Usage tracking and token counting
    • Prompt caching parameters
    • Temperature configuration

References

Review Notes

The implementation follows the same patterns as the existing Anthropic provider in the codebase, ensuring consistency and maintainability.


Important

Migrates Minimax provider to Anthropic SDK, adding support for Anthropic-specific features and updating tests.

  • Behavior:
    • Migrates Minimax provider from OpenAI SDK to Anthropic SDK in minimax.ts.
    • Adds Anthropic-specific features: message streaming, prompt caching, reasoning blocks, and token tracking.
    • Implements cost calculation based on Minimax pricing.
    • Maintains backward compatibility with existing API key and base URL configurations.
  • Testing:
    • Updates tests in minimax.spec.ts to mock Anthropic SDK.
    • Tests cover API key validation, base URL configuration, model selection, message streaming, usage tracking, and token counting.
  • Functions:
    • MiniMaxHandler class in minimax.ts now uses Anthropic SDK for message creation and token counting.
    • createMessage handles prompt caching and reasoning blocks.
    • completePrompt and countTokens methods updated for Anthropic SDK.

This description was created by Ellipsis for a8829f4. You can customize this summary. It will automatically update as commits are pushed.

- Replaced OpenAI SDK with Anthropic SDK for Minimax provider
- Updated MiniMaxHandler to extend BaseProvider and implement Anthropic message streaming
- Added support for prompt caching when available
- Updated tests to mock Anthropic SDK instead of OpenAI
- Maintained backward compatibility with existing API

Addresses #9082
@roomote roomote bot requested review from cte, jr and mrubens as code owners November 6, 2025 21:06
@dosubot dosubot bot added size:L This PR changes 100-499 lines, ignoring generated files. enhancement New feature or request labels Nov 6, 2025
@roomote
Copy link
Author

roomote bot commented Nov 6, 2025

Rooviewer Clock   See task on Roo Cloud

Review complete. Found 2 code quality improvements to align with established patterns.

  • Use centralized calculateApiCostAnthropic() function instead of duplicating cost calculation logic
  • Pass defaultTemperature to getModelParams() instead of overriding after spread

Mention @roomote in a comment to request specific changes to this pull request or fix all unresolved issues.

Comment on lines +184 to +199
if (inputTokens > 0 || outputTokens > 0 || cacheWriteTokens > 0 || cacheReadTokens > 0) {
// MiniMax pricing (per million tokens):
// Input: $0.3, Output: $1.2, Cache writes: $0.375, Cache reads: $0.03
const inputCost = (inputTokens / 1_000_000) * (info.inputPrice || 0)
const outputCost = (outputTokens / 1_000_000) * (info.outputPrice || 0)
const cacheWriteCost = (cacheWriteTokens / 1_000_000) * (info.cacheWritesPrice || 0)
const cacheReadCost = (cacheReadTokens / 1_000_000) * (info.cacheReadsPrice || 0)
const totalCost = inputCost + outputCost + cacheWriteCost + cacheReadCost

yield {
type: "usage",
inputTokens: 0,
outputTokens: 0,
totalCost,
}
}
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This duplicates cost calculation logic that exists in calculateApiCostAnthropic(). The Anthropic provider uses this centralized function to ensure consistency across providers. This manual calculation makes the code harder to maintain if pricing logic changes.

Suggested change
if (inputTokens > 0 || outputTokens > 0 || cacheWriteTokens > 0 || cacheReadTokens > 0) {
// MiniMax pricing (per million tokens):
// Input: $0.3, Output: $1.2, Cache writes: $0.375, Cache reads: $0.03
const inputCost = (inputTokens / 1_000_000) * (info.inputPrice || 0)
const outputCost = (outputTokens / 1_000_000) * (info.outputPrice || 0)
const cacheWriteCost = (cacheWriteTokens / 1_000_000) * (info.cacheWritesPrice || 0)
const cacheReadCost = (cacheReadTokens / 1_000_000) * (info.cacheReadsPrice || 0)
const totalCost = inputCost + outputCost + cacheWriteCost + cacheReadCost
yield {
type: "usage",
inputTokens: 0,
outputTokens: 0,
totalCost,
}
}
if (inputTokens > 0 || outputTokens > 0 || cacheWriteTokens > 0 || cacheReadTokens > 0) {
const { totalCost } = calculateApiCostAnthropic(
this.getModel().info,
inputTokens,
outputTokens,
cacheWriteTokens,
cacheReadTokens,
)
yield {
type: "usage",
inputTokens: 0,
outputTokens: 0,
totalCost,
}
}

Fix it with Roo Code or mention @roomote and request a fix.


const message = await this.client.messages.create({
model,
max_tokens: 16384,
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In completePrompt, max_tokens is hardcoded to 16384. Consider using the model’s maxTokens from getModel() for consistency (or add a comment if intentional).

Comment on lines +207 to +220
const params = getModelParams({
format: "anthropic",
modelId: id,
model: info,
settings: this.options,
})

return {
id,
info,
...params,
temperature: this.options.modelTemperature ?? MINIMAX_DEFAULT_TEMPERATURE,
}
}
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Temperature should be handled by passing defaultTemperature to getModelParams() rather than overriding after spreading params. This bypasses the centralized parameter handling logic and is inconsistent with other providers like AnthropicHandler.

Suggested change
const params = getModelParams({
format: "anthropic",
modelId: id,
model: info,
settings: this.options,
})
return {
id,
info,
...params,
temperature: this.options.modelTemperature ?? MINIMAX_DEFAULT_TEMPERATURE,
}
}
const params = getModelParams({
format: "anthropic",
modelId: id,
model: info,
settings: this.options,
defaultTemperature: MINIMAX_DEFAULT_TEMPERATURE,
})
return {
id,
info,
...params,
}

Fix it with Roo Code or mention @roomote and request a fix.

@hannesrudolph hannesrudolph added the Issue/PR - Triage New issue. Needs quick review to confirm validity and assign labels. label Nov 6, 2025
@daniel-lxs
Copy link
Member

Implementation is OpenAI compatible not Anthropic SDK

@daniel-lxs daniel-lxs closed this Nov 6, 2025
@github-project-automation github-project-automation bot moved this from New to Done in Roo Code Roadmap Nov 6, 2025
@github-project-automation github-project-automation bot moved this from Triage to Done in Roo Code Roadmap Nov 6, 2025
@kavehsfv
Copy link

kavehsfv commented Nov 6, 2025

Implementation is OpenAI compatible not Anthropic SDK

Hi @daniel-lxs
As they mentioned in their website, MinMax M2 is compatible with Anthtripic SDK. Link

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

enhancement New feature or request Issue/PR - Triage New issue. Needs quick review to confirm validity and assign labels. size:L This PR changes 100-499 lines, ignoring generated files.

Projects

Status: Done

Development

Successfully merging this pull request may close these issues.

[ENHANCEMENT] Request: Implement Minimax Provider in Anthropic SDK

5 participants