Skip to content

feat: add MiniMax chat model support#1320

Closed
octo-patch wants to merge 1 commit intoagentscope-ai:mainfrom
octo-patch:feat/add-minimax-model
Closed

feat: add MiniMax chat model support#1320
octo-patch wants to merge 1 commit intoagentscope-ai:mainfrom
octo-patch:feat/add-minimax-model

Conversation

@octo-patch
Copy link

Summary

Adds MiniMaxChatModel as a dedicated model class for MiniMax's LLM API, enabling first-class support for MiniMax models in AgentScope.

Changes

  • src/agentscope/model/_minimax_model.py: New model class extending OpenAIChatModel with MiniMax-specific defaults (base URL, API key env var, model names)
  • src/agentscope/model/__init__.py: Added MiniMaxChatModel to module exports

Usage

from agentscope.model import MiniMaxChatModel

model = MiniMaxChatModel(
    model_name="MiniMax-M2.5",
    api_key="your-api-key",  # or set MINIMAX_API_KEY env var
)

Key Details

  • Extends OpenAIChatModel since MiniMax API is OpenAI-compatible
  • Default base URL: https://api.minimax.io/v1
  • Default model: MiniMax-M2.5 (204K context window)
  • Also supports MiniMax-M2.5-highspeed for speed-optimized inference
  • Reads API key from MINIMAX_API_KEY environment variable
  • Temperature note: MiniMax does not accept exactly 0; use small positive values

Test Plan

  • Import MiniMaxChatModel from agentscope.model
  • Verify default base URL is set to MiniMax endpoint
  • Test chat completion with MiniMax-M2.5
  • Test streaming mode
  • Verify MINIMAX_API_KEY env var is read when api_key not provided

Add MiniMaxChatModel as a dedicated model class for MiniMax's
OpenAI-compatible API, supporting MiniMax-M2.5 and
MiniMax-M2.5-highspeed models.

- New _minimax_model.py extending OpenAIChatModel with MiniMax defaults
- Auto-configures base_url to https://api.minimax.io/v1
- Reads API key from MINIMAX_API_KEY environment variable
- Registered in model __init__.py for direct import
@cla-assistant
Copy link

cla-assistant bot commented Mar 12, 2026

CLA assistant check
Thank you for your submission! We really appreciate it. Like many open source projects, we ask that you sign our Contributor License Agreement before we can accept your contribution.
You have signed the CLA already but the status is still pending? Let us recheck it.

@qbc2016
Copy link
Member

qbc2016 commented Mar 17, 2026

Thanks for this PR!

There's already a similar PR (#1302) that implements this feature. Since theirs was opened first, we'll proceed with reviewing that one.

If you have ideas for improvements or a different implementation approach, feel free to share your thoughts on PR #1302 or the related discussion. We value collaboration and would love to hear your perspective!

Thanks again for contributing!

To keep the discussion and review in one place, I’m going to close this one.

@qbc2016 qbc2016 closed this Mar 17, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants