feat: add MiniMax as first-class LLM provider#210
feat: add MiniMax as first-class LLM provider#210octo-patch wants to merge 1 commit intocamel-ai:mainfrom
Conversation
Add MiniMax model support via CAMEL's OpenAI-compatible model backend, enabling OASIS agents to use MiniMax-M2.7 and MiniMax-M2.7-highspeed models (1M context window) through the OpenAI-compatible API at https://api.minimax.io/v1. Changes: - oasis/minimax.py: create_minimax_model() helper with temperature clamping (MiniMax requires temp in (0.0, 1.0]), model validation, and MINIMAX_API_KEY env var auto-detection - examples/reddit_simulation_minimax.py: Complete Reddit simulation example using MiniMax-M2.7 - test/agent/test_minimax_provider.py: 20 unit tests + 3 integration tests covering validation, temperature clamping, factory invocation, imports, and SocialAgent compatibility - README.md: MiniMax provider documentation with usage instructions - oasis/__init__.py: Export create_minimax_model at package level
echo-yiyiyi
left a comment
There was a problem hiding this comment.
Thanks @octo-patch very much for your PR! Left some minor comments.
| } | ||
|
|
||
|
|
||
| def create_minimax_model( |
There was a problem hiding this comment.
From my perspective, CAMEL already support different models and every model created by ModelFactory can be used in OASIS. So it is not that necessary to provide an additional function to create specific model. But it is necessary to provide such script in the example, that is, to include this part into example script.
What do you think?
| from camel.models import BaseModelBackend | ||
| from camel.types import ModelPlatformType | ||
|
|
||
| from oasis.minimax import ( |
There was a problem hiding this comment.
And the test can be changed to import from the example.
| ### Using MiniMax as the LLM Provider | ||
|
|
||
| OASIS supports [MiniMax](https://www.minimaxi.com/) models via the built-in `create_minimax_model()` helper. MiniMax offers an OpenAI-compatible API with models such as **MiniMax-M2.7** (1M context window) and **MiniMax-M2.7-highspeed** (faster variant). | ||
|
|
||
| 1. Set your MiniMax API key: | ||
|
|
||
| ```bash | ||
| export MINIMAX_API_KEY=<your MiniMax API key> | ||
| ``` | ||
|
|
||
| 2. Use `create_minimax_model()` in your simulation: | ||
|
|
||
| ```python | ||
| from oasis.minimax import create_minimax_model | ||
|
|
||
| minimax_model = create_minimax_model("MiniMax-M2.7") | ||
| # Use minimax_model the same way as any other CAMEL model backend | ||
| ``` | ||
|
|
||
| See [`examples/reddit_simulation_minimax.py`](examples/reddit_simulation_minimax.py) for a complete simulation example. | ||
|
|
There was a problem hiding this comment.
We plan to only include the most representative documentation in the README.md. Could you please move this part to our full documentation? Such as add these content to https://github.com/camel-ai/oasis/blob/main/docs/key_modules/models.mdx.
Summary
Changes
Test plan
Run tests:
Unit tests (no API key needed):
pytest test/agent/test_minimax_provider.py -k "not Integration"
Integration tests:
export MINIMAX_API_KEY=your_key
pytest test/agent/test_minimax_provider.py::TestMiniMaxIntegration