Skip to content

feat: add MiniMax as first-class LLM provider#210

Open
octo-patch wants to merge 1 commit intocamel-ai:mainfrom
octo-patch:feature/add-minimax-provider
Open

feat: add MiniMax as first-class LLM provider#210
octo-patch wants to merge 1 commit intocamel-ai:mainfrom
octo-patch:feature/add-minimax-provider

Conversation

@octo-patch
Copy link
Copy Markdown

Summary

  • Add MiniMax as a first-class LLM provider for OASIS social simulations, using CAMEL's OPENAI_COMPATIBLE_MODEL backend with the OpenAI-compatible API at https://api.minimax.io/v1
  • Support MiniMax-M2.7 (flagship, 1M context) and MiniMax-M2.7-highspeed (faster variant) models
  • create_minimax_model() helper with auto-detection of MINIMAX_API_KEY env var and temperature clamping to MiniMax's (0.0, 1.0] range

Changes

File Description
oasis/minimax.py create_minimax_model() helper, model registry, temperature clamping
oasis/init.py Export create_minimax_model at package level
examples/reddit_simulation_minimax.py Complete Reddit simulation example using MiniMax-M2.7
test/agent/test_minimax_provider.py 20 unit tests + 3 integration tests
README.md MiniMax provider documentation with usage instructions

Test plan

  • 20 unit tests covering model validation, temperature clamping, factory invocation, imports, and SocialAgent compatibility
  • 3 integration tests (require MINIMAX_API_KEY) verifying real model creation and agent sign-up
  • All tests pass locally

Run tests:

Unit tests (no API key needed):
pytest test/agent/test_minimax_provider.py -k "not Integration"

Integration tests:
export MINIMAX_API_KEY=your_key
pytest test/agent/test_minimax_provider.py::TestMiniMaxIntegration

Add MiniMax model support via CAMEL's OpenAI-compatible model backend,
enabling OASIS agents to use MiniMax-M2.7 and MiniMax-M2.7-highspeed
models (1M context window) through the OpenAI-compatible API at
https://api.minimax.io/v1.

Changes:
- oasis/minimax.py: create_minimax_model() helper with temperature
  clamping (MiniMax requires temp in (0.0, 1.0]), model validation,
  and MINIMAX_API_KEY env var auto-detection
- examples/reddit_simulation_minimax.py: Complete Reddit simulation
  example using MiniMax-M2.7
- test/agent/test_minimax_provider.py: 20 unit tests + 3 integration
  tests covering validation, temperature clamping, factory invocation,
  imports, and SocialAgent compatibility
- README.md: MiniMax provider documentation with usage instructions
- oasis/__init__.py: Export create_minimax_model at package level
Copy link
Copy Markdown
Member

@echo-yiyiyi echo-yiyiyi left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @octo-patch very much for your PR! Left some minor comments.

}


def create_minimax_model(
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

From my perspective, CAMEL already support different models and every model created by ModelFactory can be used in OASIS. So it is not that necessary to provide an additional function to create specific model. But it is necessary to provide such script in the example, that is, to include this part into example script.

What do you think?

from camel.models import BaseModelBackend
from camel.types import ModelPlatformType

from oasis.minimax import (
Copy link
Copy Markdown
Member

@echo-yiyiyi echo-yiyiyi Apr 1, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

And the test can be changed to import from the example.

Comment on lines +232 to +252
### Using MiniMax as the LLM Provider

OASIS supports [MiniMax](https://www.minimaxi.com/) models via the built-in `create_minimax_model()` helper. MiniMax offers an OpenAI-compatible API with models such as **MiniMax-M2.7** (1M context window) and **MiniMax-M2.7-highspeed** (faster variant).

1. Set your MiniMax API key:

```bash
export MINIMAX_API_KEY=<your MiniMax API key>
```

2. Use `create_minimax_model()` in your simulation:

```python
from oasis.minimax import create_minimax_model

minimax_model = create_minimax_model("MiniMax-M2.7")
# Use minimax_model the same way as any other CAMEL model backend
```

See [`examples/reddit_simulation_minimax.py`](examples/reddit_simulation_minimax.py) for a complete simulation example.

Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We plan to only include the most representative documentation in the README.md. Could you please move this part to our full documentation? Such as add these content to https://github.com/camel-ai/oasis/blob/main/docs/key_modules/models.mdx.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants