Skip to content

feat: add MiniMax LLM provider support#883

Open
octo-patch wants to merge 1 commit intoMODSetter:devfrom
octo-patch:feat/complete-minimax-provider-support
Open

feat: add MiniMax LLM provider support#883
octo-patch wants to merge 1 commit intoMODSetter:devfrom
octo-patch:feat/complete-minimax-provider-support

Conversation

@octo-patch
Copy link

@octo-patch octo-patch commented Mar 14, 2026

Summary

This PR adds complete MiniMax LLM provider support to SurfSense, enabling users to use MiniMax's M2.5 series models (with 204K context window) through the existing LiteLLM integration.

Changes

Backend:

  • Add MINIMAX to LiteLLMProvider enum
  • Add MiniMax-specific model configuration (MiniMax-M2.5, MiniMax-M2.5-highspeed)
  • Database migration to support the new provider

Frontend:

  • Add MiniMax provider option to the model selection UI
  • Add MiniMax branding/icon assets

Re-opened from #881, now targeting dev branch as requested.

High-level PR Summary

This PR adds MiniMax LLM provider support to SurfSense, enabling integration with MiniMax's M2.5 series models that offer a 204K context window. The implementation follows the existing LiteLLM integration pattern by adding the MINIMAX provider to the backend enum, creating a database migration, configuring MiniMax-specific models (MiniMax-M2.5 and MiniMax-M2.5-highspeed) with OpenAI-compatible routing, and updating the frontend UI with provider options, model selections, and branding assets. Documentation in Chinese is also updated with comprehensive MiniMax configuration instructions including API key setup, available models, and usage recommendations.

⏱️ Estimated Review Time: 15-30 minutes

💡 Review Order Suggestion
Order File Path
1 surfsense_backend/app/db.py
2 surfsense_backend/alembic/versions/106_add_minimax_to_litellmprovider_enum.py
3 surfsense_web/contracts/types/new-llm-config.types.ts
4 surfsense_web/contracts/enums/llm-providers.ts
5 surfsense_web/contracts/enums/llm-models.ts
6 surfsense_backend/app/agents/new_chat/llm_config.py
7 surfsense_backend/app/services/llm_router_service.py
8 surfsense_backend/app/services/llm_service.py
9 surfsense_web/components/icons/providers/minimax.svg
10 surfsense_web/components/icons/providers/index.ts
11 surfsense_web/lib/provider-icons.tsx
12 surfsense_backend/app/config/global_llm_config.example.yaml
13 docs/chinese-llm-setup.md

Need help? Join our Discord

Analyze latest changes

Add full MiniMax provider support across the entire stack:

Backend:
- Add MINIMAX to LiteLLMProvider enum in db.py
- Add MINIMAX mapping to all provider_map dicts in llm_service.py,
  llm_router_service.py, and llm_config.py
- Add Alembic migration (rev 106) for PostgreSQL enum
- Add MiniMax M2.5 example in global_llm_config.example.yaml

Frontend:
- Add MiniMax to LLM_PROVIDERS enum with apiBase
- Add MiniMax-M2.5 and MiniMax-M2.5-highspeed to LLM_MODELS
- Add MINIMAX to Zod validation schema
- Add MiniMax SVG icon and wire up in provider-icons

Docs:
- Add MiniMax setup guide in chinese-llm-setup.md

MiniMax uses an OpenAI-compatible API (https://api.minimax.io/v1)
with models supporting up to 204K context window.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Copy link

@recurseml recurseml bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Review by RecurseML

🔍 Review performed on 49d8f41..760aa38

✨ No bugs found, your code is sparkling clean

✅ Files analyzed, no issues (13)

docs/chinese-llm-setup.md
surfsense_backend/alembic/versions/106_add_minimax_to_litellmprovider_enum.py
surfsense_backend/app/agents/new_chat/llm_config.py
surfsense_backend/app/config/global_llm_config.example.yaml
surfsense_backend/app/db.py
surfsense_backend/app/services/llm_router_service.py
surfsense_backend/app/services/llm_service.py
surfsense_web/components/icons/providers/index.ts
surfsense_web/components/icons/providers/minimax.svg
surfsense_web/contracts/enums/llm-models.ts
surfsense_web/contracts/enums/llm-providers.ts
surfsense_web/contracts/types/new-llm-config.types.ts
surfsense_web/lib/provider-icons.tsx

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant