Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion document/content/docs/self-host/config/model/meta.en.json
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
{
"title": "Model Configuration",
"pages": ["intro", "ai-proxy", "one-api", "siliconCloud", "ppio"]
"pages": ["intro", "ai-proxy", "one-api", "siliconCloud", "ppio", "minimax"]
}
2 changes: 1 addition & 1 deletion document/content/docs/self-host/config/model/meta.json
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
{
"title": "模型配置方案",
"pages": ["intro", "ai-proxy", "one-api", "siliconCloud", "ppio"]
"pages": ["intro", "ai-proxy", "one-api", "siliconCloud", "ppio", "minimax"]
}
106 changes: 106 additions & 0 deletions document/content/docs/self-host/config/model/minimax.en.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,106 @@
---
title: Connect Models via MiniMax
description: Connect Models via MiniMax
---

import { Alert } from '@/components/docs/Alert';

[MiniMax](https://www.minimaxi.com) is an AI technology company that provides high-performance large language model API services. MiniMax's API is compatible with the OpenAI format, making it easy to integrate with FastGPT.

## 1. Get a MiniMax API Key

1. Visit [MiniMax Platform](https://platform.minimaxi.com), register and log in.
2. Go to the console and create an API Key.

## 2. Configure Models

### Option 1: Via OneAPI

Refer to the [OneAPI Integration Guide](/docs/self-host/config/model/one-api) to add a MiniMax channel in OneAPI:

- Channel type: Select **MiniMax** or **Custom Channel**
- Base URL: `https://api.minimax.io/v1`
- Enter your MiniMax API Key
- Model: `MiniMax-M2.5` or `MiniMax-M2.5-highspeed`

Once configured, enable the corresponding model in FastGPT.

### Option 2: Direct Integration (without OneAPI)

In the FastGPT model configuration page, add a custom model with the following information:

- **Model ID**: `MiniMax-M2.5`
- **Custom Request URL**: `https://api.minimax.io/v1/chat/completions`
- **Custom Request Key**: Enter your MiniMax API Key

<Alert icon="⚠️" context="warning">
MiniMax models require temperature to be in the range (0.0, 1.0]. A value of 0 is not supported. It is recommended to set the maximum temperature to 1.0.
</Alert>

### Via Configuration File

If you prefer configuring models via a configuration file, refer to the following JSON:

```json
[
{
"model": "MiniMax-M2.5",
"metadata": {
"isCustom": true,
"isActive": true,
"provider": "MiniMax",
"model": "MiniMax-M2.5",
"name": "MiniMax-M2.5",
"maxContext": 204000,
"maxResponse": 16000,
"quoteMaxToken": 200000,
"maxTemperature": 1.0,
"vision": true,
"datasetProcess": true,
"usedInClassify": true,
"usedInExtractFields": true,
"usedInToolCall": true,
"toolChoice": true,
"functionCall": true,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"requestUrl": "https://api.minimax.io/v1/chat/completions",
"requestAuth": "your-minimax-api-key"
}
},
{
"model": "MiniMax-M2.5-highspeed",
"metadata": {
"isCustom": true,
"isActive": true,
"provider": "MiniMax",
"model": "MiniMax-M2.5-highspeed",
"name": "MiniMax-M2.5-highspeed",
"maxContext": 204000,
"maxResponse": 16000,
"quoteMaxToken": 200000,
"maxTemperature": 1.0,
"vision": true,
"datasetProcess": true,
"usedInClassify": true,
"usedInExtractFields": true,
"usedInToolCall": true,
"toolChoice": true,
"functionCall": true,
"defaultSystemChatPrompt": "",
"defaultConfig": {}
}
}
]
```

## 3. Available Models

| Model Name | Max Context | Features |
| --- | --- | --- |
| MiniMax-M2.5 | 204K tokens | High-performance general model with vision and function calling support |
| MiniMax-M2.5-highspeed | 204K tokens | High-speed version, ideal for latency-sensitive scenarios |

## 4. Testing

After configuration, click the test button on the FastGPT model configuration page to verify the model is working properly.
106 changes: 106 additions & 0 deletions document/content/docs/self-host/config/model/minimax.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,106 @@
---
title: 通过 MiniMax 接入模型
description: 通过 MiniMax 接入模型
---

import { Alert } from '@/components/docs/Alert';

[MiniMax](https://www.minimaxi.com) 是一家通用人工智能科技公司,提供高性能的大语言模型 API 服务。MiniMax 的 API 兼容 OpenAI 格式,可以方便地接入 FastGPT。

## 1. 获取 MiniMax API Key

1. 访问 [MiniMax 开放平台](https://platform.minimaxi.com),注册并登录账号。
2. 进入控制台,创建 API Key。

## 2. 配置模型

### 方式一:通过 OneAPI 接入

参考 [OneAPI 接入教程](/docs/self-host/config/model/one-api),在 OneAPI 中添加 MiniMax 渠道:

- 渠道类型选择 **MiniMax** 或 **自定义渠道**
- Base URL 填写:`https://api.minimax.io/v1`
- 填写 MiniMax 的 API Key
- 模型填写:`MiniMax-M2.5` 或 `MiniMax-M2.5-highspeed`

配置完成后,在 FastGPT 中启用对应模型即可。

### 方式二:直接接入(不使用 OneAPI)

在 FastGPT 模型配置页面,添加自定义模型,填写以下信息:

- **模型 ID**:`MiniMax-M2.5`
- **自定义请求地址**:`https://api.minimax.io/v1/chat/completions`
- **自定义请求 Key**:填写 MiniMax 的 API Key

<Alert icon="⚠️" context="warning">
MiniMax 模型的 temperature 参数范围为 (0.0, 1.0],不支持设置为 0。建议将最大温度设置为 1.0。
</Alert>

### 通过配置文件接入

如果你更习惯通过配置文件配置模型,可以参考以下 JSON 配置:

```json
[
{
"model": "MiniMax-M2.5",
"metadata": {
"isCustom": true,
"isActive": true,
"provider": "MiniMax",
"model": "MiniMax-M2.5",
"name": "MiniMax-M2.5",
"maxContext": 204000,
"maxResponse": 16000,
"quoteMaxToken": 200000,
"maxTemperature": 1.0,
"vision": true,
"datasetProcess": true,
"usedInClassify": true,
"usedInExtractFields": true,
"usedInToolCall": true,
"toolChoice": true,
"functionCall": true,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"requestUrl": "https://api.minimax.io/v1/chat/completions",
"requestAuth": "your-minimax-api-key"
}
},
{
"model": "MiniMax-M2.5-highspeed",
"metadata": {
"isCustom": true,
"isActive": true,
"provider": "MiniMax",
"model": "MiniMax-M2.5-highspeed",
"name": "MiniMax-M2.5-highspeed",
"maxContext": 204000,
"maxResponse": 16000,
"quoteMaxToken": 200000,
"maxTemperature": 1.0,
"vision": true,
"datasetProcess": true,
"usedInClassify": true,
"usedInExtractFields": true,
"usedInToolCall": true,
"toolChoice": true,
"functionCall": true,
"defaultSystemChatPrompt": "",
"defaultConfig": {}
}
}
]
```

## 3. 可用模型

| 模型名称 | 最大上下文 | 特点 |
| --- | --- | --- |
| MiniMax-M2.5 | 204K tokens | 高性能通用模型,支持视觉、函数调用 |
| MiniMax-M2.5-highspeed | 204K tokens | 高速版本,适合对响应速度要求较高的场景 |

## 4. 测试

配置完成后,可以在 FastGPT 模型配置页面点击测试按钮,验证模型是否正常工作。
Loading
Loading