Skip to content

feat(providers): add MiniMax AI provider support#1028

Open
ximiximi423 wants to merge 4 commits intoItzCrazyKns:masterfrom
ximiximi423:master
Open

feat(providers): add MiniMax AI provider support#1028
ximiximi423 wants to merge 4 commits intoItzCrazyKns:masterfrom
ximiximi423:master

Conversation

@ximiximi423
Copy link

@ximiximi423 ximiximi423 commented Mar 6, 2026

Add MiniMax as a new LLM provider, enabling users to use MiniMax models through the OpenAI-compatible API endpoint.


Summary by cubic

Added MiniMax as a new chat provider via an OpenAI-compatible endpoint. Streaming now strips tags across chunk boundaries, flushes buffered output on stream end, and preserves tool call updates; added vitest tests with coverage.

  • New Features

    • Provider key minimax; chat models MiniMax-M2.5 and MiniMax-M2.5-highspeed.
    • Config: API Key (MINIMAX_API_KEY) and Base URL (MINIMAX_BASE_URL, default https://api.minimax.io/v1).
    • Streams and object generation strip … and repair JSON when needed.
    • Added vitest and @vitest/coverage-v8 with test scripts (test, test:run, test:coverage) and 18 unit tests for streaming logic.
  • Bug Fixes

    • Streaming: chunk-safe stripping, flush remaining buffer at end, discard incomplete think content.
    • Preserve toolCallChunk updates even when no visible text is emitted.

Written for commit 960b426. Summary will update on new commits.

Add MiniMax as a new LLM provider, enabling users to use MiniMax models
through the OpenAI-compatible API endpoint.

Made-with: Cursor
Copy link
Contributor

@cubic-dev-ai cubic-dev-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

2 issues found across 3 files

Prompt for AI agents (unresolved issues)

Check if these issues are valid — if so, understand the root cause of each and fix them. If appropriate, use sub-agents to investigate and fix each issue separately.


<file name="src/lib/models/providers/minimax/minimaxLLM.ts">

<violation number="1" location="src/lib/models/providers/minimax/minimaxLLM.ts:75">
P2: Streaming think-tag stripping is not chunk-boundary safe: when a `</think>` is split across chunks, the partial delimiter is discarded and `insideThinkTag` stays true, suppressing all subsequent output.</violation>

<violation number="2" location="src/lib/models/providers/minimax/minimaxLLM.ts:86">
P2: streamText drops toolCallChunk updates when no visible output is emitted, so function-call deltas can be lost if a chunk contains only tool call data.</violation>
</file>

Since this is your first cubic review, here's how it works:

  • cubic automatically reviews your code and comments on bugs and improvements
  • Teach cubic by replying to its comments. cubic learns from your replies and gets better over time
  • Add one-off context when rerunning by tagging @cubic-dev-ai with guidance or docs links (including llms.txt)
  • Ask questions if you need clarification on any suggestion

Reply with feedback, questions, or to request a fix. Tag @cubic-dev-ai to re-run a review.

…olCallChunk

- Fix streaming think-tag stripping to handle tags split across chunk boundaries
- Preserve toolCallChunk updates when no visible text output is emitted

Made-with: Cursor
Copy link
Contributor

@cubic-dev-ai cubic-dev-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

1 issue found across 1 file (changes from recent commits).

Prompt for AI agents (unresolved issues)

Check if these issues are valid — if so, understand the root cause of each and fix them. If appropriate, use sub-agents to investigate and fix each issue separately.


<file name="src/lib/models/providers/minimax/minimaxLLM.ts">

<violation number="1" location="src/lib/models/providers/minimax/minimaxLLM.ts:89">
P2: Trailing content can be dropped because the buffered tail is never flushed when the stream ends; on `chunk.done`, the remaining buffer is discarded and the final characters are lost.</violation>
</file>

Reply with feedback, questions, or to request a fix. Tag @cubic-dev-ai to re-run a review.

Co-authored-by: cubic-dev-ai[bot] <191113872+cubic-dev-ai[bot]@users.noreply.github.com>
Copy link

@xkonjin xkonjin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Quick review pass:

  • Main risk area here is auth/session state and stale credential handling.
  • I didn’t see targeted regression coverage in the diff; please add or point CI at a focused test for the changed path in index.ts, index.ts, minimaxLLM.ts.
  • Before merge, I’d smoke-test the behavior touched by index.ts, index.ts, minimaxLLM.ts with malformed input / retry / rollback cases, since that’s where this class of change usually breaks.

- Fix trailing content being dropped when stream ends by flushing buffer
- Discard incomplete think tag content at stream end
- Add vitest testing framework with 18 unit tests covering:
  - Think tag stripping (single/multiple/multiline)
  - Chunk boundary handling for split tags
  - Buffer flushing on stream end
  - toolCallChunk preservation
  - Edge cases and error scenarios

Addresses review feedback from PR ItzCrazyKns#1028

Made-with: Cursor
@ximiximi423
Copy link
Author

Thanks for the review @xkonjin ! I have added the fix for buffer flushing on stream end and unit tests covering think-tag stripping, chunk boundary handling, and toolCallChunk preservation. Please review it when you have a moment.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants