Skip to content

feat: add MiniMax as first-class LLM provider#139

Open
octo-patch wants to merge 1 commit intoAIDC-AI:mainfrom
octo-patch:feature/add-minimax-provider
Open

feat: add MiniMax as first-class LLM provider#139
octo-patch wants to merge 1 commit intoAIDC-AI:mainfrom
octo-patch:feature/add-minimax-provider

Conversation

@octo-patch
Copy link
Copy Markdown

@octo-patch octo-patch commented Mar 21, 2026

Summary

  • Add MiniMax as a first-class LLM provider alongside OpenAI and LMStudio
  • Support MiniMax-M2.7 and MiniMax-M2.7-highspeed models (1M context window, OpenAI-compatible API)
  • Auto-detect MiniMax via MINIMAX_API_KEY / CC_MINIMAX_API_KEY environment variables
  • Return static model list for MiniMax (no /v1/models endpoint available)
  • Verify MiniMax API key via chat completions endpoint

Changes (9 files, 631 additions)

  • backend/utils/globals.py: is_minimax_url(), MINIMAX_MODELS, env var support, auto-detect in apply_llm_env_defaults()
  • backend/agent_factory.py: Default to MiniMax-M2.7 when base_url points to MiniMax API
  • backend/controller/llm_api.py: Static MiniMax model list, key verification via chat completions
  • README.md, README_CN.md: MiniMax configuration documentation (env vars + UI)
  • tests/: 27 unit tests + 5 integration tests with ComfyUI runtime stubs

Test plan

  • 27 unit tests pass (URL detection, constants, env defaults, agent factory)
  • 5 integration tests pass (real MiniMax API: M2.7, M2.7-highspeed, temp=0, key verification)
  • Manual test in ComfyUI with MiniMax API key

@CLAassistant
Copy link
Copy Markdown

CLAassistant commented Mar 21, 2026

CLA assistant check
All committers have signed the CLA.

Add MiniMax (api.minimax.io) as supported LLM provider alongside OpenAI and LMStudio.
MiniMax-M2.7 and MiniMax-M2.7-highspeed models (1M context window).

- backend/utils/globals.py: is_minimax_url(), MINIMAX_MODELS, env vars
- backend/agent_factory.py: MiniMax default model
- backend/controller/llm_api.py: static model list, key verification
- README.md, README_CN.md: MiniMax configuration docs
- tests/: 27 unit tests + 5 integration tests
@octo-patch octo-patch force-pushed the feature/add-minimax-provider branch from 8497eaf to cb47f15 Compare March 21, 2026 10:56
@octo-patch octo-patch changed the title feat: add MiniMax as LLM provider feat: add MiniMax as first-class LLM provider Mar 21, 2026
@Jefsky
Copy link
Copy Markdown

Jefsky commented Apr 29, 2026

Hi @octo-patch! Just a heads-up: the base_url https://api.minimax.io/v1 was tested in a related Pixelle-Video project and couldn't fetch the model list. The working URL is https://api.minimax.chat/v1 — that one successfully returns the model list (e.g. MiniMax-M2.7). You may want to update this PR to use the correct URL. Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants