feat: add MiniMax as first-class LLM provider (M2.7/M2.5, 204K context)#3
Open
octo-patch wants to merge 1 commit intoAmeNetwork:mainfrom
Open
feat: add MiniMax as first-class LLM provider (M2.7/M2.5, 204K context)#3octo-patch wants to merge 1 commit intoAmeNetwork:mainfrom
octo-patch wants to merge 1 commit intoAmeNetwork:mainfrom
Conversation
Introduces aser/providers.py with preset configs for MiniMax and OpenAI. MiniMax M2.7/M2.5 families are supported via OpenAI-compatible API at https://api.minimax.io/v1, all with 204K context windows. - aser/providers.py: MINIMAX preset, MINIMAX_MODELS dict, custom_provider() - aser/__init__.py: export providers submodule - examples/agent_minimax.py: end-to-end usage demo - tests/test_providers.py: 26 unit + 3 integration tests - .env.example: MINIMAX_API_KEY entry - README.md / README_CN.md: provider table + MiniMax quick-start
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
This PR adds MiniMax as a first-class LLM provider for Aser via a new
aser/providers.pymodule.MiniMax offers OpenAI-compatible API access to its latest model families, all with 204K-token context windows — the largest available in this class.
Changes
aser/providers.pyMINIMAXconfig preset,MINIMAX_MODELSdict (MiniMax-M2.7,MiniMax-M2.7-highspeed,MiniMax-M2.5,MiniMax-M2.5-highspeed),OPENAIpreset,custom_provider()helperaser/__init__.pyproviderssubmoduleexamples/agent_minimax.pytests/test_providers.py.env.exampleMINIMAX_API_KEYentry with model notesREADME.md/README_CN.mdQuick start
export MINIMAX_API_KEY=your_api_key_hereSupported models
standardMiniMax-M2.7fastMiniMax-M2.7-highspeedstandard_v25MiniMax-M2.5fast_v25MiniMax-M2.5-highspeedTests