feat: add LightLLM provider support #806
Open
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
With this change, you can specify a LightLLM provider for Codex.
Command line:
export LIGHTLLM_API_KEY=
export LIGHTLLM_BASE_URL=http://xxx.xxx.xxx.xxx:port
node ./codex-cli/dist/cli.js --provider lightllm --model claude-3-7-sonnet --fullAuto
or:
codex --provider lightllm --model claude-3-7-sonnet --fullAuto
config.yaml
The example LightLLM configuration file:
model_list:
litellm_params:
model: bedrock/us.anthropic.claude-3-7-sonnet-20250219-v1:0
aws_access_key_id:
aws_secret_access_key:
aws_region_name: us-east-1
api_key:
LightLLM Deploy:
docker run -d -v /tmp/litellm/config.yaml:/app/config/config.yaml -p 9080:4000 ghcr.io/berriai/litellm:main-latest --config /app/config/config.yaml --detailed_debug
Test: