feat: custom LLM provider#20
Open
taylorelley wants to merge 1 commit intosaolalab:mainfrom
Open
Conversation
Adds a UI-configurable Custom (OpenAI-compatible) LLM provider so users can point an agent at any OpenAI-compatible endpoint (e.g. vLLM, LM Studio, a self-hosted gateway) by entering a Base URL, API key, and model name — no code changes or redeploys required. - Backend: route models with the `custom/` prefix through CustomProvider in `clawbot/core/config/schema.py`, and strip that prefix before calling the upstream endpoint in `clawbot/providers/custom_provider.py`. - API: extend `ListModelsRequest` with `api_base` and add a `custom` branch to `/api/providers/models` in `clawforce/apis/providers.py` that lists models from the configured base URL with optional bearer auth. - UI: register the provider in `ModelProviderSection.tsx`, add the Base URL input, and thread `api_base` through `listModels` in `lib/api.ts` and the supporting agent-detail components. - Rebuilt static assets under `clawforce/static/`.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Adds a UI-configurable Custom (OpenAI-compatible) LLM provider so users can point an agent at any OpenAI-compatible endpoint (e.g. vLLM, LM Studio, a self-hosted gateway) by entering a Base URL, API key, and model name.
Description
custom/prefix through CustomProvider inclawbot/core/config/schema.py, and strip that prefix before calling the upstream endpoint inclawbot/providers/custom_provider.py.ListModelsRequestwithapi_baseand add acustombranch to/api/providers/modelsinclawforce/apis/providers.pythat lists models from the configured base URL with optional bearer auth.ModelProviderSection.tsx, add the Base URL input, and threadapi_basethroughlistModelsinlib/api.tsand the supporting agent-detail components.clawforce/static/.Type of change
Checklist
make lint)make test)