You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
description: Custom provider for third-party APIs and local models
4
+
---
5
+
6
+
For models accessed via OpenAI-compatible APIs, such as:
7
+
8
+
-**Third-party API proxies**: Use a unified API Base to call multiple models
9
+
-**Local models**: Models deployed locally via Ollama, vLLM, LocalAI, etc.
10
+
-**Private deployments**: Self-hosted model services within your organization
11
+
12
+
<Note>
13
+
Unlike the `openai` provider, switching models under the Custom provider will not auto-switch the provider type. Your custom API address is always preserved.
14
+
</Note>
15
+
16
+
## Configuration
17
+
18
+
### Third-party API Proxy
19
+
20
+
```json
21
+
{
22
+
"bot_type": "custom",
23
+
"model": "deepseek-chat",
24
+
"custom_api_key": "YOUR_API_KEY",
25
+
"custom_api_base": "https://{your-proxy.com}/v1"
26
+
}
27
+
```
28
+
29
+
| Parameter | Description |
30
+
| --- | --- |
31
+
|`bot_type`| Must be set to `custom`|
32
+
|`model`| Model name, any model supported by your proxy service |
33
+
|`custom_api_key`| API key provided by your proxy service |
34
+
|`custom_api_base`| API base URL, must be OpenAI-compatible |
35
+
36
+
### Local Models
37
+
38
+
Local models typically don't require an API key — just set the API base:
39
+
40
+
```json
41
+
{
42
+
"bot_type": "custom",
43
+
"model": "qwen3.5:27b",
44
+
"custom_api_base": "http://localhost:11434/v1"
45
+
}
46
+
```
47
+
48
+
Common local deployment tools and their default addresses:
0 commit comments