Skip to content

Commit a6758cd

Browse files
committed
docs: add Foundry Local as a BYOK provider
Add Foundry Local (Microsoft on-device inference) to the BYOK documentation: - Supported Providers table entry - Example configuration section with SDK bootstrap snippet - Provider-Specific Limitations table entry - Troubleshooting section for connection issues
1 parent 96c8e8d commit a6758cd

File tree

1 file changed

+45
-0
lines changed

1 file changed

+45
-0
lines changed

docs/auth/byok.md

Lines changed: 45 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -10,6 +10,7 @@ BYOK allows you to use the Copilot SDK with your own API keys from model provide
1010
| Azure OpenAI / Azure AI Foundry | `"azure"` | Azure-hosted models |
1111
| Anthropic | `"anthropic"` | Claude models |
1212
| Ollama | `"openai"` | Local models via OpenAI-compatible API |
13+
| Foundry Local | `"openai"` | On-device inference via Microsoft's OpenAI-compatible local server |
1314
| Other OpenAI-compatible | `"openai"` | vLLM, LiteLLM, etc. |
1415

1516
## Quick Start: Azure AI Foundry
@@ -250,6 +251,27 @@ provider: {
250251
}
251252
```
252253

254+
### Foundry Local (On-Device)
255+
256+
[Foundry Local](https://github.com/microsoft/Foundry-Local) runs AI models on-device without requiring an Azure subscription. It exposes an OpenAI-compatible API at `http://localhost:5272/v1`.
257+
258+
Use the [Foundry Local SDK](https://github.com/microsoft/Foundry-Local#-integrate-with-your-applications-using-the-sdk) to bootstrap the service and model, then point the Copilot SDK at the local endpoint:
259+
260+
```typescript
261+
// Bootstrap: npm install foundry-local-sdk
262+
// import { FoundryLocalManager } from "foundry-local-sdk";
263+
// const manager = new FoundryLocalManager();
264+
// const modelInfo = await manager.init("phi-3.5-mini");
265+
266+
provider: {
267+
type: "openai",
268+
baseUrl: "http://localhost:5272/v1", // Default Foundry Local endpoint
269+
apiKey: manager.apiKey, // Provided by Foundry Local SDK
270+
}
271+
```
272+
273+
> **Note:** Foundry Local must be [installed separately](https://github.com/microsoft/Foundry-Local#installing). Run `foundry model run phi-3.5-mini` to download and start a model.
274+
253275
### Anthropic
254276

255277
```typescript
@@ -301,6 +323,7 @@ Some Copilot features may behave differently with BYOK:
301323
|----------|-------------|
302324
| Azure AI Foundry | No Entra ID auth; must use API keys |
303325
| Ollama | No API key; local only; model support varies |
326+
| Foundry Local | Local only; requires [Foundry Local installed](https://github.com/microsoft/Foundry-Local#installing); model catalog limited to Foundry Local models; REST API is in preview |
304327
| OpenAI | Subject to OpenAI rate limits and quotas |
305328

306329
## Troubleshooting
@@ -364,6 +387,28 @@ curl http://localhost:11434/v1/models
364387
ollama serve
365388
```
366389

390+
### Connection Refused (Foundry Local)
391+
392+
Ensure Foundry Local is installed and the service is running:
393+
394+
```bash
395+
# Check if Foundry Local is installed
396+
foundry --version
397+
398+
# List available models
399+
foundry model ls
400+
401+
# Start a model (downloads if not cached)
402+
foundry model run phi-3.5-mini
403+
404+
# Check the service endpoint
405+
curl http://localhost:5272/v1/models
406+
```
407+
408+
If not installed:
409+
- **Windows**: `winget install Microsoft.FoundryLocal`
410+
- **macOS**: `brew install microsoft/foundrylocal/foundrylocal`
411+
367412
### Authentication Failed
368413

369414
1. Verify your API key is correct and not expired

0 commit comments

Comments
 (0)