Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -25,3 +25,4 @@ __*__/
.build

CLAUDE.md
mise.toml
1 change: 1 addition & 0 deletions src/config/navigation.yml
Original file line number Diff line number Diff line change
Expand Up @@ -85,6 +85,7 @@ sidebar:
- docs/user-guide/concepts/model-providers/ollama
- docs/user-guide/concepts/model-providers/openai
- docs/user-guide/concepts/model-providers/sagemaker
- docs/user-guide/concepts/model-providers/vercel-ai-sdk
- docs/user-guide/concepts/model-providers/writer
- docs/user-guide/concepts/model-providers/custom_model_provider
- label: Streaming
Expand Down
137 changes: 137 additions & 0 deletions src/content/docs/user-guide/concepts/model-providers/vercel-ai-sdk.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,137 @@
---
title: Vercel AI SDK
languages: TypeScript
integrationType: model-provider
---

The [Vercel AI SDK](https://sdk.vercel.ai/) is a TypeScript toolkit for building AI-powered applications. It defines a [Language Model Specification](https://github.com/vercel/ai/tree/main/packages/provider/src/language-model/v3) that standardizes how applications interact with LLMs across providers. The Strands Agents SDK includes a `VercelModel` adapter that wraps any Language Model Specification v3 (`LanguageModelV3`) provider for use as a Strands model provider.

This means you can bring models from the entire Vercel AI SDK ecosystem - including `@ai-sdk/openai`, `@ai-sdk/anthropic`, `@ai-sdk/amazon-bedrock`, `@ai-sdk/google`, and [many more](https://sdk.vercel.ai/docs/foundations/providers-and-models) - directly into Strands agents.

## Installation

Install the Strands SDK along with the Vercel AI SDK provider package for the model you want to use:

```bash
# OpenAI
npm install @strands-agents/sdk @ai-sdk/openai

# Amazon Bedrock
npm install @strands-agents/sdk @ai-sdk/amazon-bedrock

# Anthropic
npm install @strands-agents/sdk @ai-sdk/anthropic

# Google Generative AI
npm install @strands-agents/sdk @ai-sdk/google
```

The `@ai-sdk/provider` package (which defines the `LanguageModelV3` interface) is listed as an optional peer dependency of `@strands-agents/sdk` and will be installed automatically with any `@ai-sdk/*` provider.

## Usage

Create a `LanguageModelV3` instance from any Vercel provider and wrap it with `VercelModel`:

### OpenAI

```typescript
--8<-- "user-guide/concepts/model-providers/vercel-ai-sdk.ts:basic_usage_openai"
```

### Amazon Bedrock

```typescript
--8<-- "user-guide/concepts/model-providers/vercel-ai-sdk.ts:basic_usage_bedrock"
```

### Anthropic

```typescript
--8<-- "user-guide/concepts/model-providers/vercel-ai-sdk.ts:basic_usage_anthropic"
```

### Google Generative AI

```typescript
--8<-- "user-guide/concepts/model-providers/vercel-ai-sdk.ts:basic_usage_google"
```

## Configuration

The second argument to `VercelModel` accepts configuration options. These include all [LanguageModelV3CallOptions](https://github.com/vercel/ai/tree/main/packages/provider/src/language-model/v3) settings (temperature, topP, topK, penalties, stop sequences, seed, etc.) plus the base Strands model config fields.

```typescript
--8<-- "user-guide/concepts/model-providers/vercel-ai-sdk.ts:config_example"
```

| Parameter | Description | Example |
|-----------|-------------|---------|
| `modelId` | Override the model ID (defaults to the provider's model ID) | `'gpt-4o'` |
| `maxTokens` | Maximum tokens to generate | `1000` |
| `temperature` | Controls randomness | `0.7` |
| `topP` | Nucleus sampling | `0.9` |
| `topK` | Top-k sampling | `40` |
| `presencePenalty` | Encourages new topics | `0.5` |
| `frequencyPenalty` | Reduces repetition | `0.5` |
| `stopSequences` | Custom stop sequences | `['END']` |
| `seed` | Deterministic generation | `42` |

When new fields are added to the Language Model Specification, they become available in the config automatically.

## Streaming

The adapter supports streaming text, reasoning content, and tool use:

```typescript
--8<-- "user-guide/concepts/model-providers/vercel-ai-sdk.ts:streaming"
```

## Supported features

The `VercelModel` adapter handles:

- Streaming text, reasoning, and tool use (both incremental and complete tool call events)
- Message formatting: text, images, documents, video, tool use/results, and reasoning blocks
- Tool specification and tool choice mapping
- Usage and token tracking including cache read/write tokens
- Error classification: maps provider errors to `ModelThrottledError`, `ContextWindowOverflowError`, and `ModelError`

## Compatible providers

Any package that implements the `LanguageModelV3` interface works with `VercelModel`. This includes all [official Vercel AI SDK providers](https://sdk.vercel.ai/docs/foundations/providers-and-models) and community providers:

| Provider | Package |
|----------|---------|
| OpenAI | `@ai-sdk/openai` |
| Amazon Bedrock | `@ai-sdk/amazon-bedrock` |
| Anthropic | `@ai-sdk/anthropic` |
| Google Generative AI | `@ai-sdk/google` |
| Google Vertex | `@ai-sdk/google-vertex` |
| Azure OpenAI | `@ai-sdk/azure` |
| Mistral | `@ai-sdk/mistral` |
| Cohere | `@ai-sdk/cohere` |
| xAI Grok | `@ai-sdk/xai` |
| DeepSeek | `@ai-sdk/deepseek` |
| Groq | `@ai-sdk/groq` |

See the [Vercel AI SDK providers page](https://sdk.vercel.ai/docs/foundations/providers-and-models) for the full list.

## Troubleshooting

### Missing peer dependency

If you see warnings about `@ai-sdk/provider`, install it explicitly:

```bash
npm install @ai-sdk/provider
```

### Authentication errors

Authentication is handled by the underlying Vercel provider package. Refer to the specific provider's documentation for credential setup - for example, `@ai-sdk/openai` reads `OPENAI_API_KEY` from the environment, and `@ai-sdk/amazon-bedrock` uses the standard AWS credential chain.

## References

- [Vercel AI SDK](https://sdk.vercel.ai/)
- [Language Model Specification v3](https://github.com/vercel/ai/tree/main/packages/provider/src/language-model/v3)
- [Vercel AI SDK Providers](https://sdk.vercel.ai/docs/foundations/providers-and-models)
105 changes: 105 additions & 0 deletions src/content/docs/user-guide/concepts/model-providers/vercel-ai-sdk.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,105 @@
/**
* TypeScript examples for Vercel AI SDK model provider documentation.
*/
// @ts-nocheck

import { Agent } from '@strands-agents/sdk'
import { VercelModel } from '@strands-agents/sdk/vercel'
import { bedrock } from '@ai-sdk/amazon-bedrock'
import { openai } from '@ai-sdk/openai'
import { anthropic } from '@ai-sdk/anthropic'
import { google } from '@ai-sdk/google'

// Basic usage with OpenAI
async function basicUsageOpenAI() {
// --8<-- [start:basic_usage_openai]
import { Agent } from '@strands-agents/sdk'
import { VercelModel } from '@strands-agents/sdk/vercel'
import { openai } from '@ai-sdk/openai'

const agent = new Agent({
model: new VercelModel(openai('gpt-4o')),
})

const result = await agent.invoke('Hello!')
console.log(result)
// --8<-- [end:basic_usage_openai]
}

// Basic usage with Bedrock
async function basicUsageBedrock() {
// --8<-- [start:basic_usage_bedrock]
import { Agent } from '@strands-agents/sdk'
import { VercelModel } from '@strands-agents/sdk/vercel'
import { bedrock } from '@ai-sdk/amazon-bedrock'

const agent = new Agent({
model: new VercelModel(bedrock('us.anthropic.claude-sonnet-4-20250514-v1:0')),
})

const result = await agent.invoke('Hello!')
console.log(result)
// --8<-- [end:basic_usage_bedrock]
}

// Basic usage with Anthropic
async function basicUsageAnthropic() {
// --8<-- [start:basic_usage_anthropic]
import { Agent } from '@strands-agents/sdk'
import { VercelModel } from '@strands-agents/sdk/vercel'
import { anthropic } from '@ai-sdk/anthropic'

const agent = new Agent({
model: new VercelModel(anthropic('claude-sonnet-4-20250514')),
})

const result = await agent.invoke('Hello!')
console.log(result)
// --8<-- [end:basic_usage_anthropic]
}

// Basic usage with Google
async function basicUsageGoogle() {
// --8<-- [start:basic_usage_google]
import { Agent } from '@strands-agents/sdk'
import { VercelModel } from '@strands-agents/sdk/vercel'
import { google } from '@ai-sdk/google'

const agent = new Agent({
model: new VercelModel(google('gemini-2.5-flash')),
})

const result = await agent.invoke('Hello!')
console.log(result)
// --8<-- [end:basic_usage_google]
}

// Configuration example
async function configExample() {
// --8<-- [start:config_example]
const model = new VercelModel(openai('gpt-4o'), {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Issue: This snippet uses openai without including the import statement. Users copying this code won't have the necessary import.

Suggestion: Add the imports inside the snippet markers:

// --8<-- [start:config_example]
import { Agent } from '@strands-agents/sdk'
import { VercelModel } from '@strands-agents/sdk/vercel'
import { openai } from '@ai-sdk/openai'

const model = new VercelModel(openai('gpt-4o'), {
  // ...

maxTokens: 1000,
temperature: 0.7,
topP: 0.9,
})

const agent = new Agent({ model })
const result = await agent.invoke('Write a short poem')
console.log(result)
// --8<-- [end:config_example]
}

// Streaming example
async function streamingExample() {
// --8<-- [start:streaming]
const agent = new Agent({
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Issue: Same as above - this snippet uses openai without including the import statement.

Suggestion: Include imports inside the snippet so users can copy the complete example.

model: new VercelModel(openai('gpt-4o')),
})

for await (const event of agent.stream('Tell me a story')) {
if (event.type === 'modelContentBlockDeltaEvent' && event.delta.type === 'textDelta') {
process.stdout.write(event.delta.text)
}
}
// --8<-- [end:streaming]
}
Loading