From 9a8c9df2ed1fcb5b394944cb7273c845cf677b41 Mon Sep 17 00:00:00 2001 From: notgitika Date: Wed, 4 Mar 2026 16:48:00 -0500 Subject: [PATCH] docs: add page for OpenAI Responses API model provider --- .../concepts/model-providers/index.md | 6 +- .../model-providers/openai-responses.md | 122 ++++++++++++++++++ .../concepts/model-providers/openai.md | 8 +- mkdocs.yml | 3 +- 4 files changed, 134 insertions(+), 5 deletions(-) create mode 100644 docs/user-guide/concepts/model-providers/openai-responses.md diff --git a/docs/user-guide/concepts/model-providers/index.md b/docs/user-guide/concepts/model-providers/index.md index 774797f5b..adfaa461e 100644 --- a/docs/user-guide/concepts/model-providers/index.md +++ b/docs/user-guide/concepts/model-providers/index.md @@ -13,7 +13,8 @@ The following table shows all model providers supported by Strands Agents SDK an | [Custom Providers](custom_model_provider.md) | ✅ | ✅ | | [Amazon Bedrock](amazon-bedrock.md) | ✅ | ✅ | | [Amazon Nova](amazon-nova.md) | ✅ | ❌ | -| [OpenAI](openai.md) | ✅ | ✅ | +| [OpenAI (Chat Completions)](openai.md) | ✅ | ✅ | +| [OpenAI (Responses API)](openai-responses.md) | ✅ | ❌ | | [Anthropic](anthropic.md) | ✅ | ❌ | | [Gemini](gemini.md) | ✅ | ✅ | | [LiteLLM](litellm.md) | ✅ | ❌ | @@ -98,7 +99,8 @@ Each provider follows a similar pattern for initialization and usage. Models are ### Explore Model Providers - **[Amazon Bedrock](amazon-bedrock.md)** - Default provider with wide model selection, enterprise features, and full Python/TypeScript support -- **[OpenAI](openai.md)** - GPT models with streaming support +- **[OpenAI (Chat Completions)](openai.md)** - GPT models via the Chat Completions API +- **[OpenAI (Responses API)](openai-responses.md)** - GPT models via the newer Responses API - **[Gemini](gemini.md)** - Google's Gemini models with tool calling support - **[Custom Providers](custom_model_provider.md)** - Build your own model integration - **[Anthropic](anthropic.md)** - Direct Claude API access (Python only) diff --git a/docs/user-guide/concepts/model-providers/openai-responses.md b/docs/user-guide/concepts/model-providers/openai-responses.md new file mode 100644 index 000000000..0825b8bd7 --- /dev/null +++ b/docs/user-guide/concepts/model-providers/openai-responses.md @@ -0,0 +1,122 @@ +# OpenAI Responses API + +!!! info "Language Support" + This provider is only supported in Python. + +The [OpenAI Responses API](https://platform.openai.com/docs/api-reference/responses/create) is OpenAI's latest API for interacting with their models. The Strands Agents SDK provides a dedicated `OpenAIResponsesModel` provider that uses this API, supporting streaming, tool calling, and structured output. + +!!! note "Looking for the Chat Completions API?" + If you want to use OpenAI's Chat Completions API instead, see the [OpenAI (Chat Completions)](openai.md) provider. + +## Installation + +OpenAI is configured as an optional dependency in Strands Agents. The Responses API provider requires the OpenAI Python SDK **v2.0.0 or later**. To install, run: + +```bash +pip install 'strands-agents[openai]' strands-agents-tools +``` + +## Usage + +After installing dependencies, you can import and initialize the OpenAI Responses API provider as follows: + +```python +from strands import Agent +from strands.models.openai_responses import OpenAIResponsesModel +from strands_tools import calculator + +model = OpenAIResponsesModel( + client_args={ + "api_key": "", + }, + model_id="gpt-4o", + params={ + "max_output_tokens": 1000, + "temperature": 0.7, + } +) + +agent = Agent(model=model, tools=[calculator]) +response = agent("What is 2+2") +print(response) +``` + +To connect to a custom OpenAI-compatible server: + +```python +model = OpenAIResponsesModel( + client_args={ + "api_key": "", + "base_url": "", + }, + ... +) +``` + +## Configuration + +### Client Configuration + +The `client_args` configure the underlying `AsyncOpenAI` client. For a complete list of available arguments, please refer to the OpenAI [source](https://github.com/openai/openai-python). + +### Model Configuration + +The model configuration sets parameters for inference: + +| Parameter | Description | Example | Options | +|-----------|-------------|---------|---------| +| `model_id` | ID of a model to use | `gpt-4o` | [reference](https://platform.openai.com/docs/models) | +| `params` | Model specific parameters | `{"max_output_tokens": 1000, "temperature": 0.7}` | [reference](https://platform.openai.com/docs/api-reference/responses/create) | + +## Troubleshooting + +**Module Not Found** + +If you encounter the error `ModuleNotFoundError: No module named 'openai'`, this means you haven't installed the `openai` dependency in your environment. To fix, run `pip install 'strands-agents[openai]'`. + +**OpenAI SDK Version** + +The Responses API provider requires the OpenAI Python SDK v2.0.0 or later. If you encounter an `ImportError` about the SDK version, upgrade with `pip install --upgrade openai`. + +## Advanced Features + +### Structured Output + +The OpenAI Responses API provider supports structured output through the `responses.parse(...)` endpoint. When you use `Agent.structured_output()`, the Strands SDK uses this to return typed results conforming to your Pydantic model. + +```python +from pydantic import BaseModel, Field +from strands import Agent +from strands.models.openai_responses import OpenAIResponsesModel + +class PersonInfo(BaseModel): + """Extract person information from text.""" + name: str = Field(description="Full name of the person") + age: int = Field(description="Age in years") + occupation: str = Field(description="Job or profession") + +model = OpenAIResponsesModel( + client_args={"api_key": ""}, + model_id="gpt-4o", +) + +agent = Agent(model=model) + +result = agent.structured_output( + PersonInfo, + "John Smith is a 30-year-old software engineer working at a tech startup." +) + +print(f"Name: {result.name}") # "John Smith" +print(f"Age: {result.age}") # 30 +print(f"Job: {result.occupation}") # "software engineer" +``` + +### Reasoning Models + +The Responses API provider supports reasoning models (such as o1 and o3) that include chain-of-thought reasoning in their responses. Reasoning content is automatically captured and streamed as `reasoningContent` events. + +## References + +- [OpenAI Responses API](https://platform.openai.com/docs/api-reference/responses/create) +- [OpenAI (Chat Completions) Provider](openai.md) diff --git a/docs/user-guide/concepts/model-providers/openai.md b/docs/user-guide/concepts/model-providers/openai.md index 4c4996ad7..5933b2297 100644 --- a/docs/user-guide/concepts/model-providers/openai.md +++ b/docs/user-guide/concepts/model-providers/openai.md @@ -1,6 +1,9 @@ -# OpenAI +# OpenAI (Chat Completions) -[OpenAI](https://platform.openai.com/docs/overview) is an AI research and deployment company that provides a suite of powerful language models. The Strands Agents SDK implements an OpenAI provider, allowing you to run agents against any OpenAI or OpenAI-compatible model. +[OpenAI](https://platform.openai.com/docs/overview) is an AI research and deployment company that provides a suite of powerful language models. The Strands Agents SDK implements an OpenAI provider using the [Chat Completions API](https://platform.openai.com/docs/api-reference/chat/create), allowing you to run agents against any OpenAI or OpenAI-compatible model. + +!!! note "Looking for the Responses API?" + OpenAI also offers a newer [Responses API](https://platform.openai.com/docs/api-reference/responses/create). The Strands SDK provides a dedicated provider for it — see the [OpenAI Responses API](openai-responses.md) provider. ## Installation @@ -203,3 +206,4 @@ Users can pass their own custom OpenAI client to the OpenAIModel for Strands Age - [API](../../../api-reference/python/models/model.md) - [OpenAI](https://platform.openai.com/docs/overview) +- [OpenAI Responses API Provider](openai-responses.md) diff --git a/mkdocs.yml b/mkdocs.yml index 0a606b468..12ef0d287 100644 --- a/mkdocs.yml +++ b/mkdocs.yml @@ -117,7 +117,8 @@ nav: - LlamaAPI: user-guide/concepts/model-providers/llamaapi.md - MistralAI: user-guide/concepts/model-providers/mistral.md - Ollama: user-guide/concepts/model-providers/ollama.md - - OpenAI: user-guide/concepts/model-providers/openai.md + - OpenAI (Chat Completions): user-guide/concepts/model-providers/openai.md + - OpenAI (Responses API): user-guide/concepts/model-providers/openai-responses.md - SageMaker: user-guide/concepts/model-providers/sagemaker.md - Writer: user-guide/concepts/model-providers/writer.md - Custom Providers: user-guide/concepts/model-providers/custom_model_provider.md