diff --git a/AGENTS.md b/AGENTS.md index 5eb57ca..34d71c6 100644 --- a/AGENTS.md +++ b/AGENTS.md @@ -19,7 +19,7 @@ - **Run Agent C (Textual UI)**: `uv run agent-c` - **Run Console UI**: `uv run run-console` - **Test all**: `uv run pytest` -- **Test agentc only**: `uv run pytest tests/core/ tests/middleware/ tests/adapters/` +- **Test agentc core**: `uv run pytest tests/core/ tests/middleware/ tests/adapters/` - **Type check**: `uv run mypy` - **Lint**: `uv run ruff check` - **Format**: `uv run ruff format` @@ -43,22 +43,37 @@ The project uses an event-driven, layered architecture with clear separation of Event-driven, layered architecture with: -- **`core/`**: Agnostic agentic logic (types, event loop, agent factory, tools). The `tools/` package provides filesystem operations (with combined ignore patterns), file editing with atomic writes and backups, and command execution. `skill_loader.py` discovers `SKILL.md` files from bundled skills (installed to user data directory) and project directories. - - `types.py`: Event-stream union (`AgentEvent`), `AgentSessionProtocol`, tool result dataclasses, and re-exports of patching types. +- **`core/`**: Agnostic agentic logic (types, event loop, commands). Implements shared protocols and types used by all backends. + - `types.py`: Event-stream union (`AgentEvent`), `AgentSessionProtocol`, `SessionFactoryProtocol`, tool result dataclasses, and re-exports of patching types. - `config_types.py`: `BackendConfig` and `ModelConfig` for provider/model presets. - - `command_types.py`: `CommandType`, `CommandResult`, and `CommandEffect` for command parsing/execution. + - `command_types.py`: `CommandType`, `CommandResult`, `SessionConfig`, and `CommandEffect` for command parsing/execution. - `deps.py`: `RunDeps` context for dependency-injected agent runs. - `config.py`: Centralized system constants (output caps, suffixes, default skill dirs, `DEFAULT_MODEL`, `DEFAULT_PROVIDER_DIRS`). Must be UI-agnostic. - - `loop.py`: `AgentSession` implementing the bidirectional async generator loop and mapping pydantic_ai events to `AgentEvent`. - - `factory.py`: `create_agent` factory assembling the `pydantic_ai.Agent` using the model preset configured in `providers.toml` (default preset: `local-oss` on the `ollama` backend), plus the shared toolset and skills table. - `commands.py`: Command parsing (`CommandParser`) and effect-based execution (`execute_command`). - `CommandParser` performs pure parsing without validation - Commands produce pure `CommandEffect` data containing `SessionConfig` - Session factories validate model names and apply configuration to create new sessions - `tool_parsing.py`: Robust JSON/dict argument handling for tool calls. - - `tools/`: Tool package organized by category (see Available Tools section) - `skill_loader.py`: Discovers `SKILL.md` skills from project directories (`.github/skills`, `.claude/skills`), user directory (`~/.agentc/skills`), and bundled skills (installed to platform-specific user data directory). Earlier directories take precedence. - - `provider_loader.py`: Discovers, loads, and merges `providers.toml` files from repo/user/bundled locations (priority: repo > user > bundled). Dynamically imports provider/model classes and builds instances with API keys, base URLs, and model params. + - `backends/`: Backend implementations (see Backend Structure below) + - `patching/`: Structured file patching engine (see Patching Module below) + +- **`core/backends/`**: Backend-specific implementations + - **`pydantic_ai/`**: Pydantic AI backend + - `factory.py`: `create_agent` factory assembling the `pydantic_ai.Agent` using the model preset configured in `providers.toml` (default preset: `ollama-gpt-oss-120b` on the `ollama` backend), plus the shared toolset and skills table. + - `loop.py`: `AgentSession` implementing the bidirectional async generator loop and mapping pydantic_ai events to `AgentEvent`. + - `session_factory.py`: `PydanticAISessionFactory` implementing `SessionFactoryProtocol`. + - `provider_loader.py`: Discovers, loads, and merges `providers.toml` files from repo/user/bundled locations (priority: repo > user > bundled). Dynamically imports provider/model classes and builds instances with API keys, base URLs, and model params. + - `tools/`: Tool package (filesystem, editing, execution) providing filesystem operations with combined ignore patterns, file editing with atomic writes and backups, and command execution. + - **`github_copilot/`**: GitHub Copilot SDK backend + - `loop.py`: `GhAgentSession` implementing `AgentSessionProtocol` using the Copilot SDK. + - `session_factory.py`: `GhCopilotSessionFactory` implementing `SessionFactoryProtocol`. + +- **`core/patching/`**: Structured file patching engine + - `types.py`: `PatchHunk`, `PatchPlan`, `FilePatch`, and result types. + - `engine.py`: Anchor-based hunk matching and application logic. + - `transaction.py`: Transactional file modifications with rollback. + - `errors.py`: Patching-specific exception types. - **`middleware/`**: Cross-cutting concerns (e.g., debouncing) - `debouncing.py`: `DebouncingMiddleware` for text/thinking delta aggregation (default threshold: 40 characters, configurable). @@ -73,11 +88,11 @@ Event-driven, layered architecture with: - `textual_app.py`: The main Textual `App` implementation. - Receives model names list via dependency injection from composition root - Each backend's entry point discovers models using backend-specific mechanisms - - UI layer remains completely backend-agnostic + - UI layer remains completely backend-agnostic, depends on `SessionFactoryProtocol` - `widgets.py`: Reusable UI components (status bar, approval forms, etc.). - **`entrypoints/`**: Application composition roots (dependency injection and bootstrapping) - - `run_textual.py`: Pydantic AI backend launcher (entry point: `agent-c`, `run-textual`) + - `run_textual.py`: Pydantic AI backend launcher (entry points: `agent-c`, `run-textual`) - `run_textual_gh.py`: GitHub Copilot SDK backend launcher (entry point: `run-textual-gh`) - `run_console.py`: Console UI demo launcher (entry point: `run-console`) @@ -118,7 +133,7 @@ Entry points inject concrete factories into the UI layer, which uses the Protoco ## Available Tools -Agent C provides the following tools in the `core/tools/` package: +Agent C provides the following tools in the `core/backends/pydantic_ai/tools/` package: ### Filesystem Tools (`tools/filesystem.py`) @@ -194,18 +209,21 @@ When working on `agentc`, follow these rules strictly: ### Testing (Mandatory) Maintain and update the test suite in `tests/`. Must cover: -- `core.loop`: approval handshake, history, and tool call yielding -- `core.factory`: agent creation with model presets +- `core.backends.pydantic_ai.loop`: approval handshake, history, and tool call yielding +- `core.backends.pydantic_ai.factory`: agent creation with model presets +- `core.backends.github_copilot.loop`: GitHub Copilot SDK session integration - `core.commands`: command parsing and effect-based execution - `core.tool_parsing`: robust JSON argument handling -- `core.tools`: +- `core.backends.pydantic_ai.tools`: - `test_tools_filesystem.py`: list_files, glob_paths, search_files - `test_tools_editing.py`: read_file, create_file, edit_file, apply_hunks - `test_tools_execution.py`: run_command - `test_tool_result.py`: tool result mapping - `test_ignore_logic.py`: gitignore support integration +- `core.patching`: patching engine and transaction tests - `core.skill_loader`: skill discovery and skills table rendering -- `core.provider_loader`: provider/model loading and merging +- `core.backends.pydantic_ai.provider_loader`: provider/model loading and merging +- `core.session_factories`: Pydantic AI and GitHub Copilot session factory tests - `middleware.debouncing`: flush logic and delta aggregation - `adapters.textual`: mapping to `adapters.messages` - `adapters.console`: console event mapping and approval flow @@ -294,7 +312,7 @@ When making changes, include in your response: - **Build system**: `uv_build` - **Entry points**: - - `agentc.entrypoints.run_textual:main` (agent-c command - default Textual UI) - - `agentc.entrypoints.run_console:main` (run-console command) - - `agentc.entrypoints.run_textual:main` (run-textual command) - - `agentc.entrypoints.run_textual_gh:main_sync` (run-textual-gh command) + - `agent-c`: `agentc.entrypoints.run_textual:main` (default Textual UI with Pydantic AI backend) + - `run-textual`: `agentc.entrypoints.run_textual:main` (alias for agent-c) + - `run-textual-gh`: `agentc.entrypoints.run_textual_gh:main_sync` (Textual UI with GitHub Copilot SDK backend) + - `run-console`: `agentc.entrypoints.run_console:main` (Console UI demo) diff --git a/ARCHITECTURE.md b/ARCHITECTURE.md index 0fd4853..a47f6bc 100644 --- a/ARCHITECTURE.md +++ b/ARCHITECTURE.md @@ -26,20 +26,26 @@ graph TD ### Layer Responsibilities -**Types Layer** (`types.py`) -- Central `AgentEvent` union (chunks, tool calls, tool results, approvals, done) -- `AgentSessionProtocol` interface +**Types Layer** (`types.py`, `config_types.py`, `command_types.py`) +- Central `AgentEvent` union (chunks, tool calls, tool results, approvals, done) in `types.py` +- `AgentSessionProtocol` and `SessionFactoryProtocol` interfaces in `types.py` +- `BackendConfig` and `ModelConfig` for backend/model presets in `config_types.py` +- `CommandType`, `CommandResult`, `SessionConfig`, `CommandEffect` in `command_types.py` - Shared dataclasses for cross-layer communication -- `CommandEffect` for effect-based command execution -- `BackendConfig` and `ModelConfig` for backend/model presets **Core Layer** (`core/`) -- `loop.py`: `AgentSession` implementing bidirectional async generator loop -- `factory.py`: Agent creation with model presets, tools, and skills - `commands.py`: Command parsing and effect-based execution -- `tools/`: Tool implementations (filesystem, editing, execution) - `skill_loader.py`: Skill discovery from bundled and project directories -- `provider_loader.py`: Dynamic provider/model loading from TOML configs +- `backends/pydantic_ai/`: Pydantic AI backend implementation + - `loop.py`: `AgentSession` implementing bidirectional async generator loop + - `factory.py`: Agent creation with model presets, tools, and skills + - `provider_loader.py`: Dynamic provider/model loading from TOML configs + - `session_factory.py`: `PydanticAISessionFactory` implementing `SessionFactoryProtocol` + - `tools/`: Tool implementations (filesystem, editing, execution) +- `backends/github_copilot/`: GitHub Copilot SDK backend implementation + - `loop.py`: `GhAgentSession` implementing `AgentSessionProtocol` + - `session_factory.py`: `GhCopilotSessionFactory` implementing `SessionFactoryProtocol` +- `patching/`: Structured file patching engine (anchor-based hunk matching, transactions) **Middleware Layer** (`middleware/`) - `debouncing.py`: Text/thinking delta aggregation (40 char threshold default) @@ -49,11 +55,12 @@ graph TD - `console.py`: Translates `AgentEvent` to console callbacks - Owns UI-specific message types and approval handshake coordination -**UI Layer** (`ui/`) -- `textual_app.py`: Textual TUI application -- `run_textual.py`: Textual UI entry point -- `run_console.py`: Console UI demo entry point +**UI Layer** (`ui/`, `entrypoints/`) +- `textual_app.py`: Textual TUI application (backend-agnostic) - `widgets.py`: Reusable UI components +- `entrypoints/run_textual.py`: Pydantic AI backend composition root +- `entrypoints/run_textual_gh.py`: GitHub Copilot SDK backend composition root +- `entrypoints/run_console.py`: Console UI demo entry point ## Event Flow @@ -184,7 +191,7 @@ Agent C discovers `providers.toml` files in priority order: Entries from earlier locations override those with the same name later. -### Provider Loader (`core/provider_loader.py`) +### Provider Loader (`core/backends/pydantic_ai/provider_loader.py`) - **`get_default_provider_dirs()`**: Returns discovery paths in priority order - **`load_providers(dirs)`**: Discovers and merges backend/model presets with precedence @@ -198,10 +205,9 @@ provider_cls = "pydantic_ai.providers.ollama.OllamaProvider" model_cls = "pydantic_ai.models.openai.OpenAIChatModel" base_url = "http://localhost:11434/v1" -[models.local-oss] +[models.ollama-gpt-oss-120b] backend = "ollama" model_name = "gpt-oss:120b-cloud" -params = {temperature = 0.2} ``` ## Design Principles diff --git a/README.md b/README.md index deb272d..1b7efe2 100644 --- a/README.md +++ b/README.md @@ -1,10 +1,10 @@ # Agent C -A modern code editing assistant powered by [Pydantic AI](https://ai.pydantic.dev/), featuring an event-driven architecture with skills-based prompting, multiple LLM provider support, and a rich Textual TUI. +A modern code editing assistant powered by [Pydantic AI](https://ai.pydantic.dev/) or the [GitHub Copilot SDK](https://github.com/github/copilot-sdk), featuring an event-driven architecture with skills-based prompting, multiple LLM provider support, and a rich Textual TUI. Hugely inspired by [How to Build an Agent](https://ampcode.com/how-to-build-an-agent) by Thorsten Ball of [AmpCode](https://ampcode.com/). -Agent C uses a layered, event-driven architecture - see [ARCHITECTURE.md](ARCHITECTURE.md) for detailed design and diagrams. +Agent C uses a layered, event-driven architecture with pluggable backends - see [ARCHITECTURE.md](ARCHITECTURE.md) for detailed design and diagrams. ## Features @@ -24,7 +24,7 @@ Agent C uses a layered, event-driven architecture - see [ARCHITECTURE.md](ARCHIT ## Quick Start -Agent C defaults to running with Ollama and the `gpt-oss:120b-cloud` model. +Agent C defaults to running with Ollama and the `ollama-gpt-oss-120b` model preset. ### 1. Install Ollama (for local inference) @@ -141,6 +141,14 @@ uv run agent-c uv run run-console ``` +### GitHub Copilot SDK Backend + +```bash +uv run run-textual-gh +``` + +Requires the Copilot CLI to be installed and available in PATH. + ### Override the Model Preset Use the `/model` command within the agent: @@ -148,7 +156,7 @@ Use the `/model` command within the agent: /model claude-sonnet ``` -Available presets (bundled): `local-oss`, `gpt-4o-mini`, `claude-sonnet`, `gemini-flash`, `hf-gpt-oss-120b`, `mistral-large` +Available presets (bundled): `ollama-gpt-oss-120b`, `ollama-gpt-oss-20b`, `ollama-kimi-k2-5`, `ollama-glm-4-7`, `gpt-4o-mini`, `claude-sonnet`, `gemini3-flash`, `hf-gpt-oss-120b`, `mistral-large` ### Run Without Installing @@ -201,8 +209,13 @@ The agent uses a smart editing strategy: └── src/ └── agentc/ # Main Implementation ├── core/ # Agnostic agent logic + │ ├── backends/ # Backend implementations + │ │ ├── pydantic_ai/ # Pydantic AI backend (factory, loop, tools) + │ │ └── github_copilot/ # GitHub Copilot SDK backend + │ └── patching/ # Structured file patching engine ├── middleware/ # Cross-cutting concerns (debouncing) ├── adapters/ # UI framework bridges + ├── entrypoints/ # Application composition roots ├── ui/ # User interfaces (Textual, Console) └── providers.toml # Provider configuration ``` diff --git a/pyproject.toml b/pyproject.toml index ee9291c..5b48b10 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -14,7 +14,7 @@ dependencies = [ license = "MIT" [dependency-groups] -dev = ["mypy>=1.18.2", "pytest>=8.4.2", "ruff>=0.14.2", "isort>=5.12.0"] +dev = ["mypy>=1.18.2", "pytest>=8.4.2", "ruff>=0.14.2", "isort>=5.12.0", "pytest-cov>=4.1.0"] [build-system] requires = ["uv_build>=0.9.6,<0.10.0"] diff --git a/tests/core/test_loop_github_copilot.py b/tests/core/test_loop_github_copilot.py new file mode 100644 index 0000000..b434cfc --- /dev/null +++ b/tests/core/test_loop_github_copilot.py @@ -0,0 +1,408 @@ +"""Tests for GitHub Copilot SDK GhAgentSession.""" + +import asyncio +from typing import Any +from unittest.mock import MagicMock, AsyncMock + +import pytest + +from agentc.core.backends.github_copilot.loop import GhAgentSession +from agentc.core.types import ( + AgentChunk, + AgentDone, + ToolCallInfo, + ToolCallResultInfo, + ToolResult, +) + + +def create_mock_session_event(event_type: str, data: dict[str, Any] | None = None) -> MagicMock: + """Create a mock SessionEvent with the given type and data. + + Args: + event_type: The SessionEventType enum value (as string). + data: Optional event data dictionary. + + Returns: + A MagicMock SessionEvent. + """ + from copilot.generated.session_events import SessionEventType + + event_mock = MagicMock() + event_mock.type = getattr(SessionEventType, event_type) + + data_mock = MagicMock() + if data: + for key, value in data.items(): + setattr(data_mock, key, value) + event_mock.data = data_mock + + return event_mock + + +async def _test_gh_agent_session_run_text(): + """Test that GhAgentSession yields text chunks and completion event.""" + # Create mock CopilotSession + mock_copilot_session = MagicMock() + mock_copilot_session.send = AsyncMock() + + # Create GhAgentSession + session = GhAgentSession(mock_copilot_session) + + # Simulate events by directly putting them in the queue + await session._event_queue.put( + create_mock_session_event("ASSISTANT_MESSAGE_DELTA", {"delta_content": "Hello"}) + ) + await session._event_queue.put( + create_mock_session_event("ASSISTANT_MESSAGE_DELTA", {"delta_content": " world"}) + ) + await session._event_queue.put( + create_mock_session_event("SESSION_IDLE", {}) + ) + + # Run session and collect events + gen = session.run("Hi") + + results = [] + async for event in gen: + results.append(event) + + # Verify events + assert len(results) == 3 + + # First two should be text chunks + assert isinstance(results[0], AgentChunk) + assert results[0].content == "Hello" + assert results[0].is_thought is False + + assert isinstance(results[1], AgentChunk) + assert results[1].content == " world" + assert results[1].is_thought is False + + # Last should be completion + assert isinstance(results[2], AgentDone) + assert results[2].history is None + + +def test_gh_agent_session_run_text(): + asyncio.run(_test_gh_agent_session_run_text()) + + +async def _test_gh_agent_session_run_thinking(): + """Test that GhAgentSession yields thinking chunks.""" + mock_copilot_session = MagicMock() + mock_copilot_session.send = AsyncMock() + + session = GhAgentSession(mock_copilot_session) + + # Simulate thinking/reasoning events + await session._event_queue.put( + create_mock_session_event("ASSISTANT_REASONING_DELTA", {"delta_content": "Let me think..."}) + ) + await session._event_queue.put( + create_mock_session_event("SESSION_IDLE", {}) + ) + + gen = session.run("Analyze this") + + results = [] + async for event in gen: + results.append(event) + + assert len(results) == 2 + + # First should be thinking chunk + assert isinstance(results[0], AgentChunk) + assert results[0].content == "Let me think..." + assert results[0].is_thought is True + + # Last should be completion + assert isinstance(results[1], AgentDone) + + +def test_gh_agent_session_run_thinking(): + asyncio.run(_test_gh_agent_session_run_thinking()) + + +async def _test_gh_agent_session_run_tool_call(): + """Test that GhAgentSession detects and yields tool calls.""" + mock_copilot_session = MagicMock() + mock_copilot_session.send = AsyncMock() + + session = GhAgentSession(mock_copilot_session) + + # Simulate tool execution start + await session._event_queue.put( + create_mock_session_event( + "TOOL_EXECUTION_START", + { + "tool_name": "read_file", + "arguments": {"path": "/test/file.py"}, + "tool_call_id": "call_123", + } + ) + ) + await session._event_queue.put( + create_mock_session_event("SESSION_IDLE", {}) + ) + + gen = session.run("Read file") + + results = [] + async for event in gen: + results.append(event) + + assert len(results) == 2 + + # First should be tool call + tool_call = results[0] + assert isinstance(tool_call, ToolCallInfo) + assert tool_call.tool_name == "read_file" + assert tool_call.args == {"path": "/test/file.py"} + assert tool_call.tool_call_id == "call_123" + + # Last should be completion + assert isinstance(results[1], AgentDone) + + +def test_gh_agent_session_run_tool_call(): + asyncio.run(_test_gh_agent_session_run_tool_call()) + + +async def _test_gh_agent_session_run_tool_result(): + """Test that GhAgentSession detects and yields tool results.""" + mock_copilot_session = MagicMock() + mock_copilot_session.send = AsyncMock() + + session = GhAgentSession(mock_copilot_session) + + # Simulate tool execution complete + await session._event_queue.put( + create_mock_session_event( + "TOOL_EXECUTION_COMPLETE", + { + "tool_call_id": "call_123", + "success": True, + "result": "File contents here", + "error": None, + } + ) + ) + await session._event_queue.put( + create_mock_session_event("SESSION_IDLE", {}) + ) + + gen = session.run("Execute tool") + + results = [] + async for event in gen: + results.append(event) + + assert len(results) == 2 + + # First should be tool result + tool_result_info = results[0] + assert isinstance(tool_result_info, ToolCallResultInfo) + assert tool_result_info.tool_call_id == "call_123" + assert isinstance(tool_result_info.result, ToolResult) + assert tool_result_info.result.success is True + assert tool_result_info.result.content == "File contents here" + assert tool_result_info.result.error is None + + # Last should be completion + assert isinstance(results[1], AgentDone) + + +def test_gh_agent_session_run_tool_result(): + asyncio.run(_test_gh_agent_session_run_tool_result()) + + +async def _test_gh_agent_session_run_tool_error(): + """Test that GhAgentSession handles tool execution errors.""" + mock_copilot_session = MagicMock() + mock_copilot_session.send = AsyncMock() + + session = GhAgentSession(mock_copilot_session) + + # Simulate tool execution failure + await session._event_queue.put( + create_mock_session_event( + "TOOL_EXECUTION_COMPLETE", + { + "tool_call_id": "call_456", + "success": False, + "result": None, + "error": "File not found", + } + ) + ) + await session._event_queue.put( + create_mock_session_event("SESSION_IDLE", {}) + ) + + gen = session.run("Execute tool") + + results = [] + async for event in gen: + results.append(event) + + assert len(results) == 2 + + # First should be tool result with error + tool_result_info = results[0] + assert isinstance(tool_result_info, ToolCallResultInfo) + assert tool_result_info.result.success is False + assert tool_result_info.result.error == "File not found" + + +def test_gh_agent_session_run_tool_error(): + asyncio.run(_test_gh_agent_session_run_tool_error()) + + +async def _test_gh_agent_session_filters_internal_tools(): + """Test that GhAgentSession filters internal tools like report_intent.""" + mock_copilot_session = MagicMock() + mock_copilot_session.send = AsyncMock() + + session = GhAgentSession(mock_copilot_session) + + # Simulate internal tool call (should be filtered) + await session._event_queue.put( + create_mock_session_event( + "TOOL_EXECUTION_START", + { + "tool_name": "report_intent", + "arguments": {"intent": "reading file"}, + "tool_call_id": "call_internal", + } + ) + ) + # Simulate regular tool call (should be yielded) + await session._event_queue.put( + create_mock_session_event( + "TOOL_EXECUTION_START", + { + "tool_name": "read_file", + "arguments": {"path": "/test.py"}, + "tool_call_id": "call_regular", + } + ) + ) + await session._event_queue.put( + create_mock_session_event("SESSION_IDLE", {}) + ) + + gen = session.run("Test") + + results = [] + async for event in gen: + results.append(event) + + # Should only have the regular tool call + completion + assert len(results) == 2 + assert isinstance(results[0], ToolCallInfo) + assert results[0].tool_name == "read_file" + assert isinstance(results[1], AgentDone) + + +def test_gh_agent_session_filters_internal_tools(): + asyncio.run(_test_gh_agent_session_filters_internal_tools()) + + +async def _test_gh_agent_session_cancellation(): + """Test that GhAgentSession respects cancellation events.""" + mock_copilot_session = MagicMock() + mock_copilot_session.send = AsyncMock() + + session = GhAgentSession(mock_copilot_session) + cancellation_event = asyncio.Event() + + # Queue an event + await session._event_queue.put( + create_mock_session_event("ASSISTANT_MESSAGE_DELTA", {"delta_content": "Starting..."}) + ) + + # Start the generator but don't consume yet + gen = session.run("Test", cancellation_event=cancellation_event) + it = aiter(gen) + + # Get first event (cancellation not set yet) + first_event = await anext(it) + assert isinstance(first_event, AgentChunk) + assert first_event.content == "Starting..." + + # Now set cancellation + cancellation_event.set() + + # Queue more events (these should not be yielded) + await session._event_queue.put( + create_mock_session_event("ASSISTANT_MESSAGE_DELTA", {"delta_content": "This should not appear"}) + ) + await session._event_queue.put( + create_mock_session_event("SESSION_IDLE", {}) + ) + + # Try to get next event - should stop iteration + with pytest.raises(StopAsyncIteration): + await anext(it) + + +def test_gh_agent_session_cancellation(): + asyncio.run(_test_gh_agent_session_cancellation()) + + +async def _test_gh_agent_session_mixed_events(): + """Test that GhAgentSession handles a realistic mix of events.""" + mock_copilot_session = MagicMock() + mock_copilot_session.send = AsyncMock() + + session = GhAgentSession(mock_copilot_session) + + # Simulate a realistic event sequence + await session._event_queue.put( + create_mock_session_event("ASSISTANT_REASONING_DELTA", {"delta_content": "Planning..."}) + ) + await session._event_queue.put( + create_mock_session_event("ASSISTANT_MESSAGE_DELTA", {"delta_content": "I'll read "}) + ) + await session._event_queue.put( + create_mock_session_event("ASSISTANT_MESSAGE_DELTA", {"delta_content": "the file"}) + ) + await session._event_queue.put( + create_mock_session_event( + "TOOL_EXECUTION_START", + {"tool_name": "read_file", "arguments": {}, "tool_call_id": "call_1"} + ) + ) + await session._event_queue.put( + create_mock_session_event( + "TOOL_EXECUTION_COMPLETE", + {"tool_call_id": "call_1", "success": True, "result": "content", "error": None} + ) + ) + await session._event_queue.put( + create_mock_session_event("ASSISTANT_MESSAGE_DELTA", {"delta_content": "Done!"}) + ) + await session._event_queue.put( + create_mock_session_event("SESSION_IDLE", {}) + ) + + gen = session.run("Test") + + results = [] + async for event in gen: + results.append(event) + + # Verify event sequence + assert len(results) == 7 + assert isinstance(results[0], AgentChunk) and results[0].is_thought is True + assert isinstance(results[1], AgentChunk) and results[1].is_thought is False + assert isinstance(results[2], AgentChunk) and results[2].is_thought is False + assert isinstance(results[3], ToolCallInfo) + assert isinstance(results[4], ToolCallResultInfo) + assert isinstance(results[5], AgentChunk) and results[5].is_thought is False + assert isinstance(results[6], AgentDone) + + +def test_gh_agent_session_mixed_events(): + asyncio.run(_test_gh_agent_session_mixed_events()) diff --git a/tests/core/test_session_factory_github_copilot.py b/tests/core/test_session_factory_github_copilot.py new file mode 100644 index 0000000..576712b --- /dev/null +++ b/tests/core/test_session_factory_github_copilot.py @@ -0,0 +1,286 @@ +"""Tests for GhCopilotSessionFactory.""" + +from pathlib import Path +from unittest.mock import MagicMock, AsyncMock + +import pytest + +from agentc.core.backends.github_copilot.loop import GhAgentSession +from agentc.core.backends.github_copilot.session_factory import GhCopilotSessionFactory +from agentc.core.command_types import SessionConfig + + +@pytest.mark.anyio +async def test_gh_session_factory_creates_session() -> None: + """Test that factory creates a valid GhAgentSession.""" + # Mock CopilotClient + mock_client = MagicMock() + mock_copilot_session = MagicMock() + mock_copilot_session.destroy = AsyncMock() + mock_client.create_session = AsyncMock(return_value=mock_copilot_session) + + # Create base config + base_config = { + "model": "gpt-4", + "skill_directories": ["/default/skills"], + "streaming": True, + "system_message": "You are a helpful assistant", + "on_permission_request": None, + } + + factory = GhCopilotSessionFactory(mock_client, base_config) + config = SessionConfig( + model_name=None, + clear_history=True, + skill_dirs=None, + ) + + session = await factory.create_session(config) + + assert isinstance(session, GhAgentSession) + mock_client.create_session.assert_called_once() + + +@pytest.mark.anyio +async def test_gh_session_factory_with_model_override() -> None: + """Test that factory respects model_name override.""" + mock_client = MagicMock() + mock_copilot_session = MagicMock() + mock_copilot_session.destroy = AsyncMock() + mock_client.create_session = AsyncMock(return_value=mock_copilot_session) + + base_config = { + "model": "gpt-4", + "skill_directories": ["/default/skills"], + "streaming": True, + "system_message": "You are a helpful assistant", + "on_permission_request": None, + } + + factory = GhCopilotSessionFactory(mock_client, base_config) + config = SessionConfig( + model_name="gpt-4o", + clear_history=True, + ) + + session = await factory.create_session(config) + + assert isinstance(session, GhAgentSession) + + # Verify the model override was passed to create_session + call_args = mock_client.create_session.call_args + session_config = call_args[0][0] + assert session_config["model"] == "gpt-4o" + + +@pytest.mark.anyio +async def test_gh_session_factory_with_skill_dirs_override() -> None: + """Test that factory respects skill_dirs override.""" + mock_client = MagicMock() + mock_copilot_session = MagicMock() + mock_copilot_session.destroy = AsyncMock() + mock_client.create_session = AsyncMock(return_value=mock_copilot_session) + + base_config = { + "model": "gpt-4", + "skill_directories": ["/default/skills"], + "streaming": True, + "system_message": "You are a helpful assistant", + "on_permission_request": None, + } + + factory = GhCopilotSessionFactory(mock_client, base_config) + custom_skill_dirs = [Path("/custom/skills"), Path("/another/skills")] + config = SessionConfig( + model_name=None, + clear_history=True, + skill_dirs=custom_skill_dirs, + ) + + session = await factory.create_session(config) + + assert isinstance(session, GhAgentSession) + + # Verify skill directories override + call_args = mock_client.create_session.call_args + session_config = call_args[0][0] + assert session_config["skill_directories"] == [str(d) for d in custom_skill_dirs] + + +@pytest.mark.anyio +async def test_gh_session_factory_uses_base_config_defaults() -> None: + """Test that factory uses base config when no overrides provided.""" + mock_client = MagicMock() + mock_copilot_session = MagicMock() + mock_copilot_session.destroy = AsyncMock() + mock_client.create_session = AsyncMock(return_value=mock_copilot_session) + + base_config = { + "model": "gpt-4-turbo", + "skill_directories": ["/base/skills"], + "streaming": True, + "system_message": "Base system message", + "on_permission_request": MagicMock(), + } + + factory = GhCopilotSessionFactory(mock_client, base_config) + config = SessionConfig( + model_name=None, + clear_history=True, + skill_dirs=None, + ) + + session = await factory.create_session(config) + + assert isinstance(session, GhAgentSession) + + # Verify base config is used + call_args = mock_client.create_session.call_args + session_config = call_args[0][0] + assert session_config["model"] == "gpt-4-turbo" + assert session_config["skill_directories"] == ["/base/skills"] + assert session_config["streaming"] is True + assert session_config["system_message"] == "Base system message" + assert session_config["on_permission_request"] is base_config["on_permission_request"] + + +@pytest.mark.anyio +async def test_gh_session_factory_destroys_old_session() -> None: + """Test that factory destroys old session before creating new one.""" + mock_client = MagicMock() + + # Create two different mock sessions + first_session = MagicMock() + first_session.destroy = AsyncMock() + second_session = MagicMock() + second_session.destroy = AsyncMock() + + mock_client.create_session = AsyncMock(side_effect=[first_session, second_session]) + + base_config = { + "model": "gpt-4", + "skill_directories": [], + "streaming": True, + "system_message": "Test", + "on_permission_request": None, + } + + factory = GhCopilotSessionFactory(mock_client, base_config) + config = SessionConfig(model_name=None, clear_history=True) + + # Create first session + session1 = await factory.create_session(config) + assert isinstance(session1, GhAgentSession) + first_session.destroy.assert_not_called() + + # Create second session - should destroy first + session2 = await factory.create_session(config) + assert isinstance(session2, GhAgentSession) + first_session.destroy.assert_called_once() + second_session.destroy.assert_not_called() + + +@pytest.mark.anyio +async def test_gh_session_factory_handles_destroy_error() -> None: + """Test that factory handles errors during session destruction gracefully.""" + mock_client = MagicMock() + + first_session = MagicMock() + # Make destroy raise an exception + first_session.destroy = AsyncMock(side_effect=Exception("Destroy failed")) + second_session = MagicMock() + second_session.destroy = AsyncMock() + + mock_client.create_session = AsyncMock(side_effect=[first_session, second_session]) + + base_config = { + "model": "gpt-4", + "skill_directories": [], + "streaming": True, + "system_message": "Test", + "on_permission_request": None, + } + + factory = GhCopilotSessionFactory(mock_client, base_config) + config = SessionConfig(model_name=None, clear_history=True) + + # Create first session + await factory.create_session(config) + + # Create second session - should handle destroy error gracefully + session2 = await factory.create_session(config) + assert isinstance(session2, GhAgentSession) + first_session.destroy.assert_called_once() + + +@pytest.mark.anyio +async def test_gh_session_factory_cleanup() -> None: + """Test that cleanup properly destroys the current session.""" + mock_client = MagicMock() + mock_copilot_session = MagicMock() + mock_copilot_session.destroy = AsyncMock() + mock_client.create_session = AsyncMock(return_value=mock_copilot_session) + + base_config = { + "model": "gpt-4", + "skill_directories": [], + "streaming": True, + "system_message": "Test", + "on_permission_request": None, + } + + factory = GhCopilotSessionFactory(mock_client, base_config) + config = SessionConfig(model_name=None, clear_history=True) + + # Create a session + await factory.create_session(config) + + # Cleanup should destroy it + await factory.cleanup() + mock_copilot_session.destroy.assert_called_once() + + +@pytest.mark.anyio +async def test_gh_session_factory_cleanup_without_session() -> None: + """Test that cleanup works when no session exists.""" + mock_client = MagicMock() + + base_config = { + "model": "gpt-4", + "skill_directories": [], + "streaming": True, + "system_message": "Test", + "on_permission_request": None, + } + + factory = GhCopilotSessionFactory(mock_client, base_config) + + # Cleanup without creating a session should not raise + await factory.cleanup() + + +@pytest.mark.anyio +async def test_gh_session_factory_cleanup_handles_error() -> None: + """Test that cleanup handles destruction errors gracefully.""" + mock_client = MagicMock() + mock_copilot_session = MagicMock() + mock_copilot_session.destroy = AsyncMock(side_effect=Exception("Cleanup failed")) + mock_client.create_session = AsyncMock(return_value=mock_copilot_session) + + base_config = { + "model": "gpt-4", + "skill_directories": [], + "streaming": True, + "system_message": "Test", + "on_permission_request": None, + } + + factory = GhCopilotSessionFactory(mock_client, base_config) + config = SessionConfig(model_name=None, clear_history=True) + + # Create a session + await factory.create_session(config) + + # Cleanup should handle error gracefully and not raise + await factory.cleanup() + mock_copilot_session.destroy.assert_called_once() diff --git a/uv.lock b/uv.lock index d445309..de43288 100644 --- a/uv.lock +++ b/uv.lock @@ -31,6 +31,7 @@ dev = [ { name = "isort" }, { name = "mypy" }, { name = "pytest" }, + { name = "pytest-cov" }, { name = "ruff" }, ] @@ -48,6 +49,7 @@ dev = [ { name = "isort", specifier = ">=5.12.0" }, { name = "mypy", specifier = ">=1.18.2" }, { name = "pytest", specifier = ">=8.4.2" }, + { name = "pytest-cov", specifier = ">=4.1.0" }, { name = "ruff", specifier = ">=0.14.2" }, ] @@ -151,7 +153,7 @@ wheels = [ [[package]] name = "anthropic" -version = "0.76.0" +version = "0.77.0" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "anyio" }, @@ -163,9 +165,9 @@ dependencies = [ { name = "sniffio" }, { name = "typing-extensions" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/6e/be/d11abafaa15d6304826438170f7574d750218f49a106c54424a40cef4494/anthropic-0.76.0.tar.gz", hash = "sha256:e0cae6a368986d5cf6df743dfbb1b9519e6a9eee9c6c942ad8121c0b34416ffe", size = 495483, upload-time = "2026-01-13T18:41:14.908Z" } +sdist = { url = "https://files.pythonhosted.org/packages/eb/85/6cb5da3cf91de2eeea89726316e8c5c8c31e2d61ee7cb1233d7e95512c31/anthropic-0.77.0.tar.gz", hash = "sha256:ce36efeb80cb1e25430a88440dc0f9aa5c87f10d080ab70a1bdfd5c2c5fbedb4", size = 504575, upload-time = "2026-01-29T18:20:41.507Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/e5/70/7b0fd9c1a738f59d3babe2b4212031c34ab7d0fda4ffef15b58a55c5bcea/anthropic-0.76.0-py3-none-any.whl", hash = "sha256:81efa3113901192af2f0fe977d3ec73fdadb1e691586306c4256cd6d5ccc331c", size = 390309, upload-time = "2026-01-13T18:41:13.483Z" }, + { url = "https://files.pythonhosted.org/packages/ac/27/9df785d3f94df9ac72f43ee9e14b8120b37d992b18f4952774ed46145022/anthropic-0.77.0-py3-none-any.whl", hash = "sha256:65cc83a3c82ce622d5c677d0d7706c77d29dc83958c6b10286e12fda6ffb2651", size = 397867, upload-time = "2026-01-29T18:20:39.481Z" }, ] [[package]] @@ -400,6 +402,67 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/d1/d6/3965ed04c63042e047cb6a3e6ed1a63a35087b6a609aa3a15ed8ac56c221/colorama-0.4.6-py2.py3-none-any.whl", hash = "sha256:4f1d9991f5acc0ca119f9d443620b77f9d6b33703e51011c16baf57afb285fc6", size = 25335, upload-time = "2022-10-25T02:36:20.889Z" }, ] +[[package]] +name = "coverage" +version = "7.13.2" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/ad/49/349848445b0e53660e258acbcc9b0d014895b6739237920886672240f84b/coverage-7.13.2.tar.gz", hash = "sha256:044c6951ec37146b72a50cc81ef02217d27d4c3640efd2640311393cbbf143d3", size = 826523, upload-time = "2026-01-25T13:00:04.889Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/a7/f0/3d3eac7568ab6096ff23791a526b0048a1ff3f49d0e236b2af6fb6558e88/coverage-7.13.2-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:ed75de7d1217cf3b99365d110975f83af0528c849ef5180a12fd91b5064df9d6", size = 219168, upload-time = "2026-01-25T12:58:23.376Z" }, + { url = "https://files.pythonhosted.org/packages/a3/a6/f8b5cfeddbab95fdef4dcd682d82e5dcff7a112ced57a959f89537ee9995/coverage-7.13.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:97e596de8fa9bada4d88fde64a3f4d37f1b6131e4faa32bad7808abc79887ddc", size = 219537, upload-time = "2026-01-25T12:58:24.932Z" }, + { url = "https://files.pythonhosted.org/packages/7b/e6/8d8e6e0c516c838229d1e41cadcec91745f4b1031d4db17ce0043a0423b4/coverage-7.13.2-cp313-cp313-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:68c86173562ed4413345410c9480a8d64864ac5e54a5cda236748031e094229f", size = 250528, upload-time = "2026-01-25T12:58:26.567Z" }, + { url = "https://files.pythonhosted.org/packages/8e/78/befa6640f74092b86961f957f26504c8fba3d7da57cc2ab7407391870495/coverage-7.13.2-cp313-cp313-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:7be4d613638d678b2b3773b8f687537b284d7074695a43fe2fbbfc0e31ceaed1", size = 253132, upload-time = "2026-01-25T12:58:28.251Z" }, + { url = "https://files.pythonhosted.org/packages/9d/10/1630db1edd8ce675124a2ee0f7becc603d2bb7b345c2387b4b95c6907094/coverage-7.13.2-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:d7f63ce526a96acd0e16c4af8b50b64334239550402fb1607ce6a584a6d62ce9", size = 254374, upload-time = "2026-01-25T12:58:30.294Z" }, + { url = "https://files.pythonhosted.org/packages/ed/1d/0d9381647b1e8e6d310ac4140be9c428a0277330991e0c35bdd751e338a4/coverage-7.13.2-cp313-cp313-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:406821f37f864f968e29ac14c3fccae0fec9fdeba48327f0341decf4daf92d7c", size = 250762, upload-time = "2026-01-25T12:58:32.036Z" }, + { url = "https://files.pythonhosted.org/packages/43/e4/5636dfc9a7c871ee8776af83ee33b4c26bc508ad6cee1e89b6419a366582/coverage-7.13.2-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:ee68e5a4e3e5443623406b905db447dceddffee0dceb39f4e0cd9ec2a35004b5", size = 252502, upload-time = "2026-01-25T12:58:33.961Z" }, + { url = "https://files.pythonhosted.org/packages/02/2a/7ff2884d79d420cbb2d12fed6fff727b6d0ef27253140d3cdbbd03187ee0/coverage-7.13.2-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:2ee0e58cca0c17dd9c6c1cdde02bb705c7b3fbfa5f3b0b5afeda20d4ebff8ef4", size = 250463, upload-time = "2026-01-25T12:58:35.529Z" }, + { url = "https://files.pythonhosted.org/packages/91/c0/ba51087db645b6c7261570400fc62c89a16278763f36ba618dc8657a187b/coverage-7.13.2-cp313-cp313-musllinux_1_2_riscv64.whl", hash = "sha256:6e5bbb5018bf76a56aabdb64246b5288d5ae1b7d0dd4d0534fe86df2c2992d1c", size = 250288, upload-time = "2026-01-25T12:58:37.226Z" }, + { url = "https://files.pythonhosted.org/packages/03/07/44e6f428551c4d9faf63ebcefe49b30e5c89d1be96f6a3abd86a52da9d15/coverage-7.13.2-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:a55516c68ef3e08e134e818d5e308ffa6b1337cc8b092b69b24287bf07d38e31", size = 252063, upload-time = "2026-01-25T12:58:38.821Z" }, + { url = "https://files.pythonhosted.org/packages/c2/67/35b730ad7e1859dd57e834d1bc06080d22d2f87457d53f692fce3f24a5a9/coverage-7.13.2-cp313-cp313-win32.whl", hash = "sha256:5b20211c47a8abf4abc3319d8ce2464864fa9f30c5fcaf958a3eed92f4f1fef8", size = 221716, upload-time = "2026-01-25T12:58:40.484Z" }, + { url = "https://files.pythonhosted.org/packages/0d/82/e5fcf5a97c72f45fc14829237a6550bf49d0ab882ac90e04b12a69db76b4/coverage-7.13.2-cp313-cp313-win_amd64.whl", hash = "sha256:14f500232e521201cf031549fb1ebdfc0a40f401cf519157f76c397e586c3beb", size = 222522, upload-time = "2026-01-25T12:58:43.247Z" }, + { url = "https://files.pythonhosted.org/packages/b1/f1/25d7b2f946d239dd2d6644ca2cc060d24f97551e2af13b6c24c722ae5f97/coverage-7.13.2-cp313-cp313-win_arm64.whl", hash = "sha256:9779310cb5a9778a60c899f075a8514c89fa6d10131445c2207fc893e0b14557", size = 221145, upload-time = "2026-01-25T12:58:45Z" }, + { url = "https://files.pythonhosted.org/packages/9e/f7/080376c029c8f76fadfe43911d0daffa0cbdc9f9418a0eead70c56fb7f4b/coverage-7.13.2-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:e64fa5a1e41ce5df6b547cbc3d3699381c9e2c2c369c67837e716ed0f549d48e", size = 219861, upload-time = "2026-01-25T12:58:46.586Z" }, + { url = "https://files.pythonhosted.org/packages/42/11/0b5e315af5ab35f4c4a70e64d3314e4eec25eefc6dec13be3a7d5ffe8ac5/coverage-7.13.2-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:b01899e82a04085b6561eb233fd688474f57455e8ad35cd82286463ba06332b7", size = 220207, upload-time = "2026-01-25T12:58:48.277Z" }, + { url = "https://files.pythonhosted.org/packages/b2/0c/0874d0318fb1062117acbef06a09cf8b63f3060c22265adaad24b36306b7/coverage-7.13.2-cp313-cp313t-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:838943bea48be0e2768b0cf7819544cdedc1bbb2f28427eabb6eb8c9eb2285d3", size = 261504, upload-time = "2026-01-25T12:58:49.904Z" }, + { url = "https://files.pythonhosted.org/packages/83/5e/1cd72c22ecb30751e43a72f40ba50fcef1b7e93e3ea823bd9feda8e51f9a/coverage-7.13.2-cp313-cp313t-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:93d1d25ec2b27e90bcfef7012992d1f5121b51161b8bffcda756a816cf13c2c3", size = 263582, upload-time = "2026-01-25T12:58:51.582Z" }, + { url = "https://files.pythonhosted.org/packages/9b/da/8acf356707c7a42df4d0657020308e23e5a07397e81492640c186268497c/coverage-7.13.2-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:93b57142f9621b0d12349c43fc7741fe578e4bc914c1e5a54142856cfc0bf421", size = 266008, upload-time = "2026-01-25T12:58:53.234Z" }, + { url = "https://files.pythonhosted.org/packages/41/41/ea1730af99960309423c6ea8d6a4f1fa5564b2d97bd1d29dda4b42611f04/coverage-7.13.2-cp313-cp313t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:f06799ae1bdfff7ccb8665d75f8291c69110ba9585253de254688aa8a1ccc6c5", size = 260762, upload-time = "2026-01-25T12:58:55.372Z" }, + { url = "https://files.pythonhosted.org/packages/22/fa/02884d2080ba71db64fdc127b311db60e01fe6ba797d9c8363725e39f4d5/coverage-7.13.2-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:7f9405ab4f81d490811b1d91c7a20361135a2df4c170e7f0b747a794da5b7f23", size = 263571, upload-time = "2026-01-25T12:58:57.52Z" }, + { url = "https://files.pythonhosted.org/packages/d2/6b/4083aaaeba9b3112f55ac57c2ce7001dc4d8fa3fcc228a39f09cc84ede27/coverage-7.13.2-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:f9ab1d5b86f8fbc97a5b3cd6280a3fd85fef3b028689d8a2c00918f0d82c728c", size = 261200, upload-time = "2026-01-25T12:58:59.255Z" }, + { url = "https://files.pythonhosted.org/packages/e9/d2/aea92fa36d61955e8c416ede9cf9bf142aa196f3aea214bb67f85235a050/coverage-7.13.2-cp313-cp313t-musllinux_1_2_riscv64.whl", hash = "sha256:f674f59712d67e841525b99e5e2b595250e39b529c3bda14764e4f625a3fa01f", size = 260095, upload-time = "2026-01-25T12:59:01.066Z" }, + { url = "https://files.pythonhosted.org/packages/0d/ae/04ffe96a80f107ea21b22b2367175c621da920063260a1c22f9452fd7866/coverage-7.13.2-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:c6cadac7b8ace1ba9144feb1ae3cb787a6065ba6d23ffc59a934b16406c26573", size = 262284, upload-time = "2026-01-25T12:59:02.802Z" }, + { url = "https://files.pythonhosted.org/packages/1c/7a/6f354dcd7dfc41297791d6fb4e0d618acb55810bde2c1fd14b3939e05c2b/coverage-7.13.2-cp313-cp313t-win32.whl", hash = "sha256:14ae4146465f8e6e6253eba0cccd57423e598a4cb925958b240c805300918343", size = 222389, upload-time = "2026-01-25T12:59:04.563Z" }, + { url = "https://files.pythonhosted.org/packages/8d/d5/080ad292a4a3d3daf411574be0a1f56d6dee2c4fdf6b005342be9fac807f/coverage-7.13.2-cp313-cp313t-win_amd64.whl", hash = "sha256:9074896edd705a05769e3de0eac0a8388484b503b68863dd06d5e473f874fd47", size = 223450, upload-time = "2026-01-25T12:59:06.677Z" }, + { url = "https://files.pythonhosted.org/packages/88/96/df576fbacc522e9fb8d1c4b7a7fc62eb734be56e2cba1d88d2eabe08ea3f/coverage-7.13.2-cp313-cp313t-win_arm64.whl", hash = "sha256:69e526e14f3f854eda573d3cf40cffd29a1a91c684743d904c33dbdcd0e0f3e7", size = 221707, upload-time = "2026-01-25T12:59:08.363Z" }, + { url = "https://files.pythonhosted.org/packages/55/53/1da9e51a0775634b04fcc11eb25c002fc58ee4f92ce2e8512f94ac5fc5bf/coverage-7.13.2-cp314-cp314-macosx_10_15_x86_64.whl", hash = "sha256:387a825f43d680e7310e6f325b2167dd093bc8ffd933b83e9aa0983cf6e0a2ef", size = 219213, upload-time = "2026-01-25T12:59:11.909Z" }, + { url = "https://files.pythonhosted.org/packages/46/35/b3caac3ebbd10230fea5a33012b27d19e999a17c9285c4228b4b2e35b7da/coverage-7.13.2-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:f0d7fea9d8e5d778cd5a9e8fc38308ad688f02040e883cdc13311ef2748cb40f", size = 219549, upload-time = "2026-01-25T12:59:13.638Z" }, + { url = "https://files.pythonhosted.org/packages/76/9c/e1cf7def1bdc72c1907e60703983a588f9558434a2ff94615747bd73c192/coverage-7.13.2-cp314-cp314-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:e080afb413be106c95c4ee96b4fffdc9e2fa56a8bbf90b5c0918e5c4449412f5", size = 250586, upload-time = "2026-01-25T12:59:15.808Z" }, + { url = "https://files.pythonhosted.org/packages/ba/49/f54ec02ed12be66c8d8897270505759e057b0c68564a65c429ccdd1f139e/coverage-7.13.2-cp314-cp314-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:a7fc042ba3c7ce25b8a9f097eb0f32a5ce1ccdb639d9eec114e26def98e1f8a4", size = 253093, upload-time = "2026-01-25T12:59:17.491Z" }, + { url = "https://files.pythonhosted.org/packages/fb/5e/aaf86be3e181d907e23c0f61fccaeb38de8e6f6b47aed92bf57d8fc9c034/coverage-7.13.2-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:d0ba505e021557f7f8173ee8cd6b926373d8653e5ff7581ae2efce1b11ef4c27", size = 254446, upload-time = "2026-01-25T12:59:19.752Z" }, + { url = "https://files.pythonhosted.org/packages/28/c8/a5fa01460e2d75b0c853b392080d6829d3ca8b5ab31e158fa0501bc7c708/coverage-7.13.2-cp314-cp314-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:7de326f80e3451bd5cc7239ab46c73ddb658fe0b7649476bc7413572d36cd548", size = 250615, upload-time = "2026-01-25T12:59:21.928Z" }, + { url = "https://files.pythonhosted.org/packages/86/0b/6d56315a55f7062bb66410732c24879ccb2ec527ab6630246de5fe45a1df/coverage-7.13.2-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:abaea04f1e7e34841d4a7b343904a3f59481f62f9df39e2cd399d69a187a9660", size = 252452, upload-time = "2026-01-25T12:59:23.592Z" }, + { url = "https://files.pythonhosted.org/packages/30/19/9bc550363ebc6b0ea121977ee44d05ecd1e8bf79018b8444f1028701c563/coverage-7.13.2-cp314-cp314-musllinux_1_2_i686.whl", hash = "sha256:9f93959ee0c604bccd8e0697be21de0887b1f73efcc3aa73a3ec0fd13feace92", size = 250418, upload-time = "2026-01-25T12:59:25.392Z" }, + { url = "https://files.pythonhosted.org/packages/1f/53/580530a31ca2f0cc6f07a8f2ab5460785b02bb11bdf815d4c4d37a4c5169/coverage-7.13.2-cp314-cp314-musllinux_1_2_riscv64.whl", hash = "sha256:13fe81ead04e34e105bf1b3c9f9cdf32ce31736ee5d90a8d2de02b9d3e1bcb82", size = 250231, upload-time = "2026-01-25T12:59:27.888Z" }, + { url = "https://files.pythonhosted.org/packages/e2/42/dd9093f919dc3088cb472893651884bd675e3df3d38a43f9053656dca9a2/coverage-7.13.2-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:d6d16b0f71120e365741bca2cb473ca6fe38930bc5431c5e850ba949f708f892", size = 251888, upload-time = "2026-01-25T12:59:29.636Z" }, + { url = "https://files.pythonhosted.org/packages/fa/a6/0af4053e6e819774626e133c3d6f70fae4d44884bfc4b126cb647baee8d3/coverage-7.13.2-cp314-cp314-win32.whl", hash = "sha256:9b2f4714bb7d99ba3790ee095b3b4ac94767e1347fe424278a0b10acb3ff04fe", size = 221968, upload-time = "2026-01-25T12:59:31.424Z" }, + { url = "https://files.pythonhosted.org/packages/c4/cc/5aff1e1f80d55862442855517bb8ad8ad3a68639441ff6287dde6a58558b/coverage-7.13.2-cp314-cp314-win_amd64.whl", hash = "sha256:e4121a90823a063d717a96e0a0529c727fb31ea889369a0ee3ec00ed99bf6859", size = 222783, upload-time = "2026-01-25T12:59:33.118Z" }, + { url = "https://files.pythonhosted.org/packages/de/20/09abafb24f84b3292cc658728803416c15b79f9ee5e68d25238a895b07d9/coverage-7.13.2-cp314-cp314-win_arm64.whl", hash = "sha256:6873f0271b4a15a33e7590f338d823f6f66f91ed147a03938d7ce26efd04eee6", size = 221348, upload-time = "2026-01-25T12:59:34.939Z" }, + { url = "https://files.pythonhosted.org/packages/b6/60/a3820c7232db63be060e4019017cd3426751c2699dab3c62819cdbcea387/coverage-7.13.2-cp314-cp314t-macosx_10_15_x86_64.whl", hash = "sha256:f61d349f5b7cd95c34017f1927ee379bfbe9884300d74e07cf630ccf7a610c1b", size = 219950, upload-time = "2026-01-25T12:59:36.624Z" }, + { url = "https://files.pythonhosted.org/packages/fd/37/e4ef5975fdeb86b1e56db9a82f41b032e3d93a840ebaf4064f39e770d5c5/coverage-7.13.2-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:a43d34ce714f4ca674c0d90beb760eb05aad906f2c47580ccee9da8fe8bfb417", size = 220209, upload-time = "2026-01-25T12:59:38.339Z" }, + { url = "https://files.pythonhosted.org/packages/54/df/d40e091d00c51adca1e251d3b60a8b464112efa3004949e96a74d7c19a64/coverage-7.13.2-cp314-cp314t-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:bff1b04cb9d4900ce5c56c4942f047dc7efe57e2608cb7c3c8936e9970ccdbee", size = 261576, upload-time = "2026-01-25T12:59:40.446Z" }, + { url = "https://files.pythonhosted.org/packages/c5/44/5259c4bed54e3392e5c176121af9f71919d96dde853386e7730e705f3520/coverage-7.13.2-cp314-cp314t-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:6ae99e4560963ad8e163e819e5d77d413d331fd00566c1e0856aa252303552c1", size = 263704, upload-time = "2026-01-25T12:59:42.346Z" }, + { url = "https://files.pythonhosted.org/packages/16/bd/ae9f005827abcbe2c70157459ae86053971c9fa14617b63903abbdce26d9/coverage-7.13.2-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:e79a8c7d461820257d9aa43716c4efc55366d7b292e46b5b37165be1d377405d", size = 266109, upload-time = "2026-01-25T12:59:44.073Z" }, + { url = "https://files.pythonhosted.org/packages/a2/c0/8e279c1c0f5b1eaa3ad9b0fb7a5637fc0379ea7d85a781c0fe0bb3cfc2ab/coverage-7.13.2-cp314-cp314t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:060ee84f6a769d40c492711911a76811b4befb6fba50abb450371abb720f5bd6", size = 260686, upload-time = "2026-01-25T12:59:45.804Z" }, + { url = "https://files.pythonhosted.org/packages/b2/47/3a8112627e9d863e7cddd72894171c929e94491a597811725befdcd76bce/coverage-7.13.2-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:3bca209d001fd03ea2d978f8a4985093240a355c93078aee3f799852c23f561a", size = 263568, upload-time = "2026-01-25T12:59:47.929Z" }, + { url = "https://files.pythonhosted.org/packages/92/bc/7ea367d84afa3120afc3ce6de294fd2dcd33b51e2e7fbe4bbfd200f2cb8c/coverage-7.13.2-cp314-cp314t-musllinux_1_2_i686.whl", hash = "sha256:6b8092aa38d72f091db61ef83cb66076f18f02da3e1a75039a4f218629600e04", size = 261174, upload-time = "2026-01-25T12:59:49.717Z" }, + { url = "https://files.pythonhosted.org/packages/33/b7/f1092dcecb6637e31cc2db099581ee5c61a17647849bae6b8261a2b78430/coverage-7.13.2-cp314-cp314t-musllinux_1_2_riscv64.whl", hash = "sha256:4a3158dc2dcce5200d91ec28cd315c999eebff355437d2765840555d765a6e5f", size = 260017, upload-time = "2026-01-25T12:59:51.463Z" }, + { url = "https://files.pythonhosted.org/packages/2b/cd/f3d07d4b95fbe1a2ef0958c15da614f7e4f557720132de34d2dc3aa7e911/coverage-7.13.2-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:3973f353b2d70bd9796cc12f532a05945232ccae966456c8ed7034cb96bbfd6f", size = 262337, upload-time = "2026-01-25T12:59:53.407Z" }, + { url = "https://files.pythonhosted.org/packages/e0/db/b0d5b2873a07cb1e06a55d998697c0a5a540dcefbf353774c99eb3874513/coverage-7.13.2-cp314-cp314t-win32.whl", hash = "sha256:79f6506a678a59d4ded048dc72f1859ebede8ec2b9a2d509ebe161f01c2879d3", size = 222749, upload-time = "2026-01-25T12:59:56.316Z" }, + { url = "https://files.pythonhosted.org/packages/e5/2f/838a5394c082ac57d85f57f6aba53093b30d9089781df72412126505716f/coverage-7.13.2-cp314-cp314t-win_amd64.whl", hash = "sha256:196bfeabdccc5a020a57d5a368c681e3a6ceb0447d153aeccc1ab4d70a5032ba", size = 223857, upload-time = "2026-01-25T12:59:58.201Z" }, + { url = "https://files.pythonhosted.org/packages/44/d4/b608243e76ead3a4298824b50922b89ef793e50069ce30316a65c1b4d7ef/coverage-7.13.2-cp314-cp314t-win_arm64.whl", hash = "sha256:69269ab58783e090bfbf5b916ab3d188126e22d6070bbfc93098fdd474ef937c", size = 221881, upload-time = "2026-01-25T13:00:00.449Z" }, + { url = "https://files.pythonhosted.org/packages/d2/db/d291e30fdf7ea617a335531e72294e0c723356d7fdde8fba00610a76bda9/coverage-7.13.2-py3-none-any.whl", hash = "sha256:40ce1ea1e25125556d8e76bd0b61500839a07944cc287ac21d5626f3e620cad5", size = 210943, upload-time = "2026-01-25T13:00:02.388Z" }, +] + [[package]] name = "cryptography" version = "46.0.4" @@ -1940,19 +2003,19 @@ email = [ [[package]] name = "pydantic-ai" -version = "1.48.0" +version = "1.49.0" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "pydantic-ai-slim", extra = ["ag-ui", "anthropic", "bedrock", "cli", "cohere", "evals", "fastmcp", "google", "groq", "huggingface", "logfire", "mcp", "mistral", "openai", "retries", "temporal", "ui", "vertexai", "xai"] }, ] -sdist = { url = "https://files.pythonhosted.org/packages/37/30/913d8d3f5271289c1753d0970751143c47acf762d00b11756b25e3e5db34/pydantic_ai-1.48.0.tar.gz", hash = "sha256:d739d7a56125f58a9a8dfbdbb737cb082f9304802f9886ada4195ff76883b15c", size = 11795, upload-time = "2026-01-28T00:09:30.511Z" } +sdist = { url = "https://files.pythonhosted.org/packages/c6/33/9575f50d72342962c712c35cd16a2b14f6d5367227b58d8cebdd7556c284/pydantic_ai-1.49.0.tar.gz", hash = "sha256:a40ec23768d79ca47afa8d8956b17e8b90385a29337d9eecdade8a236bba198d", size = 11795, upload-time = "2026-01-29T18:55:43.39Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/32/ce/dd0dbaa03f2166e895ef4af74f46780b68c96f17538a0bc28edff2196748/pydantic_ai-1.48.0-py3-none-any.whl", hash = "sha256:da7eacfee0b9534f03ff50d96a66406f7030df59e2626b0100618c0d208b2503", size = 7220, upload-time = "2026-01-28T00:09:20.729Z" }, + { url = "https://files.pythonhosted.org/packages/43/ff/9d6e1526eeb03217dc8e3f367e310759ad06b4c022fcf7e53e6a1b351892/pydantic_ai-1.49.0-py3-none-any.whl", hash = "sha256:fd97653a16ef241ec11197d3c07f8fceb534340845124a9812bb358fd138fb6d", size = 7220, upload-time = "2026-01-29T18:55:34.446Z" }, ] [[package]] name = "pydantic-ai-slim" -version = "1.48.0" +version = "1.49.0" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "genai-prices" }, @@ -1963,9 +2026,9 @@ dependencies = [ { name = "pydantic-graph" }, { name = "typing-inspection" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/83/e1/02d712bb1e93ccd4cf08e154811914da406320cf2ce041941992b642f675/pydantic_ai_slim-1.48.0.tar.gz", hash = "sha256:b386dc9fb6f751aac12599b0efdd3f82ca0c9ec9ae67323bc343d1b4f5f1e7eb", size = 396949, upload-time = "2026-01-28T00:09:33.073Z" } +sdist = { url = "https://files.pythonhosted.org/packages/fa/63/d30dac3ebff8bdc17fdbd59b38e0f9d5d3cd3dd1b16bafc3d816323577ce/pydantic_ai_slim-1.49.0.tar.gz", hash = "sha256:83e2406f14be2f15e8722bcabe93c9c9296628e0431543a35ca0a6a75aec827c", size = 402318, upload-time = "2026-01-29T18:55:45.433Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/24/77/42872ccfd21bb8fc701fbe5715ec476de521200284527d04981d32e88389/pydantic_ai_slim-1.48.0-py3-none-any.whl", hash = "sha256:23aa473115f7402876cff448d87ae411dace9c78457640e395f9878882ec2f00", size = 519241, upload-time = "2026-01-28T00:09:24.215Z" }, + { url = "https://files.pythonhosted.org/packages/70/1b/7af71c6ce23866183e07dc4a23571156444e13ce4197d3fe763300baed02/pydantic_ai_slim-1.49.0-py3-none-any.whl", hash = "sha256:44fed1d7d67361bb7f2e582500409b0e4e11536db6f919ed5b3de9f50bc78faa", size = 526438, upload-time = "2026-01-29T18:55:37.49Z" }, ] [package.optional-dependencies] @@ -2088,7 +2151,7 @@ wheels = [ [[package]] name = "pydantic-evals" -version = "1.48.0" +version = "1.49.0" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "anyio" }, @@ -2098,14 +2161,14 @@ dependencies = [ { name = "pyyaml" }, { name = "rich" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/01/a1/50e031c1d11613cb4864d885a2e9ca87ad397b132a570c77b7c4cd2054f3/pydantic_evals-1.48.0.tar.gz", hash = "sha256:cc0d7a2178fcae8696df56a8ed7b44b8eb80991d492b608a83d3dfa80d8e0f35", size = 47191, upload-time = "2026-01-28T00:09:34.69Z" } +sdist = { url = "https://files.pythonhosted.org/packages/b4/17/2d496cbc873c8faf00711eaf884d23b76228fe00505540ee8542d34c6f7f/pydantic_evals-1.49.0.tar.gz", hash = "sha256:6bb147d25b07953ee01beffdad9288756fb1923cbe32396839f4c97c31da550e", size = 47190, upload-time = "2026-01-29T18:55:46.629Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/b2/48/f210b0c3b0600e47295c10e532eaa97e0bde4ed5f9c84a5cd87a9401a069/pydantic_evals-1.48.0-py3-none-any.whl", hash = "sha256:cf7982038407f7d905d95a8c9f693b37dfb515024d63f28013e400100d00c5de", size = 56378, upload-time = "2026-01-28T00:09:26.162Z" }, + { url = "https://files.pythonhosted.org/packages/79/35/ceaaa0b50dd6975df752d7489dc3532391211b4a6a6b66ebd7f9ca8d4781/pydantic_evals-1.49.0-py3-none-any.whl", hash = "sha256:6e513d5b79da5856dbe624bc5fa878578cf3553b8dc63f27447e48f5b9a59d6e", size = 56378, upload-time = "2026-01-29T18:55:38.999Z" }, ] [[package]] name = "pydantic-graph" -version = "1.48.0" +version = "1.49.0" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "httpx" }, @@ -2113,9 +2176,9 @@ dependencies = [ { name = "pydantic" }, { name = "typing-inspection" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/f2/da/ed69a409ff559e1088607b5f65b667edbee0dd0169582c013221bfed4711/pydantic_graph-1.48.0.tar.gz", hash = "sha256:de731137208a80dbf1396f23954e7f920da4b8b9d74d02e5cdb154d01fb878c5", size = 58458, upload-time = "2026-01-28T00:09:36.18Z" } +sdist = { url = "https://files.pythonhosted.org/packages/32/91/156f8762900de5e9325decf6491ba011e9d2e04773f9b81000763110d0d2/pydantic_graph-1.49.0.tar.gz", hash = "sha256:9e1379ac99c56f6d3cc33d7b3524151b64ec12b8c4b099de811727cb64395912", size = 58461, upload-time = "2026-01-29T18:55:49.258Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/16/7f/9db05557d9fdb01e888ce5c18875346868b68d94ab600154d74a34279eca/pydantic_graph-1.48.0-py3-none-any.whl", hash = "sha256:47557ffca1972a8cfb1de0c427bf4eba0a080dbd53e966a94e88b55529442485", size = 72345, upload-time = "2026-01-28T00:09:28.15Z" }, + { url = "https://files.pythonhosted.org/packages/be/d1/2f5497dce7c3950788ae4a5c5b6a7867f104f1bc3582d30d962c437022f0/pydantic_graph-1.49.0-py3-none-any.whl", hash = "sha256:01289209681ab434710aff5bb8e49f7d21d7f9ce6952a78c893584f197da5fb9", size = 72346, upload-time = "2026-01-29T18:55:40.896Z" }, ] [[package]] @@ -2203,6 +2266,20 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/3b/ab/b3226f0bd7cdcf710fbede2b3548584366da3b19b5021e74f5bde2a8fa3f/pytest-9.0.2-py3-none-any.whl", hash = "sha256:711ffd45bf766d5264d487b917733b453d917afd2b0ad65223959f59089f875b", size = 374801, upload-time = "2025-12-06T21:30:49.154Z" }, ] +[[package]] +name = "pytest-cov" +version = "7.0.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "coverage" }, + { name = "pluggy" }, + { name = "pytest" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/5e/f7/c933acc76f5208b3b00089573cf6a2bc26dc80a8aece8f52bb7d6b1855ca/pytest_cov-7.0.0.tar.gz", hash = "sha256:33c97eda2e049a0c5298e91f519302a1334c26ac65c1a483d6206fd458361af1", size = 54328, upload-time = "2025-09-09T10:57:02.113Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/ee/49/1377b49de7d0c1ce41292161ea0f721913fa8722c19fb9c1e3aa0367eecb/pytest_cov-7.0.0-py3-none-any.whl", hash = "sha256:3b8e9558b16cc1479da72058bdecf8073661c7f57f7d3c5f22a1c23507f2d861", size = 22424, upload-time = "2025-09-09T10:57:00.695Z" }, +] + [[package]] name = "python-dateutil" version = "2.9.0.post0" @@ -2906,7 +2983,7 @@ wheels = [ [[package]] name = "xai-sdk" -version = "1.6.0" +version = "1.6.1" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "aiohttp" }, @@ -2918,9 +2995,9 @@ dependencies = [ { name = "pydantic" }, { name = "requests" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/8b/38/751421b0b40050582ab5f5ed56f0074d322668cf72b0374b1b6f898388ac/xai_sdk-1.6.0.tar.gz", hash = "sha256:e0fc9df3a1cab1ec744a9c962d6802f0b27cfd91b4ca8c18f74dd5f4f944f7f7", size = 371793, upload-time = "2026-01-27T12:31:18.526Z" } +sdist = { url = "https://files.pythonhosted.org/packages/9e/66/1e0163eac090733d0ed0836a0cd3c14f5b59abeaa6fdba71c7b56b1916e4/xai_sdk-1.6.1.tar.gz", hash = "sha256:b55528df188f8c8448484021d735f75b0e7d71719ddeb432c5f187ac67e3c983", size = 388223, upload-time = "2026-01-29T03:13:07.373Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/7b/ca/9924cf64c1289ff7725572ca17fb2dea7573233d3cc964a16287260145cf/xai_sdk-1.6.0-py3-none-any.whl", hash = "sha256:b1e828dfb631233bfb3dfc3173ee4e37432710de76094b695f2a89f6cd659ba4", size = 224130, upload-time = "2026-01-27T12:31:17.222Z" }, + { url = "https://files.pythonhosted.org/packages/94/98/8b4019b35f2200295c5eec8176da4b779ec3a0fd60eba7196b618f437e1f/xai_sdk-1.6.1-py3-none-any.whl", hash = "sha256:f478dee9bd8839b8d341bd075277d0432aff5cd7120a4284547d25c6c9e7ab3b", size = 240917, upload-time = "2026-01-29T03:13:05.626Z" }, ] [[package]]