diff --git a/.agent/rules/pythonic-fastapi-async.md b/.agent/rules/pythonic-fastapi-async.md new file mode 100644 index 0000000..f10716b --- /dev/null +++ b/.agent/rules/pythonic-fastapi-async.md @@ -0,0 +1,58 @@ +--- +trigger: glob +description: Rules for working with Python and FastAPI backend/API development. +globs: *.py +--- + +You are an expert in Python, FastAPI, and scalable API development. + +Tech Stack + +- FastAPI +- Pydantic v2 +- Async database libraries +- SQLAlchemy 2.0 (if using ORM features) + +Refer to FastAPI, Pydantic, SQLAlachemy, other library documentation for Data Models, Shemas, Path Operations, Middleware and for best practices. + +## General Coding Principles + +- Write concise, technical responses with accurate Python examples. +- Prefer iteration and modularization over code duplication. +- Favour composition over inheritence. Smart core, Thin interfaces. + +## Naming and File Structure + +- Prefer writing meaningful, descriptive variable names with auxiliary verbs (e.g., is_active, has_permission), over (decorated) comments, follow pep8 style documenting comments. +- Use lowercase with underscores for directories and files (e.g., routers/user_routes.py). +- Favor named exports for routes and utility functions. + +## Asynchronous Programming + +- Use def for synchronous operations and async def for asynchronous ones. + +## Type Safety and Validation + +- Use type hints for all function signatures. Prefer Pydantic models over raw dictionaries for input validation. +- Write typed python code strictly, and avoid the use of `Any` +- Use functional components (plain functions) and Pydantic models/basemodel for consistent input/output validation and response schemas. +- Ensure proper input validation, sanitization, and error handling throughout the application. + +## Error Handling + +- Prioritize error handling and edge cases +- Use HTTPException for expected errors and model them as specific HTTP responses. + +## FastAPI Best Practices + +- Use declarative route definitions with clear return type annotations. +- Minimize @app.on_event("startup") and @app.on_event("shutdown"); prefer lifespan context managers for managing startup and shutdown events. +- Use middleware for logging, error monitoring, and performance optimization and for handling unexpected errors. + +## Performance Optimization + +- Optimize for performance using async functions for I/O-bound tasks, caching strategies, and lazy loading. +- Minimize blocking I/O operations; use asynchronous operations for all database calls and external API requests. +- Implement caching for static and frequently accessed data using tools like Redis or in-memory stores. +- Optimize data serialization and deserialization with Pydantic. +- Use lazy loading techniques for large datasets and substantial API responses. diff --git a/.dockerignore b/.dockerignore index 28a8ee2..0a21abb 100644 --- a/.dockerignore +++ b/.dockerignore @@ -56,5 +56,6 @@ test_*.py docs/ *.md !README.md +!Readme.md knowledge/ samples/ \ No newline at end of file diff --git a/.env.example b/.env.example deleted file mode 100644 index 65624ff..0000000 --- a/.env.example +++ /dev/null @@ -1,20 +0,0 @@ -# BookBytes Environment Variables -# Copy this file to .env and fill in your actual values - -# OpenAI API Key (Required) -# Get your API key from: https://platform.openai.com/api-keys -OPENAI_API_KEY=your-openai-api-key-here - -# Optional: Flask Configuration -FLASK_ENV=development -FLASK_DEBUG=1 - -# Optional: Database Configuration -# DATABASE_PATH=bookbytes.db - -# Optional: Audio Directory -# AUDIO_DIR=audio - -# Optional: Server Configuration -# HOST=0.0.0.0 -# PORT=5000 \ No newline at end of file diff --git a/.github/copilot-instructions.md b/.github/copilot-instructions.md index 8bb6da6..25a41d0 100644 --- a/.github/copilot-instructions.md +++ b/.github/copilot-instructions.md @@ -1,47 +1,79 @@ # BookBytes AI Coding Instructions -## 🧠 Project Overview - -BookBytes converts physical books (via ISBN) into chapter-wise audio summaries. - -- **Core Logic**: `BookBytesApp` class in `app.py` orchestrates the entire pipeline. -- **Stack**: Python 3.8+, Flask, SQLite, OpenAI GPT-3.5, gTTS. -- **Data Flow**: ISBN -> Open Library API (Metadata) -> OpenAI (Chapter extraction & Summaries) -> gTTS (Audio) -> SQLite (Persistence). - -## πŸ— Architecture & Patterns - -- **Service Layer**: `BookBytesApp` encapsulates all business logic. Do not put logic in Flask routes or CLI commands; they should only call `BookBytesApp` methods. -- **Data Models**: Use `@dataclass` for entities (`Book`, `Chapter`) defined in `app.py`. -- **Database**: SQLite with raw SQL queries in `BookBytesApp`. Tables: `books`, `chapters`. -- **Logging**: MUST use `logger.py`. Import with `from logger import get_logger`. - ```python - logger = get_logger(__name__) - logger.info("Message", extra={"context": "value"}) - ``` -- **Path Handling**: Always use `pathlib.Path` instead of `os.path`. - -## πŸ›  Workflows & Commands - -- **Run API**: `python app.py` (Starts Flask server on port 5000). -- **Run CLI**: `python cli.py [command]` (e.g., `process --isbn `). -- **Docker**: `docker-compose up -d` (Runs app + persists data in `bookbytes-data` volume). -- **Testing**: - - `test_app.py` is a standalone integration test script, NOT a pytest suite. - - Run against a running server: `python test_app.py`. - - Ensure `OPENAI_API_KEY` is set in `.env` before running. - -## πŸ“¦ Dependencies & Integrations - -- **External APIs**: - - Open Library (Book metadata). - - OpenAI API (Summarization, Chapter detection). -- **Audio**: `gTTS` (Google Text-to-Speech) saves files to `audio/` directory. -- **Environment**: Load vars using `python-dotenv` (handled in `cli.py` and `app.py`). - -## 🚨 Critical Conventions - -- **Error Handling**: Catch exceptions in `BookBytesApp` methods and return a result dict (`{'success': False, 'message': ...}`) rather than raising exceptions to the caller. -- **File Structure**: - - `app.py`: Monolithic core (API + Logic + Models). - - `cli.py`: CLI wrapper around `BookBytesApp`. - - `knowledge/`: Documentation storage. +## Project Overview + +BookBytes converts physical books (via ISBN) into chapter-wise audio summaries. Currently being refactored from Flask monolith to production FastAPI. + +- **Stack**: Python 3.13+, FastAPI, PostgreSQL (async), Redis, ARQ workers, OpenAI, gTTS +- **Data Flow**: ISBN β†’ Open Library API β†’ OpenAI (chapters + summaries) β†’ gTTS β†’ Storage (local/S3) + +## Architecture (`src/bookbytes/`) + +``` +main.py # FastAPI app factory with lifespan, middleware, exception handlers +config.py # Pydantic Settings with env validation (Settings class, enums) +dependencies.py # FastAPI Depends() injection container +core/ + exceptions.py # Exception hierarchy (BookBytesError base, domain-specific subclasses) + logging.py # Structlog config with correlation ID support +api/v1/ # Versioned API routers +services/ # Business logic (call from routes, not vice versa) +repositories/ # Database access layer (SQLAlchemy async) +schemas/ # Pydantic request/response models (BaseSchema in common.py) +models/ # SQLAlchemy ORM models +storage/ # Pluggable storage (local dev, S3 prod) +workers/ # ARQ background job handlers +``` + +## Logging (MUST use structlog) + +```python +from bookbytes.core.logging import get_logger +logger = get_logger(__name__) +logger.info("Processing book", isbn="123", user_id="abc") # Key-value pairs, not f-strings +``` + +Correlation IDs are auto-injected via middleware. Use `set_correlation_id()` for background jobs. + +## Exceptions Pattern + +Raise domain exceptions from `core/exceptions.py`, never raw `Exception`. Global handlers convert to JSON: + +```python +from bookbytes.core.exceptions import BookNotFoundError +raise BookNotFoundError(isbn="123") # Returns {"error": {"code": "BOOK_NOT_FOUND", ...}} +``` + +## Configuration + +All config via `Settings` class in `config.py`. Access with `get_settings()` (cached) or `SettingsDep` in routes: + +```python +from bookbytes.config import get_settings +settings = get_settings() +if settings.is_development: ... +``` + +## Commands + +- **Run API**: `uv run python -m bookbytes.main` or `uv run uvicorn bookbytes.main:app --reload` +- **Tests**: `uv run pytest tests/` (async fixtures in `tests/conftest.py`) +- **Lint**: `uv run ruff check src/ tests/` | **Format**: `uv run ruff format src/ tests/` +- **Type check**: `uv run mypy src/` + +## Testing Conventions + +- Use fixtures from `tests/conftest.py`: `async_client`, `authenticated_client`, `test_settings` +- Mock external services (OpenAI, Open Library) using `tests/mocks/` + +## Key Conventions + +- **Async everywhere**: All DB/HTTP ops must use `async/await` +- **Pydantic schemas**: Inherit from `BaseSchema` in `schemas/common.py` (auto ORM conversion) +- **Enums for options**: Use `str, Enum` pattern (e.g., `Environment`, `StorageBackend`) for type-safe configs +- **Path handling**: Use `pathlib.Path`, never `os.path` +- **Auth modes**: `API_KEY` for dev (header `X-API-Key`), `JWT` for prod + +## Legacy Code (root-level) + +`app.py`, `cli.py`, `test_app.py` are the original Flask implementationβ€”reference for business logic only. diff --git a/Makefile b/Makefile new file mode 100644 index 0000000..c4ca7a7 --- /dev/null +++ b/Makefile @@ -0,0 +1,144 @@ +.PHONY: help dev prod build-dev build-prod down-dev down-prod \ + start-dev-infra start-prod-infra \ + logs-dev logs-prod api-logs-dev api-logs-prod \ + db-logs-dev db-logs-prod redis-logs-dev redis-logs-prod \ + migrate shell test clean-docker-dev status + +# Variables +BACKEND_DIR := backend +COMPOSE_DEV := $(BACKEND_DIR)/docker/docker-compose.dev.yml +COMPOSE_PROD := $(BACKEND_DIR)/docker/docker-compose.yml +DC_DEV := docker compose -f $(COMPOSE_DEV) +DC_PROD := docker compose -f $(COMPOSE_PROD) + +help: + @echo "BookBytes - Development Commands" + @echo "=================================" + @echo "" + @echo "πŸš€ Environment:" + @echo " dev - Start dev (hot reload)" + @echo " prod - Start production" + @echo " build-dev - Build dev containers" + @echo " build-prod - Build prod containers" + @echo " down-dev - Stop dev services" + @echo " down-prod - Stop prod services" + @echo " start-dev-infra - Start infra only (postgres, redis)" + @echo " start-prod-infra - Start prod infra only" + @echo "" + @echo "Logs (composable env-component):" + @echo " logs-dev - All dev logs" + @echo " logs-prod - All prod logs" + @echo " api-logs-dev - Dev API logs" + @echo " api-logs-prod - Prod API logs" + @echo " db-logs-dev - Dev DB logs" + @echo " db-logs-prod - Prod DB logs" + @echo " redis-logs-dev - Dev Redis logs" + @echo " redis-logs-prod - Prod Redis logs" + @echo "" + @echo "Database (dev-container only):" + @echo " migrate - Run migrations (dev)" + @echo " shell - Python shell (dev)" + @echo " db-shell - PostgreSQL shell (dev)" + @echo "" + @echo "πŸ§ͺ Testing (dev-container only):" + @echo " test - Run tests" + @echo " test-cov - Tests with coverage" + @echo "" + @echo "🧹 Cleanup (dev-container only):" + @echo " clean-docker-dev - Remove dev containers/images/volumes" + @echo " status - Show container status" + +# Start +dev: + $(DC_DEV) up -d + +prod: + $(DC_PROD) up -d + +# Build +build-dev: + $(DC_DEV) build + +build-prod: + $(DC_PROD) build + +# Stop +down-dev: + $(DC_DEV) down + +down-prod: + $(DC_PROD) down + +# Infrastructure only (no API - for migrations, local dev) +start-dev-infra: + $(DC_DEV) up -d postgres redis + +start-prod-infra: + $(DC_PROD) up -d postgres redis + +# Logs (composable) +logs-dev: + $(DC_DEV) logs -f + +logs-prod: + $(DC_PROD) logs -f + +api-logs-dev: + $(DC_DEV) logs -f api + +api-logs-prod: + $(DC_PROD) logs -f api + +db-logs-dev: + $(DC_DEV) logs -f postgres + +db-logs-prod: + $(DC_PROD) logs -f postgres + +redis-logs-dev: + $(DC_DEV) logs -f redis + +redis-logs-prod: + $(DC_PROD) logs -f redis + +# Database +migrate: + $(DC_DEV) exec api uv run alembic upgrade head + +shell: + $(DC_DEV) exec api uv run python + +db-shell: + $(DC_DEV) exec postgres psql -U bookbytes -d bookbytes + +# Test +test: + cd $(BACKEND_DIR) && uv sync --all-extras && uv run pytest + +test-cov: + cd $(BACKEND_DIR) && uv sync --all-extras && uv run pytest --cov=src/bookbytes --cov-report=html + +# Status +status: + @echo "Dev containers:" + @$(DC_DEV) ps + @echo "" + @echo "Prod containers:" + @$(DC_PROD) ps + +# Cleanup (DEV ONLY - composable) +clean-dev-containers: + @echo "🧹 Removing dev containers..." + $(DC_DEV) rm -f + +clean-dev-images: + @echo "🧹 Removing dev images..." + @docker images | grep bookbytes-.*-dev | awk '{print $$3}' | xargs -r docker rmi 2>/dev/null || echo "No dev images to remove" + +clean-dev-volumes: + @echo "⚠️ Removing dev volumes..." + @read -p "Remove dev volumes? [y/N] " confirm && [ "$$confirm" = "y" ] && \ + docker volume ls | grep bookbytes_dev | awk '{print $$2}' | xargs -r docker volume rm 2>/dev/null || echo "Skipped volume removal" + +clean-dev: down-dev clean-dev-containers clean-dev-images clean-dev-volumes + @echo "βœ… Dev cleanup complete" diff --git a/backend/Readme.md b/backend/Readme.md new file mode 100644 index 0000000..5608b51 --- /dev/null +++ b/backend/Readme.md @@ -0,0 +1,231 @@ +# BookBytes Backend πŸ“šπŸŽ§ + +> Transform lengthy non-fiction books into concise, chapter-wise audio summaries. + +## Overview + +BookBytes converts books into digestible 5-minute audio bytes. A 250-page book becomes 15-20 short audio chapters totaling 1.5-2 hours. + +**Tech Stack:** Python 3.13+, FastAPI, PostgreSQL, Redis, SQLAlchemy 2.0, httpx + +## Architecture + +``` +β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” +β”‚ BookBytes Library β”‚ +β”œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€ +β”‚ β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ +β”‚ β”‚ Works │───▢│ Editions │───▢│ AudioBooks β”‚ β”‚ +β”‚ β”‚ (our data) β”‚ β”‚ (ISBNs) β”‚ β”‚ (our content) β”‚ β”‚ +β”‚ β””β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”˜ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚ +β”‚ β”‚ β”‚ +β”‚ β–Ό β”‚ +β”‚ β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ +β”‚ β”‚BookProvidersβ”‚ ← Maps our IDs to provider IDs (OL, Google)β”‚ +β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚ +β”‚ β”‚ β”‚ +β”‚ β–Ό β”‚ +β”‚ β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ +β”‚ β”‚ Redis Cacheβ”‚ ← Raw API responses (TTL-based) β”‚ +β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚ +β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ +``` + +### Data Model Hierarchy + +``` +Work (canonical book) + β”œβ”€β”€ Edition (specific ISBN/format) + β”‚ └── AudioBook (generated audio) + β”‚ └── Chapter (individual segments) + └── BookProvider (external ID mappings) +``` + +### Book Search Flow + +```mermaid +flowchart TD + A[User Search] --> B{Search Type?} + + %% ISBN Search Path + B -->|By ISBN| C[Check Library DB] + C -->|Found| E[Return Existing Book] + C -->|Not Found| F1[Check Cache] + F1 -->|Hit| G{Latest Edition?} + F1 -->|Miss| F[Query OpenLibrary API] + F --> G + G -->|Yes| H[Store in Library] + G -->|No| I[Find Latest Edition] + I --> H + + %% Title/Author Search Path + B -->|By Title/Author| D1[Check Search Cache] + D1 -->|Hit| K[Display Results] + D1 -->|Miss| D4[Query OpenLibrary API] + D4 --> K + + %% Selection & Processing + K --> L[User Selects Book] + L --> M[Store Work + Editions] + M --> N[Process AudioBook] +``` + +## Project Structure + +``` +backend/ +β”œβ”€β”€ src/bookbytes/ +β”‚ β”œβ”€β”€ api/v1/ # FastAPI routers +β”‚ β”‚ β”œβ”€β”€ router.py # Main v1 router +β”‚ β”‚ └── search.py # Book search endpoints +β”‚ β”œβ”€β”€ core/ # Config, database, logging +β”‚ β”œβ”€β”€ models/ # SQLAlchemy models +β”‚ β”‚ β”œβ”€β”€ work.py # Work entity +β”‚ β”‚ β”œβ”€β”€ edition.py # Edition entity +β”‚ β”‚ β”œβ”€β”€ audio_book.py # AudioBook entity +β”‚ β”‚ └── book_provider.py +β”‚ β”œβ”€β”€ repositories/ # Database access layer +β”‚ β”œβ”€β”€ schemas/ # Pydantic schemas +β”‚ └── services/ # Business logic +β”‚ β”œβ”€β”€ cache.py # Redis cache service +β”‚ β”œβ”€β”€ library.py # Work/Edition persistence +β”‚ └── openlibrary.py # OpenLibrary API client +β”œβ”€β”€ tests/ +β”‚ β”œβ”€β”€ unit/ # Unit tests (mocked deps) +β”‚ └── integration/ # Integration tests +└── pyproject.toml +``` + +## Local Development Setup + +### Prerequisites + +- Python 3.13+ +- Redis (for caching) +- PostgreSQL (or use SQLite for development) + +### 1. Clone and Setup + +```bash +git clone +cd bookbytes/backend +``` + +### 2. Install Dependencies + +```bash +# Using uv (recommended) +uv sync + +# Or using pip +pip install -e ".[dev]" +``` + +### 3. Environment Configuration + +Create a `.env` file: + +```bash +# Database +DATABASE_URL=sqlite+aiosqlite:///./bookbytes.db +# For PostgreSQL: +# DATABASE_URL=postgresql+asyncpg://user:pass@localhost/bookbytes + +# Redis +REDIS_URL=redis://localhost:6379/0 + +# API Keys +OPENAI_API_KEY=sk-... + +# App Config +APP_ENV=development +DEBUG=true +LOG_LEVEL=DEBUG +LOG_FORMAT=console +``` + +### 4. Start Redis + +```bash +# Using Docker +docker run -d -p 6379:6379 redis:alpine + +# Or install locally +brew install redis && redis-server +``` + +### 5. Run the Server + +```bash +# Development server with auto-reload +uv run uvicorn bookbytes.main:app --reload --port 8000 + +# Or using make +make dev +``` + +### 6. Run Tests + +```bash +# All unit tests +make test + +# Integration tests (requires Redis) +make test-integration + +# Coverage report +make test-cov +``` + +## API Endpoints + +| Method | Endpoint | Description | +| ------ | -------------------------------- | ---------------------------- | +| `POST` | `/api/v1/books/search` | Search books by title/author | +| `GET` | `/api/v1/books/works/{work_key}` | Get work details | +| `GET` | `/api/v1/books/isbn/{isbn}` | Lookup by ISBN | +| `GET` | `/health/live` | Liveness probe | +| `GET` | `/health/ready` | Readiness probe | + +### Example: Search Books + +```bash +curl -X POST http://localhost:8000/api/v1/books/search \ + -H "Content-Type: application/json" \ + -d '{"title": "Atomic Habits"}' +``` + +## Testing Strategy + +| Layer | Location | Coverage | +| ----------- | ---------------------------------------------- | ------------------------------------ | +| Unit | `tests/unit/` | Services, endpoints with mocked deps | +| Integration | `tests/integration/` | API + database + mocked externals | +| External | `tests/integration/` (`@pytest.mark.external`) | Real API calls | + +Run with markers: + +```bash +# Skip external API tests (for CI) +pytest -m "not external" + +# Only run external tests +pytest -m external +``` + +## Configuration + +All settings via environment variables or `.env`: + +| Variable | Default | Description | +| ---------------- | ------------- | -------------------- | +| `APP_ENV` | `development` | Environment mode | +| `DATABASE_URL` | Required | SQLAlchemy async URL | +| `REDIS_URL` | Required | Redis connection URL | +| `OPENAI_API_KEY` | Required | OpenAI API key | +| `LOG_LEVEL` | `INFO` | Logging level | +| `LOG_FORMAT` | `json` | `json` or `console` | + +## License + +Not licensed yet. diff --git a/Readme.md b/backend/Readme.v0.md similarity index 100% rename from Readme.md rename to backend/Readme.v0.md diff --git a/backend/alembic.ini b/backend/alembic.ini new file mode 100644 index 0000000..cda15d1 --- /dev/null +++ b/backend/alembic.ini @@ -0,0 +1,146 @@ +# A generic, single database configuration. + +[alembic] +# path to migration scripts. +# this is typically a path given in POSIX (e.g. forward slashes) +# format, relative to the token %(here)s which refers to the location of this +# ini file +script_location = %(here)s/alembic + +# template used to generate migration file names; The default value is %%(rev)s_%%(slug)s +# Uncomment the line below if you want the files to be prepended with date and time +# see https://alembic.sqlalchemy.org/en/latest/tutorial.html#editing-the-ini-file +# for all available tokens +# file_template = %%(year)d_%%(month).2d_%%(day).2d_%%(hour).2d%%(minute).2d-%%(rev)s_%%(slug)s + +# sys.path path, will be prepended to sys.path if present. +# defaults to the current working directory. for multiple paths, the path separator +# is defined by "path_separator" below. +prepend_sys_path = . + + +# timezone to use when rendering the date within the migration file +# as well as the filename. +# If specified, requires the tzdata library which can be installed by adding +# `alembic[tz]` to the pip requirements. +# string value is passed to ZoneInfo() +# leave blank for localtime +# timezone = + +# max length of characters to apply to the "slug" field +# truncate_slug_length = 40 + +# set to 'true' to run the environment during +# the 'revision' command, regardless of autogenerate +# revision_environment = false + +# set to 'true' to allow .pyc and .pyo files without +# a source .py file to be detected as revisions in the +# versions/ directory +# sourceless = false + +# version location specification; This defaults +# to /versions. When using multiple version +# directories, initial revisions must be specified with --version-path. +# The path separator used here should be the separator specified by "path_separator" +# below. +# version_locations = %(here)s/bar:%(here)s/bat:%(here)s/alembic/versions + +# path_separator; This indicates what character is used to split lists of file +# paths, including version_locations and prepend_sys_path within configparser +# files such as alembic.ini. +# The default rendered in new alembic.ini files is "os", which uses os.pathsep +# to provide os-dependent path splitting. +# +# Note that in order to support legacy alembic.ini files, this default does NOT +# take place if path_separator is not present in alembic.ini. If this +# option is omitted entirely, fallback logic is as follows: +# +# 1. Parsing of the version_locations option falls back to using the legacy +# "version_path_separator" key, which if absent then falls back to the legacy +# behavior of splitting on spaces and/or commas. +# 2. Parsing of the prepend_sys_path option falls back to the legacy +# behavior of splitting on spaces, commas, or colons. +# +# Valid values for path_separator are: +# +# path_separator = : +# path_separator = ; +# path_separator = space +# path_separator = newline +# +# Use os.pathsep. Default configuration used for new projects. +path_separator = os + +# set to 'true' to search source files recursively +# in each "version_locations" directory +# new in Alembic version 1.10 +# recursive_version_locations = false + +# the output encoding used when revision files +# are written from script.py.mako +# output_encoding = utf-8 + +# database URL. This is consumed by the user-maintained env.py script only. +# We read this from the environment in env.py instead. +# sqlalchemy.url = driver://user:pass@localhost/dbname + + +[post_write_hooks] +# post_write_hooks defines scripts or Python functions that are run +# on newly generated revision scripts. See the documentation for further +# detail and examples + +# format using "black" - use the console_scripts runner, against the "black" entrypoint +# hooks = black +# black.type = console_scripts +# black.entrypoint = black +# black.options = -l 79 REVISION_SCRIPT_FILENAME + +# lint with attempts to fix using "ruff" - use the module runner, against the "ruff" module +# hooks = ruff +# ruff.type = module +# ruff.module = ruff +# ruff.options = check --fix REVISION_SCRIPT_FILENAME + +# Alternatively, use the exec runner to execute a binary found on your PATH +# hooks = ruff +# ruff.type = exec +# ruff.executable = ruff +# ruff.options = check --fix REVISION_SCRIPT_FILENAME + +# Logging configuration. This is also consumed by the user-maintained +# env.py script only. +[loggers] +keys = root,sqlalchemy,alembic + +[handlers] +keys = console + +[formatters] +keys = generic + +[logger_root] +level = WARNING +handlers = console +qualname = + +[logger_sqlalchemy] +level = WARNING +handlers = +qualname = sqlalchemy.engine + +[logger_alembic] +level = INFO +handlers = +qualname = alembic + +[handler_console] +class = StreamHandler +args = (sys.stderr,) +level = NOTSET +formatter = generic + +[formatter_generic] +format = %(levelname)-5.5s [%(name)s] %(message)s +datefmt = %H:%M:%S diff --git a/backend/alembic/README b/backend/alembic/README new file mode 100644 index 0000000..98e4f9c --- /dev/null +++ b/backend/alembic/README @@ -0,0 +1 @@ +Generic single-database configuration. \ No newline at end of file diff --git a/backend/alembic/env.py b/backend/alembic/env.py new file mode 100644 index 0000000..ead1abb --- /dev/null +++ b/backend/alembic/env.py @@ -0,0 +1,101 @@ +"""Alembic environment configuration for async migrations. + +This module configures Alembic to work with: +- Async SQLAlchemy engine (postgresql+asyncpg) +- Environment-based database URL +- Model metadata from bookbytes.models +""" + +import asyncio +from logging.config import fileConfig + +from sqlalchemy import pool +from sqlalchemy.engine import Connection +from sqlalchemy.ext.asyncio import async_engine_from_config + +from alembic import context + +from bookbytes.config import get_settings +from bookbytes.models import Base + +# this is the Alembic Config object, which provides +# access to the values within the .ini file in use. +config = context.config + +# Interpret the config file for Python logging. +# This line sets up loggers basically. +if config.config_file_name is not None: + fileConfig(config.config_file_name) + +# add your model's MetaData object here +# for 'autogenerate' support +target_metadata = Base.metadata + + +def get_url() -> str: + """Get database URL from settings.""" + settings = get_settings() + return settings.database_url + + +def run_migrations_offline() -> None: + """Run migrations in 'offline' mode. + + This configures the context with just a URL + and not an Engine, though an Engine is acceptable + here as well. By skipping the Engine creation + we don't even need a DBAPI to be available. + + Calls to context.execute() here emit the given string to the + script output. + """ + url = get_url() + context.configure( + url=url, + target_metadata=target_metadata, + literal_binds=True, + dialect_opts={"paramstyle": "named"}, + ) + + with context.begin_transaction(): + context.run_migrations() + + +def do_run_migrations(connection: Connection) -> None: + """Run migrations using the given connection.""" + context.configure(connection=connection, target_metadata=target_metadata) + + with context.begin_transaction(): + context.run_migrations() + + +async def run_async_migrations() -> None: + """Run migrations in 'online' mode with async engine. + + In this scenario we need to create an Engine + and associate a connection with the context. + """ + configuration = config.get_section(config.config_ini_section, {}) + configuration["sqlalchemy.url"] = get_url() + + connectable = async_engine_from_config( + configuration, + prefix="sqlalchemy.", + poolclass=pool.NullPool, + ) + + async with connectable.connect() as connection: + await connection.run_sync(do_run_migrations) + + await connectable.dispose() + + +def run_migrations_online() -> None: + """Run migrations in 'online' mode.""" + asyncio.run(run_async_migrations()) + + +if context.is_offline_mode(): + run_migrations_offline() +else: + run_migrations_online() diff --git a/backend/alembic/script.py.mako b/backend/alembic/script.py.mako new file mode 100644 index 0000000..1101630 --- /dev/null +++ b/backend/alembic/script.py.mako @@ -0,0 +1,28 @@ +"""${message} + +Revision ID: ${up_revision} +Revises: ${down_revision | comma,n} +Create Date: ${create_date} + +""" +from typing import Sequence, Union + +from alembic import op +import sqlalchemy as sa +${imports if imports else ""} + +# revision identifiers, used by Alembic. +revision: str = ${repr(up_revision)} +down_revision: Union[str, Sequence[str], None] = ${repr(down_revision)} +branch_labels: Union[str, Sequence[str], None] = ${repr(branch_labels)} +depends_on: Union[str, Sequence[str], None] = ${repr(depends_on)} + + +def upgrade() -> None: + """Upgrade schema.""" + ${upgrades if upgrades else "pass"} + + +def downgrade() -> None: + """Downgrade schema.""" + ${downgrades if downgrades else "pass"} diff --git a/backend/alembic/versions/2232afba1a25_add_jobs_and_audio_book_jobs_tables.py b/backend/alembic/versions/2232afba1a25_add_jobs_and_audio_book_jobs_tables.py new file mode 100644 index 0000000..38782f2 --- /dev/null +++ b/backend/alembic/versions/2232afba1a25_add_jobs_and_audio_book_jobs_tables.py @@ -0,0 +1,68 @@ +"""add_jobs_and_audio_book_jobs_tables + +Revision ID: 2232afba1a25 +Revises: 80de24a1ee9e +Create Date: 2025-12-11 16:49:08.799639 + +""" +from typing import Sequence, Union + +from alembic import op +import sqlalchemy as sa + + +# revision identifiers, used by Alembic. +revision: str = '2232afba1a25' +down_revision: Union[str, Sequence[str], None] = '80de24a1ee9e' +branch_labels: Union[str, Sequence[str], None] = None +depends_on: Union[str, Sequence[str], None] = None + + +def upgrade() -> None: + """Upgrade schema.""" + # ### commands auto generated by Alembic - please adjust! ### + op.create_table('jobs', + sa.Column('job_type', sa.String(length=50), nullable=False), + sa.Column('status', sa.String(length=20), nullable=False), + sa.Column('progress', sa.Integer(), nullable=False), + sa.Column('current_step', sa.String(length=100), nullable=True), + sa.Column('error_message', sa.String(length=2000), nullable=True), + sa.Column('error_code', sa.String(length=50), nullable=True), + sa.Column('version', sa.Integer(), nullable=False), + sa.Column('worker_id', sa.String(length=100), nullable=True), + sa.Column('retry_count', sa.Integer(), nullable=False), + sa.Column('max_retries', sa.Integer(), nullable=False), + sa.Column('started_at', sa.DateTime(), nullable=True), + sa.Column('completed_at', sa.DateTime(), nullable=True), + sa.Column('id', sa.Uuid(), nullable=False), + sa.Column('created_at', sa.DateTime(timezone=True), server_default=sa.text('now()'), nullable=False), + sa.Column('updated_at', sa.DateTime(timezone=True), server_default=sa.text('now()'), nullable=False), + sa.PrimaryKeyConstraint('id') + ) + op.create_index(op.f('ix_jobs_job_type'), 'jobs', ['job_type'], unique=False) + op.create_index(op.f('ix_jobs_status'), 'jobs', ['status'], unique=False) + op.create_table('audio_book_jobs', + sa.Column('job_id', sa.Uuid(), nullable=False), + sa.Column('audio_book_id', sa.Uuid(), nullable=False), + sa.Column('id', sa.Uuid(), nullable=False), + sa.Column('created_at', sa.DateTime(timezone=True), server_default=sa.text('now()'), nullable=False), + sa.Column('updated_at', sa.DateTime(timezone=True), server_default=sa.text('now()'), nullable=False), + sa.ForeignKeyConstraint(['audio_book_id'], ['audio_books.id'], ondelete='CASCADE'), + sa.ForeignKeyConstraint(['job_id'], ['jobs.id'], ondelete='CASCADE'), + sa.PrimaryKeyConstraint('id') + ) + op.create_index(op.f('ix_audio_book_jobs_audio_book_id'), 'audio_book_jobs', ['audio_book_id'], unique=False) + op.create_index(op.f('ix_audio_book_jobs_job_id'), 'audio_book_jobs', ['job_id'], unique=True) + # ### end Alembic commands ### + + +def downgrade() -> None: + """Downgrade schema.""" + # ### commands auto generated by Alembic - please adjust! ### + op.drop_index(op.f('ix_audio_book_jobs_job_id'), table_name='audio_book_jobs') + op.drop_index(op.f('ix_audio_book_jobs_audio_book_id'), table_name='audio_book_jobs') + op.drop_table('audio_book_jobs') + op.drop_index(op.f('ix_jobs_status'), table_name='jobs') + op.drop_index(op.f('ix_jobs_job_type'), table_name='jobs') + op.drop_table('jobs') + # ### end Alembic commands ### diff --git a/backend/alembic/versions/80de24a1ee9e_add_audio_books_library_models.py b/backend/alembic/versions/80de24a1ee9e_add_audio_books_library_models.py new file mode 100644 index 0000000..ed2e9b3 --- /dev/null +++ b/backend/alembic/versions/80de24a1ee9e_add_audio_books_library_models.py @@ -0,0 +1,125 @@ +"""add_audio_books_library_models + +Revision ID: 80de24a1ee9e +Revises: +Create Date: 2025-12-06 20:31:54.570557 + +""" +from typing import Sequence, Union + +from alembic import op +import sqlalchemy as sa + + +# revision identifiers, used by Alembic. +revision: str = '80de24a1ee9e' +down_revision: Union[str, Sequence[str], None] = None +branch_labels: Union[str, Sequence[str], None] = None +depends_on: Union[str, Sequence[str], None] = None + + +def upgrade() -> None: + """Upgrade schema.""" + # ### commands auto generated by Alembic - please adjust! ### + op.create_table('works', + sa.Column('title', sa.String(length=500), nullable=False), + sa.Column('authors', sa.JSON(), nullable=False), + sa.Column('first_publish_year', sa.Integer(), nullable=True), + sa.Column('subjects', sa.JSON(), nullable=True), + sa.Column('cover_url', sa.String(length=1000), nullable=True), + sa.Column('id', sa.Uuid(), nullable=False), + sa.Column('created_at', sa.DateTime(timezone=True), server_default=sa.text('now()'), nullable=False), + sa.Column('updated_at', sa.DateTime(timezone=True), server_default=sa.text('now()'), nullable=False), + sa.PrimaryKeyConstraint('id') + ) + op.create_index(op.f('ix_works_title'), 'works', ['title'], unique=False) + op.create_table('editions', + sa.Column('work_id', sa.Uuid(), nullable=False), + sa.Column('isbn', sa.String(length=13), nullable=False), + sa.Column('isbn_type', sa.String(length=10), nullable=False), + sa.Column('title', sa.String(length=500), nullable=False), + sa.Column('publisher', sa.String(length=200), nullable=True), + sa.Column('publish_year', sa.Integer(), nullable=True), + sa.Column('language', sa.String(length=3), nullable=False), + sa.Column('pages', sa.Integer(), nullable=True), + sa.Column('id', sa.Uuid(), nullable=False), + sa.Column('created_at', sa.DateTime(timezone=True), server_default=sa.text('now()'), nullable=False), + sa.Column('updated_at', sa.DateTime(timezone=True), server_default=sa.text('now()'), nullable=False), + sa.ForeignKeyConstraint(['work_id'], ['works.id'], ondelete='CASCADE'), + sa.PrimaryKeyConstraint('id') + ) + op.create_index(op.f('ix_editions_isbn'), 'editions', ['isbn'], unique=True) + op.create_index(op.f('ix_editions_publish_year'), 'editions', ['publish_year'], unique=False) + op.create_index(op.f('ix_editions_work_id'), 'editions', ['work_id'], unique=False) + op.create_table('audio_books', + sa.Column('edition_id', sa.Uuid(), nullable=False), + sa.Column('status', sa.String(length=20), nullable=False), + sa.Column('version', sa.Integer(), nullable=False), + sa.Column('error_message', sa.String(length=1000), nullable=True), + sa.Column('id', sa.Uuid(), nullable=False), + sa.Column('created_at', sa.DateTime(timezone=True), server_default=sa.text('now()'), nullable=False), + sa.Column('updated_at', sa.DateTime(timezone=True), server_default=sa.text('now()'), nullable=False), + sa.Column('deleted_at', sa.DateTime(timezone=True), nullable=True), + sa.ForeignKeyConstraint(['edition_id'], ['editions.id'], ondelete='CASCADE'), + sa.PrimaryKeyConstraint('id') + ) + op.create_index(op.f('ix_audio_books_deleted_at'), 'audio_books', ['deleted_at'], unique=False) + op.create_index(op.f('ix_audio_books_edition_id'), 'audio_books', ['edition_id'], unique=False) + op.create_index(op.f('ix_audio_books_status'), 'audio_books', ['status'], unique=False) + op.create_table('book_providers', + sa.Column('entity_type', sa.String(length=20), nullable=False), + sa.Column('entity_id', sa.Uuid(), nullable=False), + sa.Column('provider', sa.String(length=50), nullable=False), + sa.Column('external_key', sa.String(length=200), nullable=False), + sa.Column('provider_metadata', sa.JSON(), nullable=True), + sa.Column('work_id', sa.Uuid(), nullable=True), + sa.Column('edition_id', sa.Uuid(), nullable=True), + sa.Column('id', sa.Uuid(), nullable=False), + sa.Column('created_at', sa.DateTime(timezone=True), server_default=sa.text('now()'), nullable=False), + sa.Column('updated_at', sa.DateTime(timezone=True), server_default=sa.text('now()'), nullable=False), + sa.ForeignKeyConstraint(['edition_id'], ['editions.id'], ondelete='CASCADE'), + sa.ForeignKeyConstraint(['work_id'], ['works.id'], ondelete='CASCADE'), + sa.PrimaryKeyConstraint('id'), + sa.UniqueConstraint('provider', 'external_key', name='uq_provider_external_key') + ) + op.create_index('ix_book_providers_entity_lookup', 'book_providers', ['entity_type', 'entity_id'], unique=False) + op.create_index(op.f('ix_book_providers_provider'), 'book_providers', ['provider'], unique=False) + op.create_table('chapters', + sa.Column('audio_book_id', sa.Uuid(), nullable=False), + sa.Column('chapter_number', sa.Integer(), nullable=False), + sa.Column('title', sa.String(length=500), nullable=False), + sa.Column('summary', sa.Text(), nullable=False), + sa.Column('audio_file_path', sa.String(length=500), nullable=True), + sa.Column('audio_url', sa.String(length=1000), nullable=True), + sa.Column('word_count', sa.Integer(), nullable=True), + sa.Column('duration_seconds', sa.Integer(), nullable=True), + sa.Column('id', sa.Uuid(), nullable=False), + sa.Column('created_at', sa.DateTime(timezone=True), server_default=sa.text('now()'), nullable=False), + sa.Column('updated_at', sa.DateTime(timezone=True), server_default=sa.text('now()'), nullable=False), + sa.ForeignKeyConstraint(['audio_book_id'], ['audio_books.id'], ondelete='CASCADE'), + sa.PrimaryKeyConstraint('id'), + sa.UniqueConstraint('audio_book_id', 'chapter_number', name='uq_chapter_audio_book_number') + ) + op.create_index(op.f('ix_chapters_audio_book_id'), 'chapters', ['audio_book_id'], unique=False) + # ### end Alembic commands ### + + +def downgrade() -> None: + """Downgrade schema.""" + # ### commands auto generated by Alembic - please adjust! ### + op.drop_index(op.f('ix_chapters_audio_book_id'), table_name='chapters') + op.drop_table('chapters') + op.drop_index(op.f('ix_book_providers_provider'), table_name='book_providers') + op.drop_index('ix_book_providers_entity_lookup', table_name='book_providers') + op.drop_table('book_providers') + op.drop_index(op.f('ix_audio_books_status'), table_name='audio_books') + op.drop_index(op.f('ix_audio_books_edition_id'), table_name='audio_books') + op.drop_index(op.f('ix_audio_books_deleted_at'), table_name='audio_books') + op.drop_table('audio_books') + op.drop_index(op.f('ix_editions_work_id'), table_name='editions') + op.drop_index(op.f('ix_editions_publish_year'), table_name='editions') + op.drop_index(op.f('ix_editions_isbn'), table_name='editions') + op.drop_table('editions') + op.drop_index(op.f('ix_works_title'), table_name='works') + op.drop_table('works') + # ### end Alembic commands ### diff --git a/backend/docker/Dockerfile b/backend/docker/Dockerfile new file mode 100644 index 0000000..9c81efb --- /dev/null +++ b/backend/docker/Dockerfile @@ -0,0 +1,73 @@ +# BookBytes Multi-Stage Dockerfile +# Stage 1: Builder - installs dependencies +# Stage 2: Runtime - minimal production image + +# Stage 1: Builder +FROM python:3.13-slim AS builder + +# Install uv for fast package management +COPY --from=ghcr.io/astral-sh/uv:latest /uv /uvx /bin/ + +# Set working directory +WORKDIR /app + +# Copy dependency files +COPY pyproject.toml uv.lock ./ + +# Create virtual environment and install dependencies +RUN uv sync --frozen --no-dev --no-install-project + +# Copy source code and README (needed for hatchling build) +COPY src/ src/ +COPY Readme.md ./ + +# Install the project itself +RUN uv sync --frozen --no-dev + + +# Stage 2: Runtime +FROM python:3.13-slim AS runtime + +# Install runtime dependencies +RUN apt-get update && apt-get install -y --no-install-recommends \ + curl \ + && rm -rf /var/lib/apt/lists/* + +# Create non-root user for security +RUN groupadd --gid 1000 bookbytes \ + && useradd --uid 1000 --gid 1000 --shell /bin/bash --create-home bookbytes + +# Set working directory +WORKDIR /app + +# Copy virtual environment from builder +COPY --from=builder /app/.venv /app/.venv + +# Copy source code +COPY --from=builder /app/src /app/src + +# Set environment variables +ENV PATH="/app/.venv/bin:$PATH" \ + PYTHONUNBUFFERED=1 \ + PYTHONDONTWRITEBYTECODE=1 \ + # App defaults + APP_ENV=production \ + HOST=0.0.0.0 \ + PORT=8000 + +# Create data directories +RUN mkdir -p /app/data/audio && chown -R bookbytes:bookbytes /app + +# Switch to non-root user +USER bookbytes + +# Expose port +EXPOSE 8000 + +# Health check +HEALTHCHECK --interval=30s --timeout=10s --start-period=5s --retries=3 \ + CMD curl -f http://localhost:${PORT}/health/live || exit 1 + +# Default command (can be overridden in docker-compose) +# Use shell form to interpolate PORT env var +CMD uvicorn bookbytes.main:app --host 0.0.0.0 --port ${PORT} diff --git a/backend/docker/Dockerfile.dev b/backend/docker/Dockerfile.dev new file mode 100644 index 0000000..ec5be29 --- /dev/null +++ b/backend/docker/Dockerfile.dev @@ -0,0 +1,55 @@ +# ============================================================================= +# BookBytes Development Dockerfile +# ============================================================================= +# Simplified image for development - no package installation +# Uses volume mounts for hot reload +# ============================================================================= + +FROM python:3.13-slim + +# Install uv for fast package management +COPY --from=ghcr.io/astral-sh/uv:latest /uv /uvx /bin/ + +# Install runtime dependencies +RUN apt-get update && apt-get install -y --no-install-recommends \ + curl \ + && rm -rf /var/lib/apt/lists/* + +# Create non-root user +RUN groupadd --gid 1000 bookbytes \ + && useradd --uid 1000 --gid 1000 --shell /bin/bash --create-home bookbytes + +# Set working directory +WORKDIR /app + +# Copy only dependency files (no source code - will be volume mounted) +COPY pyproject.toml uv.lock ./ + +# Install dependencies only (not the project itself) +RUN uv sync --frozen --no-dev --no-install-project + +# Set environment variables +ENV PATH="/app/.venv/bin:$PATH" \ + PYTHONUNBUFFERED=1 \ + PYTHONDONTWRITEBYTECODE=1 \ + PYTHONPATH="/app/src" \ + APP_ENV=development \ + HOST=0.0.0.0 \ + PORT=8000 + +# Create data directories +RUN mkdir -p /app/data/audio && chown -R bookbytes:bookbytes /app + +# Switch to non-root user +USER bookbytes + +# Expose port +EXPOSE 8000 + +# Health check +HEALTHCHECK --interval=30s --timeout=10s --start-period=5s --retries=3 \ + CMD curl -f http://localhost:${PORT}/health/live || exit 1 + +# Default command - run uvicorn with hot reload +# Use shell form to interpolate PORT env var +CMD uvicorn bookbytes.main:app --host 0.0.0.0 --port ${PORT} --reload diff --git a/backend/docker/docker-compose.dev.yml b/backend/docker/docker-compose.dev.yml new file mode 100644 index 0000000..9c86ab5 --- /dev/null +++ b/backend/docker/docker-compose.dev.yml @@ -0,0 +1,88 @@ +# BookBytes Development Docker Compose +# Use: docker compose -f docker/docker-compose.dev.yml up + +services: + api: + build: + context: .. + dockerfile: docker/Dockerfile.dev + container_name: bookbytes-api-dev + ports: + - "${API_PORT:-8000}:8000" + environment: + - APP_ENV=development + - DEBUG=true + - LOG_LEVEL=DEBUG + - LOG_FORMAT=console + - DATABASE_URL=postgresql+asyncpg://bookbytes:bookbytes@postgres:5432/bookbytes + - REDIS_URL=redis://redis:6379/0 + - STORAGE_BACKEND=local + - LOCAL_STORAGE_PATH=/app/data/audio + - AUTH_MODE=api_key + - API_KEY=${API_KEY:-dev-api-key} + - JWT_SECRET_KEY=${JWT_SECRET_KEY:-dev-jwt-secret-change-in-production} + - OPENAI_API_KEY=${OPENAI_API_KEY:-sk-placeholder} + volumes: + # Mount source code for hot reload + - ../src:/app/src:cached + # Persist audio data + - audio-data:/app/data/audio + depends_on: + postgres: + condition: service_healthy + redis: + condition: service_healthy + healthcheck: + test: ["CMD", "curl", "-f", "http://localhost:8000/health/live"] + interval: 10s + timeout: 5s + retries: 3 + start_period: 10s + networks: + - bookbytes-dev + + postgres: + image: postgres:16-alpine + container_name: bookbytes-postgres-dev + environment: + POSTGRES_USER: bookbytes + POSTGRES_PASSWORD: bookbytes + POSTGRES_DB: bookbytes + ports: + - "${POSTGRES_PORT:-5432}:5432" + volumes: + - postgres-data:/var/lib/postgresql/data + healthcheck: + test: ["CMD-SHELL", "pg_isready -U bookbytes -d bookbytes"] + interval: 5s + timeout: 5s + retries: 5 + networks: + - bookbytes-dev + + redis: + image: redis:7-alpine + container_name: bookbytes-redis-dev + ports: + - "${REDIS_PORT:-6379}:6379" + volumes: + - redis-data:/data + healthcheck: + test: ["CMD", "redis-cli", "ping"] + interval: 5s + timeout: 5s + retries: 5 + networks: + - bookbytes-dev + +volumes: + postgres-data: + name: bookbytes_dev_postgres_data + redis-data: + name: bookbytes_dev_redis_data + audio-data: + name: bookbytes_dev_audio_data + +networks: + bookbytes-dev: + name: bookbytes-dev-network diff --git a/backend/docker/docker-compose.yml b/backend/docker/docker-compose.yml new file mode 100644 index 0000000..0f0ea37 --- /dev/null +++ b/backend/docker/docker-compose.yml @@ -0,0 +1,148 @@ +# ============================================================================= +# BookBytes Development Stack +# ============================================================================= +# Services: +# - api: FastAPI application server +# - worker: ARQ background job worker (placeholder) +# - postgres: PostgreSQL 16 database +# - redis: Redis 7 for job queue and caching +# ============================================================================= + +services: + # --------------------------------------------------------------------------- + # API Service + # --------------------------------------------------------------------------- + api: + build: + context: .. + dockerfile: docker/Dockerfile + command: uvicorn bookbytes.main:app --host 0.0.0.0 --port 8000 --reload + ports: + - "${API_PORT:-8000}:8000" + environment: + # Application + - APP_ENV=${APP_ENV:-development} + - DEBUG=${DEBUG:-true} + - LOG_LEVEL=${LOG_LEVEL:-INFO} + - LOG_FORMAT=${LOG_FORMAT:-console} + # Database + - DATABASE_URL=postgresql+asyncpg://bookbytes:bookbytes@postgres:5432/bookbytes + - DATABASE_POOL_MIN=2 + - DATABASE_POOL_MAX=10 + # Redis + - REDIS_URL=redis://redis:6379/0 + # Storage + - STORAGE_BACKEND=${STORAGE_BACKEND:-local} + - LOCAL_STORAGE_PATH=/app/data/audio + # Auth + - AUTH_MODE=${AUTH_MODE:-api_key} + - API_KEY=${API_KEY:-dev-api-key-12345} + - JWT_SECRET_KEY=${JWT_SECRET_KEY:-dev-secret-key-change-in-production} + - JWT_EXPIRE_MINUTES=${JWT_EXPIRE_MINUTES:-30} + # OpenAI (required for book processing) + - OPENAI_API_KEY=${OPENAI_API_KEY:-} + - OPENAI_MODEL=${OPENAI_MODEL:-gpt-4o-mini} + volumes: + # Mount source for hot reload in development + - ../src:/app/src:ro + # Persist audio files + - audio-data:/app/data/audio + depends_on: + postgres: + condition: service_healthy + redis: + condition: service_healthy + healthcheck: + test: ["CMD", "curl", "-f", "http://localhost:8000/health/live"] + interval: 10s + timeout: 5s + retries: 3 + start_period: 10s + restart: unless-stopped + + # --------------------------------------------------------------------------- + # Worker Service (ARQ Background Jobs) + # --------------------------------------------------------------------------- + worker: + build: + context: .. + dockerfile: docker/Dockerfile + # Placeholder command - will be enabled in Phase 3 + command: echo "Worker placeholder - enable in Phase 3 with 'arq bookbytes.workers.settings.WorkerSettings'" + environment: + # Application + - APP_ENV=${APP_ENV:-development} + - LOG_LEVEL=${LOG_LEVEL:-INFO} + - LOG_FORMAT=${LOG_FORMAT:-console} + # Database + - DATABASE_URL=postgresql+asyncpg://bookbytes:bookbytes@postgres:5432/bookbytes + # Redis + - REDIS_URL=redis://redis:6379/0 + # Storage + - STORAGE_BACKEND=${STORAGE_BACKEND:-local} + - LOCAL_STORAGE_PATH=/app/data/audio + # OpenAI + - OPENAI_API_KEY=${OPENAI_API_KEY:-} + - OPENAI_MODEL=${OPENAI_MODEL:-gpt-4o-mini} + # Worker + - WORKER_MAX_JOBS=${WORKER_MAX_JOBS:-5} + volumes: + # Persist audio files (shared with api) + - audio-data:/app/data/audio + depends_on: + postgres: + condition: service_healthy + redis: + condition: service_healthy + profiles: + - worker # Only start with --profile worker + restart: unless-stopped + + # --------------------------------------------------------------------------- + # PostgreSQL Database + # --------------------------------------------------------------------------- + postgres: + image: postgres:16-alpine + environment: + - POSTGRES_USER=bookbytes + - POSTGRES_PASSWORD=bookbytes + - POSTGRES_DB=bookbytes + ports: + - "${POSTGRES_PORT:-5432}:5432" + volumes: + - postgres-data:/var/lib/postgresql/data + healthcheck: + test: ["CMD-SHELL", "pg_isready -U bookbytes -d bookbytes"] + interval: 5s + timeout: 5s + retries: 5 + start_period: 10s + restart: unless-stopped + + # --------------------------------------------------------------------------- + # Redis Cache & Job Queue + # --------------------------------------------------------------------------- + redis: + image: redis:7-alpine + command: redis-server --appendonly yes + ports: + - "${REDIS_PORT:-6379}:6379" + volumes: + - redis-data:/data + healthcheck: + test: ["CMD", "redis-cli", "ping"] + interval: 5s + timeout: 5s + retries: 5 + restart: unless-stopped + +# ============================================================================= +# Volumes +# ============================================================================= +volumes: + postgres-data: + name: bookbytes_postgres_data + redis-data: + name: bookbytes_redis_data + audio-data: + name: bookbytes_audio_data diff --git a/backend/pyproject.toml b/backend/pyproject.toml new file mode 100644 index 0000000..7b1c1ce --- /dev/null +++ b/backend/pyproject.toml @@ -0,0 +1,168 @@ +[build-system] +requires = ["hatchling"] +build-backend = "hatchling.build" + +[project] +name = "bookbytes" +version = "0.1.0" +description = "AI-powered book summarization and audio generation service" +readme = "Readme.md" +requires-python = ">=3.13" +authors = [ + { name = "BookBytes Team" } +] +keywords = ["book", "summary", "audio", "tts", "openai", "fastapi"] +classifiers = [ + "Development Status :: 3 - Alpha", + "Environment :: Web Environment", + "Framework :: FastAPI", + "Intended Audience :: Developers", + "Operating System :: OS Independent", + "Programming Language :: Python :: 3", + "Programming Language :: Python :: 3.13", + "Topic :: Internet :: WWW/HTTP :: HTTP Servers", + "Typing :: Typed", +] +dependencies = [ + # Core + "fastapi", + "uvicorn[standard]", + "python-multipart", + + # Async + "httpx", + "anyio", + "aiofiles", + + # Database + "sqlalchemy[asyncio]", + "asyncpg", + "aiosqlite", + "alembic", + + # Background Jobs + "arq", + "redis", + + # Configuration + "pydantic", + "pydantic-settings", + "python-dotenv", + + # Auth (JWT) + "python-jose[cryptography]", + "passlib[bcrypt]", + + # Storage + "boto3", + "aioboto3", + + # External APIs + "openai", + "instructor>=1.0.0", # Structured LLM output with Pydantic + "gtts", + + # Resilience + "tenacity", + + # Observability + "structlog", + + # UUIDs + "uuid6", # RFC 9562 UUIDv7 (backport until Python 3.14) +] + +[project.optional-dependencies] +dev = [ + "pytest", + "pytest-asyncio", + "pytest-cov", + "respx", + "fakeredis", + "ruff", + "mypy", + "pre-commit", +] + +[project.urls] +Homepage = "https://github.com/cryptus-neoxys/bookbytes" +Documentation = "https://github.com/cryptus-neoxys/bookbytes#readme" +Repository = "https://github.com/cryptus-neoxys/bookbytes.git" +Issues = "https://github.com/cryptus-neoxys/bookbytes/issues" + +[project.scripts] +bookbytes = "bookbytes.main:cli" + +[tool.hatch.build.targets.wheel] +packages = ["src/bookbytes"] + +[tool.ruff] +target-version = "py311" +line-length = 88 +src = ["src"] + +[tool.ruff.lint] +select = [ + "E", # pycodestyle errors + "W", # pycodestyle warnings + "F", # pyflakes + "I", # isort + "C4", # flake8-comprehensions + "B", # flake8-bugbear + "UP", # pyupgrade + "ARG", # flake8-unused-arguments + "SIM", # flake8-simplify +] +ignore = [ + "E501", # line too long (handled by formatter) + "B008", # do not perform function calls in argument defaults (for Depends()) + "C901", # too complex + "ARG001", # unused function argument (common in FastAPI dependencies) +] + +[tool.ruff.lint.isort] +known-first-party = ["bookbytes"] + +[tool.pytest.ini_options] +asyncio_mode = "auto" +testpaths = ["tests"] +python_files = ["test_*.py"] +python_classes = ["Test*"] +python_functions = ["test_*"] +addopts = "-v --tb=short" +markers = [ + "integration: marks tests as integration tests (deselect with '-m \"not integration\"')", + "external: marks tests that hit external APIs (may be slow/flaky)", + "slow: marks tests as slow-running", +] +filterwarnings = [ + "ignore::DeprecationWarning", +] + +[tool.mypy] +python_version = "3.11" +strict = true +warn_return_any = true +warn_unused_ignores = true +disallow_untyped_defs = true +plugins = ["pydantic.mypy"] + +[[tool.mypy.overrides]] +module = [ + "gtts.*", + "arq.*", + "aioboto3.*", +] +ignore_missing_imports = true + +[tool.coverage.run] +source = ["src/bookbytes"] +branch = true + +[tool.coverage.report] +exclude_lines = [ + "pragma: no cover", + "def __repr__", + "if TYPE_CHECKING:", + "raise NotImplementedError", +] diff --git a/backend/src/bookbytes/__init__.py b/backend/src/bookbytes/__init__.py new file mode 100644 index 0000000..f535efe --- /dev/null +++ b/backend/src/bookbytes/__init__.py @@ -0,0 +1,3 @@ +"""BookBytes - AI-powered book summarization and audio generation service.""" + +__version__ = "0.1.0" diff --git a/backend/src/bookbytes/api/__init__.py b/backend/src/bookbytes/api/__init__.py new file mode 100644 index 0000000..cfb7534 --- /dev/null +++ b/backend/src/bookbytes/api/__init__.py @@ -0,0 +1 @@ +"""API package for BookBytes.""" diff --git a/backend/src/bookbytes/api/v1/__init__.py b/backend/src/bookbytes/api/v1/__init__.py new file mode 100644 index 0000000..9ee8fc7 --- /dev/null +++ b/backend/src/bookbytes/api/v1/__init__.py @@ -0,0 +1 @@ +"""API v1 package for BookBytes.""" diff --git a/backend/src/bookbytes/api/v1/processing.py b/backend/src/bookbytes/api/v1/processing.py new file mode 100644 index 0000000..bbbd4c1 --- /dev/null +++ b/backend/src/bookbytes/api/v1/processing.py @@ -0,0 +1,170 @@ +"""Audiobook processing endpoints. + +Provides endpoints for starting audiobook processing, +checking job status, and refreshing audiobooks. +""" + +from typing import Annotated +from uuid import UUID + +from fastapi import APIRouter, Depends, HTTPException, status +from sqlalchemy.ext.asyncio import AsyncSession + +from bookbytes.core.database import get_async_session +from bookbytes.core.logging import get_logger +from bookbytes.schemas.common import ErrorResponse +from bookbytes.schemas.processing import ( + JobStatusResponse, + ProcessRequest, + ProcessResponse, + RefreshRequest, +) + +logger = get_logger(__name__) + +router = APIRouter() + + +# ============================================================================= +# Dependencies (will be implemented in later phases) +# ============================================================================= + + +# async def get_processing_service(...) -> ProcessingService: +# """Get the processing service with injected dependencies.""" +# ... + + +# ============================================================================= +# Processing Endpoints +# ============================================================================= + + +@router.post( + "/process", + response_model=ProcessResponse, + status_code=status.HTTP_202_ACCEPTED, + summary="Start audiobook processing", + description="Start processing an audiobook from an edition or ISBN. Returns a job ID for tracking progress.", + responses={ + 202: {"description": "Processing started"}, + 400: {"model": ErrorResponse, "description": "Invalid request"}, + 404: {"model": ErrorResponse, "description": "Edition/ISBN not found"}, + 409: {"model": ErrorResponse, "description": "Audiobook already exists"}, + }, +) +async def start_processing( + request: ProcessRequest, + db: Annotated[AsyncSession, Depends(get_async_session)], +) -> ProcessResponse: + """Start audiobook processing for an edition. + + Accepts either an edition_id (UUID) or an ISBN string. + Creates an AudioBook record in PENDING status and queues + a background job for processing. + + Returns immediately with a job_id for status polling. + """ + logger.info( + "start_processing_request", + edition_id=str(request.edition_id) if request.edition_id else None, + isbn=request.isbn, + ) + + # TODO: Implement in Phase 5 (ProcessingService) + # 1. Resolve edition_id or look up by ISBN + # 2. Check if audiobook already exists for this edition + # 3. Create AudioBook record (status=PENDING) + # 4. Create Job record + # 5. Enqueue ARQ task + # 6. Return job_id + + # Placeholder - will be replaced with actual implementation + raise HTTPException( + status_code=status.HTTP_501_NOT_IMPLEMENTED, + detail="Processing service not yet implemented. See Phase 3.1.5 tasks.", + ) + + +@router.post( + "/{audio_book_id}/refresh", + response_model=ProcessResponse, + status_code=status.HTTP_202_ACCEPTED, + summary="Refresh audiobook", + description="Regenerate an audiobook with a new version. Useful when summary quality improves.", + responses={ + 202: {"description": "Refresh started"}, + 404: {"model": ErrorResponse, "description": "Audiobook not found"}, + }, +) +async def refresh_audiobook( + audio_book_id: UUID, + request: RefreshRequest, + db: Annotated[AsyncSession, Depends(get_async_session)], +) -> ProcessResponse: + """Refresh an existing audiobook. + + Creates a new version by incrementing the version number + and reprocessing all chapters. + + Args: + audio_book_id: UUID of the audiobook to refresh + request: Refresh options (force flag) + """ + logger.info( + "refresh_audiobook_request", + audio_book_id=str(audio_book_id), + force=request.force, + ) + + # TODO: Implement in Phase 5 (ProcessingService) + # 1. Look up AudioBook by ID + # 2. Increment version + # 3. Reset status to PENDING + # 4. Create Job record + # 5. Enqueue ARQ task + # 6. Return job_id + + raise HTTPException( + status_code=status.HTTP_501_NOT_IMPLEMENTED, + detail="Refresh service not yet implemented. See Phase 3.1.5 tasks.", + ) + + +# ============================================================================= +# Job Status Endpoints +# ============================================================================= + + +@router.get( + "/jobs/{job_id}", + response_model=JobStatusResponse, + status_code=status.HTTP_200_OK, + summary="Get job status", + description="Get the current status and progress of a processing job.", + responses={ + 200: {"description": "Job status"}, + 404: {"model": ErrorResponse, "description": "Job not found"}, + }, +) +async def get_job_status( + job_id: UUID, + db: Annotated[AsyncSession, Depends(get_async_session)], +) -> JobStatusResponse: + """Get the status of a processing job. + + Returns progress percentage, current status, and any error details. + + Args: + job_id: UUID of the job to check + """ + logger.info("get_job_status_request", job_id=str(job_id)) + + # TODO: Implement when Job model exists (Phase 2) + # 1. Query Job by ID + # 2. Return JobStatusResponse + + raise HTTPException( + status_code=status.HTTP_501_NOT_IMPLEMENTED, + detail="Job repository not yet implemented. See Phase 3.1.2 tasks.", + ) diff --git a/backend/src/bookbytes/api/v1/router.py b/backend/src/bookbytes/api/v1/router.py new file mode 100644 index 0000000..2075a33 --- /dev/null +++ b/backend/src/bookbytes/api/v1/router.py @@ -0,0 +1,18 @@ +"""API v1 main router. + +Aggregates all v1 API routers into a single router for inclusion in the app. +""" + +from fastapi import APIRouter + +from bookbytes.api.v1.processing import router as processing_router +from bookbytes.api.v1.search import router as search_router + +router = APIRouter() + +# Include sub-routers +router.include_router(search_router, prefix="/books", tags=["Books"]) +router.include_router(processing_router, prefix="/books", tags=["Processing"]) + +# Future routers: +# router.include_router(users_router, prefix="/users", tags=["Users"]) diff --git a/backend/src/bookbytes/api/v1/search.py b/backend/src/bookbytes/api/v1/search.py new file mode 100644 index 0000000..6ae31f6 --- /dev/null +++ b/backend/src/bookbytes/api/v1/search.py @@ -0,0 +1,265 @@ +"""Book search and lookup endpoints. + +Provides endpoints for searching books via OpenLibrary, +fetching work details, and ISBN lookups. +""" + +from typing import Annotated + +from fastapi import APIRouter, Depends, Query, status + +from bookbytes.core.exceptions import BookNotFoundError, ExternalServiceError +from bookbytes.core.logging import get_logger +from bookbytes.schemas.common import ErrorResponse +from bookbytes.schemas.search import ( + BookSearchRequest, + BookSearchResponse, + BookSearchResultItem, + WorkResponse, +) +from bookbytes.services.cache import CacheService, get_cache_service +from bookbytes.services.openlibrary import ( + OpenLibraryError, + OpenLibraryService, +) + +logger = get_logger(__name__) + +router = APIRouter() + + +# ============================================================================= +# Dependencies +# ============================================================================= + + +def get_openlibrary_service( + cache: Annotated[CacheService, Depends(get_cache_service)], +) -> OpenLibraryService: + """Create OpenLibraryService with injected cache.""" + return OpenLibraryService(cache) + + +# ============================================================================= +# Search Endpoints +# ============================================================================= + + +@router.post( + "/search", + response_model=BookSearchResponse, + status_code=status.HTTP_200_OK, + summary="Search for books", + description="Search books via OpenLibrary by title, author, publisher, or language.", + responses={ + 200: {"description": "Search results"}, + 400: {"model": ErrorResponse, "description": "Invalid request"}, + 502: {"model": ErrorResponse, "description": "External service error"}, + }, +) +async def search_books( + request: BookSearchRequest, + page: Annotated[int, Query(ge=1, description="Page number")] = 1, + page_size: Annotated[int, Query(ge=1, le=100, description="Results per page")] = 20, + openlibrary: Annotated[OpenLibraryService, Depends(get_openlibrary_service)] = None, +) -> BookSearchResponse: + """Search for books by title, author, etc. + + Uses OpenLibrary API with caching. Results are paginated from + a larger cached result set (100 per API call). + """ + logger.info( + "search_books_request", + title=request.title, + author=request.author, + page=page, + page_size=page_size, + ) + + try: + # Calculate offset for API call + # We fetch 100 results max, then slice for pagination + api_offset = ((page - 1) // 5) * 100 # Fetch next batch every 5 pages + + result = await openlibrary.search_books( + title=request.title, + author=request.author, + publisher=request.publisher, + language=request.language, + offset=api_offset, + ) + + # Calculate slice for requested page + # Within our 100-result batch, find the right slice + batch_start = (page - 1) % 5 * page_size + batch_end = batch_start + page_size + page_results = result.results[batch_start:batch_end] + + # Convert to response format + items = [ + BookSearchResultItem( + title=r.title, + authors=r.authors, + first_publish_year=r.first_publish_year, + cover_url=r.cover_url, + isbn_list=r.isbn_list, + edition_count=r.edition_count, + subjects=r.subjects, + external_work_key=r.external_work_key, + ) + for r in page_results + ] + + logger.info( + "search_books_success", + total_found=result.total_found, + results_returned=len(items), + ) + + return BookSearchResponse( + results=items, + total_found=result.total_found, + page=page, + page_size=page_size, + has_more=(page * page_size) < result.total_found, + ) + + except OpenLibraryError as e: + logger.error("search_books_failed", error=str(e)) + raise ExternalServiceError( + message="Failed to search books", + details={"provider": "openlibrary", "error": str(e)}, + ) from e + finally: + await openlibrary.close() + + +@router.get( + "/works/{work_key:path}", + response_model=WorkResponse, + status_code=status.HTTP_200_OK, + summary="Get work details", + description="Fetch detailed information about a work by its OpenLibrary key.", + responses={ + 200: {"description": "Work details"}, + 404: {"model": ErrorResponse, "description": "Work not found"}, + 502: {"model": ErrorResponse, "description": "External service error"}, + }, +) +async def get_work_details( + work_key: str, + openlibrary: Annotated[OpenLibraryService, Depends(get_openlibrary_service)] = None, +) -> WorkResponse: + """Get detailed information about a work. + + Args: + work_key: OpenLibrary work key (e.g., "works/OL27448W") + """ + # Normalize work key + if not work_key.startswith("/"): + work_key = f"/{work_key}" + + logger.info("get_work_details_request", work_key=work_key) + + try: + work = await openlibrary.get_work_details(work_key) + + logger.info("get_work_details_success", work_key=work_key, title=work.title) + + return WorkResponse( + id=None, # Not yet in our library + title=work.title, + authors=work.authors, + description=work.description, + subjects=work.subjects, + first_publish_year=work.first_publish_year, + cover_url=work.cover_url, + edition_count=work.edition_count, + external_work_key=work.external_work_key, + editions=[], # Editions fetched separately + ) + + except OpenLibraryError as e: + logger.error("get_work_details_failed", work_key=work_key, error=str(e)) + raise ExternalServiceError( + message="Failed to fetch work details", + details={"work_key": work_key, "error": str(e)}, + ) from e + finally: + await openlibrary.close() + + +@router.get( + "/isbn/{isbn}", + response_model=WorkResponse, + status_code=status.HTTP_200_OK, + summary="Lookup by ISBN", + description="Find a book by ISBN. Checks library first, then queries OpenLibrary.", + responses={ + 200: {"description": "Book details"}, + 404: {"model": ErrorResponse, "description": "ISBN not found"}, + 502: {"model": ErrorResponse, "description": "External service error"}, + }, +) +async def lookup_by_isbn( + isbn: str, + openlibrary: Annotated[OpenLibraryService, Depends(get_openlibrary_service)] = None, +) -> WorkResponse: + """Lookup a book by ISBN. + + First checks if the ISBN exists in our library. + If not, queries OpenLibrary and optionally stores the result. + + Args: + isbn: ISBN-10 or ISBN-13 + """ + # Normalize ISBN (remove dashes/spaces) + clean_isbn = isbn.replace("-", "").replace(" ", "") + + logger.info("lookup_isbn_request", isbn=clean_isbn) + + # TODO: Check library first (EditionRepository) + # For now, search OpenLibrary directly + + try: + # Search by ISBN + result = await openlibrary.search_books(title=clean_isbn) + + if not result.results: + logger.warning("lookup_isbn_not_found", isbn=clean_isbn) + raise BookNotFoundError(isbn=clean_isbn) + + # Get the first match and fetch full work details + first_match = result.results[0] + work = await openlibrary.get_work_details(first_match.external_work_key) + + logger.info( + "lookup_isbn_success", + isbn=clean_isbn, + work_key=work.external_work_key, + title=work.title, + ) + + return WorkResponse( + id=None, + title=work.title, + authors=work.authors, + description=work.description, + subjects=work.subjects, + first_publish_year=work.first_publish_year, + cover_url=work.cover_url, + edition_count=work.edition_count, + external_work_key=work.external_work_key, + editions=[], + ) + + except BookNotFoundError: + raise + except OpenLibraryError as e: + logger.error("lookup_isbn_failed", isbn=clean_isbn, error=str(e)) + raise ExternalServiceError( + message="Failed to lookup ISBN", + details={"isbn": clean_isbn, "error": str(e)}, + ) from e + finally: + await openlibrary.close() diff --git a/backend/src/bookbytes/config.py b/backend/src/bookbytes/config.py new file mode 100644 index 0000000..fa3cbd1 --- /dev/null +++ b/backend/src/bookbytes/config.py @@ -0,0 +1,284 @@ +"""Application configuration using Pydantic Settings. + +This module provides centralized configuration management with environment +variable validation, type coercion, and default values. +""" + +from enum import Enum +from functools import lru_cache + +from pydantic import Field, SecretStr, field_validator +from pydantic_settings import BaseSettings, SettingsConfigDict + + +class Environment(str, Enum): + """Application environment.""" + + DEVELOPMENT = "development" + STAGING = "staging" + PRODUCTION = "production" + + +class LogLevel(str, Enum): + """Log level options.""" + + DEBUG = "DEBUG" + INFO = "INFO" + WARNING = "WARNING" + ERROR = "ERROR" + CRITICAL = "CRITICAL" + + +class LogFormat(str, Enum): + """Log format options.""" + + JSON = "json" + CONSOLE = "console" + + +class StorageBackend(str, Enum): + """Storage backend options.""" + + LOCAL = "local" + S3 = "s3" + + +class AuthMode(str, Enum): + """Authentication mode options.""" + + JWT = "jwt" + API_KEY = "api_key" + + +class Settings(BaseSettings): + """Application settings with environment variable validation. + + All settings can be overridden via environment variables. + Sensitive values should be provided via environment variables or .env file. + """ + + model_config = SettingsConfigDict( + env_file=".env", + env_file_encoding="utf-8", + case_sensitive=False, + extra="ignore", + ) + + # ======================================== + # Application + # ======================================== + app_env: Environment = Field( + default=Environment.DEVELOPMENT, + description="Application environment", + ) + debug: bool = Field( + default=False, + description="Enable debug mode", + ) + app_name: str = Field( + default="BookBytes", + description="Application name", + ) + app_version: str = Field( + default="0.1.0", + description="Application version", + ) + + # ======================================== + # Logging + # ======================================== + log_level: LogLevel = Field( + default=LogLevel.INFO, + description="Logging level", + ) + log_format: LogFormat = Field( + default=LogFormat.CONSOLE, + description="Log output format (json for production, console for dev)", + ) + + # ======================================== + # Server + # ======================================== + host: str = Field( + default="0.0.0.0", + description="Server host", + ) + port: int = Field( + default=8000, + ge=1, + le=65535, + description="Server port", + ) + + # ======================================== + # Database + # ======================================== + database_url: str = Field( + default="postgresql+asyncpg://bookbytes:bookbytes@localhost:5432/bookbytes", + description="PostgreSQL database URL with async driver", + ) + database_pool_min: int = Field( + default=2, + ge=1, + description="Minimum database connection pool size", + ) + database_pool_max: int = Field( + default=10, + ge=1, + description="Maximum database connection pool size", + ) + + # ======================================== + # Redis + # ======================================== + redis_url: str = Field( + default="redis://localhost:6379/0", + description="Redis connection URL", + ) + + # ======================================== + # Storage + # ======================================== + storage_backend: StorageBackend = Field( + default=StorageBackend.LOCAL, + description="Storage backend to use (local or s3)", + ) + local_storage_path: str = Field( + default="./data/audio", + description="Local filesystem path for audio storage", + ) + + # S3 Configuration + s3_bucket: str = Field( + default="bookbytes-audio", + description="S3 bucket name for audio storage", + ) + s3_region: str = Field( + default="us-east-1", + description="AWS S3 region", + ) + aws_access_key_id: str | None = Field( + default=None, + description="AWS access key ID (optional, uses IAM role if not provided)", + ) + aws_secret_access_key: SecretStr | None = Field( + default=None, + description="AWS secret access key", + ) + s3_url_expiry_seconds: int = Field( + default=0, + ge=0, + description="S3 pre-signed URL expiry in seconds (0 = no expiry/public)", + ) + + # ======================================== + # External APIs + # ======================================== + openai_api_key: SecretStr = Field( + default=SecretStr(""), + description="OpenAI API key for chapter extraction and summaries", + ) + openai_model: str = Field( + default="gpt-3.5-turbo", + description="OpenAI model to use", + ) + openai_timeout: int = Field( + default=30, + ge=1, + description="OpenAI API request timeout in seconds", + ) + + # OpenLibrary API + openlibrary_base_url: str = Field( + default="https://openlibrary.org", + description="OpenLibrary API base URL", + ) + openlibrary_timeout: int = Field( + default=30, + ge=1, + description="OpenLibrary API request timeout in seconds", + ) + openlibrary_page_size: int = Field( + default=100, + ge=1, + le=1000, + description="Number of results per OpenLibrary search request", + ) + + # ======================================== + # Authentication + # ======================================== + auth_mode: AuthMode = Field( + default=AuthMode.API_KEY, + description="Authentication mode (jwt for production, api_key for local dev)", + ) + jwt_secret_key: SecretStr = Field( + default=SecretStr("dev-secret-key-change-in-production-!!!"), + description="Secret key for JWT token signing", + ) + jwt_algorithm: str = Field( + default="HS256", + description="JWT signing algorithm", + ) + jwt_expire_minutes: int = Field( + default=30, + ge=1, + description="JWT access token expiry in minutes", + ) + api_key: SecretStr = Field( + default=SecretStr("dev-api-key-12345"), + description="API key for local development bypass (used when AUTH_MODE=api_key)", + ) + + # ======================================== + # Worker + # ======================================== + worker_max_jobs: int = Field( + default=5, + ge=1, + description="Maximum concurrent jobs per worker", + ) + worker_job_timeout: int = Field( + default=600, + ge=60, + description="Job timeout in seconds (default 10 minutes)", + ) + + # ======================================== + # Derived Properties + # ======================================== + @property + def is_development(self) -> bool: + """Check if running in development mode.""" + return self.app_env == Environment.DEVELOPMENT + + @property + def is_production(self) -> bool: + """Check if running in production mode.""" + return self.app_env == Environment.PRODUCTION + + @property + def use_json_logs(self) -> bool: + """Check if JSON logging should be used.""" + return self.log_format == LogFormat.JSON or self.is_production + + @field_validator("jwt_secret_key") + @classmethod + def validate_jwt_secret(cls, v: SecretStr) -> SecretStr: + """Warn if using default JWT secret in non-dev environment.""" + # This is a soft validation - we log a warning but don't fail + # The actual check happens at runtime in production + return v + + +@lru_cache +def get_settings() -> Settings: + """Get cached application settings. + + This function is cached to avoid re-reading environment variables + on every access. Use dependency injection in FastAPI routes. + + Returns: + Settings: Application settings instance + """ + return Settings() diff --git a/backend/src/bookbytes/core/__init__.py b/backend/src/bookbytes/core/__init__.py new file mode 100644 index 0000000..9e7d497 --- /dev/null +++ b/backend/src/bookbytes/core/__init__.py @@ -0,0 +1 @@ +"""Core utilities and infrastructure package for BookBytes.""" diff --git a/backend/src/bookbytes/core/database.py b/backend/src/bookbytes/core/database.py new file mode 100644 index 0000000..5910f82 --- /dev/null +++ b/backend/src/bookbytes/core/database.py @@ -0,0 +1,183 @@ +"""Async database engine and session management. + +This module provides the core database infrastructure: +- Async SQLAlchemy engine with connection pooling +- Async session factory for request-scoped sessions +- Database lifecycle management (init/close) + +Usage: + from bookbytes.core.database import init_db, close_db, get_async_session + + # At startup + await init_db(settings) + + # In request handlers (via dependency injection) + async with get_async_session() as session: + ... + + # At shutdown + await close_db() +""" + +from collections.abc import AsyncGenerator +from typing import Any + +from sqlalchemy import text +from sqlalchemy.ext.asyncio import ( + AsyncEngine, + AsyncSession, + async_sessionmaker, + create_async_engine, +) +from sqlalchemy.pool import NullPool + +from bookbytes.config import Settings +from bookbytes.core.logging import get_logger + +logger = get_logger(__name__) + +# Global engine and session factory (initialized at startup) +_engine: AsyncEngine | None = None +_async_session_factory: async_sessionmaker[AsyncSession] | None = None + + +def get_engine() -> AsyncEngine: + """Get the database engine. + + Raises: + RuntimeError: If database is not initialized + """ + if _engine is None: + raise RuntimeError("Database not initialized. Call init_db() first.") + return _engine + + +def get_session_factory() -> async_sessionmaker[AsyncSession]: + """Get the session factory. + + Raises: + RuntimeError: If database is not initialized + """ + if _async_session_factory is None: + raise RuntimeError("Database not initialized. Call init_db() first.") + return _async_session_factory + + +async def get_async_session() -> AsyncGenerator[AsyncSession, None]: + """Yield an async database session. + + This is designed to be used as a FastAPI dependency or in any async context. + The session is automatically closed when the context exits. + + Yields: + AsyncSession: Database session for the current context + """ + factory = get_session_factory() + async with factory() as session: + try: + yield session + except Exception: + await session.rollback() + raise + + +async def init_db(settings: Settings) -> None: + """Initialize the database engine and session factory. + + This should be called once at application startup. + + Args: + settings: Application settings containing database configuration + """ + global _engine, _async_session_factory + + logger.info( + "Initializing database", + database_url=_mask_password(settings.database_url), + ) + + # Determine pool settings based on database type + is_sqlite = settings.database_url.startswith("sqlite") + + engine_kwargs: dict[str, Any] = { + "echo": settings.debug, + } + + if is_sqlite: + # SQLite doesn't support connection pooling the same way + engine_kwargs["poolclass"] = NullPool + # Required for SQLite async + engine_kwargs["connect_args"] = {"check_same_thread": False} + else: + # PostgreSQL connection pool settings + engine_kwargs["pool_size"] = settings.database_pool_min + engine_kwargs["max_overflow"] = ( + settings.database_pool_max - settings.database_pool_min + ) + engine_kwargs["pool_pre_ping"] = True # Verify connections before use + + _engine = create_async_engine( + settings.database_url, + **engine_kwargs, + ) + + _async_session_factory = async_sessionmaker( + bind=_engine, + class_=AsyncSession, + expire_on_commit=False, + autoflush=False, + ) + + logger.info("Database initialized successfully") + + +async def close_db() -> None: + """Close the database engine and all connections. + + This should be called at application shutdown. + """ + global _engine, _async_session_factory + + if _engine is not None: + logger.info("Closing database connections") + await _engine.dispose() + _engine = None + _async_session_factory = None + logger.info("Database connections closed") + + +async def check_db_connection() -> bool: + """Check if the database connection is working. + + Returns: + True if connection is healthy, False otherwise + """ + try: + engine = get_engine() + async with engine.connect() as conn: + await conn.execute(text("SELECT 1")) + return True + except Exception as e: + logger.error("Database health check failed", error=str(e)) + return False + + +def _mask_password(url: str) -> str: + """Mask password in database URL for logging. + + Args: + url: Database URL + + Returns: + URL with password masked + """ + # Simple masking - replace password between :// and @ + if "://" in url and "@" in url: + prefix = url.split("://")[0] + "://" + rest = url.split("://")[1] + if "@" in rest: + creds, host = rest.split("@", 1) + if ":" in creds: + user = creds.split(":")[0] + return f"{prefix}{user}:****@{host}" + return url diff --git a/backend/src/bookbytes/core/exceptions.py b/backend/src/bookbytes/core/exceptions.py new file mode 100644 index 0000000..6a39d6c --- /dev/null +++ b/backend/src/bookbytes/core/exceptions.py @@ -0,0 +1,391 @@ +"""Custom exception hierarchy for BookBytes. + +This module defines a consistent exception hierarchy that enables: +- Structured error responses with error codes +- Consistent HTTP status code mapping +- Machine-readable error handling for API consumers + +Usage: + from bookbytes.core.exceptions import BookNotFoundError + + raise BookNotFoundError(isbn="1234567890") +""" + +from typing import Any + + +class BookBytesError(Exception): + """Base exception for all BookBytes errors. + + All custom exceptions should inherit from this class to enable + consistent error handling and response formatting. + + Attributes: + code: Machine-readable error code (e.g., "BOOK_NOT_FOUND") + message: Human-readable error message + status_code: HTTP status code to return + details: Additional error details (optional) + """ + + code: str = "INTERNAL_ERROR" + message: str = "An unexpected error occurred" + status_code: int = 500 + + def __init__( + self, + message: str | None = None, + code: str | None = None, + details: dict[str, Any] | None = None, + ) -> None: + """Initialize the exception. + + Args: + message: Override default message + code: Override default error code + details: Additional error details + """ + if message: + self.message = message + if code: + self.code = code + self.details = details or {} + super().__init__(self.message) + + def to_dict(self, request_id: str | None = None) -> dict[str, Any]: + """Convert exception to API error response format. + + Args: + request_id: Request correlation ID + + Returns: + Error response dictionary + """ + error: dict[str, Any] = { + "code": self.code, + "message": self.message, + } + if request_id: + error["request_id"] = request_id + if self.details: + error["details"] = self.details + return {"error": error} + + +# ============================================================================= +# Resource Not Found Errors (404) +# ============================================================================= + + +class NotFoundError(BookBytesError): + """Base class for resource not found errors.""" + + status_code: int = 404 + + +class BookNotFoundError(NotFoundError): + """Raised when a book cannot be found.""" + + code: str = "BOOK_NOT_FOUND" + message: str = "Book not found" + + def __init__( + self, + book_id: str | None = None, + isbn: str | None = None, + message: str | None = None, + ) -> None: + """Initialize with optional identifiers. + + Args: + book_id: UUID of the book + isbn: ISBN of the book + message: Override default message + """ + details: dict[str, Any] = {} + if book_id: + details["book_id"] = book_id + if isbn: + details["isbn"] = isbn + + if not message: + if isbn: + message = f"Book with ISBN {isbn} not found" + elif book_id: + message = f"Book with ID {book_id} not found" + + super().__init__(message=message, details=details if details else None) + + +class ChapterNotFoundError(NotFoundError): + """Raised when a chapter cannot be found.""" + + code: str = "CHAPTER_NOT_FOUND" + message: str = "Chapter not found" + + def __init__( + self, + chapter_id: str | None = None, + book_id: str | None = None, + chapter_number: int | None = None, + message: str | None = None, + ) -> None: + """Initialize with optional identifiers.""" + details: dict[str, Any] = {} + if chapter_id: + details["chapter_id"] = chapter_id + if book_id: + details["book_id"] = book_id + if chapter_number is not None: + details["chapter_number"] = chapter_number + + if not message and chapter_number is not None and book_id: + message = f"Chapter {chapter_number} not found for book {book_id}" + + super().__init__(message=message, details=details if details else None) + + +class JobNotFoundError(NotFoundError): + """Raised when a job cannot be found.""" + + code: str = "JOB_NOT_FOUND" + message: str = "Job not found" + + def __init__(self, job_id: str | None = None, message: str | None = None) -> None: + """Initialize with optional job ID.""" + details: dict[str, Any] = {} + if job_id: + details["job_id"] = job_id + if not message: + message = f"Job with ID {job_id} not found" + + super().__init__(message=message, details=details if details else None) + + +class UserNotFoundError(NotFoundError): + """Raised when a user cannot be found.""" + + code: str = "USER_NOT_FOUND" + message: str = "User not found" + + def __init__( + self, + user_id: str | None = None, + email: str | None = None, + message: str | None = None, + ) -> None: + """Initialize with optional identifiers.""" + details: dict[str, Any] = {} + if user_id: + details["user_id"] = user_id + if email: + details["email"] = email + + super().__init__(message=message, details=details if details else None) + + +class ISBNNotFoundError(NotFoundError): + """Raised when an ISBN lookup fails (from external API).""" + + code: str = "ISBN_NOT_FOUND" + message: str = "ISBN not found in metadata service" + + def __init__(self, isbn: str | None = None, message: str | None = None) -> None: + """Initialize with optional ISBN.""" + details: dict[str, Any] = {} + if isbn: + details["isbn"] = isbn + if not message: + message = f"ISBN {isbn} not found in Open Library" + + super().__init__(message=message, details=details if details else None) + + +# ============================================================================= +# Authentication & Authorization Errors (401, 403) +# ============================================================================= + + +class AuthenticationError(BookBytesError): + """Raised when authentication fails.""" + + code: str = "AUTHENTICATION_FAILED" + message: str = "Authentication failed" + status_code: int = 401 + + +class InvalidCredentialsError(AuthenticationError): + """Raised when login credentials are invalid.""" + + code: str = "INVALID_CREDENTIALS" + message: str = "Invalid email or password" + + +class InvalidTokenError(AuthenticationError): + """Raised when JWT token is invalid or expired.""" + + code: str = "INVALID_TOKEN" + message: str = "Invalid or expired token" + + +class AuthorizationError(BookBytesError): + """Raised when user lacks permission for an action.""" + + code: str = "AUTHORIZATION_FAILED" + message: str = "You do not have permission to perform this action" + status_code: int = 403 + + +# ============================================================================= +# Validation Errors (400) +# ============================================================================= + + +class ValidationError(BookBytesError): + """Raised when input validation fails.""" + + code: str = "VALIDATION_ERROR" + message: str = "Validation error" + status_code: int = 400 + + def __init__( + self, + message: str | None = None, + field: str | None = None, + details: dict[str, Any] | None = None, + ) -> None: + """Initialize with optional field information.""" + if details is None: + details = {} + if field: + details["field"] = field + super().__init__(message=message, details=details if details else None) + + +class InvalidISBNError(ValidationError): + """Raised when ISBN format is invalid.""" + + code: str = "INVALID_ISBN" + message: str = "Invalid ISBN format" + + def __init__(self, isbn: str | None = None, message: str | None = None) -> None: + """Initialize with optional ISBN.""" + details: dict[str, Any] = {} + if isbn: + details["isbn"] = isbn + super().__init__(message=message, field="isbn", details=details) + + +class DuplicateEmailError(ValidationError): + """Raised when email is already registered.""" + + code: str = "DUPLICATE_EMAIL" + message: str = "Email is already registered" + + +# ============================================================================= +# External Service Errors (502, 503) +# ============================================================================= + + +class ExternalServiceError(BookBytesError): + """Base class for external service errors.""" + + code: str = "EXTERNAL_SERVICE_ERROR" + message: str = "External service error" + status_code: int = 502 + + +class OpenAIServiceError(ExternalServiceError): + """Raised when OpenAI API call fails.""" + + code: str = "OPENAI_SERVICE_ERROR" + message: str = "Failed to communicate with OpenAI" + + +class TTSServiceError(ExternalServiceError): + """Raised when TTS service fails.""" + + code: str = "TTS_SERVICE_ERROR" + message: str = "Failed to generate audio" + + +class MetadataServiceError(ExternalServiceError): + """Raised when book metadata service fails.""" + + code: str = "METADATA_SERVICE_ERROR" + message: str = "Failed to fetch book metadata" + + +class StorageServiceError(ExternalServiceError): + """Raised when storage operations fail.""" + + code: str = "STORAGE_SERVICE_ERROR" + message: str = "Failed to store or retrieve file" + + +# ============================================================================= +# Job Processing Errors (409, 500) +# ============================================================================= + + +class JobError(BookBytesError): + """Base class for job processing errors.""" + + code: str = "JOB_ERROR" + message: str = "Job processing error" + + +class JobAlreadyExistsError(JobError): + """Raised when trying to create a duplicate job for a book.""" + + code: str = "JOB_ALREADY_EXISTS" + message: str = "A job for this book is already in progress" + status_code: int = 409 + + def __init__( + self, + book_id: str | None = None, + job_id: str | None = None, + message: str | None = None, + ) -> None: + """Initialize with optional identifiers. + + Args: + book_id: UUID of the book being processed + job_id: ID of the existing job + message: Override default message + """ + details: dict[str, Any] = {} + if book_id: + details["book_id"] = book_id + if job_id: + details["existing_job_id"] = job_id + + if not message and book_id: + message = f"A job for book {book_id} is already in progress" + + super().__init__(message=message, details=details if details else None) + + +class JobProcessingError(JobError): + """Raised when job processing fails.""" + + code: str = "JOB_PROCESSING_FAILED" + message: str = "Job processing failed" + status_code: int = 500 + + def __init__( + self, + job_id: str | None = None, + step: str | None = None, + error: str | None = None, + ) -> None: + """Initialize with optional job details.""" + details: dict[str, Any] = {} + if job_id: + details["job_id"] = job_id + if step: + details["failed_step"] = step + if error: + details["error"] = error + super().__init__(details=details if details else None) diff --git a/backend/src/bookbytes/core/logging.py b/backend/src/bookbytes/core/logging.py new file mode 100644 index 0000000..ed8145c --- /dev/null +++ b/backend/src/bookbytes/core/logging.py @@ -0,0 +1,169 @@ +"""Structured logging configuration using Python's standard logging. + +This module configures logging with: +- JSON output for production (machine-readable) +- Console output for development (human-readable) +- Correlation ID support for request tracing +- Structlog integration for structured log entries + +Usage: + from bookbytes.core.logging import configure_logging, get_logger + + # Configure at app startup + configure_logging(log_level="INFO", json_format=True) + + # Get a logger in any module + logger = get_logger(__name__) + logger.info("Processing book", isbn="1234567890", user_id="abc-123") +""" + +import logging +import sys +from contextvars import ContextVar +from typing import Any + +import structlog +from structlog.types import EventDict, Processor + +from bookbytes.config import Settings + +# Context variable for correlation/request ID +correlation_id_ctx: ContextVar[str | None] = ContextVar("correlation_id", default=None) + + +def get_correlation_id() -> str | None: + """Get the current correlation ID from context.""" + return correlation_id_ctx.get() + + +def set_correlation_id(correlation_id: str) -> None: + """Set the correlation ID in context.""" + correlation_id_ctx.set(correlation_id) + + +def clear_correlation_id() -> None: + """Clear the correlation ID from context.""" + correlation_id_ctx.set(None) + + +def add_correlation_id( + logger: logging.Logger, method_name: str, event_dict: EventDict +) -> EventDict: + """Structlog processor to add correlation ID to log entries.""" + correlation_id = get_correlation_id() + if correlation_id: + event_dict["correlation_id"] = correlation_id + return event_dict + + +def add_app_context( + logger: logging.Logger, method_name: str, event_dict: EventDict +) -> EventDict: + """Add application context to log entries.""" + event_dict["service"] = "bookbytes" + return event_dict + + +def configure_logging(settings: Settings | None = None) -> None: + """Configure structured logging for the application. + + Args: + settings: Application settings. If None, uses default settings. + """ + if settings is None: + from bookbytes.config import get_settings + + settings = get_settings() + + # Determine log level + log_level = getattr(logging, settings.log_level.value, logging.INFO) + + # Shared processors for all environments + shared_processors: list[Processor] = [ + structlog.contextvars.merge_contextvars, + structlog.stdlib.add_log_level, + structlog.stdlib.add_logger_name, + structlog.stdlib.PositionalArgumentsFormatter(), + add_correlation_id, + add_app_context, + structlog.processors.TimeStamper(fmt="iso"), + structlog.processors.StackInfoRenderer(), + structlog.processors.UnicodeDecoder(), + ] + + if settings.use_json_logs: + # Production: JSON format for log aggregation + processors: list[Processor] = [ + *shared_processors, + structlog.processors.format_exc_info, + structlog.processors.JSONRenderer(), + ] + formatter = structlog.stdlib.ProcessorFormatter( + processor=structlog.processors.JSONRenderer(), + foreign_pre_chain=shared_processors, + ) + else: + # Development: Pretty console output + processors = [ + *shared_processors, + structlog.dev.ConsoleRenderer(colors=True), + ] + formatter = structlog.stdlib.ProcessorFormatter( + processor=structlog.dev.ConsoleRenderer(colors=True), + foreign_pre_chain=shared_processors, + ) + + # Configure structlog + structlog.configure( + processors=processors, + wrapper_class=structlog.stdlib.BoundLogger, + context_class=dict, + logger_factory=structlog.stdlib.LoggerFactory(), + cache_logger_on_first_use=True, + ) + + # Configure root logger + root_logger = logging.getLogger() + root_logger.setLevel(log_level) + + # Remove existing handlers + for handler in root_logger.handlers[:]: + root_logger.removeHandler(handler) + + # Add new handler with structlog formatter + handler = logging.StreamHandler(sys.stdout) + handler.setFormatter(formatter) + handler.setLevel(log_level) + root_logger.addHandler(handler) + + # Quiet noisy loggers + logging.getLogger("uvicorn.access").setLevel(logging.WARNING) + logging.getLogger("httpx").setLevel(logging.WARNING) + logging.getLogger("httpcore").setLevel(logging.WARNING) + + +def get_logger(name: str | None = None) -> structlog.stdlib.BoundLogger: + """Get a structured logger instance. + + Args: + name: Logger name, typically __name__ of the calling module. + + Returns: + A bound structlog logger that outputs structured logs. + + Example: + logger = get_logger(__name__) + logger.info("Processing started", book_id="123", chapter=1) + logger.error("Processing failed", book_id="123", error=str(e)) + """ + return structlog.get_logger(name) + + +def log_context(**kwargs: Any) -> structlog.contextvars.bound_contextvars: + """Context manager to bind values to all logs within the context. + + Example: + with log_context(request_id="abc-123", user_id="user-456"): + logger.info("Processing") # Includes request_id and user_id + """ + return structlog.contextvars.bound_contextvars(**kwargs) diff --git a/backend/src/bookbytes/dependencies.py b/backend/src/bookbytes/dependencies.py new file mode 100644 index 0000000..c9c3475 --- /dev/null +++ b/backend/src/bookbytes/dependencies.py @@ -0,0 +1,178 @@ +"""FastAPI dependency injection container. + +This module provides dependency injection functions for use with FastAPI's +Depends() pattern. Dependencies are organized by functionality and can be +easily mocked for testing. +""" + +from collections.abc import AsyncGenerator +from functools import lru_cache +from typing import Annotated + +from fastapi import Depends, Request +from sqlalchemy.ext.asyncio import AsyncSession + +from bookbytes.config import Settings, get_settings + +# Type alias for common dependency patterns +SettingsDep = Annotated[Settings, Depends(get_settings)] + + +# ======================================== +# Settings Dependencies +# ======================================== +def get_settings_from_request(request: Request) -> Settings: + """Get settings from request state (set during lifespan). + + This is useful when you need settings that were potentially + modified during startup. + + Args: + request: The current request + + Returns: + Settings: Application settings + """ + return request.app.state.settings + + +# ======================================== +# Database Dependencies +# ======================================== +async def get_db_session() -> AsyncGenerator[AsyncSession, None]: + """Get an async database session. + + Yields a database session that automatically handles + commit on success and rollback on exception. + + Yields: + AsyncSession: Database session + """ + + from bookbytes.core.database import get_async_session + + async for session in get_async_session(): + try: + yield session + await session.commit() + except Exception: + await session.rollback() + raise + + +# ======================================== +# Redis Dependencies +# ======================================== +# TODO: Implement in Phase 3 +async def get_redis() -> AsyncGenerator[None, None]: + """Get a Redis connection. + + Yields a Redis connection from the connection pool. + + Yields: + Redis: Redis connection + """ + # Placeholder - will be implemented in Phase 3 + # redis = await aioredis.from_url(settings.redis_url) + # try: + # yield redis + # finally: + # await redis.close() + yield None # type: ignore + + +# ======================================== +# Storage Dependencies +# ======================================== +# TODO: Implement in Phase 5 +@lru_cache +def get_storage() -> None: + """Get the configured storage backend. + + Returns the appropriate storage backend (Local or S3) based on + the STORAGE_BACKEND configuration. + + Returns: + StorageBackend: Configured storage backend + """ + # Placeholder - will be implemented in Phase 5 + # settings = get_settings() + # from bookbytes.storage import get_storage_backend + # return get_storage_backend(settings) + return None + + +# ======================================== +# Service Dependencies +# ======================================== +# TODO: Implement in Phase 5 +def get_metadata_service() -> None: + """Get the book metadata service. + + Returns: + BookMetadataService: Metadata service instance + """ + # Placeholder - will be implemented in Phase 5 + return None + + +def get_openai_service() -> None: + """Get the OpenAI service. + + Returns: + OpenAIService: OpenAI service instance + """ + # Placeholder - will be implemented in Phase 5 + return None + + +def get_tts_service() -> None: + """Get the TTS (Text-to-Speech) service. + + Returns: + TTSService: TTS service instance + """ + # Placeholder - will be implemented in Phase 5 + return None + + +def get_book_service() -> None: + """Get the book processing service. + + Returns: + BookService: Book service instance + """ + # Placeholder - will be implemented in Phase 5 + return None + + +# ======================================== +# Auth Dependencies +# ======================================== +# TODO: Implement in Phase 6 +async def get_current_user() -> None: + """Get the current authenticated user. + + Decodes the JWT token from the Authorization header, + validates it, and returns the user from the database. + + Raises: + HTTPException: 401 if token is invalid or user not found + + Returns: + User: Current authenticated user + """ + # Placeholder - will be implemented in Phase 6 + return None + + +async def get_current_user_optional() -> None: + """Get the current user if authenticated, otherwise None. + + Useful for endpoints that work with or without authentication. + + Returns: + User | None: Current user or None if not authenticated + """ + # Placeholder - will be implemented in Phase 6 + return None diff --git a/backend/src/bookbytes/main.py b/backend/src/bookbytes/main.py new file mode 100644 index 0000000..b3787fa --- /dev/null +++ b/backend/src/bookbytes/main.py @@ -0,0 +1,363 @@ +"""FastAPI application factory for BookBytes. + +This module creates and configures the FastAPI application with: +- Lifespan management for startup/shutdown events +- Middleware configuration (CORS, request ID, logging) +- Exception handlers +- API routers +""" + +import time +import uuid +from collections.abc import AsyncGenerator +from contextlib import asynccontextmanager +from typing import Any + +from fastapi import FastAPI, Request, status +from fastapi.middleware.cors import CORSMiddleware +from fastapi.responses import JSONResponse + +from bookbytes.config import Settings, get_settings +from bookbytes.core.exceptions import BookBytesError +from bookbytes.core.logging import ( + clear_correlation_id, + configure_logging, + get_logger, + set_correlation_id, +) + +# Initialize logger for this module +logger = get_logger(__name__) + + +@asynccontextmanager +async def lifespan(app: FastAPI) -> AsyncGenerator[None, None]: + """Manage application lifespan events. + + Handles initialization and cleanup of: + - Logging configuration + - Database connection pool + - Redis connection + - Any other resources that need lifecycle management + + Args: + app: The FastAPI application instance + + Yields: + None: Control back to the application + """ + from bookbytes.core.database import close_db, init_db + + settings = get_settings() + + # ======================================== + # Startup + # ======================================== + # Configure logging first + configure_logging(settings) + + # Re-get logger after configuration + startup_logger = get_logger(__name__) + + # Initialize database connection pool + await init_db(settings) + + # TODO: Initialize Redis connection (Phase 3) + + # Store settings in app state for access in dependencies + app.state.settings = settings + + # Log startup + startup_logger.info( + "Application starting", + app_name=settings.app_name, + version=settings.app_version, + environment=settings.app_env.value, + debug=settings.debug, + ) + + yield + + # ======================================== + # Shutdown + # ======================================== + # Close database connections + await close_db() + + # TODO: Close Redis connections (Phase 3) + # TODO: Wait for in-flight requests (Phase 7) + + startup_logger.info("Application shutting down", app_name=settings.app_name) + + +def create_app(settings: Settings | None = None) -> FastAPI: + """Create and configure the FastAPI application. + + This is the application factory function that creates a fully configured + FastAPI instance with all middleware, routes, and exception handlers. + + Args: + settings: Optional settings override for testing + + Returns: + FastAPI: Configured application instance + """ + if settings is None: + settings = get_settings() + + app = FastAPI( + title=settings.app_name, + description=( + "AI-powered book summarization and audio generation service. " + "Upload a book ISBN and get chapter summaries with audio narration." + ), + version=settings.app_version, + docs_url="/docs" if settings.is_development else "/docs", + redoc_url="/redoc" if settings.is_development else "/redoc", + openapi_url="/openapi.json" if settings.is_development else "/openapi.json", + lifespan=lifespan, + ) + + # ======================================== + # Middleware + # ======================================== + configure_middleware(app, settings) + + # ======================================== + # Exception Handlers + # ======================================== + configure_exception_handlers(app) + + # ======================================== + # Routes + # ======================================== + configure_routes(app) + + return app + + +def configure_middleware(app: FastAPI, settings: Settings) -> None: + """Configure application middleware. + + Args: + app: The FastAPI application instance + settings: Application settings + """ + # CORS middleware + app.add_middleware( + CORSMiddleware, + allow_origins=["*"] if settings.is_development else [], + allow_credentials=True, + allow_methods=["*"], + allow_headers=["*"], + ) + + # Request logging middleware + @app.middleware("http") + async def logging_middleware(request: Request, call_next: Any) -> Any: + """Log requests and responses with correlation ID.""" + # Generate or extract request ID + request_id = request.headers.get("X-Request-ID", str(uuid.uuid4())) + request.state.request_id = request_id + + # Set correlation ID for all logs in this request context + set_correlation_id(request_id) + + # Log request start + request_logger = get_logger("bookbytes.request") + start_time = time.perf_counter() + + request_logger.info( + "Request started", + method=request.method, + path=request.url.path, + query=str(request.query_params) if request.query_params else None, + ) + + try: + response = await call_next(request) + + # Calculate duration + duration_ms = (time.perf_counter() - start_time) * 1000 + + # Log request completion + request_logger.info( + "Request completed", + method=request.method, + path=request.url.path, + status_code=response.status_code, + duration_ms=round(duration_ms, 2), + ) + + # Add request ID to response headers + response.headers["X-Request-ID"] = request_id + + return response + + except Exception as exc: + duration_ms = (time.perf_counter() - start_time) * 1000 + request_logger.error( + "Request failed", + method=request.method, + path=request.url.path, + duration_ms=round(duration_ms, 2), + error=str(exc), + ) + raise + + finally: + # Clear correlation ID + clear_correlation_id() + + +def configure_exception_handlers(app: FastAPI) -> None: + """Configure global exception handlers. + + Args: + app: The FastAPI application instance + """ + exception_logger = get_logger("bookbytes.exceptions") + + @app.exception_handler(BookBytesError) + async def bookbytes_exception_handler( + request: Request, exc: BookBytesError + ) -> JSONResponse: + """Handle BookBytes custom exceptions with structured error response.""" + request_id = getattr(request.state, "request_id", None) + + # Log at appropriate level based on status code + if exc.status_code >= 500: + exception_logger.error( + "Application error", + error_code=exc.code, + error_message=exc.message, + status_code=exc.status_code, + path=request.url.path, + ) + else: + exception_logger.warning( + "Client error", + error_code=exc.code, + error_message=exc.message, + status_code=exc.status_code, + path=request.url.path, + ) + + return JSONResponse( + status_code=exc.status_code, + content=exc.to_dict(request_id=request_id), + ) + + @app.exception_handler(Exception) + async def global_exception_handler( + request: Request, exc: Exception + ) -> JSONResponse: + """Handle unexpected exceptions with a consistent error response.""" + request_id = getattr(request.state, "request_id", None) + + exception_logger.exception( + "Unhandled exception", + error_type=type(exc).__name__, + error_message=str(exc), + path=request.url.path, + ) + + return JSONResponse( + status_code=status.HTTP_500_INTERNAL_SERVER_ERROR, + content={ + "error": { + "code": "INTERNAL_SERVER_ERROR", + "message": "An unexpected error occurred", + "request_id": request_id, + } + }, + ) + + +def configure_routes(app: FastAPI) -> None: + """Configure application routes. + + Args: + app: The FastAPI application instance + """ + + # Placeholder health check endpoint + @app.get( + "/health/live", + tags=["Health"], + summary="Liveness probe", + description="Returns OK if the service is running", + ) + async def liveness() -> dict[str, str]: + """Liveness probe for container orchestration.""" + return {"status": "ok"} + + @app.get( + "/health/ready", + tags=["Health"], + summary="Readiness probe", + description="Returns OK if the service is ready to accept requests", + ) + async def readiness() -> dict[str, Any]: + """Readiness probe checking dependent services.""" + from bookbytes.core.database import check_db_connection + + # Check database connectivity + db_ok = await check_db_connection() + + # TODO: Check Redis connectivity (Phase 3) + redis_ok = True # Placeholder + + overall_status = "ok" if (db_ok and redis_ok) else "error" + + return { + "status": overall_status, + "checks": { + "database": "ok" if db_ok else "error", + "redis": "ok" if redis_ok else "error", + }, + } + + # Root endpoint + @app.get( + "/", + tags=["Root"], + summary="API root", + description="Returns API information", + ) + async def root() -> dict[str, str]: + """API root endpoint with service information.""" + settings = get_settings() + return { + "service": settings.app_name, + "version": settings.app_version, + "docs": "/docs", + "health": "/health/live", + } + + # Include API v1 router + from bookbytes.api.v1.router import router as v1_router + + app.include_router(v1_router, prefix="/api/v1") + + +# Create the application instance +app = create_app() + + +def cli() -> None: + """CLI entry point for running the application.""" + import uvicorn + + settings = get_settings() + uvicorn.run( + "bookbytes.main:app", + host=settings.host, + port=settings.port, + reload=settings.is_development, + log_level=settings.log_level.value.lower(), + ) + + +if __name__ == "__main__": + cli() diff --git a/backend/src/bookbytes/models/__init__.py b/backend/src/bookbytes/models/__init__.py new file mode 100644 index 0000000..fe654ca --- /dev/null +++ b/backend/src/bookbytes/models/__init__.py @@ -0,0 +1,43 @@ +"""Models package for BookBytes. + +This module exports the Base class and all model classes. +Models are added incrementally as each phase is implemented. +""" + +from bookbytes.models.audio_book import AudioBook, AudioBookStatus +from bookbytes.models.audio_book_job import AudioBookJob +from bookbytes.models.base import ( + Base, + SoftDeleteMixin, + TimestampMixin, + UUIDPrimaryKeyMixin, +) +from bookbytes.models.book_provider import BookProvider, BookProviderType +from bookbytes.models.chapter import Chapter +from bookbytes.models.edition import Edition +from bookbytes.models.job import Job, JobStatus, JobType +from bookbytes.models.work import Work + +__all__ = [ + # Base and Mixins + "Base", + "UUIDPrimaryKeyMixin", + "TimestampMixin", + "SoftDeleteMixin", + # Phase 3: Audio Books Library + "Work", + "Edition", + "BookProvider", + "BookProviderType", + "AudioBook", + "AudioBookStatus", + "Chapter", + # Phase 3.1: Audio Books Pipeline + "Job", + "JobStatus", + "JobType", + "AudioBookJob", + # NOTE: No APICache - using Redis-only caching + # Future phases will add: + # Phase 6: User (authentication) +] diff --git a/backend/src/bookbytes/models/audio_book.py b/backend/src/bookbytes/models/audio_book.py new file mode 100644 index 0000000..9de0b1e --- /dev/null +++ b/backend/src/bookbytes/models/audio_book.py @@ -0,0 +1,79 @@ +"""AudioBook model - our generated audiobook content. + +AudioBook is the core output entity - it represents our generated audio +content for a specific Edition. Uses soft delete to preserve history. +""" + +from __future__ import annotations + +from enum import Enum +from typing import TYPE_CHECKING +from uuid import UUID + +from sqlalchemy import ForeignKey, String +from sqlalchemy.orm import Mapped, mapped_column, relationship + +from bookbytes.models.base import ( + Base, + SoftDeleteMixin, + TimestampMixin, + UUIDPrimaryKeyMixin, +) + +if TYPE_CHECKING: + from bookbytes.models.chapter import Chapter + from bookbytes.models.edition import Edition + + +class AudioBookStatus(str, Enum): + """Processing status for audiobook generation.""" + + PENDING = "pending" + PROCESSING = "processing" + COMPLETED = "completed" + FAILED = "failed" + + +class AudioBook(UUIDPrimaryKeyMixin, TimestampMixin, SoftDeleteMixin, Base): + """Generated audiobook for a specific edition. + + Created when a user requests audiobook generation for an Edition. + Uses SoftDeleteMixin for soft delete support (preserves history). + + Attributes: + edition_id: Foreign key to the Edition this audiobook is for + status: Processing status (pending, processing, completed, failed) + version: Version number (incremented on refresh/regeneration) + error_message: Error details if status is "failed" + """ + + __tablename__ = "audio_books" + + edition_id: Mapped[UUID] = mapped_column( + ForeignKey("editions.id", ondelete="CASCADE"), + nullable=False, + index=True, + ) + status: Mapped[str] = mapped_column( + String(20), + nullable=False, + default=AudioBookStatus.PENDING.value, + index=True, + ) + version: Mapped[int] = mapped_column(nullable=False, default=1) + error_message: Mapped[str | None] = mapped_column(String(1000), nullable=True) + + # Relationships + edition: Mapped[Edition] = relationship("Edition", back_populates="audio_book") + chapters: Mapped[list[Chapter]] = relationship( + "Chapter", + back_populates="audio_book", + cascade="all, delete-orphan", + order_by="Chapter.chapter_number", + ) + + def __repr__(self) -> str: + return ( + f"" + ) diff --git a/backend/src/bookbytes/models/audio_book_job.py b/backend/src/bookbytes/models/audio_book_job.py new file mode 100644 index 0000000..327e816 --- /dev/null +++ b/backend/src/bookbytes/models/audio_book_job.py @@ -0,0 +1,58 @@ +"""AudioBookJob relation - links jobs to audiobooks. + +This relation table maintains the link between generic jobs +and audiobook domain entities, keeping the Job model pure. +""" + +from __future__ import annotations + +from typing import TYPE_CHECKING +from uuid import UUID + +from sqlalchemy import ForeignKey +from sqlalchemy.orm import Mapped, mapped_column, relationship + +from bookbytes.models.base import Base, TimestampMixin, UUIDPrimaryKeyMixin + +if TYPE_CHECKING: + from bookbytes.models.audio_book import AudioBook + from bookbytes.models.job import Job + + +class AudioBookJob(UUIDPrimaryKeyMixin, TimestampMixin, Base): + """Relation between Job and AudioBook. + + Links a generic job to an audiobook. One job can be associated + with one audiobook (1:1 via unique constraint on job_id). + + Using CASCADE delete on both FKs: + - If job is deleted, this link is deleted + - If audiobook is deleted, this link is deleted + + Attributes: + job_id: Foreign key to the generic job + audio_book_id: Foreign key to the audiobook being processed + """ + + __tablename__ = "audio_book_jobs" + + job_id: Mapped[UUID] = mapped_column( + ForeignKey("jobs.id", ondelete="CASCADE"), + nullable=False, + unique=True, # 1 job : 1 audiobook + index=True, + ) + audio_book_id: Mapped[UUID] = mapped_column( + ForeignKey("audio_books.id", ondelete="CASCADE"), + nullable=False, + index=True, + ) + + # Relationships + job: Mapped[Job] = relationship("Job", lazy="joined") + audio_book: Mapped[AudioBook] = relationship("AudioBook", lazy="joined") + + def __repr__(self) -> str: + return ( + f"" + ) diff --git a/backend/src/bookbytes/models/base.py b/backend/src/bookbytes/models/base.py new file mode 100644 index 0000000..7ae3ad4 --- /dev/null +++ b/backend/src/bookbytes/models/base.py @@ -0,0 +1,111 @@ +"""SQLAlchemy Base model and common mixins. + +This module provides: +- Base: Declarative base for all models +- UUIDPrimaryKeyMixin: UUIDv7 primary key for all entities (time-sortable) +- TimestampMixin: created_at and updated_at columns + +Usage: + from bookbytes.models.base import Base, UUIDPrimaryKeyMixin, TimestampMixin + + class MyModel(UUIDPrimaryKeyMixin, TimestampMixin, Base): + __tablename__ = "my_table" + name: Mapped[str] +""" + +import uuid +from datetime import datetime + +from sqlalchemy import DateTime, func +from sqlalchemy.orm import DeclarativeBase, Mapped, declared_attr, mapped_column +from uuid6 import uuid7 + + +class Base(DeclarativeBase): + """Base class for all SQLAlchemy models. + + All models should inherit from this class to be included in migrations. + """ + + pass + + +class UUIDPrimaryKeyMixin: + """Mixin that adds a UUIDv7 primary key column. + + All entities use UUIDv7 as primary key for: + - Time-sortable (monotonically increasing within millisecond) + - Better B-tree index performance (sequential inserts) + - Security (not guessable like auto-increment) + - Distributed ID generation (no central sequence) + - URL-safe identifiers + + Uses uuid6 library (RFC 9562 compliant) until Python 3.14 adds native support. + """ + + @declared_attr + def id(cls) -> Mapped[uuid.UUID]: + return mapped_column( + primary_key=True, + default=uuid7, + nullable=False, + ) + + +class TimestampMixin: + """Mixin that adds created_at and updated_at columns. + + - created_at: Set automatically on insert + - updated_at: Set automatically on insert and update + """ + + @declared_attr + def created_at(cls) -> Mapped[datetime]: + return mapped_column( + DateTime(timezone=True), + server_default=func.now(), + nullable=False, + ) + + @declared_attr + def updated_at(cls) -> Mapped[datetime]: + return mapped_column( + DateTime(timezone=True), + server_default=func.now(), + onupdate=func.now(), + nullable=False, + ) + + +class SoftDeleteMixin: + """Mixin that adds soft delete support via deleted_at timestamp. + + Instead of permanently deleting records, this marks them with a timestamp. + Queries should filter out soft-deleted records by default. + + - deleted_at: Nullable timestamp, NULL means not deleted + """ + + @declared_attr + def deleted_at(cls) -> Mapped[datetime | None]: + return mapped_column( + DateTime(timezone=True), + nullable=True, + default=None, + index=True, # Index for efficient filtering + ) + + @property + def is_deleted(self) -> bool: + """Check if this entity has been soft-deleted.""" + return self.deleted_at is not None + + def mark_deleted(self) -> None: + """Mark this entity as soft-deleted.""" + from datetime import UTC, datetime + + self.deleted_at = datetime.now(UTC) + + def restore(self) -> None: + """Restore a soft-deleted entity.""" + self.deleted_at = None diff --git a/backend/src/bookbytes/models/book_provider.py b/backend/src/bookbytes/models/book_provider.py new file mode 100644 index 0000000..a6a5efb --- /dev/null +++ b/backend/src/bookbytes/models/book_provider.py @@ -0,0 +1,100 @@ +"""BookProvider model - maps internal entities to external provider IDs. + +This is a polymorphic table that can link either Works or Editions to +external providers like OpenLibrary, Google Books, etc. + +Benefits: +- Provider-agnostic core models (no openlibrary_key on Work/Edition) +- Easy to add new providers without migrations +- Same Work/Edition can have IDs from multiple providers +""" + +from __future__ import annotations + +from enum import Enum +from typing import TYPE_CHECKING +from uuid import UUID + +from sqlalchemy import JSON, ForeignKey, Index, String, UniqueConstraint +from sqlalchemy.orm import Mapped, mapped_column, relationship + +from bookbytes.models.base import Base, TimestampMixin, UUIDPrimaryKeyMixin + +if TYPE_CHECKING: + from bookbytes.models.edition import Edition + from bookbytes.models.work import Work + + +class BookProviderType(str, Enum): + """Supported external data providers.""" + + OPENLIBRARY = "openlibrary" + GOOGLE_BOOKS = "google_books" + # Future: AMAZON, GOODREADS, etc. + + +class BookProvider(UUIDPrimaryKeyMixin, TimestampMixin, Base): + """Maps internal UUIDs to external provider IDs. + + This is a sparse/polymorphic table - either work_id OR edition_id is set, + never both. + + Attributes: + entity_type: "work" or "edition" - which table this maps to + entity_id: The UUID of the Work or Edition + provider: Provider name (e.g., "openlibrary") + external_key: Provider's ID (e.g., "/works/OL27448W") + provider_metadata: Optional JSON with extra provider-specific data + work_id: FK to works table (nullable, set when entity_type="work") + edition_id: FK to editions table (nullable, set when entity_type="edition") + """ + + __tablename__ = "book_providers" + + # Entity mapping + entity_type: Mapped[str] = mapped_column( + String(20), + nullable=False, + ) # "work" or "edition" + entity_id: Mapped[UUID] = mapped_column(nullable=False) + + # Provider info + provider: Mapped[str] = mapped_column(String(50), nullable=False, index=True) + external_key: Mapped[str] = mapped_column(String(200), nullable=False) + + # Optional provider-specific metadata + provider_metadata: Mapped[dict | None] = mapped_column(JSON, nullable=True) + + # Relationships (nullable - only one will be set based on entity_type) + work_id: Mapped[UUID | None] = mapped_column( + ForeignKey("works.id", ondelete="CASCADE"), + nullable=True, + ) + edition_id: Mapped[UUID | None] = mapped_column( + ForeignKey("editions.id", ondelete="CASCADE"), + nullable=True, + ) + + work: Mapped[Work | None] = relationship( + "Work", + back_populates="book_providers", + foreign_keys=[work_id], + ) + edition: Mapped[Edition | None] = relationship( + "Edition", + back_populates="book_providers", + foreign_keys=[edition_id], + ) + + __table_args__ = ( + # Each provider key should be unique globally + UniqueConstraint("provider", "external_key", name="uq_provider_external_key"), + # Index for looking up all providers for an entity + Index("ix_book_providers_entity_lookup", "entity_type", "entity_id"), + ) + + def __repr__(self) -> str: + return ( + f"" + ) diff --git a/backend/src/bookbytes/models/chapter.py b/backend/src/bookbytes/models/chapter.py new file mode 100644 index 0000000..b62431e --- /dev/null +++ b/backend/src/bookbytes/models/chapter.py @@ -0,0 +1,66 @@ +"""Chapter model - audio content for a book chapter. + +Chapters belong to an AudioBook and contain the generated summary +and audio file references. +""" + +from __future__ import annotations + +from typing import TYPE_CHECKING +from uuid import UUID + +from sqlalchemy import ForeignKey, String, Text, UniqueConstraint +from sqlalchemy.orm import Mapped, mapped_column, relationship + +from bookbytes.models.base import Base, TimestampMixin, UUIDPrimaryKeyMixin + +if TYPE_CHECKING: + from bookbytes.models.audio_book import AudioBook + + +class Chapter(UUIDPrimaryKeyMixin, TimestampMixin, Base): + """A chapter within an audiobook. + + Contains the generated summary text and audio file references. + + Attributes: + audio_book_id: Foreign key to parent AudioBook + chapter_number: Sequential number within the book (1-indexed) + title: Chapter title + summary: Generated chapter summary text + audio_file_path: Local/S3 path to audio file + audio_url: Public URL for streaming + word_count: Number of words in summary + duration_seconds: Audio duration in seconds + """ + + __tablename__ = "chapters" + + audio_book_id: Mapped[UUID] = mapped_column( + ForeignKey("audio_books.id", ondelete="CASCADE"), + nullable=False, + index=True, + ) + chapter_number: Mapped[int] = mapped_column(nullable=False) + title: Mapped[str] = mapped_column(String(500), nullable=False) + summary: Mapped[str] = mapped_column(Text, nullable=False) + audio_file_path: Mapped[str | None] = mapped_column(String(500), nullable=True) + audio_url: Mapped[str | None] = mapped_column(String(1000), nullable=True) + word_count: Mapped[int | None] = mapped_column(nullable=True) + duration_seconds: Mapped[int | None] = mapped_column(nullable=True) + + # Relationships + audio_book: Mapped[AudioBook] = relationship("AudioBook", back_populates="chapters") + + __table_args__ = ( + # Each chapter number must be unique within an audiobook + UniqueConstraint( + "audio_book_id", "chapter_number", name="uq_chapter_audio_book_number" + ), + ) + + def __repr__(self) -> str: + return ( + f"" + ) diff --git a/backend/src/bookbytes/models/edition.py b/backend/src/bookbytes/models/edition.py new file mode 100644 index 0000000..c59fc03 --- /dev/null +++ b/backend/src/bookbytes/models/edition.py @@ -0,0 +1,80 @@ +"""Edition model - represents a specific ISBN/format of a Work. + +A Work can have many Editions (hardcover, paperback, different years, languages). +Each Edition has a unique ISBN and can have one AudioBook generated for it. +""" + +from __future__ import annotations + +from typing import TYPE_CHECKING +from uuid import UUID + +from sqlalchemy import ForeignKey, String +from sqlalchemy.orm import Mapped, mapped_column, relationship + +from bookbytes.models.base import Base, TimestampMixin, UUIDPrimaryKeyMixin + +if TYPE_CHECKING: + from bookbytes.models.audio_book import AudioBook + from bookbytes.models.book_provider import BookProvider + from bookbytes.models.work import Work + + +class Edition(UUIDPrimaryKeyMixin, TimestampMixin, Base): + """A specific edition of a book work. + + Example: "The Lord of the Rings" hardcover 1954 edition with ISBN 0-618-00222-7. + + Attributes: + work_id: Foreign key to parent Work + isbn: Normalized ISBN (10 or 13 digits, no dashes) + isbn_type: "isbn10" or "isbn13" + title: Edition-specific title (may differ from Work title) + publisher: Publisher name + publish_year: Year this edition was published + language: ISO 639-2 language code (default: "eng") + pages: Page count + """ + + __tablename__ = "editions" + + work_id: Mapped[UUID] = mapped_column( + ForeignKey("works.id", ondelete="CASCADE"), + nullable=False, + index=True, + ) + isbn: Mapped[str] = mapped_column( + String(13), + unique=True, + nullable=False, + index=True, + ) + isbn_type: Mapped[str] = mapped_column( + String(10), + nullable=False, + ) # "isbn10" or "isbn13" + title: Mapped[str] = mapped_column(String(500), nullable=False) + publisher: Mapped[str | None] = mapped_column(String(200), nullable=True) + publish_year: Mapped[int | None] = mapped_column(nullable=True, index=True) + language: Mapped[str] = mapped_column( + String(3), nullable=False, default="eng" + ) # ISO 639-2/B (bibliographic) code - standard for MARC/ONIX publishing + pages: Mapped[int | None] = mapped_column(nullable=True) + + # Relationships + work: Mapped[Work] = relationship("Work", back_populates="editions") + audio_book: Mapped[AudioBook | None] = relationship( + "AudioBook", + back_populates="edition", + uselist=False, + cascade="all, delete-orphan", + ) + book_providers: Mapped[list[BookProvider]] = relationship( + "BookProvider", + back_populates="edition", + cascade="all, delete-orphan", + foreign_keys="BookProvider.edition_id", + ) + + def __repr__(self) -> str: + return f"" diff --git a/backend/src/bookbytes/models/job.py b/backend/src/bookbytes/models/job.py new file mode 100644 index 0000000..0562b4b --- /dev/null +++ b/backend/src/bookbytes/models/job.py @@ -0,0 +1,114 @@ +"""Job model - generic background job tracking. + +Job is a fully generic table with no domain-specific columns. +Domain entities link to jobs via relation tables (e.g., audio_book_jobs). +""" + +from __future__ import annotations + +from datetime import datetime +from enum import Enum + +from sqlalchemy import String +from sqlalchemy.orm import Mapped, mapped_column + +from bookbytes.models.base import Base, TimestampMixin, UUIDPrimaryKeyMixin + + +class JobStatus(str, Enum): + """Processing status for background jobs.""" + + PENDING = "pending" + PROCESSING = "processing" + COMPLETED = "completed" + FAILED = "failed" + + +class JobType(str, Enum): + """Types of background jobs. + + Extend this enum as new job types are added. + """ + + AUDIOBOOK_GENERATION = "audiobook_generation" + AUDIOBOOK_REFRESH = "audiobook_refresh" + + +class Job(UUIDPrimaryKeyMixin, TimestampMixin, Base): + """Generic background job tracking. + + This model is intentionally domain-agnostic. Domain entities + (e.g., AudioBook) link to jobs via relation tables. + + State Machine: + pending β†’ processing β†’ completed + β†˜ failed β†’ pending (retry) + + Attributes: + job_type: Type of job (e.g., audiobook_generation) + status: Current status (pending, processing, completed, failed) + progress: Completion percentage (0-100) + current_step: Human-readable current step description + error_message: Error details if status is failed + error_code: Machine-readable error code + version: Optimistic lock for concurrent access + worker_id: Identifier of worker processing this job + retry_count: Number of retry attempts + max_retries: Maximum allowed retries + started_at: When processing started + completed_at: When processing finished (success or failure) + """ + + __tablename__ = "jobs" + + # === Core (Generic) === + job_type: Mapped[str] = mapped_column( + String(50), + nullable=False, + index=True, + ) + status: Mapped[str] = mapped_column( + String(20), + nullable=False, + default=JobStatus.PENDING.value, + index=True, + ) + + # === Progress Tracking === + progress: Mapped[int] = mapped_column(nullable=False, default=0) + current_step: Mapped[str | None] = mapped_column(String(100), nullable=True) + + # === Error Handling === + error_message: Mapped[str | None] = mapped_column(String(2000), nullable=True) + error_code: Mapped[str | None] = mapped_column(String(50), nullable=True) + + # === Concurrency Control === + version: Mapped[int] = mapped_column(nullable=False, default=1) + worker_id: Mapped[str | None] = mapped_column(String(100), nullable=True) + + # === Retry Tracking === + retry_count: Mapped[int] = mapped_column(nullable=False, default=0) + max_retries: Mapped[int] = mapped_column(nullable=False, default=3) + + # === Timestamps === + started_at: Mapped[datetime | None] = mapped_column(nullable=True) + completed_at: Mapped[datetime | None] = mapped_column(nullable=True) + + def __repr__(self) -> str: + return ( + f"" + ) + + @property + def is_terminal(self) -> bool: + """Check if job is in a terminal state (completed or failed).""" + return self.status in (JobStatus.COMPLETED.value, JobStatus.FAILED.value) + + @property + def can_retry(self) -> bool: + """Check if job can be retried.""" + return ( + self.status == JobStatus.FAILED.value + and self.retry_count < self.max_retries + ) diff --git a/backend/src/bookbytes/models/work.py b/backend/src/bookbytes/models/work.py new file mode 100644 index 0000000..92c79ea --- /dev/null +++ b/backend/src/bookbytes/models/work.py @@ -0,0 +1,61 @@ +"""Work model - represents a book across all editions. + +This is the top-level entity in our library hierarchy: +Work -> Edition -> AudioBook -> Chapter + +Works are provider-agnostic - external provider IDs are stored +in the BookProvider table for decoupling. +""" + +from __future__ import annotations + +from typing import TYPE_CHECKING + +from sqlalchemy import JSON, String +from sqlalchemy.orm import Mapped, mapped_column, relationship + +from bookbytes.models.base import Base, TimestampMixin, UUIDPrimaryKeyMixin + +if TYPE_CHECKING: + from bookbytes.models.book_provider import BookProvider + from bookbytes.models.edition import Edition + + +class Work(UUIDPrimaryKeyMixin, TimestampMixin, Base): + """A book work spanning all editions. + + Example: "The Lord of the Rings" is one Work, but may have many Editions + (hardcover 1954, paperback 1965, anniversary edition 2004, etc.). + + Attributes: + title: Primary title of the work + authors: List of author names (JSON array) + first_publish_year: Year of first publication (if known) + subjects: Topic/genre classifications (JSON array) + cover_url: URL to cover image (if available) + """ + + __tablename__ = "works" + + title: Mapped[str] = mapped_column(String(500), nullable=False, index=True) + authors: Mapped[list[str]] = mapped_column(JSON, nullable=False, default=list) + first_publish_year: Mapped[int | None] = mapped_column(nullable=True) + subjects: Mapped[list[str] | None] = mapped_column(JSON, nullable=True) + cover_url: Mapped[str | None] = mapped_column(String(1000), nullable=True) + + # Relationships + editions: Mapped[list[Edition]] = relationship( + "Edition", + back_populates="work", + cascade="all, delete-orphan", + lazy="selectin", + ) + book_providers: Mapped[list[BookProvider]] = relationship( + "BookProvider", + back_populates="work", + cascade="all, delete-orphan", + foreign_keys="BookProvider.work_id", + ) + + def __repr__(self) -> str: + return f"" diff --git a/backend/src/bookbytes/repositories/__init__.py b/backend/src/bookbytes/repositories/__init__.py new file mode 100644 index 0000000..01e96c0 --- /dev/null +++ b/backend/src/bookbytes/repositories/__init__.py @@ -0,0 +1,28 @@ +"""Repository pattern package for BookBytes. + +This module exports base repository classes and concrete repositories. +""" + +from bookbytes.repositories.audio_book import AudioBookRepository +from bookbytes.repositories.audio_book_job import AudioBookJobRepository +from bookbytes.repositories.base import BaseRepository, SoftDeleteRepository +from bookbytes.repositories.book_provider import BookProviderRepository +from bookbytes.repositories.chapter import ChapterRepository +from bookbytes.repositories.edition import EditionRepository +from bookbytes.repositories.job import JobRepository +from bookbytes.repositories.work import WorkRepository + +__all__ = [ + # Base + "BaseRepository", + "SoftDeleteRepository", + # Phase 3: Audio Books Library + "WorkRepository", + "EditionRepository", + "BookProviderRepository", + "AudioBookRepository", + "ChapterRepository", + # Phase 3.1: Audio Books Pipeline + "JobRepository", + "AudioBookJobRepository", +] diff --git a/backend/src/bookbytes/repositories/audio_book.py b/backend/src/bookbytes/repositories/audio_book.py new file mode 100644 index 0000000..320fc44 --- /dev/null +++ b/backend/src/bookbytes/repositories/audio_book.py @@ -0,0 +1,151 @@ +"""AudioBookRepository for managing AudioBook entities. + +Uses SoftDeleteRepository for soft delete support - audiobooks are never +permanently deleted to preserve history. +""" + +from uuid import UUID + +from sqlalchemy import select + +from bookbytes.models.audio_book import AudioBook, AudioBookStatus +from bookbytes.repositories.base import SoftDeleteRepository + + +class AudioBookRepository(SoftDeleteRepository[AudioBook]): + """Repository for AudioBook entities with soft delete support.""" + + async def get_by_edition( + self, + edition_id: UUID, + *, + include_deleted: bool = False, + ) -> AudioBook | None: + """Find an audiobook for a specific edition. + + Args: + edition_id: Edition UUID + include_deleted: If True, include soft-deleted audiobooks + + Returns: + AudioBook if found, None otherwise + """ + query = select(AudioBook).where(AudioBook.edition_id == edition_id) + + if not include_deleted: + query = query.where(AudioBook.deleted_at.is_(None)) + + result = await self.session.execute(query) + return result.scalar_one_or_none() + + async def get_by_status( + self, + status: AudioBookStatus, + *, + offset: int = 0, + limit: int = 100, + include_deleted: bool = False, + ) -> list[AudioBook]: + """Get audiobooks by processing status. + + Args: + status: Processing status to filter by + offset: Pagination offset + limit: Maximum results + include_deleted: If True, include soft-deleted audiobooks + + Returns: + List of audiobooks with the given status + """ + query = select(AudioBook).where(AudioBook.status == status.value) + + if not include_deleted: + query = query.where(AudioBook.deleted_at.is_(None)) + + result = await self.session.execute(query.offset(offset).limit(limit)) + return list(result.scalars().all()) + + async def get_pending(self, *, limit: int = 100) -> list[AudioBook]: + """Get audiobooks pending processing. + + Args: + limit: Maximum results + + Returns: + List of pending audiobooks + """ + return await self.get_by_status( + AudioBookStatus.PENDING, + limit=limit, + ) + + async def get_processing(self, *, limit: int = 100) -> list[AudioBook]: + """Get audiobooks currently being processed. + + Args: + limit: Maximum results + + Returns: + List of processing audiobooks + """ + return await self.get_by_status( + AudioBookStatus.PROCESSING, + limit=limit, + ) + + async def mark_processing(self, audiobook: AudioBook) -> AudioBook: + """Mark an audiobook as processing. + + Args: + audiobook: AudioBook to update + + Returns: + Updated audiobook + """ + audiobook.status = AudioBookStatus.PROCESSING.value + return await self.update(audiobook) + + async def mark_completed(self, audiobook: AudioBook) -> AudioBook: + """Mark an audiobook as completed. + + Args: + audiobook: AudioBook to update + + Returns: + Updated audiobook + """ + audiobook.status = AudioBookStatus.COMPLETED.value + audiobook.error_message = None + return await self.update(audiobook) + + async def mark_failed( + self, + audiobook: AudioBook, + error_message: str, + ) -> AudioBook: + """Mark an audiobook as failed with error message. + + Args: + audiobook: AudioBook to update + error_message: Error details + + Returns: + Updated audiobook + """ + audiobook.status = AudioBookStatus.FAILED.value + audiobook.error_message = error_message + return await self.update(audiobook) + + async def increment_version(self, audiobook: AudioBook) -> AudioBook: + """Increment audiobook version for refresh/regeneration. + + Args: + audiobook: AudioBook to update + + Returns: + Updated audiobook with incremented version + """ + audiobook.version += 1 + audiobook.status = AudioBookStatus.PENDING.value + audiobook.error_message = None + return await self.update(audiobook) diff --git a/backend/src/bookbytes/repositories/audio_book_job.py b/backend/src/bookbytes/repositories/audio_book_job.py new file mode 100644 index 0000000..055da2e --- /dev/null +++ b/backend/src/bookbytes/repositories/audio_book_job.py @@ -0,0 +1,106 @@ +"""AudioBookJob repository for job-audiobook relations. + +Provides operations to link jobs to audiobooks and query jobs +for a specific audiobook. +""" + +from __future__ import annotations + +from uuid import UUID + +from sqlalchemy import select +from sqlalchemy.orm import joinedload + +from bookbytes.models.audio_book_job import AudioBookJob +from bookbytes.models.job import Job +from bookbytes.repositories.base import BaseRepository + + +class AudioBookJobRepository(BaseRepository[AudioBookJob]): + """Repository for AudioBookJob relation model. + + Manages the link between generic jobs and audiobooks. + """ + + async def create_link( + self, + job_id: UUID, + audio_book_id: UUID, + ) -> AudioBookJob: + """Create a link between a job and an audiobook. + + Args: + job_id: The job's UUID + audio_book_id: The audiobook's UUID + + Returns: + The created AudioBookJob link + """ + link = AudioBookJob(job_id=job_id, audio_book_id=audio_book_id) + self.session.add(link) + await self.session.commit() + await self.session.refresh(link) + return link + + async def get_by_job_id(self, job_id: UUID) -> AudioBookJob | None: + """Get the link for a specific job. + + Args: + job_id: The job's UUID + + Returns: + The link if found, None otherwise + """ + query = ( + select(AudioBookJob) + .where(AudioBookJob.job_id == job_id) + .options(joinedload(AudioBookJob.audio_book)) + ) + result = await self.session.execute(query) + return result.scalar_one_or_none() + + async def get_jobs_for_audiobook( + self, + audio_book_id: UUID, + limit: int = 50, + ) -> list[Job]: + """Get all jobs associated with an audiobook. + + Args: + audio_book_id: The audiobook's UUID + limit: Maximum number of results + + Returns: + List of jobs for this audiobook, newest first + """ + query = ( + select(Job) + .join(AudioBookJob, AudioBookJob.job_id == Job.id) + .where(AudioBookJob.audio_book_id == audio_book_id) + .order_by(Job.created_at.desc()) + .limit(limit) + ) + result = await self.session.execute(query) + return list(result.scalars().all()) + + async def get_latest_job_for_audiobook( + self, + audio_book_id: UUID, + ) -> Job | None: + """Get the most recent job for an audiobook. + + Args: + audio_book_id: The audiobook's UUID + + Returns: + The latest job, or None if no jobs exist + """ + query = ( + select(Job) + .join(AudioBookJob, AudioBookJob.job_id == Job.id) + .where(AudioBookJob.audio_book_id == audio_book_id) + .order_by(Job.created_at.desc()) + .limit(1) + ) + result = await self.session.execute(query) + return result.scalar_one_or_none() diff --git a/backend/src/bookbytes/repositories/base.py b/backend/src/bookbytes/repositories/base.py new file mode 100644 index 0000000..f34b879 --- /dev/null +++ b/backend/src/bookbytes/repositories/base.py @@ -0,0 +1,381 @@ +"""Generic base repository with async CRUD operations and soft delete support. + +This module provides a generic repository pattern for SQLAlchemy models: +- BaseRepository[T]: Generic class for standard CRUD operations +- SoftDeleteRepository[T]: Repository with soft delete support +- All methods are async and use SQLAlchemy 2.0 style + +Usage: + from bookbytes.repositories.base import BaseRepository, SoftDeleteRepository + from bookbytes.models.book import Book + + # For models without soft delete + class JobRepository(BaseRepository[Job]): + pass + + # For models with SoftDeleteMixin + class BookRepository(SoftDeleteRepository[Book]): + pass + + # Usage + repo = BookRepository(session) + book = await repo.get_by_id(book_id) + books = await repo.get_all(offset=0, limit=10) + await repo.soft_delete(book) # Sets deleted_at instead of removing +""" + +from datetime import UTC, datetime +from typing import Any, Generic, TypeVar +from uuid import UUID + +from sqlalchemy import func, select +from sqlalchemy.ext.asyncio import AsyncSession + +from bookbytes.models.base import Base, SoftDeleteMixin + +# Type variable for model classes +T = TypeVar("T", bound=Base) + +# Type variable for soft-deletable models +S = TypeVar("S", bound=SoftDeleteMixin) + + +class BaseRepository(Generic[T]): + """Generic repository providing async CRUD operations. + + This base class provides standard database operations that can be + inherited by specific repositories. It uses SQLAlchemy 2.0 style + async queries. + + Type Parameters: + T: The SQLAlchemy model class + + Attributes: + session: The async database session + model_class: The model class for this repository + """ + + model_class: type[T] + + def __init__(self, session: AsyncSession) -> None: + """Initialize the repository with a database session. + + Args: + session: Async database session + """ + self.session = session + + def __init_subclass__(cls, **kwargs: Any) -> None: + """Extract model class from Generic type parameter.""" + super().__init_subclass__(**kwargs) + # Get the model class from the type parameter + for base in cls.__orig_bases__: # type: ignore[attr-defined] + if hasattr(base, "__args__"): + cls.model_class = base.__args__[0] + break + + async def get_by_id(self, id: UUID) -> T | None: + """Get a single entity by its UUID. + + Args: + id: The entity's UUID + + Returns: + The entity if found, None otherwise + """ + result = await self.session.execute( + select(self.model_class).where(self.model_class.id == id) + ) + return result.scalar_one_or_none() + + async def get_all( + self, + *, + offset: int = 0, + limit: int = 100, + ) -> list[T]: + """Get all entities with pagination. + + Args: + offset: Number of records to skip + limit: Maximum records to return + + Returns: + List of entities + """ + result = await self.session.execute( + select(self.model_class).offset(offset).limit(limit) + ) + return list(result.scalars().all()) + + async def count(self) -> int: + """Count total entities. + + Returns: + Total count + """ + result = await self.session.execute( + select(func.count()).select_from(self.model_class) + ) + return result.scalar_one() + + async def create(self, entity: T) -> T: + """Create a new entity. + + Args: + entity: The entity to create + + Returns: + The created entity with generated fields (id, timestamps) + """ + self.session.add(entity) + await self.session.flush() + await self.session.refresh(entity) + return entity + + async def create_many(self, entities: list[T]) -> list[T]: + """Create multiple entities in bulk. + + Args: + entities: List of entities to create + + Returns: + List of created entities + """ + self.session.add_all(entities) + await self.session.flush() + for entity in entities: + await self.session.refresh(entity) + return entities + + async def update(self, entity: T) -> T: + """Update an existing entity. + + Args: + entity: The entity with updated fields + + Returns: + The updated entity + """ + await self.session.flush() + await self.session.refresh(entity) + return entity + + async def hard_delete(self, entity: T) -> None: + """Permanently delete an entity from the database. + + WARNING: This is a hard delete. For soft delete, use SoftDeleteRepository. + + Args: + entity: The entity to permanently delete + """ + await self.session.delete(entity) + await self.session.flush() + + async def hard_delete_by_id(self, id: UUID) -> bool: + """Permanently delete an entity by its UUID. + + WARNING: This is a hard delete. For soft delete, use SoftDeleteRepository. + + Args: + id: The entity's UUID + + Returns: + True if entity was deleted, False if not found + """ + entity = await self.get_by_id(id) + if entity is None: + return False + await self.hard_delete(entity) + return True + + async def exists(self, id: UUID) -> bool: + """Check if an entity exists. + + Args: + id: The entity's UUID + + Returns: + True if exists, False otherwise + """ + result = await self.session.execute( + select(func.count()) + .select_from(self.model_class) + .where(self.model_class.id == id) + ) + return result.scalar_one() > 0 + + +class SoftDeleteRepository(BaseRepository[T], Generic[T]): + """Repository with soft delete support. + + This repository is for models that include SoftDeleteMixin. + Instead of permanently deleting records, it sets the deleted_at timestamp. + By default, queries exclude soft-deleted records. + + Type Parameters: + T: The SQLAlchemy model class (must include SoftDeleteMixin) + """ + + async def get_by_id(self, id: UUID, *, include_deleted: bool = False) -> T | None: + """Get a single entity by its UUID. + + Args: + id: The entity's UUID + include_deleted: If True, include soft-deleted entities + + Returns: + The entity if found, None otherwise + """ + query = select(self.model_class).where(self.model_class.id == id) + + if not include_deleted: + query = query.where(self.model_class.deleted_at.is_(None)) + + result = await self.session.execute(query) + return result.scalar_one_or_none() + + async def get_all( + self, + *, + offset: int = 0, + limit: int = 100, + include_deleted: bool = False, + ) -> list[T]: + """Get all entities with pagination. + + Args: + offset: Number of records to skip + limit: Maximum records to return + include_deleted: If True, include soft-deleted entities + + Returns: + List of entities + """ + query = select(self.model_class) + + if not include_deleted: + query = query.where(self.model_class.deleted_at.is_(None)) + + result = await self.session.execute(query.offset(offset).limit(limit)) + return list(result.scalars().all()) + + async def count(self, *, include_deleted: bool = False) -> int: + """Count total entities. + + Args: + include_deleted: If True, include soft-deleted entities + + Returns: + Total count + """ + query = select(func.count()).select_from(self.model_class) + + if not include_deleted: + query = query.where(self.model_class.deleted_at.is_(None)) + + result = await self.session.execute(query) + return result.scalar_one() + + async def soft_delete(self, entity: T) -> T: + """Soft delete an entity by setting deleted_at timestamp. + + Args: + entity: The entity to soft delete + + Returns: + The soft-deleted entity + """ + entity.deleted_at = datetime.now(UTC) # type: ignore[attr-defined] + await self.session.flush() + await self.session.refresh(entity) + return entity + + async def soft_delete_by_id(self, id: UUID) -> bool: + """Soft delete an entity by its UUID. + + Args: + id: The entity's UUID + + Returns: + True if entity was soft-deleted, False if not found + """ + entity = await self.get_by_id(id, include_deleted=False) + if entity is None: + return False + await self.soft_delete(entity) + return True + + async def restore(self, entity: T) -> T: + """Restore a soft-deleted entity. + + Args: + entity: The entity to restore + + Returns: + The restored entity + """ + entity.deleted_at = None # type: ignore[attr-defined] + await self.session.flush() + await self.session.refresh(entity) + return entity + + async def restore_by_id(self, id: UUID) -> bool: + """Restore a soft-deleted entity by its UUID. + + Args: + id: The entity's UUID + + Returns: + True if entity was restored, False if not found + """ + entity = await self.get_by_id(id, include_deleted=True) + if entity is None or entity.deleted_at is None: # type: ignore[attr-defined] + return False + await self.restore(entity) + return True + + async def exists(self, id: UUID, *, include_deleted: bool = False) -> bool: + """Check if an entity exists. + + Args: + id: The entity's UUID + include_deleted: If True, include soft-deleted entities + + Returns: + True if exists, False otherwise + """ + query = ( + select(func.count()) + .select_from(self.model_class) + .where(self.model_class.id == id) + ) + + if not include_deleted: + query = query.where(self.model_class.deleted_at.is_(None)) + + result = await self.session.execute(query) + return result.scalar_one() > 0 + + # Alias for backwards compatibility and explicit naming + async def delete(self, entity: T) -> T: + """Soft delete an entity (alias for soft_delete). + + Args: + entity: The entity to soft delete + + Returns: + The soft-deleted entity + """ + return await self.soft_delete(entity) + + async def delete_by_id(self, id: UUID) -> bool: + """Soft delete an entity by ID (alias for soft_delete_by_id). + + Args: + id: The entity's UUID + + Returns: + True if soft-deleted, False if not found + """ + return await self.soft_delete_by_id(id) diff --git a/backend/src/bookbytes/repositories/book_provider.py b/backend/src/bookbytes/repositories/book_provider.py new file mode 100644 index 0000000..766d37f --- /dev/null +++ b/backend/src/bookbytes/repositories/book_provider.py @@ -0,0 +1,149 @@ +"""BookProviderRepository for managing provider mappings. + +Provides CRUD operations and specialized queries for BookProvider entities. +This is a polymorphic table that maps our internal UUIDs to external provider IDs. +""" + +from uuid import UUID + +from sqlalchemy import select + +from bookbytes.models.book_provider import BookProvider +from bookbytes.repositories.base import BaseRepository + + +class BookProviderRepository(BaseRepository[BookProvider]): + """Repository for BookProvider (external ID mapping) entities.""" + + async def get_by_provider_key( + self, + provider: str, + external_key: str, + ) -> BookProvider | None: + """Find a mapping by provider and external key. + + Args: + provider: Provider name (e.g., "openlibrary") + external_key: External ID (e.g., "/works/OL27448W") + + Returns: + BookProvider mapping if found, None otherwise + """ + result = await self.session.execute( + select(BookProvider) + .where(BookProvider.provider == provider) + .where(BookProvider.external_key == external_key) + ) + return result.scalar_one_or_none() + + async def get_for_work(self, work_id: UUID) -> list[BookProvider]: + """Get all provider mappings for a work. + + Args: + work_id: Work UUID + + Returns: + List of provider mappings + """ + result = await self.session.execute( + select(BookProvider) + .where(BookProvider.entity_type == "work") + .where(BookProvider.work_id == work_id) + ) + return list(result.scalars().all()) + + async def get_for_edition(self, edition_id: UUID) -> list[BookProvider]: + """Get all provider mappings for an edition. + + Args: + edition_id: Edition UUID + + Returns: + List of provider mappings + """ + result = await self.session.execute( + select(BookProvider) + .where(BookProvider.entity_type == "edition") + .where(BookProvider.edition_id == edition_id) + ) + return list(result.scalars().all()) + + async def create_work_mapping( + self, + work_id: UUID, + provider: str, + external_key: str, + provider_metadata: dict | None = None, + ) -> BookProvider: + """Create a provider mapping for a work. + + Args: + work_id: Work UUID + provider: Provider name + external_key: External ID + provider_metadata: Optional extra provider data + + Returns: + Created BookProvider mapping + """ + mapping = BookProvider( + entity_type="work", + entity_id=work_id, + work_id=work_id, + edition_id=None, + provider=provider, + external_key=external_key, + provider_metadata=provider_metadata, + ) + return await self.create(mapping) + + async def create_edition_mapping( + self, + edition_id: UUID, + provider: str, + external_key: str, + provider_metadata: dict | None = None, + ) -> BookProvider: + """Create a provider mapping for an edition. + + Args: + edition_id: Edition UUID + provider: Provider name + external_key: External ID + provider_metadata: Optional extra provider data + + Returns: + Created BookProvider mapping + """ + mapping = BookProvider( + entity_type="edition", + entity_id=edition_id, + work_id=None, + edition_id=edition_id, + provider=provider, + external_key=external_key, + provider_metadata=provider_metadata, + ) + return await self.create(mapping) + + async def provider_key_exists( + self, + provider: str, + external_key: str, + ) -> bool: + """Check if a provider key already exists. + + Args: + provider: Provider name + external_key: External ID + + Returns: + True if exists + """ + result = await self.session.execute( + select(BookProvider.id) + .where(BookProvider.provider == provider) + .where(BookProvider.external_key == external_key) + .limit(1) + ) + return result.scalar_one_or_none() is not None diff --git a/backend/src/bookbytes/repositories/chapter.py b/backend/src/bookbytes/repositories/chapter.py new file mode 100644 index 0000000..6beb10f --- /dev/null +++ b/backend/src/bookbytes/repositories/chapter.py @@ -0,0 +1,115 @@ +"""ChapterRepository for managing Chapter entities. + +Provides CRUD operations and specialized queries for Chapter entities. +""" + +from uuid import UUID + +from sqlalchemy import select + +from bookbytes.models.chapter import Chapter +from bookbytes.repositories.base import BaseRepository + + +class ChapterRepository(BaseRepository[Chapter]): + """Repository for Chapter entities.""" + + async def get_by_audio_book( + self, + audio_book_id: UUID, + *, + offset: int = 0, + limit: int = 100, + ) -> list[Chapter]: + """Get all chapters for an audiobook, ordered by chapter number. + + Args: + audio_book_id: AudioBook UUID + offset: Pagination offset + limit: Maximum results + + Returns: + List of chapters ordered by chapter_number + """ + result = await self.session.execute( + select(Chapter) + .where(Chapter.audio_book_id == audio_book_id) + .order_by(Chapter.chapter_number) + .offset(offset) + .limit(limit) + ) + return list(result.scalars().all()) + + async def get_by_number( + self, + audio_book_id: UUID, + chapter_number: int, + ) -> Chapter | None: + """Get a specific chapter by audiobook and chapter number. + + Args: + audio_book_id: AudioBook UUID + chapter_number: Chapter number (1-indexed) + + Returns: + Chapter if found, None otherwise + """ + result = await self.session.execute( + select(Chapter) + .where(Chapter.audio_book_id == audio_book_id) + .where(Chapter.chapter_number == chapter_number) + ) + return result.scalar_one_or_none() + + async def count_by_audio_book(self, audio_book_id: UUID) -> int: + """Count chapters for an audiobook. + + Args: + audio_book_id: AudioBook UUID + + Returns: + Number of chapters + """ + from sqlalchemy import func + + result = await self.session.execute( + select(func.count()) + .select_from(Chapter) + .where(Chapter.audio_book_id == audio_book_id) + ) + return result.scalar_one() + + async def get_total_duration(self, audio_book_id: UUID) -> int: + """Get total duration of all chapters in seconds. + + Args: + audio_book_id: AudioBook UUID + + Returns: + Total duration in seconds + """ + from sqlalchemy import func + + result = await self.session.execute( + select(func.coalesce(func.sum(Chapter.duration_seconds), 0)).where( + Chapter.audio_book_id == audio_book_id + ) + ) + return result.scalar_one() + + async def delete_all_for_audio_book(self, audio_book_id: UUID) -> int: + """Delete all chapters for an audiobook (for regeneration). + + Args: + audio_book_id: AudioBook UUID + + Returns: + Number of chapters deleted + """ + from sqlalchemy import delete + + result = await self.session.execute( + delete(Chapter).where(Chapter.audio_book_id == audio_book_id) + ) + await self.session.flush() + return result.rowcount diff --git a/backend/src/bookbytes/repositories/edition.py b/backend/src/bookbytes/repositories/edition.py new file mode 100644 index 0000000..6c2f43f --- /dev/null +++ b/backend/src/bookbytes/repositories/edition.py @@ -0,0 +1,90 @@ +"""EditionRepository for managing Edition entities. + +Provides CRUD operations and specialized queries for Edition entities. +""" + +from uuid import UUID + +from sqlalchemy import select + +from bookbytes.models.edition import Edition +from bookbytes.repositories.base import BaseRepository + + +class EditionRepository(BaseRepository[Edition]): + """Repository for Edition entities.""" + + async def get_by_isbn(self, isbn: str) -> Edition | None: + """Find an edition by its normalized ISBN. + + Args: + isbn: Normalized ISBN (10 or 13 digits, no dashes) + + Returns: + Edition if found, None otherwise + """ + result = await self.session.execute(select(Edition).where(Edition.isbn == isbn)) + return result.scalar_one_or_none() + + async def get_by_work_id( + self, + work_id: UUID, + *, + offset: int = 0, + limit: int = 100, + ) -> list[Edition]: + """Get all editions for a work. + + Args: + work_id: Work UUID + offset: Pagination offset + limit: Maximum results + + Returns: + List of editions + """ + result = await self.session.execute( + select(Edition) + .where(Edition.work_id == work_id) + .order_by(Edition.publish_year.desc().nulls_last()) + .offset(offset) + .limit(limit) + ) + return list(result.scalars().all()) + + async def get_latest_by_work( + self, + work_id: UUID, + language: str = "eng", + ) -> Edition | None: + """Get the latest edition of a work by publish year. + + Args: + work_id: Work UUID + language: ISO 639-2/B language code (default: "eng") + + Returns: + Latest edition or None + """ + result = await self.session.execute( + select(Edition) + .where(Edition.work_id == work_id) + .where(Edition.language == language) + .order_by(Edition.publish_year.desc().nulls_last()) + .limit(1) + ) + return result.scalar_one_or_none() + + async def isbn_exists(self, isbn: str) -> bool: + """Check if an ISBN already exists in the library. + + Args: + isbn: Normalized ISBN + + Returns: + True if exists + """ + result = await self.session.execute( + select(Edition.id).where(Edition.isbn == isbn).limit(1) + ) + return result.scalar_one_or_none() is not None diff --git a/backend/src/bookbytes/repositories/job.py b/backend/src/bookbytes/repositories/job.py new file mode 100644 index 0000000..4f75650 --- /dev/null +++ b/backend/src/bookbytes/repositories/job.py @@ -0,0 +1,253 @@ +"""Job repository with worker-safe operations. + +Provides CRUD operations for jobs with atomic claim, progress updates, +and status transitions. Uses optimistic locking for concurrency control. +""" + +from __future__ import annotations + +from datetime import UTC, datetime +from uuid import UUID + +from sqlalchemy import func, select, update + +from bookbytes.models.job import Job, JobStatus, JobType +from bookbytes.repositories.base import BaseRepository + + +class JobRepository(BaseRepository[Job]): + """Repository for Job model with worker-safe operations. + + Provides atomic operations for job claiming and status updates + to ensure safe concurrent access from multiple workers. + """ + + async def claim_next( + self, + worker_id: str, + job_type: str | None = None, + ) -> Job | None: + """Atomically claim the next pending job. + + Uses optimistic locking via version column to prevent + race conditions when multiple workers try to claim jobs. + + Args: + worker_id: Identifier of the claiming worker + job_type: Optional filter by job type + + Returns: + The claimed job, or None if no jobs available + """ + # Find oldest pending job + query = ( + select(Job) + .where(Job.status == JobStatus.PENDING.value) + .order_by(Job.created_at) + .limit(1) + ) + + if job_type: + query = query.where(Job.job_type == job_type) + + result = await self.session.execute(query) + job = result.scalar_one_or_none() + + if not job: + return None + + # Atomically claim with optimistic lock + stmt = ( + update(Job) + .where(Job.id == job.id) + .where(Job.version == job.version) # Optimistic lock + .values( + status=JobStatus.PROCESSING.value, + worker_id=worker_id, + started_at=datetime.now(UTC), + version=job.version + 1, + ) + .returning(Job) + ) + + result = await self.session.execute(stmt) + claimed = result.scalar_one_or_none() + + if claimed: + await self.session.commit() + + return claimed + + async def update_progress( + self, + job_id: UUID, + progress: int, + current_step: str | None = None, + ) -> bool: + """Update job progress. + + Args: + job_id: The job's UUID + progress: Progress percentage (0-100) + current_step: Optional human-readable step description + + Returns: + True if update succeeded, False if job not found + """ + values: dict[str, int | str | None] = {"progress": min(100, max(0, progress))} + if current_step is not None: + values["current_step"] = current_step + + stmt = update(Job).where(Job.id == job_id).values(**values) + result = await self.session.execute(stmt) + await self.session.commit() + + return result.rowcount > 0 + + async def mark_completed(self, job_id: UUID) -> bool: + """Mark job as completed. + + Args: + job_id: The job's UUID + + Returns: + True if update succeeded + """ + stmt = ( + update(Job) + .where(Job.id == job_id) + .values( + status=JobStatus.COMPLETED.value, + progress=100, + completed_at=datetime.now(UTC), + ) + ) + result = await self.session.execute(stmt) + await self.session.commit() + + return result.rowcount > 0 + + async def mark_failed( + self, + job_id: UUID, + error_message: str, + error_code: str | None = None, + ) -> bool: + """Mark job as failed with error details. + + Args: + job_id: The job's UUID + error_message: Human-readable error message + error_code: Optional machine-readable error code + + Returns: + True if update succeeded + """ + stmt = ( + update(Job) + .where(Job.id == job_id) + .values( + status=JobStatus.FAILED.value, + error_message=error_message[:2000], # Truncate to fit + error_code=error_code[:50] if error_code else None, + completed_at=datetime.now(UTC), + ) + ) + result = await self.session.execute(stmt) + await self.session.commit() + + return result.rowcount > 0 + + async def schedule_retry(self, job_id: UUID) -> bool: + """Schedule a failed job for retry. + + Increments retry_count and sets status back to pending. + + Args: + job_id: The job's UUID + + Returns: + True if retry scheduled, False if max retries exceeded + """ + # Get current job state + job = await self.get_by_id(job_id) + if not job or not job.can_retry: + return False + + stmt = ( + update(Job) + .where(Job.id == job_id) + .values( + status=JobStatus.PENDING.value, + retry_count=job.retry_count + 1, + worker_id=None, + started_at=None, + completed_at=None, + error_message=None, + error_code=None, + ) + ) + await self.session.execute(stmt) + await self.session.commit() + + return True + + async def get_by_status( + self, + status: JobStatus, + limit: int = 100, + ) -> list[Job]: + """Get jobs by status. + + Args: + status: The status to filter by + limit: Maximum number of results + + Returns: + List of jobs with the given status + """ + query = ( + select(Job) + .where(Job.status == status.value) + .order_by(Job.created_at) + .limit(limit) + ) + result = await self.session.execute(query) + return list(result.scalars().all()) + + async def get_pending_count(self) -> int: + """Get count of pending jobs (for monitoring). + + Returns: + Number of pending jobs + """ + query = ( + select(func.count()) + .select_from(Job) + .where(Job.status == JobStatus.PENDING.value) + ) + result = await self.session.execute(query) + return result.scalar() or 0 + + async def get_by_job_type( + self, + job_type: JobType, + limit: int = 100, + ) -> list[Job]: + """Get jobs by type. + + Args: + job_type: The job type to filter by + limit: Maximum number of results + + Returns: + List of jobs of the given type + """ + query = ( + select(Job) + .where(Job.job_type == job_type.value) + .order_by(Job.created_at.desc()) + .limit(limit) + ) + result = await self.session.execute(query) + return list(result.scalars().all()) diff --git a/backend/src/bookbytes/repositories/work.py b/backend/src/bookbytes/repositories/work.py new file mode 100644 index 0000000..2266a87 --- /dev/null +++ b/backend/src/bookbytes/repositories/work.py @@ -0,0 +1,75 @@ +"""WorkRepository for managing Work entities. + +Provides CRUD operations and specialized queries for Work entities. +""" + +from sqlalchemy import select +from sqlalchemy.orm import selectinload + +from bookbytes.models.work import Work +from bookbytes.repositories.base import BaseRepository + + +class WorkRepository(BaseRepository[Work]): + """Repository for Work entities.""" + + async def get_by_title_author( + self, + title: str, + authors: list[str], + ) -> Work | None: + """Find a work by title and author combination. + + Args: + title: Work title (case-insensitive partial match) + authors: List of author names + + Returns: + Work if found, None otherwise + """ + # Case-insensitive title match with author overlap + result = await self.session.execute( + select(Work) + .where(Work.title.ilike(f"%{title}%")) + .where(Work.authors.contains(authors)) + ) + return result.scalar_one_or_none() + + async def get_with_editions(self, work_id: Work) -> Work | None: + """Get a work with its editions eagerly loaded. + + Args: + work_id: Work UUID + + Returns: + Work with editions loaded, or None + """ + result = await self.session.execute( + select(Work).options(selectinload(Work.editions)).where(Work.id == work_id) + ) + return result.scalar_one_or_none() + + async def search_by_title( + self, + title: str, + *, + offset: int = 0, + limit: int = 100, + ) -> list[Work]: + """Search works by title (case-insensitive partial match). + + Args: + title: Search term + offset: Pagination offset + limit: Maximum results + + Returns: + List of matching works + """ + result = await self.session.execute( + select(Work) + .where(Work.title.ilike(f"%{title}%")) + .offset(offset) + .limit(limit) + ) + return list(result.scalars().all()) diff --git a/backend/src/bookbytes/schemas/__init__.py b/backend/src/bookbytes/schemas/__init__.py new file mode 100644 index 0000000..e807615 --- /dev/null +++ b/backend/src/bookbytes/schemas/__init__.py @@ -0,0 +1 @@ +"""Pydantic schemas package for BookBytes.""" diff --git a/backend/src/bookbytes/schemas/common.py b/backend/src/bookbytes/schemas/common.py new file mode 100644 index 0000000..fd02129 --- /dev/null +++ b/backend/src/bookbytes/schemas/common.py @@ -0,0 +1,242 @@ +"""Common Pydantic schemas used across the API. + +This module provides shared schemas for: +- Error responses (consistent error format) +- Pagination (standardized list responses) +- Common fields and mixins +""" + +from datetime import datetime +from typing import Generic, TypeVar +from uuid import UUID + +from pydantic import BaseModel, ConfigDict, Field + +# Generic type for paginated responses +T = TypeVar("T") + + +# ============================================================================= +# Base Configuration +# ============================================================================= + + +class BaseSchema(BaseModel): + """Base schema with common configuration.""" + + model_config = ConfigDict( + from_attributes=True, # Allow ORM model conversion + populate_by_name=True, # Allow both alias and field name + str_strip_whitespace=True, # Strip whitespace from strings + ) + + +# ============================================================================= +# Error Schemas +# ============================================================================= + + +class ErrorDetail(BaseModel): + """Details of an error response. + + Follows RFC 7807 Problem Details style. + + Attributes: + code: Machine-readable error code (e.g., "BOOK_NOT_FOUND") + message: Human-readable error description + request_id: Correlation ID for tracing (optional) + details: Additional error context (optional) + """ + + code: str = Field(..., description="Machine-readable error code") + message: str = Field(..., description="Human-readable error description") + request_id: str | None = Field( + None, description="Request correlation ID for tracing" + ) + details: dict[str, str | int | bool | None] | None = Field( + None, description="Additional error context" + ) + + model_config = ConfigDict( + json_schema_extra={ + "example": { + "code": "BOOK_NOT_FOUND", + "message": "Book with ISBN 1234567890 not found", + "request_id": "abc-123-def-456", + } + } + ) + + +class ErrorResponse(BaseModel): + """Standard error response wrapper. + + All API errors should return this format for consistency. + + Attributes: + error: The error details + """ + + error: ErrorDetail + + model_config = ConfigDict( + json_schema_extra={ + "example": { + "error": { + "code": "BOOK_NOT_FOUND", + "message": "Book with ISBN 1234567890 not found", + "request_id": "abc-123-def-456", + } + } + } + ) + + +# ============================================================================= +# Pagination Schemas +# ============================================================================= + + +class PaginationParams(BaseModel): + """Query parameters for pagination. + + Attributes: + page: Page number (1-indexed) + size: Number of items per page + """ + + page: int = Field(1, ge=1, description="Page number (1-indexed)") + size: int = Field(20, ge=1, le=100, description="Items per page (max 100)") + + @property + def offset(self) -> int: + """Calculate the offset for database queries.""" + return (self.page - 1) * self.size + + @property + def limit(self) -> int: + """Get the limit for database queries.""" + return self.size + + +class PaginatedResponse(BaseModel, Generic[T]): + """Generic paginated response wrapper. + + Use this for any list endpoint that supports pagination. + + Attributes: + items: List of items for the current page + total: Total number of items across all pages + page: Current page number + size: Number of items per page + pages: Total number of pages + """ + + items: list[T] = Field(..., description="List of items for current page") + total: int = Field(..., ge=0, description="Total items across all pages") + page: int = Field(..., ge=1, description="Current page number") + size: int = Field(..., ge=1, description="Items per page") + pages: int = Field(..., ge=0, description="Total number of pages") + + @classmethod + def create( + cls, items: list[T], total: int, page: int, size: int + ) -> "PaginatedResponse[T]": + """Factory method to create a paginated response. + + Args: + items: Items for the current page + total: Total number of items + page: Current page number + size: Page size + + Returns: + Paginated response with calculated pages + """ + pages = (total + size - 1) // size if total > 0 else 0 + return cls(items=items, total=total, page=page, size=size, pages=pages) + + +# ============================================================================= +# Common Field Types +# ============================================================================= + + +class UUIDMixin(BaseModel): + """Mixin for models with UUID primary key.""" + + id: UUID = Field(..., description="Unique identifier") + + +class TimestampMixin(BaseModel): + """Mixin for models with timestamp fields.""" + + created_at: datetime = Field(..., description="Creation timestamp") + updated_at: datetime = Field(..., description="Last update timestamp") + + +class UUIDTimestampMixin(UUIDMixin, TimestampMixin): + """Combined mixin for UUID and timestamps.""" + + pass + + +# ============================================================================= +# Health Check Schemas +# ============================================================================= + + +class HealthStatus(BaseModel): + """Individual health check status. + + Attributes: + status: Health status ("ok" or "error") + message: Optional status message + """ + + status: str = Field(..., pattern="^(ok|error)$") + message: str | None = None + + +class HealthCheckResponse(BaseModel): + """Health check endpoint response. + + Attributes: + status: Overall health status + checks: Individual service health checks + """ + + status: str = Field(..., pattern="^(ok|degraded|error)$") + checks: dict[str, HealthStatus | str] = Field( + default_factory=dict, description="Individual service checks" + ) + + model_config = ConfigDict( + json_schema_extra={ + "example": { + "status": "ok", + "checks": { + "database": "ok", + "redis": "ok", + }, + } + } + ) + + +# ============================================================================= +# Message Response Schemas +# ============================================================================= + + +class MessageResponse(BaseModel): + """Simple message response. + + Useful for operations that just need to confirm success. + """ + + message: str = Field(..., description="Response message") + + model_config = ConfigDict( + json_schema_extra={"example": {"message": "Operation completed successfully"}} + ) diff --git a/backend/src/bookbytes/schemas/processing.py b/backend/src/bookbytes/schemas/processing.py new file mode 100644 index 0000000..f2f9b8f --- /dev/null +++ b/backend/src/bookbytes/schemas/processing.py @@ -0,0 +1,137 @@ +"""Schemas for audiobook processing endpoints. + +Request and response models for the processing pipeline including +job creation, status tracking, and audiobook refresh. +""" + +from __future__ import annotations + +from datetime import datetime +from typing import Self +from uuid import UUID + +from pydantic import BaseModel, ConfigDict, Field, model_validator + + +class ProcessRequest(BaseModel): + """Request to start audiobook processing. + + Must provide either edition_id or isbn, not both or neither. + """ + + edition_id: UUID | None = Field( + default=None, + description="UUID of the edition to process", + ) + isbn: str | None = Field( + default=None, + description="ISBN to lookup and process (will find/create edition)", + min_length=10, + max_length=17, # ISBN-13 with hyphens + ) + + model_config = ConfigDict( + json_schema_extra={ + "examples": [ + {"edition_id": "01234567-89ab-cdef-0123-456789abcdef"}, + {"isbn": "978-0-13-468599-1"}, + ] + } + ) + + @model_validator(mode="after") + def require_exactly_one(self) -> Self: + """Validate that exactly one of edition_id or isbn is provided.""" + has_edition = self.edition_id is not None + has_isbn = self.isbn is not None + + if not has_edition and not has_isbn: + msg = "Must provide either edition_id or isbn" + raise ValueError(msg) + + if has_edition and has_isbn: + msg = "Provide only one of edition_id or isbn, not both" + raise ValueError(msg) + + return self + + +class RefreshRequest(BaseModel): + """Request to refresh/regenerate an audiobook.""" + + force: bool = Field( + default=False, + description="Force regeneration even if audiobook is up-to-date", + ) + + +class ProcessResponse(BaseModel): + """Response after starting audiobook processing.""" + + job_id: UUID = Field(description="ID of the processing job") + audio_book_id: UUID = Field(description="ID of the audiobook being processed") + status: str = Field(description="Initial status (typically 'pending')") + message: str = Field(description="Human-readable status message") + + model_config = ConfigDict( + json_schema_extra={ + "example": { + "job_id": "01234567-89ab-cdef-0123-456789abcdef", + "audio_book_id": "fedcba98-7654-3210-fedc-ba9876543210", + "status": "pending", + "message": "Audiobook processing started", + } + } + ) + + +class JobStatusResponse(BaseModel): + """Response for job status query.""" + + id: UUID = Field(description="Job ID") + job_type: str = Field(description="Type of job (e.g., 'audiobook_generation')") + status: str = Field( + description="Current status: pending, processing, completed, failed" + ) + audio_book_id: UUID | None = Field( + default=None, + description="Associated audiobook ID (if applicable)", + ) + progress: int = Field( + default=0, + ge=0, + le=100, + description="Progress percentage (0-100)", + ) + error_message: str | None = Field( + default=None, + description="Error details if status is 'failed'", + ) + created_at: datetime = Field(description="When the job was created") + updated_at: datetime = Field(description="When the job was last updated") + started_at: datetime | None = Field( + default=None, + description="When processing started", + ) + completed_at: datetime | None = Field( + default=None, + description="When processing completed (success or failure)", + ) + + model_config = ConfigDict( + from_attributes=True, + json_schema_extra={ + "example": { + "id": "01234567-89ab-cdef-0123-456789abcdef", + "job_type": "audiobook_generation", + "status": "processing", + "audio_book_id": "fedcba98-7654-3210-fedc-ba9876543210", + "progress": 45, + "error_message": None, + "created_at": "2024-12-11T12:00:00Z", + "updated_at": "2024-12-11T12:05:00Z", + "started_at": "2024-12-11T12:01:00Z", + "completed_at": None, + } + }, + ) diff --git a/backend/src/bookbytes/schemas/search.py b/backend/src/bookbytes/schemas/search.py new file mode 100644 index 0000000..d3790d9 --- /dev/null +++ b/backend/src/bookbytes/schemas/search.py @@ -0,0 +1,290 @@ +"""Search and book-related API schemas. + +This module defines Pydantic models for book search, work details, +and audiobook processing endpoints. +""" + +from datetime import datetime +from uuid import UUID + +from pydantic import BaseModel, ConfigDict, Field + +from bookbytes.schemas.common import BaseSchema + +# ============================================================================= +# Search Request/Response +# ============================================================================= + + +class BookSearchRequest(BaseSchema): + """Request body for book search. + + Attributes: + title: Book title to search (required) + author: Optional author name filter + publisher: Optional publisher filter + language: Optional language code (e.g., "eng") + """ + + title: str = Field( + ..., + min_length=1, + max_length=500, + description="Book title to search", + json_schema_extra={"example": "Lord of the Rings"}, + ) + author: str | None = Field( + None, + max_length=200, + description="Filter by author name", + json_schema_extra={"example": "J.R.R. Tolkien"}, + ) + publisher: str | None = Field( + None, + max_length=200, + description="Filter by publisher", + ) + language: str | None = Field( + None, + min_length=2, + max_length=10, + description="Language code (e.g., 'eng', 'spa')", + json_schema_extra={"example": "eng"}, + ) + + +class BookSearchResultItem(BaseModel): + """Single search result item. + + Represents a book work from search results with key metadata. + """ + + model_config = ConfigDict(from_attributes=True) + + title: str = Field(..., description="Book title") + authors: list[str] = Field(default_factory=list, description="Author names") + first_publish_year: int | None = Field(None, description="First publication year") + cover_url: str | None = Field(None, description="Cover image URL") + isbn_list: list[str] = Field( + default_factory=list, description="Available ISBNs (limited)" + ) + edition_count: int = Field(0, description="Number of editions") + subjects: list[str] = Field(default_factory=list, description="Subject categories") + external_work_key: str = Field(..., description="OpenLibrary work key") + + model_config = ConfigDict( + json_schema_extra={ + "example": { + "title": "The Lord of the Rings", + "authors": ["J. R. R. Tolkien"], + "first_publish_year": 1954, + "cover_url": "https://covers.openlibrary.org/b/id/258027-M.jpg", + "isbn_list": ["9780618640157", "0618640150"], + "edition_count": 120, + "subjects": ["Fantasy", "Epic"], + "external_work_key": "/works/OL27448W", + } + } + ) + + +class BookSearchResponse(BaseModel): + """Search response with pagination info. + + Contains search results and metadata about the total results. + """ + + results: list[BookSearchResultItem] = Field(..., description="Search result items") + total_found: int = Field(..., ge=0, description="Total matching results") + page: int = Field(..., ge=1, description="Current page number") + page_size: int = Field(..., ge=1, description="Results per page") + has_more: bool = Field(..., description="More results available") + + model_config = ConfigDict( + json_schema_extra={ + "example": { + "results": [ + { + "title": "The Lord of the Rings", + "authors": ["J. R. R. Tolkien"], + "first_publish_year": 1954, + "cover_url": "https://covers.openlibrary.org/b/id/258027-M.jpg", + "isbn_list": ["9780618640157"], + "edition_count": 120, + "subjects": ["Fantasy"], + "external_work_key": "/works/OL27448W", + } + ], + "total_found": 629, + "page": 1, + "page_size": 20, + "has_more": True, + } + } + ) + + +# ============================================================================= +# Work Details +# ============================================================================= + + +class EditionResponse(BaseSchema): + """Edition details response. + + Represents a specific edition of a work (ISBN-based). + """ + + id: UUID | None = Field(None, description="Internal edition ID (if stored)") + isbn: str = Field(..., description="ISBN (10 or 13)") + isbn_type: str = Field(..., description="ISBN type: 'isbn10' or 'isbn13'") + title: str = Field(..., description="Edition-specific title") + publisher: str | None = Field(None, description="Publisher name") + publish_year: int | None = Field(None, description="Publication year") + language: str = Field("eng", description="Language code") + cover_url: str | None = Field(None, description="Cover image URL") + + model_config = ConfigDict( + json_schema_extra={ + "example": { + "id": "01234567-89ab-cdef-0123-456789abcdef", + "isbn": "9780618640157", + "isbn_type": "isbn13", + "title": "The Lord of the Rings: 50th Anniversary Edition", + "publisher": "Houghton Mifflin", + "publish_year": 2004, + "language": "eng", + "cover_url": "https://covers.openlibrary.org/b/isbn/9780618640157-M.jpg", + } + } + ) + + +class WorkResponse(BaseSchema): + """Work details response. + + Contains full work metadata including all editions. + """ + + id: UUID | None = Field(None, description="Internal work ID (if stored)") + title: str = Field(..., description="Work title") + authors: list[str] = Field(default_factory=list, description="Author names") + description: str | None = Field(None, description="Work description") + subjects: list[str] = Field(default_factory=list, description="Subject categories") + first_publish_year: int | None = Field(None, description="First publication year") + cover_url: str | None = Field(None, description="Cover image URL") + edition_count: int = Field(0, description="Total edition count") + external_work_key: str = Field(..., description="OpenLibrary work key") + editions: list[EditionResponse] = Field( + default_factory=list, description="Available editions" + ) + + model_config = ConfigDict( + json_schema_extra={ + "example": { + "id": None, + "title": "The Lord of the Rings", + "authors": ["J. R. R. Tolkien"], + "description": "An epic fantasy novel...", + "subjects": ["Fantasy", "Adventure"], + "first_publish_year": 1954, + "cover_url": "https://covers.openlibrary.org/b/id/258027-M.jpg", + "edition_count": 120, + "external_work_key": "/works/OL27448W", + "editions": [], + } + } + ) + + +# ============================================================================= +# Processing Requests +# ============================================================================= + + +class ProcessBookRequest(BaseSchema): + """Request to process a book into an audiobook. + + Either edition_id OR isbn must be provided (not both). + """ + + edition_id: UUID | None = Field( + None, description="Internal edition UUID (if already in library)" + ) + isbn: str | None = Field( + None, + min_length=10, + max_length=17, + description="ISBN to process (10 or 13 digits)", + ) + + model_config = ConfigDict( + json_schema_extra={ + "example": {"isbn": "9780618640157"}, + } + ) + + +class JobResponse(BaseSchema): + """Job creation response. + + Returned when a background job is queued. + """ + + job_id: str = Field(..., description="Background job ID") + status: str = Field(..., description="Job status") + message: str = Field(..., description="Status message") + edition_id: UUID | None = Field(None, description="Edition being processed") + + model_config = ConfigDict( + json_schema_extra={ + "example": { + "job_id": "arq:job:abc123", + "status": "queued", + "message": "Audiobook generation queued", + "edition_id": "01234567-89ab-cdef-0123-456789abcdef", + } + } + ) + + +class RefreshBookRequest(BaseSchema): + """Request to refresh an audiobook. + + Optionally specify a new edition to use. + """ + + new_edition_id: UUID | None = Field( + None, description="New edition to use (optional)" + ) + force: bool = Field(False, description="Force refresh even if no changes detected") + + +# ============================================================================= +# AudioBook Response (for GET operations) +# ============================================================================= + + +class ChapterSummary(BaseSchema): + """Chapter summary in audiobook response.""" + + chapter_number: int = Field(..., ge=1, description="Chapter order") + title: str = Field(..., description="Chapter title") + summary: str | None = Field(None, description="Chapter summary") + audio_url: str | None = Field(None, description="Audio file URL") + duration_seconds: int | None = Field(None, description="Audio duration") + + +class AudioBookResponse(BaseSchema): + """AudioBook details response.""" + + id: UUID = Field(..., description="AudioBook ID") + edition_id: UUID = Field(..., description="Source edition ID") + status: str = Field(..., description="Processing status") + voice_id: str | None = Field(None, description="TTS voice used") + total_duration_seconds: int | None = Field(None, description="Total duration") + chapters: list[ChapterSummary] = Field( + default_factory=list, description="Chapter list" + ) + created_at: datetime = Field(..., description="Creation timestamp") + updated_at: datetime = Field(..., description="Last update timestamp") diff --git a/backend/src/bookbytes/services/__init__.py b/backend/src/bookbytes/services/__init__.py new file mode 100644 index 0000000..0093f6c --- /dev/null +++ b/backend/src/bookbytes/services/__init__.py @@ -0,0 +1,36 @@ +"""Services package for BookBytes. + +This module exports service classes for business logic. +""" + +from bookbytes.services.cache import ( + CacheService, + get_cache_service, + set_redis_client, +) +from bookbytes.services.openlibrary import ( + BookSearchResult, + OpenLibraryError, + OpenLibraryRateLimitError, + OpenLibraryService, + SearchResponse, + WorkDetails, + get_openlibrary_service, + set_openlibrary_service, +) + +__all__ = [ + # Cache + "CacheService", + "get_cache_service", + "set_redis_client", + # OpenLibrary + "BookSearchResult", + "OpenLibraryError", + "OpenLibraryRateLimitError", + "OpenLibraryService", + "SearchResponse", + "WorkDetails", + "get_openlibrary_service", + "set_openlibrary_service", +] diff --git a/backend/src/bookbytes/services/cache.py b/backend/src/bookbytes/services/cache.py new file mode 100644 index 0000000..cc8e53d --- /dev/null +++ b/backend/src/bookbytes/services/cache.py @@ -0,0 +1,289 @@ +"""CacheService - Redis-only caching with TTL and stale-while-revalidate. + +This service provides a simple, effective caching layer using Redis with: +- TTL-based expiration with jitter to prevent stampede +- Stale-while-revalidate for better UX +- Pattern-based invalidation + +Redis is configured with AOF persistence (appendfsync everysec) for durability. +Search results are transient - if lost, users simply re-search. Important data +(processed books) is stored permanently in PostgreSQL Work/Edition tables. + +Cache Key Types: + - search:{hash} - Search results (24h TTL) + - isbn:{isbn} - Book details by ISBN (7d TTL) + - work:{identifier} - Work details (7d TTL) + +Note: Using canonical data model approach - cache internal representations, +not raw API responses. Provider metadata stored in cached value, not key. +See: tasks/knowledge/multi-provider-integration-patterns.md +""" + +import hashlib +import json +import random +from typing import Any + +import structlog +from redis.asyncio import Redis + +logger = structlog.get_logger(__name__) + + +class CacheService: + """Redis-only caching with TTL and stale-while-revalidate. + + Configuration: + Redis should be configured with: + - appendonly yes + - appendfsync everysec + - maxmemory 256mb + - maxmemory-policy allkeys-lru + + Usage with FastAPI: + ```python + from bookbytes.services.cache import CacheService, get_cache_service + + @router.get("/search") + async def search(cache: CacheService = Depends(get_cache_service)): + ... + ``` + """ + + # TTL constants (in seconds) + TTL_SEARCH_RESULTS = 86400 # 24 hours + TTL_WORK_DETAILS = 604800 # 7 days + TTL_ISBN_DETAILS = 604800 # 7 days + + # Stale-while-revalidate threshold + REVALIDATE_THRESHOLD = 0.2 # Trigger refresh at 20% TTL remaining + + def __init__(self, redis: Redis) -> None: + """Initialize the cache service. + + Args: + redis: Async Redis client + """ + self.redis = redis + + async def get(self, cache_key: str) -> tuple[dict | None, bool]: + """Get from cache with stale-while-revalidate support. + + Returns: + Tuple of (data, needs_revalidation). + - data: Cached data or None if not found + - needs_revalidation: True if data is stale and should be refreshed + """ + try: + result = await self.redis.get(cache_key) + if not result: + return None, False + + ttl = await self.redis.ttl(cache_key) + original_ttl = self._get_original_ttl(cache_key) + + # Check if near expiry (stale-while-revalidate) + if ttl > 0 and original_ttl > 0: + needs_revalidation = (ttl / original_ttl) < self.REVALIDATE_THRESHOLD + else: + needs_revalidation = False + + data = json.loads(result) + return data, needs_revalidation + + except Exception as e: + logger.warning("cache_get_failed", cache_key=cache_key, error=str(e)) + return None, False + + async def set( + self, + cache_key: str, + data: dict[str, Any], + base_ttl: int | None = None, + ) -> None: + """Store in Redis with jittered TTL. + + Args: + cache_key: Cache key + data: Data to cache (must be JSON-serializable) + base_ttl: Base TTL in seconds (auto-detected from key prefix if None) + """ + try: + if base_ttl is None: + base_ttl = self._get_original_ttl(cache_key) + + ttl = self._jitter_ttl(base_ttl) + await self.redis.setex(cache_key, ttl, json.dumps(data)) + logger.debug("cache_set", cache_key=cache_key, ttl=ttl) + + except Exception as e: + logger.warning("cache_set_failed", cache_key=cache_key, error=str(e)) + + async def invalidate(self, cache_key: str) -> None: + """Delete a specific cache key. + + Args: + cache_key: Key to delete + """ + try: + await self.redis.delete(cache_key) + logger.debug("cache_invalidated", cache_key=cache_key) + except Exception as e: + logger.warning("cache_invalidate_failed", cache_key=cache_key, error=str(e)) + + async def invalidate_pattern(self, pattern: str) -> int: + """Delete all keys matching pattern. + + Args: + pattern: Redis glob pattern (e.g., "search:*") + + Returns: + Number of keys deleted + """ + try: + count = 0 + async for key in self.redis.scan_iter(match=pattern): + await self.redis.delete(key) + count += 1 + logger.debug("cache_pattern_invalidated", pattern=pattern, count=count) + return count + except Exception as e: + logger.warning( + "cache_pattern_invalidate_failed", pattern=pattern, error=str(e) + ) + return 0 + + def _jitter_ttl(self, base_ttl: int) -> int: + """Add Β±10% random jitter to prevent cache stampede. + + Args: + base_ttl: Base TTL in seconds + + Returns: + Jittered TTL + """ + jitter = random.uniform(-0.1, 0.1) + return max(1, int(base_ttl * (1 + jitter))) + + def _get_original_ttl(self, cache_key: str) -> int: + """Get original TTL based on key prefix. + + Args: + cache_key: Cache key + + Returns: + Original TTL in seconds + """ + ttl_map = { + "search": self.TTL_SEARCH_RESULTS, + "work": self.TTL_WORK_DETAILS, + "isbn": self.TTL_ISBN_DETAILS, + } + + prefix = cache_key.split(":", 1)[0] if ":" in cache_key else cache_key + return ttl_map.get(prefix, self.TTL_SEARCH_RESULTS) + + # ------------------------------------------------------------------------- + # Cache Key Generators + # ------------------------------------------------------------------------- + + @staticmethod + def search_key( + *, + title: str, + author: str | None = None, + publisher: str | None = None, + language: str | None = None, + ) -> str: + """Generate a deterministic cache key for search parameters. + + Same search parameters = same key = cache hit. + Normalizes input (lowercase, strip, sorted keys). + + Args: + title: Search title + author: Optional author + publisher: Optional publisher + language: Optional language code + + Returns: + Cache key (e.g., "search:a3f2b1c4d5e6f7a8") + """ + # Normalize: lowercase, strip whitespace + normalized = { + "title": title.lower().strip(), + } + if author: + normalized["author"] = author.lower().strip() + if publisher: + normalized["publisher"] = publisher.lower().strip() + if language: + normalized["language"] = language.lower().strip() + + # Create deterministic string (sorted keys) + key_parts = sorted(f"{k}={v}" for k, v in normalized.items() if v) + key_string = "&".join(key_parts) + + # Hash for storage efficiency + hash_digest = hashlib.sha256(key_string.encode()).hexdigest()[:16] + return f"search:{hash_digest}" + + @staticmethod + def isbn_key(isbn: str) -> str: + """Generate cache key for ISBN-based lookup. + + Args: + isbn: ISBN (10 or 13 digits) + + Returns: + Cache key (e.g., "isbn:9780618640157") + """ + return f"isbn:{isbn}" + + @staticmethod + def work_key(identifier: str) -> str: + """Generate cache key for work details. + + Args: + identifier: Work identifier (ISBN or internal ID) + + Returns: + Cache key (e.g., "work:9780618640157") + """ + return f"work:{identifier}" + + +# Global Redis client (set during app startup) +_redis_client: Redis | None = None + + +def set_redis_client(redis: Redis) -> None: + """Set the global Redis client during app startup. + + Call this in your FastAPI lifespan: + ```python + @asynccontextmanager + async def lifespan(app: FastAPI): + redis = Redis.from_url(settings.redis_url) + set_redis_client(redis) + yield + await redis.close() + ``` + """ + global _redis_client + _redis_client = redis + + +def get_cache_service() -> CacheService: + """FastAPI dependency for CacheService. + + Usage: + ```python + @router.get("/search") + async def search(cache: CacheService = Depends(get_cache_service)): + data, stale = await cache.get("search:abc123") + ``` + """ + if _redis_client is None: + raise RuntimeError("Redis client not initialized. Call set_redis_client first.") + return CacheService(_redis_client) diff --git a/backend/src/bookbytes/services/library.py b/backend/src/bookbytes/services/library.py new file mode 100644 index 0000000..8ca5c5d --- /dev/null +++ b/backend/src/bookbytes/services/library.py @@ -0,0 +1,319 @@ +"""Library service for managing books in the database. + +This service orchestrates Work, Edition, and BookProvider repositories +to persist data fetched from external providers like OpenLibrary. +""" + +from uuid import UUID + +import structlog + +from bookbytes.models.edition import Edition +from bookbytes.models.work import Work +from bookbytes.repositories.book_provider import BookProviderRepository +from bookbytes.repositories.edition import EditionRepository +from bookbytes.repositories.work import WorkRepository +from bookbytes.services.openlibrary import BookSearchResult, WorkDetails + +logger = structlog.get_logger(__name__) + + +class LibraryService: + """Service for managing library persistence. + + Handles the creation and retrieval of Works and Editions, + including their mappings to external providers. + + Usage: + ```python + service = LibraryService(work_repo, edition_repo, provider_repo) + work = await service.get_or_create_work(search_result) + ``` + """ + + PROVIDER_OPENLIBRARY = "openlibrary" + + def __init__( + self, + work_repo: WorkRepository, + edition_repo: EditionRepository, + provider_repo: BookProviderRepository, + ) -> None: + """Initialize the service with repositories. + + Args: + work_repo: Repository for Work entities + edition_repo: Repository for Edition entities + provider_repo: Repository for BookProvider mappings + """ + self.work_repo = work_repo + self.edition_repo = edition_repo + self.provider_repo = provider_repo + + # ------------------------------------------------------------------------- + # Work Operations + # ------------------------------------------------------------------------- + + async def find_work_by_provider( + self, + provider: str, + external_key: str, + ) -> Work | None: + """Find a work by its external provider key. + + Args: + provider: Provider name (e.g., "openlibrary") + external_key: External identifier (e.g., "/works/OL27448W") + + Returns: + Work if found, None otherwise + """ + mapping = await self.provider_repo.get_by_provider_key(provider, external_key) + + if mapping and mapping.work_id: + logger.debug( + "found_work_by_provider", + provider=provider, + external_key=external_key, + work_id=str(mapping.work_id), + ) + return await self.work_repo.get(mapping.work_id) + + return None + + async def get_or_create_work( + self, + search_result: BookSearchResult, + ) -> Work: + """Get existing work or create new one from search result. + + If the work already exists (via provider mapping), returns it. + Otherwise, creates a new Work and links it to the provider. + + Args: + search_result: BookSearchResult from OpenLibrary + + Returns: + Existing or newly created Work + """ + # Check if work already exists via provider mapping + existing = await self.find_work_by_provider( + provider=self.PROVIDER_OPENLIBRARY, + external_key=search_result.external_work_key, + ) + + if existing: + logger.info( + "work_already_exists", + work_id=str(existing.id), + title=existing.title, + ) + return existing + + # Create new work + work = Work( + title=search_result.title, + authors=search_result.authors, + subjects=search_result.subjects, + first_publish_year=search_result.first_publish_year, + ) + created_work = await self.work_repo.create(work) + + # Create provider mapping + await self.provider_repo.create_work_mapping( + work_id=created_work.id, + provider=self.PROVIDER_OPENLIBRARY, + external_key=search_result.external_work_key, + provider_metadata={ + "fetched_at": search_result.fetched_at, + "edition_count": search_result.edition_count, + }, + ) + + logger.info( + "work_created", + work_id=str(created_work.id), + title=created_work.title, + external_key=search_result.external_work_key, + ) + + return created_work + + async def get_or_create_work_from_details( + self, + work_details: WorkDetails, + ) -> Work: + """Get existing work or create new one from work details. + + Args: + work_details: WorkDetails from OpenLibrary + + Returns: + Existing or newly created Work + """ + # Check if work already exists + existing = await self.find_work_by_provider( + provider=self.PROVIDER_OPENLIBRARY, + external_key=work_details.external_work_key, + ) + + if existing: + return existing + + # Create new work + work = Work( + title=work_details.title, + authors=work_details.authors, + subjects=work_details.subjects, + first_publish_year=work_details.first_publish_year, + ) + created_work = await self.work_repo.create(work) + + # Create provider mapping + await self.provider_repo.create_work_mapping( + work_id=created_work.id, + provider=self.PROVIDER_OPENLIBRARY, + external_key=work_details.external_work_key, + ) + + logger.info( + "work_created_from_details", + work_id=str(created_work.id), + title=created_work.title, + ) + + return created_work + + # ------------------------------------------------------------------------- + # Edition Operations + # ------------------------------------------------------------------------- + + async def find_by_isbn(self, isbn: str) -> Edition | None: + """Find an edition by ISBN. + + Args: + isbn: ISBN-10 or ISBN-13 (normalized, no dashes) + + Returns: + Edition if found, None otherwise + """ + clean_isbn = isbn.replace("-", "").replace(" ", "") + return await self.edition_repo.get_by_isbn(clean_isbn) + + async def find_latest_edition( + self, + work_id: UUID, + language: str = "eng", + ) -> Edition | None: + """Find the latest edition of a work. + + Args: + work_id: Work UUID + language: ISO 639-2/B language code (default: "eng") + + Returns: + Latest edition or None + """ + return await self.edition_repo.get_latest_by_work(work_id, language) + + async def store_edition( + self, + work: Work, + isbn: str, + *, + title: str | None = None, + publisher: str | None = None, + publish_year: int | None = None, + language: str = "eng", + pages: int | None = None, + external_key: str | None = None, + ) -> Edition: + """Store a single edition for a work. + + Args: + work: Parent Work entity + isbn: ISBN-10 or ISBN-13 + title: Edition-specific title (defaults to work title) + publisher: Publisher name + publish_year: Publication year + language: ISO 639-2/B language code + pages: Page count + external_key: External provider key for edition + + Returns: + Created Edition + """ + clean_isbn = isbn.replace("-", "").replace(" ", "") + + # Check if already exists + existing = await self.edition_repo.get_by_isbn(clean_isbn) + if existing: + logger.debug("edition_already_exists", isbn=clean_isbn) + return existing + + # Determine ISBN type + isbn_type = "isbn13" if len(clean_isbn) == 13 else "isbn10" + + # Create edition + edition = Edition( + work_id=work.id, + isbn=clean_isbn, + isbn_type=isbn_type, + title=title or work.title, + publisher=publisher, + publish_year=publish_year, + language=language, + pages=pages, + ) + created_edition = await self.edition_repo.create(edition) + + # Create provider mapping if external key provided + if external_key: + await self.provider_repo.create_edition_mapping( + edition_id=created_edition.id, + provider=self.PROVIDER_OPENLIBRARY, + external_key=external_key, + ) + + logger.info( + "edition_created", + edition_id=str(created_edition.id), + isbn=clean_isbn, + work_id=str(work.id), + ) + + return created_edition + + async def isbn_exists(self, isbn: str) -> bool: + """Check if an ISBN exists in the library. + + Args: + isbn: ISBN to check + + Returns: + True if exists + """ + clean_isbn = isbn.replace("-", "").replace(" ", "") + return await self.edition_repo.isbn_exists(clean_isbn) + + +# ----------------------------------------------------------------------------- +# FastAPI Dependency Injection +# ----------------------------------------------------------------------------- + +_library_service: LibraryService | None = None + + +def set_library_service(service: LibraryService) -> None: + """Set the global library service during app startup.""" + global _library_service + _library_service = service + + +def get_library_service() -> LibraryService: + """FastAPI dependency for LibraryService.""" + if _library_service is None: + raise RuntimeError( + "Library service not initialized. Call set_library_service first." + ) + return _library_service diff --git a/backend/src/bookbytes/services/openlibrary.py b/backend/src/bookbytes/services/openlibrary.py new file mode 100644 index 0000000..1849da1 --- /dev/null +++ b/backend/src/bookbytes/services/openlibrary.py @@ -0,0 +1,480 @@ +"""OpenLibrary API client service. + +This service provides async access to the OpenLibrary API for book search +and metadata retrieval. Uses canonical internal DTOs for all responses. + +See: https://openlibrary.org/dev/docs/api/search +""" + +from dataclasses import dataclass, field +from datetime import UTC, datetime +from typing import Any + +import httpx +import structlog + +from bookbytes.config import get_settings +from bookbytes.services.cache import CacheService + +logger = structlog.get_logger(__name__) + + +# ----------------------------------------------------------------------------- +# DTOs (Canonical Internal Models - Anti-Corruption Layer) +# ----------------------------------------------------------------------------- + + +@dataclass +class BookSearchResult: + """Canonical search result - provider agnostic. + + Maps OpenLibrary (and future provider) responses to internal format. + """ + + title: str + authors: list[str] + first_publish_year: int | None + cover_url: str | None + isbn_list: list[str] + edition_count: int + subjects: list[str] + + # Metadata (stored in cache, not used for key) + external_work_key: str # e.g., "/works/OL27448W" + source_provider: str = "openlibrary" + fetched_at: str = field(default_factory=lambda: datetime.now(UTC).isoformat()) + + def to_dict(self) -> dict[str, Any]: + """Convert to JSON-serializable dict for caching.""" + return { + "title": self.title, + "authors": self.authors, + "first_publish_year": self.first_publish_year, + "cover_url": self.cover_url, + "isbn_list": self.isbn_list, + "edition_count": self.edition_count, + "subjects": self.subjects, + "external_work_key": self.external_work_key, + "source_provider": self.source_provider, + "fetched_at": self.fetched_at, + } + + @classmethod + def from_dict(cls, data: dict[str, Any]) -> "BookSearchResult": + """Create from cached dict.""" + return cls( + title=data["title"], + authors=data["authors"], + first_publish_year=data.get("first_publish_year"), + cover_url=data.get("cover_url"), + isbn_list=data.get("isbn_list", []), + edition_count=data.get("edition_count", 0), + subjects=data.get("subjects", []), + external_work_key=data["external_work_key"], + source_provider=data.get("source_provider", "openlibrary"), + fetched_at=data.get("fetched_at", ""), + ) + + +@dataclass +class WorkDetails: + """Canonical work details - provider agnostic.""" + + title: str + authors: list[str] + description: str | None + subjects: list[str] + first_publish_year: int | None + cover_url: str | None + edition_count: int + isbn_list: list[str] + + # Metadata + external_work_key: str + source_provider: str = "openlibrary" + fetched_at: str = field(default_factory=lambda: datetime.now(UTC).isoformat()) + + def to_dict(self) -> dict[str, Any]: + """Convert to JSON-serializable dict.""" + return { + "title": self.title, + "authors": self.authors, + "description": self.description, + "subjects": self.subjects, + "first_publish_year": self.first_publish_year, + "cover_url": self.cover_url, + "edition_count": self.edition_count, + "isbn_list": self.isbn_list, + "external_work_key": self.external_work_key, + "source_provider": self.source_provider, + "fetched_at": self.fetched_at, + } + + @classmethod + def from_dict(cls, data: dict[str, Any]) -> "WorkDetails": + """Create from cached dict.""" + return cls( + title=data["title"], + authors=data["authors"], + description=data.get("description"), + subjects=data.get("subjects", []), + first_publish_year=data.get("first_publish_year"), + cover_url=data.get("cover_url"), + edition_count=data.get("edition_count", 0), + isbn_list=data.get("isbn_list", []), + external_work_key=data["external_work_key"], + source_provider=data.get("source_provider", "openlibrary"), + fetched_at=data.get("fetched_at", ""), + ) + + +@dataclass +class SearchResponse: + """Container for search results with pagination info.""" + + results: list[BookSearchResult] + total_found: int + offset: int + limit: int + + @property + def has_more(self) -> bool: + """Check if there are more results to fetch.""" + return self.offset + len(self.results) < self.total_found + + +# ----------------------------------------------------------------------------- +# OpenLibrary Service +# ----------------------------------------------------------------------------- + + +class OpenLibraryError(Exception): + """Base exception for OpenLibrary API errors.""" + + pass + + +class OpenLibraryRateLimitError(OpenLibraryError): + """Rate limit exceeded.""" + + pass + + +class OpenLibraryService: + """Async client for OpenLibrary API. + + Uses httpx for async HTTP requests and integrates with CacheService + for caching responses. All responses are converted to canonical DTOs. + + Usage: + ```python + service = OpenLibraryService(cache_service) + results = await service.search_books(title="Lord of the Rings") + ``` + """ + + COVER_BASE_URL = "https://covers.openlibrary.org/b" + + def __init__(self, cache: CacheService) -> None: + """Initialize the service. + + Args: + cache: CacheService instance for caching responses + """ + self.cache = cache + self._settings = get_settings() + self._client: httpx.AsyncClient | None = None + + @property + def _user_agent(self) -> str: + """User-Agent header for API compliance.""" + return f"{self._settings.app_name}/{self._settings.app_version} (contact@bookbytes.app)" + + async def _get_client(self) -> httpx.AsyncClient: + """Get or create the HTTP client.""" + if self._client is None or self._client.is_closed: + self._client = httpx.AsyncClient( + base_url=self._settings.openlibrary_base_url, + timeout=self._settings.openlibrary_timeout, + headers={"User-Agent": self._user_agent}, + ) + return self._client + + async def close(self) -> None: + """Close the HTTP client.""" + if self._client and not self._client.is_closed: + await self._client.aclose() + self._client = None + + async def search_books( + self, + *, + title: str, + author: str | None = None, + publisher: str | None = None, + language: str | None = None, + offset: int = 0, + ) -> SearchResponse: + """Search for books by title, author, etc. + + Args: + title: Book title to search + author: Optional author name + publisher: Optional publisher name + language: Optional language code (e.g., "eng") + offset: Pagination offset + + Returns: + SearchResponse with matching books + + Raises: + OpenLibraryError: On API errors + """ + # Check cache first + cache_key = CacheService.search_key( + title=title, + author=author, + publisher=publisher, + language=language, + ) + + cached_data, needs_revalidation = await self.cache.get(cache_key) + if cached_data and not needs_revalidation: + logger.debug("search_cache_hit", cache_key=cache_key) + return self._parse_cached_search(cached_data) + + # Cache miss or stale - fetch from API + logger.debug("search_cache_miss", cache_key=cache_key) + response = await self._fetch_search( + title=title, + author=author, + publisher=publisher, + language=language, + offset=offset, + ) + + # Cache the response (fire and forget for now) + await self.cache.set(cache_key, self._serialize_search(response)) + + return response + + async def get_work_details(self, work_key: str) -> WorkDetails: + """Get detailed information about a work. + + Args: + work_key: OpenLibrary work key (e.g., "/works/OL27448W") + + Returns: + WorkDetails with full metadata + + Raises: + OpenLibraryError: On API errors + """ + # Use ISBN or work key for cache + cache_key = CacheService.work_key(work_key) + + cached_data, needs_revalidation = await self.cache.get(cache_key) + if cached_data and not needs_revalidation: + logger.debug("work_cache_hit", cache_key=cache_key) + return WorkDetails.from_dict(cached_data) + + # Fetch from API + logger.debug("work_cache_miss", cache_key=cache_key) + work = await self._fetch_work(work_key) + + # Cache the response + await self.cache.set(cache_key, work.to_dict()) + + return work + + async def get_all_isbns_for_work(self, work_key: str) -> list[str]: + """Get all ISBNs associated with a work. + + Args: + work_key: OpenLibrary work key + + Returns: + List of ISBNs (both ISBN-10 and ISBN-13) + """ + work = await self.get_work_details(work_key) + return work.isbn_list + + # ------------------------------------------------------------------------- + # Private Methods - API Fetching + # ------------------------------------------------------------------------- + + async def _fetch_search( + self, + *, + title: str, + author: str | None = None, + publisher: str | None = None, + language: str | None = None, + offset: int = 0, + ) -> SearchResponse: + """Fetch search results from OpenLibrary API.""" + client = await self._get_client() + + params: dict[str, Any] = { + "title": title, + "limit": self._settings.openlibrary_page_size, + "offset": offset, + "fields": "key,title,author_name,author_key,first_publish_year," + "edition_count,cover_i,isbn,language,publisher,subject", + } + + if author: + params["author"] = author + if publisher: + params["publisher"] = publisher + if language: + params["language"] = language + + try: + response = await client.get("/search.json", params=params) + response.raise_for_status() + except httpx.HTTPStatusError as e: + if e.response.status_code == 429: + raise OpenLibraryRateLimitError("Rate limit exceeded") from e + logger.error( + "openlibrary_search_failed", + status_code=e.response.status_code, + title=title, + ) + raise OpenLibraryError( + f"API request failed: {e.response.status_code}" + ) from e + except httpx.RequestError as e: + logger.error("openlibrary_request_error", error=str(e), title=title) + raise OpenLibraryError(f"Request failed: {e}") from e + + data = response.json() + return self._parse_search_response(data) + + async def _fetch_work(self, work_key: str) -> WorkDetails: + """Fetch work details from OpenLibrary API.""" + client = await self._get_client() + + # Normalize work key (remove leading slash if present) + work_id = work_key.lstrip("/") + + try: + response = await client.get(f"/{work_id}.json") + response.raise_for_status() + except httpx.HTTPStatusError as e: + logger.error( + "openlibrary_work_fetch_failed", + status_code=e.response.status_code, + work_key=work_key, + ) + raise OpenLibraryError( + f"API request failed: {e.response.status_code}" + ) from e + except httpx.RequestError as e: + logger.error("openlibrary_request_error", error=str(e), work_key=work_key) + raise OpenLibraryError(f"Request failed: {e}") from e + + data = response.json() + return self._parse_work_response(data, work_key) + + # ------------------------------------------------------------------------- + # Private Methods - Response Parsing (Anti-Corruption Layer) + # ------------------------------------------------------------------------- + + def _parse_search_response(self, data: dict[str, Any]) -> SearchResponse: + """Parse OpenLibrary search response to canonical format.""" + results = [] + + for doc in data.get("docs", []): + # Build cover URL + cover_id = doc.get("cover_i") + cover_url = ( + f"{self.COVER_BASE_URL}/id/{cover_id}-M.jpg" if cover_id else None + ) + + result = BookSearchResult( + title=doc.get("title", "Unknown"), + authors=doc.get("author_name", []), + first_publish_year=doc.get("first_publish_year"), + cover_url=cover_url, + isbn_list=doc.get("isbn", [])[:20], # Limit ISBNs + edition_count=doc.get("edition_count", 0), + subjects=doc.get("subject", [])[:10], # Limit subjects + external_work_key=doc.get("key", ""), + ) + results.append(result) + + return SearchResponse( + results=results, + total_found=data.get("numFound", 0), + offset=data.get("start", 0), + limit=self._settings.openlibrary_page_size, + ) + + def _parse_work_response(self, data: dict[str, Any], work_key: str) -> WorkDetails: + """Parse OpenLibrary work response to canonical format.""" + # Handle description (can be string or dict) + description = data.get("description") + if isinstance(description, dict): + description = description.get("value", "") + + # Extract cover from covers array + covers = data.get("covers", []) + cover_url = f"{self.COVER_BASE_URL}/id/{covers[0]}-M.jpg" if covers else None + + # Get subjects + subjects = data.get("subjects", []) + if isinstance(subjects, list) and subjects and isinstance(subjects[0], dict): + subjects = [s.get("name", "") for s in subjects] + + return WorkDetails( + title=data.get("title", "Unknown"), + authors=[], # Need separate author fetch - simplified for now + description=description, + subjects=subjects[:20] if subjects else [], + first_publish_year=None, # Need editions data + cover_url=cover_url, + edition_count=0, # Need separate query + isbn_list=[], # Need editions query + external_work_key=work_key, + ) + + def _parse_cached_search(self, data: dict[str, Any]) -> SearchResponse: + """Parse cached search data back to SearchResponse.""" + return SearchResponse( + results=[BookSearchResult.from_dict(r) for r in data.get("results", [])], + total_found=data.get("total_found", 0), + offset=data.get("offset", 0), + limit=data.get("limit", self._settings.openlibrary_page_size), + ) + + def _serialize_search(self, response: SearchResponse) -> dict[str, Any]: + """Serialize SearchResponse for caching.""" + return { + "results": [r.to_dict() for r in response.results], + "total_found": response.total_found, + "offset": response.offset, + "limit": response.limit, + } + + +# ----------------------------------------------------------------------------- +# FastAPI Dependency Injection +# ----------------------------------------------------------------------------- + +_openlibrary_service: OpenLibraryService | None = None + + +def set_openlibrary_service(service: OpenLibraryService) -> None: + """Set the global OpenLibrary service during app startup.""" + global _openlibrary_service + _openlibrary_service = service + + +def get_openlibrary_service() -> OpenLibraryService: + """FastAPI dependency for OpenLibraryService.""" + if _openlibrary_service is None: + raise RuntimeError( + "OpenLibrary service not initialized. Call set_openlibrary_service first." + ) + return _openlibrary_service diff --git a/backend/src/bookbytes/storage/__init__.py b/backend/src/bookbytes/storage/__init__.py new file mode 100644 index 0000000..32d4afd --- /dev/null +++ b/backend/src/bookbytes/storage/__init__.py @@ -0,0 +1 @@ +"""Storage backend package for BookBytes.""" diff --git a/backend/src/bookbytes/workers/__init__.py b/backend/src/bookbytes/workers/__init__.py new file mode 100644 index 0000000..c79306d --- /dev/null +++ b/backend/src/bookbytes/workers/__init__.py @@ -0,0 +1 @@ +"""Background worker package for BookBytes.""" diff --git a/backend/tests/__init__.py b/backend/tests/__init__.py new file mode 100644 index 0000000..94ee0c2 --- /dev/null +++ b/backend/tests/__init__.py @@ -0,0 +1 @@ +"""Tests package for BookBytes.""" diff --git a/backend/tests/conftest.py b/backend/tests/conftest.py new file mode 100644 index 0000000..ce344f7 --- /dev/null +++ b/backend/tests/conftest.py @@ -0,0 +1,262 @@ +"""Pytest configuration and fixtures for BookBytes tests. + +This module provides reusable fixtures for: +- Async test client +- Test database session (in-memory SQLite) +- Mocked external services +- Settings overrides +- Authentication helpers +""" + +from collections.abc import AsyncGenerator +from typing import Any +from unittest.mock import AsyncMock, MagicMock + +import pytest +from fastapi import FastAPI +from httpx import ASGITransport, AsyncClient + +from bookbytes.config import Settings +from bookbytes.main import create_app + + +# ============================================================================= +# Settings Fixtures +# ============================================================================= + + +@pytest.fixture +def test_settings() -> Settings: + """Create test-specific settings. + + Overrides production settings with test-appropriate values. + """ + return Settings( + app_env="development", # type: ignore[arg-type] + debug=True, + log_level="DEBUG", # type: ignore[arg-type] + log_format="console", # type: ignore[arg-type] + database_url="sqlite+aiosqlite:///:memory:", + redis_url="redis://localhost:6379/15", # Use DB 15 for tests + storage_backend="local", # type: ignore[arg-type] + local_storage_path="/tmp/bookbytes-test-audio", + auth_mode="api_key", # type: ignore[arg-type] + api_key="test-api-key", # type: ignore[arg-type] + jwt_secret_key="test-jwt-secret-key", # type: ignore[arg-type] + openai_api_key="sk-test-key", # type: ignore[arg-type] + ) + + +# ============================================================================= +# Application Fixtures +# ============================================================================= + + +@pytest.fixture +def app(test_settings: Settings) -> FastAPI: + """Create a test FastAPI application with test settings.""" + return create_app(settings=test_settings) + + +@pytest.fixture +async def async_client( + app: FastAPI, test_settings: Settings +) -> AsyncGenerator[AsyncClient, None]: + """Create an async HTTP client for testing. + + This client makes requests to the test app without starting a server. + The app lifespan is triggered to initialize database and other resources. + + Usage: + async def test_endpoint(async_client: AsyncClient): + response = await async_client.get("/health/live") + assert response.status_code == 200 + """ + from contextlib import asynccontextmanager + + from bookbytes.core.database import close_db, init_db + from bookbytes.core.logging import configure_logging + + # Initialize resources that lifespan would normally handle + configure_logging(test_settings) + await init_db(test_settings) + + async with AsyncClient( + transport=ASGITransport(app=app), + base_url="http://test", + ) as client: + yield client + + # Cleanup + await close_db() + + +@pytest.fixture +async def authenticated_client( + app: FastAPI, test_settings: Settings +) -> AsyncGenerator[AsyncClient, None]: + """Create an authenticated async client using API key. + + This client automatically includes the X-API-Key header. + + Usage: + async def test_protected_endpoint(authenticated_client: AsyncClient): + response = await authenticated_client.get("/api/v1/books") + assert response.status_code == 200 + """ + async with AsyncClient( + transport=ASGITransport(app=app), + base_url="http://test", + headers={"X-API-Key": test_settings.api_key.get_secret_value()}, + ) as client: + yield client + + +# ============================================================================= +# Database Fixtures +# ============================================================================= +# TODO: Implement in Phase 2 when database layer is ready + + +@pytest.fixture +async def test_db_session() -> AsyncGenerator[Any, None]: + """Create an isolated test database session. + + This fixture will be implemented in Phase 2 when the database layer is ready. + It will: + - Create in-memory SQLite database + - Run migrations + - Yield a session + - Rollback after test + + Usage: + async def test_create_book(test_db_session: AsyncSession): + book = Book(title="Test", author="Author") + test_db_session.add(book) + await test_db_session.commit() + """ + # Placeholder - will be implemented in Phase 2 + yield None + + +# ============================================================================= +# Mock Service Fixtures +# ============================================================================= + + +@pytest.fixture +def mock_openai_service() -> MagicMock: + """Create a mock OpenAI service. + + Use this to avoid making real API calls in tests. + + Usage: + async def test_chapter_extraction(mock_openai_service: MagicMock): + mock_openai_service.extract_chapters.return_value = [ + {"number": 1, "title": "Introduction"}, + ] + """ + mock = MagicMock() + mock.extract_chapters = AsyncMock( + return_value=[ + {"number": 1, "title": "Introduction"}, + {"number": 2, "title": "Getting Started"}, + ] + ) + mock.generate_summary = AsyncMock( + return_value="This chapter introduces the main concepts." + ) + return mock + + +@pytest.fixture +def mock_tts_service() -> MagicMock: + """Create a mock TTS service. + + Use this to avoid generating real audio in tests. + + Usage: + async def test_audio_generation(mock_tts_service: MagicMock): + mock_tts_service.generate_audio.return_value = "/path/to/audio.mp3" + """ + mock = MagicMock() + mock.generate_audio = AsyncMock(return_value="/tmp/test-audio.mp3") + return mock + + +@pytest.fixture +def mock_metadata_service() -> MagicMock: + """Create a mock book metadata service. + + Use this to avoid making real Open Library API calls. + """ + mock = MagicMock() + mock.fetch_by_isbn = AsyncMock( + return_value={ + "title": "Test Book", + "author": "Test Author", + "publisher": "Test Publisher", + "publish_date": "2024", + "pages": 200, + "cover_url": "https://example.com/cover.jpg", + } + ) + return mock + + +@pytest.fixture +def mock_storage() -> MagicMock: + """Create a mock storage backend. + + Use this to avoid file system or S3 operations in tests. + """ + mock = MagicMock() + mock.save = AsyncMock(return_value="test-audio-key") + mock.get_url = AsyncMock(return_value="https://storage.example.com/test-audio.mp3") + mock.delete = AsyncMock(return_value=True) + mock.exists = AsyncMock(return_value=True) + return mock + + +# ============================================================================= +# Helper Fixtures +# ============================================================================= + + +@pytest.fixture +def sample_isbn() -> str: + """Return a sample ISBN for testing.""" + return "9780134685991" + + +@pytest.fixture +def sample_book_data() -> dict[str, Any]: + """Return sample book data for testing.""" + return { + "title": "Effective Python", + "author": "Brett Slatkin", + "language": "en", + "publisher": "Addison-Wesley", + "pages": 480, + "publish_date": "2019", + "cover_url": "https://covers.openlibrary.org/b/isbn/9780134685991-L.jpg", + } + + +@pytest.fixture +def sample_chapter_data() -> list[dict[str, Any]]: + """Return sample chapter data for testing.""" + return [ + { + "chapter_number": 1, + "title": "Pythonic Thinking", + "summary": "This chapter covers Python idioms and best practices.", + "word_count": 150, + }, + { + "chapter_number": 2, + "title": "Lists and Dictionaries", + "summary": "This chapter explores Python's built-in data structures.", + "word_count": 180, + }, + ] diff --git a/backend/tests/integration/__init__.py b/backend/tests/integration/__init__.py new file mode 100644 index 0000000..c66cd71 --- /dev/null +++ b/backend/tests/integration/__init__.py @@ -0,0 +1 @@ +"""Integration tests package.""" diff --git a/backend/tests/integration/test_database.py b/backend/tests/integration/test_database.py new file mode 100644 index 0000000..a531596 --- /dev/null +++ b/backend/tests/integration/test_database.py @@ -0,0 +1,66 @@ +"""Tests for database infrastructure.""" + +import pytest +from httpx import AsyncClient + + +@pytest.mark.asyncio +async def test_health_ready_checks_database(async_client: AsyncClient) -> None: + """Test that /health/ready checks database connectivity.""" + response = await async_client.get("/health/ready") + + assert response.status_code == 200 + data = response.json() + assert "status" in data + assert "checks" in data + assert "database" in data["checks"] + + +@pytest.mark.asyncio +async def test_database_session_lifecycle() -> None: + """Test that database session can be created and closed.""" + from bookbytes.config import Settings + from bookbytes.core.database import close_db, get_async_session, init_db + + # Create test settings with SQLite + settings = Settings( + app_env="development", # type: ignore[arg-type] + database_url="sqlite+aiosqlite:///:memory:", + redis_url="redis://localhost:6379/15", + storage_backend="local", # type: ignore[arg-type] + local_storage_path="/tmp/test", + auth_mode="api_key", # type: ignore[arg-type] + api_key="test-key", # type: ignore[arg-type] + jwt_secret_key="test-secret", # type: ignore[arg-type] + openai_api_key="sk-test", # type: ignore[arg-type] + ) + + # Initialize database + await init_db(settings) + + # Get a session and verify it works + from sqlalchemy import text + + async for session in get_async_session(): + assert session is not None + # Simple query to verify connection + result = await session.execute(text("SELECT 1")) + value = result.scalar() + assert value == 1 + + # Close database + await close_db() + + +@pytest.mark.asyncio +async def test_base_model_has_expected_mixins() -> None: + """Test that Base and mixins are properly defined.""" + from bookbytes.models.base import Base, TimestampMixin, UUIDPrimaryKeyMixin + + # Verify Base exists + assert Base is not None + + # Verify mixins have expected attributes + assert hasattr(UUIDPrimaryKeyMixin, "id") + assert hasattr(TimestampMixin, "created_at") + assert hasattr(TimestampMixin, "updated_at") diff --git a/backend/tests/integration/test_health.py b/backend/tests/integration/test_health.py new file mode 100644 index 0000000..9dfb2ec --- /dev/null +++ b/backend/tests/integration/test_health.py @@ -0,0 +1,59 @@ +"""Tests for health check endpoints.""" + +import pytest +from httpx import AsyncClient + + +@pytest.mark.asyncio +async def test_liveness_probe(async_client: AsyncClient) -> None: + """Test that liveness probe returns OK.""" + response = await async_client.get("/health/live") + + assert response.status_code == 200 + data = response.json() + assert data["status"] == "ok" + + +@pytest.mark.asyncio +async def test_readiness_probe(async_client: AsyncClient) -> None: + """Test that readiness probe returns OK with checks.""" + response = await async_client.get("/health/ready") + + assert response.status_code == 200 + data = response.json() + assert data["status"] == "ok" + assert "checks" in data + + +@pytest.mark.asyncio +async def test_root_endpoint(async_client: AsyncClient) -> None: + """Test that root endpoint returns service info.""" + response = await async_client.get("/") + + assert response.status_code == 200 + data = response.json() + assert data["service"] == "BookBytes" + assert "version" in data + assert data["docs"] == "/docs" + assert data["health"] == "/health/live" + + +@pytest.mark.asyncio +async def test_request_id_header(async_client: AsyncClient) -> None: + """Test that response includes X-Request-ID header.""" + response = await async_client.get("/health/live") + + assert response.status_code == 200 + assert "X-Request-ID" in response.headers + + +@pytest.mark.asyncio +async def test_custom_request_id(async_client: AsyncClient) -> None: + """Test that custom X-Request-ID is echoed back.""" + custom_id = "test-request-id-12345" + response = await async_client.get( + "/health/live", headers={"X-Request-ID": custom_id} + ) + + assert response.status_code == 200 + assert response.headers["X-Request-ID"] == custom_id diff --git a/backend/tests/integration/test_library_service.py b/backend/tests/integration/test_library_service.py new file mode 100644 index 0000000..2d9b97b --- /dev/null +++ b/backend/tests/integration/test_library_service.py @@ -0,0 +1,332 @@ +"""Integration tests for LibraryService. + +These tests verify the LibraryService logic with mocked repositories. +For true database integration tests, the test_db_session fixture needs +to be properly implemented (marked as TODO in conftest.py). + +Run these tests with: + pytest tests/integration/test_library_service.py -v +""" + +from unittest.mock import AsyncMock, MagicMock +from uuid import UUID + +import pytest + +from bookbytes.models.edition import Edition +from bookbytes.models.work import Work +from bookbytes.services.library import LibraryService +from bookbytes.services.openlibrary import BookSearchResult, WorkDetails + +# ============================================================================= +# Fixtures +# ============================================================================= + + +@pytest.fixture +def mock_work_repo(): + """Create a mock WorkRepository with realistic behavior.""" + repo = MagicMock() + repo.get = AsyncMock() + repo.create = AsyncMock() + return repo + + +@pytest.fixture +def mock_edition_repo(): + """Create a mock EditionRepository with realistic behavior.""" + repo = MagicMock() + repo.get_by_isbn = AsyncMock(return_value=None) + repo.get_latest_by_work = AsyncMock(return_value=None) + repo.isbn_exists = AsyncMock(return_value=False) + repo.create = AsyncMock() + return repo + + +@pytest.fixture +def mock_provider_repo(): + """Create a mock BookProviderRepository with realistic behavior.""" + repo = MagicMock() + repo.get_by_provider_key = AsyncMock(return_value=None) + repo.create_work_mapping = AsyncMock() + repo.create_edition_mapping = AsyncMock() + repo.get_for_edition = AsyncMock(return_value=[]) + return repo + + +@pytest.fixture +def library_service(mock_work_repo, mock_edition_repo, mock_provider_repo): + """Create LibraryService with mocked repositories.""" + return LibraryService( + work_repo=mock_work_repo, + edition_repo=mock_edition_repo, + provider_repo=mock_provider_repo, + ) + + +@pytest.fixture +def sample_search_result() -> BookSearchResult: + """Create a sample BookSearchResult.""" + return BookSearchResult( + title="The Lord of the Rings", + authors=["J. R. R. Tolkien"], + first_publish_year=1954, + cover_url="https://covers.openlibrary.org/b/id/258027-M.jpg", + isbn_list=["9780618640157"], + edition_count=120, + subjects=["Fantasy", "Epic"], + external_work_key="/works/OL27448W", + ) + + +@pytest.fixture +def sample_work_details() -> WorkDetails: + """Create sample work details.""" + return WorkDetails( + title="1984", + authors=["George Orwell"], + description="A dystopian novel.", + subjects=["Dystopia", "Politics"], + first_publish_year=1949, + cover_url=None, + edition_count=200, + isbn_list=["9780451524935"], + external_work_key="/works/OL1168083W", + ) + + +@pytest.fixture +def sample_work() -> Work: + """Create a sample Work instance.""" + work = Work( + title="The Lord of the Rings", + authors=["J. R. R. Tolkien"], + subjects=["Fantasy"], + first_publish_year=1954, + ) + work.id = UUID("01234567-89ab-cdef-0123-456789abcdef") + return work + + +@pytest.fixture +def sample_edition(sample_work: Work) -> Edition: + """Create a sample Edition instance.""" + edition = Edition( + work_id=sample_work.id, + isbn="9780618640157", + isbn_type="isbn13", + title="The Lord of the Rings", + language="eng", + ) + edition.id = UUID("fedcba98-7654-3210-fedc-ba9876543210") + return edition + + +# ============================================================================= +# Work Persistence Tests +# ============================================================================= + + +class TestLibraryServiceWorkPersistence: + """Integration tests for Work persistence.""" + + @pytest.mark.asyncio + async def test_create_work_creates_provider_mapping( + self, + library_service: LibraryService, + mock_work_repo: MagicMock, + mock_provider_repo: MagicMock, + sample_search_result: BookSearchResult, + sample_work: Work, + ) -> None: + """Test that provider mapping is created when work is created.""" + mock_work_repo.create.return_value = sample_work + + await library_service.get_or_create_work(sample_search_result) + + # Verify provider mapping was created + mock_provider_repo.create_work_mapping.assert_called_once() + call_kwargs = mock_provider_repo.create_work_mapping.call_args.kwargs + assert call_kwargs["provider"] == "openlibrary" + assert call_kwargs["external_key"] == "/works/OL27448W" + + @pytest.mark.asyncio + async def test_get_or_create_work_returns_existing( + self, + library_service: LibraryService, + mock_work_repo: MagicMock, + mock_provider_repo: MagicMock, + sample_search_result: BookSearchResult, + sample_work: Work, + ) -> None: + """Test that existing work is returned, not duplicated.""" + # Setup: work exists via provider mapping + mock_mapping = MagicMock() + mock_mapping.work_id = sample_work.id + mock_provider_repo.get_by_provider_key.return_value = mock_mapping + mock_work_repo.get.return_value = sample_work + + result = await library_service.get_or_create_work(sample_search_result) + + assert result == sample_work + mock_work_repo.create.assert_not_called() + + @pytest.mark.asyncio + async def test_create_work_from_details( + self, + library_service: LibraryService, + mock_work_repo: MagicMock, + sample_work_details: WorkDetails, + sample_work: Work, + ) -> None: + """Test creating work from WorkDetails.""" + mock_work_repo.create.return_value = sample_work + + result = await library_service.get_or_create_work_from_details( + sample_work_details + ) + + assert result.id is not None + mock_work_repo.create.assert_called_once() + + +# ============================================================================= +# Edition Persistence Tests +# ============================================================================= + + +class TestLibraryServiceEditionPersistence: + """Integration tests for Edition persistence.""" + + @pytest.mark.asyncio + async def test_store_edition_creates_new( + self, + library_service: LibraryService, + mock_edition_repo: MagicMock, + sample_work: Work, + sample_edition: Edition, + ) -> None: + """Test that store_edition creates a new edition.""" + mock_edition_repo.get_by_isbn.return_value = None + mock_edition_repo.create.return_value = sample_edition + + result = await library_service.store_edition( + work=sample_work, + isbn="9780618640157", + title="50th Anniversary Edition", + publisher="Houghton Mifflin", + publish_year=2004, + ) + + assert result == sample_edition + mock_edition_repo.create.assert_called_once() + + @pytest.mark.asyncio + async def test_store_edition_normalizes_isbn( + self, + library_service: LibraryService, + mock_edition_repo: MagicMock, + sample_work: Work, + sample_edition: Edition, + ) -> None: + """Test that ISBN is normalized before storing.""" + mock_edition_repo.get_by_isbn.return_value = None + mock_edition_repo.create.return_value = sample_edition + + await library_service.store_edition( + work=sample_work, + isbn="978-0-618-64015-7", + ) + + # Check that get_by_isbn was called with normalized ISBN + mock_edition_repo.get_by_isbn.assert_called_with("9780618640157") + + @pytest.mark.asyncio + async def test_store_edition_returns_existing( + self, + library_service: LibraryService, + mock_edition_repo: MagicMock, + sample_work: Work, + sample_edition: Edition, + ) -> None: + """Test that existing edition is returned, not duplicated.""" + mock_edition_repo.get_by_isbn.return_value = sample_edition + + result = await library_service.store_edition( + work=sample_work, isbn="9780618640157" + ) + + assert result == sample_edition + mock_edition_repo.create.assert_not_called() + + @pytest.mark.asyncio + async def test_find_by_isbn_normalizes_input( + self, + library_service: LibraryService, + mock_edition_repo: MagicMock, + ) -> None: + """Test that ISBN is normalized in find_by_isbn.""" + await library_service.find_by_isbn("978-0-618-64015-7") + + mock_edition_repo.get_by_isbn.assert_called_with("9780618640157") + + @pytest.mark.asyncio + async def test_isbn_exists_returns_correct_value( + self, + library_service: LibraryService, + mock_edition_repo: MagicMock, + ) -> None: + """Test isbn_exists returns correct boolean.""" + mock_edition_repo.isbn_exists.return_value = True + + result = await library_service.isbn_exists("9780618640157") + + assert result is True + + +# ============================================================================= +# Provider Mapping Tests +# ============================================================================= + + +class TestLibraryServiceProviderMapping: + """Integration tests for provider mappings.""" + + @pytest.mark.asyncio + async def test_edition_provider_mapping_created( + self, + library_service: LibraryService, + mock_edition_repo: MagicMock, + mock_provider_repo: MagicMock, + sample_work: Work, + sample_edition: Edition, + ) -> None: + """Test that edition can have provider mapping.""" + mock_edition_repo.get_by_isbn.return_value = None + mock_edition_repo.create.return_value = sample_edition + + await library_service.store_edition( + work=sample_work, + isbn="9780618640157", + external_key="/books/OL12345M", + ) + + mock_provider_repo.create_edition_mapping.assert_called_once() + call_kwargs = mock_provider_repo.create_edition_mapping.call_args.kwargs + assert call_kwargs["external_key"] == "/books/OL12345M" + + @pytest.mark.asyncio + async def test_find_work_by_nonexistent_provider( + self, + library_service: LibraryService, + mock_provider_repo: MagicMock, + ) -> None: + """Test finding work by provider that doesn't exist.""" + mock_provider_repo.get_by_provider_key.return_value = None + + result = await library_service.find_work_by_provider( + provider="openlibrary", + external_key="/works/NONEXISTENT", + ) + + assert result is None diff --git a/backend/tests/integration/test_openlibrary_api.py b/backend/tests/integration/test_openlibrary_api.py new file mode 100644 index 0000000..77c4d05 --- /dev/null +++ b/backend/tests/integration/test_openlibrary_api.py @@ -0,0 +1,295 @@ +"""Integration tests for OpenLibraryService against real API. + +These tests hit the actual OpenLibrary API to validate: +- API contract hasn't changed +- Response parsing works with real data +- Error handling for real network conditions + +Run these tests with: + pytest -m integration + pytest -m external + +Skip these in CI with: + pytest -m "not integration" +""" + +import pytest + +from bookbytes.services.cache import CacheService +from bookbytes.services.openlibrary import ( + OpenLibraryError, + OpenLibraryService, +) + +# ============================================================================= +# Fixtures +# ============================================================================= + + +@pytest.fixture +def mock_cache(): + """Create a minimal mock cache that always misses. + + For integration tests, we want to hit the real API. + """ + from unittest.mock import AsyncMock, MagicMock + + cache = MagicMock(spec=CacheService) + cache.get = AsyncMock(return_value=(None, False)) # Always miss + cache.set = AsyncMock(return_value=None) + return cache + + +@pytest.fixture +def openlibrary_service(mock_cache) -> OpenLibraryService: + """Create OpenLibraryService with mock cache for real API testing.""" + return OpenLibraryService(mock_cache) + + +async def check_openlibrary_reachable() -> bool: + """Check if OpenLibrary API is reachable.""" + import httpx + + try: + async with httpx.AsyncClient(timeout=10) as client: + response = await client.head("https://openlibrary.org") + return response.status_code < 500 + except (httpx.RequestError, httpx.HTTPStatusError): + return False + + +@pytest.fixture +async def skip_if_no_network(): + """Skip test if OpenLibrary API is unreachable.""" + is_reachable = await check_openlibrary_reachable() + if not is_reachable: + pytest.skip("OpenLibrary API is unreachable (network issue)") + return True + + +# ============================================================================= +# Search Integration Tests +# ============================================================================= + + +@pytest.mark.integration +@pytest.mark.external +class TestOpenLibrarySearchIntegration: + """Integration tests for search_books against real API.""" + + @pytest.mark.asyncio + async def test_search_returns_results( + self, openlibrary_service: OpenLibraryService + ) -> None: + """Test that search returns real results.""" + result = await openlibrary_service.search_books(title="Lord of the Rings") + + assert result.total_found > 0 + assert len(result.results) > 0 + + # Verify first result has expected fields + first = result.results[0] + assert first.title # Has a title + assert first.external_work_key # Has work key + assert first.source_provider == "openlibrary" + + await openlibrary_service.close() + + @pytest.mark.asyncio + async def test_search_tolkien_returns_correct_author( + self, openlibrary_service: OpenLibraryService + ) -> None: + """Test that searching for Tolkien returns his works.""" + result = await openlibrary_service.search_books( + title="Hobbit", author="Tolkien" + ) + + assert result.total_found > 0 + + # At least one result should have Tolkien as author + has_tolkien = any("Tolkien" in str(r.authors) for r in result.results) + assert has_tolkien, "Expected to find Tolkien in authors" + + await openlibrary_service.close() + + @pytest.mark.asyncio + async def test_search_with_language_filter( + self, openlibrary_service: OpenLibraryService + ) -> None: + """Test search with language parameter.""" + result = await openlibrary_service.search_books( + title="Don Quixote", language="spa" + ) + + # Should return results (Don Quixote exists in Spanish) + assert result.total_found >= 0 # May be 0 if API doesn't support filter well + + await openlibrary_service.close() + + @pytest.mark.asyncio + async def test_search_nonexistent_book( + self, openlibrary_service: OpenLibraryService + ) -> None: + """Test search for a book that doesn't exist.""" + result = await openlibrary_service.search_books( + title="xyznonexistentbook123456789" + ) + + assert result.total_found == 0 + assert len(result.results) == 0 + + await openlibrary_service.close() + + @pytest.mark.asyncio + async def test_search_result_has_cover_url( + self, openlibrary_service: OpenLibraryService + ) -> None: + """Test that popular books have cover URLs.""" + result = await openlibrary_service.search_books(title="Harry Potter") + + # At least one result should have a cover + has_cover = any(r.cover_url is not None for r in result.results) + assert has_cover, "Expected at least one result with cover" + + await openlibrary_service.close() + + +# ============================================================================= +# Work Details Integration Tests +# ============================================================================= + + +@pytest.mark.integration +@pytest.mark.external +class TestOpenLibraryWorkDetailsIntegration: + """Integration tests for get_work_details against real API.""" + + @pytest.mark.asyncio + async def test_get_work_details_lotr( + self, openlibrary_service: OpenLibraryService + ) -> None: + """Test fetching work details for Lord of the Rings.""" + # First search to get a valid work key + search = await openlibrary_service.search_books(title="Lord of the Rings") + assert len(search.results) > 0 + + work_key = search.results[0].external_work_key + + # Now fetch work details + work = await openlibrary_service.get_work_details(work_key) + + assert work.title # Has title + assert work.external_work_key == work_key + assert work.source_provider == "openlibrary" + + await openlibrary_service.close() + + @pytest.mark.asyncio + async def test_get_work_details_has_description( + self, openlibrary_service: OpenLibraryService + ) -> None: + """Test that work details include description when available.""" + # Use a well-known work that has description + search = await openlibrary_service.search_books(title="1984", author="Orwell") + + if len(search.results) > 0: + work_key = search.results[0].external_work_key + work = await openlibrary_service.get_work_details(work_key) + + # Description may or may not be present depending on the work + # Just verify it's a string or None + assert work.description is None or isinstance(work.description, str) + + await openlibrary_service.close() + + +# ============================================================================= +# ISBN Helper Integration Tests +# ============================================================================= + + +@pytest.mark.integration +@pytest.mark.external +class TestOpenLibraryISBNIntegration: + """Integration tests for ISBN-related methods.""" + + @pytest.mark.asyncio + async def test_search_returns_isbns( + self, openlibrary_service: OpenLibraryService + ) -> None: + """Test that search results include ISBNs.""" + result = await openlibrary_service.search_books(title="Clean Code") + + # At least one result should have ISBNs + has_isbns = any(len(r.isbn_list) > 0 for r in result.results) + assert has_isbns, "Expected at least one result with ISBNs" + + await openlibrary_service.close() + + +# ============================================================================= +# Error Handling Integration Tests +# ============================================================================= + + +@pytest.mark.integration +@pytest.mark.external +class TestOpenLibraryErrorHandlingIntegration: + """Integration tests for error handling with real API.""" + + @pytest.mark.asyncio + async def test_invalid_work_key_handling( + self, openlibrary_service: OpenLibraryService + ) -> None: + """Test handling of invalid work key.""" + + with pytest.raises(OpenLibraryError): + await openlibrary_service.get_work_details("/works/INVALID12345") + + await openlibrary_service.close() + + +# ============================================================================= +# Cache Integration Tests +# ============================================================================= + + +@pytest.mark.integration +@pytest.mark.external +class TestOpenLibraryCacheIntegration: + """Test cache behavior with real API responses.""" + + @pytest.mark.asyncio + async def test_cache_is_populated_after_search( + self, mock_cache, openlibrary_service: OpenLibraryService + ) -> None: + """Test that cache.set is called after successful search.""" + await openlibrary_service.search_books(title="Python") + + # Cache should have been set + mock_cache.set.assert_called_once() + + # Verify the cached data structure + call_args = mock_cache.set.call_args + cache_key = call_args[0][0] + cache_data = call_args[0][1] + + assert cache_key.startswith("search:") + assert "results" in cache_data + assert "total_found" in cache_data + + await openlibrary_service.close() + + @pytest.mark.asyncio + async def test_cache_key_is_deterministic( + self, openlibrary_service: OpenLibraryService + ) -> None: + """Test that same search params produce same cache key.""" + key1 = CacheService.search_key(title="Python", author="Guido") + key2 = CacheService.search_key(title="Python", author="Guido") + key3 = CacheService.search_key(title="python", author="guido") + + assert key1 == key2 + assert key1 == key3 # Normalized + + await openlibrary_service.close() diff --git a/backend/tests/integration/test_search_endpoints.py b/backend/tests/integration/test_search_endpoints.py new file mode 100644 index 0000000..dbfd94b --- /dev/null +++ b/backend/tests/integration/test_search_endpoints.py @@ -0,0 +1,332 @@ +"""Integration tests for search API endpoints. + +These tests verify the endpoints work correctly with: +- Real database (SQLite in-memory for tests) +- Mocked OpenLibrary API (for predictable responses) + +Run these tests with: + pytest tests/integration/test_search_endpoints.py -v +""" + +from unittest.mock import AsyncMock, MagicMock + +import pytest +from httpx import AsyncClient + +from bookbytes.api.v1.search import get_openlibrary_service +from bookbytes.services.cache import get_cache_service +from bookbytes.services.openlibrary import BookSearchResult, SearchResponse, WorkDetails + +# ============================================================================= +# Fixtures +# ============================================================================= + + +@pytest.fixture +def mock_cache(): + """Create mock cache service.""" + cache = MagicMock() + cache.get = AsyncMock(return_value=(None, False)) + cache.set = AsyncMock(return_value=None) + return cache + + +@pytest.fixture +def mock_openlibrary(): + """Create mock OpenLibrary service.""" + service = MagicMock() + service.search_books = AsyncMock() + service.get_work_details = AsyncMock() + service.close = AsyncMock() + return service + + +@pytest.fixture +def sample_search_result() -> BookSearchResult: + """Create sample search result.""" + return BookSearchResult( + title="The Lord of the Rings", + authors=["J. R. R. Tolkien"], + first_publish_year=1954, + cover_url="https://covers.openlibrary.org/b/id/258027-M.jpg", + isbn_list=["9780618640157", "0618640150"], + edition_count=120, + subjects=["Fantasy", "Epic"], + external_work_key="/works/OL27448W", + ) + + +@pytest.fixture +def sample_work_details() -> WorkDetails: + """Create sample work details.""" + return WorkDetails( + title="The Lord of the Rings", + authors=["J. R. R. Tolkien"], + description="An epic fantasy novel about the quest to destroy the One Ring.", + subjects=["Fantasy", "Adventure", "Epic"], + first_publish_year=1954, + cover_url="https://covers.openlibrary.org/b/id/258027-M.jpg", + edition_count=120, + isbn_list=["9780618640157"], + external_work_key="/works/OL27448W", + ) + + +# ============================================================================= +# Search Endpoint Integration Tests +# ============================================================================= + + +class TestSearchEndpointIntegration: + """Integration tests for POST /api/v1/books/search.""" + + @pytest.mark.asyncio + async def test_search_returns_200_with_results( + self, + app, + async_client: AsyncClient, + mock_cache, + mock_openlibrary, + sample_search_result: BookSearchResult, + ) -> None: + """Test search endpoint returns 200 with results.""" + mock_openlibrary.search_books.return_value = SearchResponse( + results=[sample_search_result], + total_found=1, + offset=0, + limit=100, + ) + + app.dependency_overrides[get_cache_service] = lambda: mock_cache + app.dependency_overrides[get_openlibrary_service] = lambda: mock_openlibrary + + response = await async_client.post( + "/api/v1/books/search", + json={"title": "Lord of the Rings"}, + ) + + assert response.status_code == 200 + data = response.json() + assert data["total_found"] == 1 + assert len(data["results"]) == 1 + assert data["results"][0]["title"] == "The Lord of the Rings" + + app.dependency_overrides.clear() + + @pytest.mark.asyncio + async def test_search_validates_request_body( + self, + app, + async_client: AsyncClient, + mock_cache, + mock_openlibrary, + ) -> None: + """Test search validates request body.""" + app.dependency_overrides[get_cache_service] = lambda: mock_cache + app.dependency_overrides[get_openlibrary_service] = lambda: mock_openlibrary + + # Missing title + response = await async_client.post( + "/api/v1/books/search", + json={}, + ) + + assert response.status_code == 422 + + app.dependency_overrides.clear() + + @pytest.mark.asyncio + async def test_search_with_pagination( + self, + app, + async_client: AsyncClient, + mock_cache, + mock_openlibrary, + sample_search_result: BookSearchResult, + ) -> None: + """Test search with pagination parameters.""" + mock_openlibrary.search_books.return_value = SearchResponse( + results=[sample_search_result] * 30, + total_found=100, + offset=0, + limit=100, + ) + + app.dependency_overrides[get_cache_service] = lambda: mock_cache + app.dependency_overrides[get_openlibrary_service] = lambda: mock_openlibrary + + response = await async_client.post( + "/api/v1/books/search?page=2&page_size=10", + json={"title": "Test"}, + ) + + assert response.status_code == 200 + data = response.json() + assert data["page"] == 2 + assert data["page_size"] == 10 + assert data["has_more"] is True + + app.dependency_overrides.clear() + + +# ============================================================================= +# Work Details Endpoint Integration Tests +# ============================================================================= + + +class TestWorkDetailsEndpointIntegration: + """Integration tests for GET /api/v1/books/works/{work_key}.""" + + @pytest.mark.asyncio + async def test_get_work_returns_200( + self, + app, + async_client: AsyncClient, + mock_cache, + mock_openlibrary, + sample_work_details: WorkDetails, + ) -> None: + """Test get work details returns 200.""" + mock_openlibrary.get_work_details.return_value = sample_work_details + + app.dependency_overrides[get_cache_service] = lambda: mock_cache + app.dependency_overrides[get_openlibrary_service] = lambda: mock_openlibrary + + response = await async_client.get("/api/v1/books/works/works/OL27448W") + + assert response.status_code == 200 + data = response.json() + assert data["title"] == "The Lord of the Rings" + assert data["description"] is not None + assert any("Tolkien" in author for author in data["authors"]) + + app.dependency_overrides.clear() + + +# ============================================================================= +# ISBN Lookup Endpoint Integration Tests +# ============================================================================= + + +class TestISBNLookupEndpointIntegration: + """Integration tests for GET /api/v1/books/isbn/{isbn}.""" + + @pytest.mark.asyncio + async def test_isbn_lookup_returns_200( + self, + app, + async_client: AsyncClient, + mock_cache, + mock_openlibrary, + sample_search_result: BookSearchResult, + sample_work_details: WorkDetails, + ) -> None: + """Test ISBN lookup returns 200 with book details.""" + mock_openlibrary.search_books.return_value = SearchResponse( + results=[sample_search_result], + total_found=1, + offset=0, + limit=100, + ) + mock_openlibrary.get_work_details.return_value = sample_work_details + + app.dependency_overrides[get_cache_service] = lambda: mock_cache + app.dependency_overrides[get_openlibrary_service] = lambda: mock_openlibrary + + response = await async_client.get("/api/v1/books/isbn/9780618640157") + + assert response.status_code == 200 + data = response.json() + assert data["title"] == "The Lord of the Rings" + + app.dependency_overrides.clear() + + @pytest.mark.asyncio + async def test_isbn_lookup_not_found_returns_404( + self, + app, + async_client: AsyncClient, + mock_cache, + mock_openlibrary, + ) -> None: + """Test ISBN not found returns 404.""" + mock_openlibrary.search_books.return_value = SearchResponse( + results=[], + total_found=0, + offset=0, + limit=100, + ) + + app.dependency_overrides[get_cache_service] = lambda: mock_cache + app.dependency_overrides[get_openlibrary_service] = lambda: mock_openlibrary + + response = await async_client.get("/api/v1/books/isbn/0000000000000") + + assert response.status_code == 404 + data = response.json() + assert data["error"]["code"] == "BOOK_NOT_FOUND" + + app.dependency_overrides.clear() + + @pytest.mark.asyncio + async def test_isbn_with_dashes_is_handled( + self, + app, + async_client: AsyncClient, + mock_cache, + mock_openlibrary, + sample_search_result: BookSearchResult, + sample_work_details: WorkDetails, + ) -> None: + """Test ISBN with dashes is normalized and handled.""" + mock_openlibrary.search_books.return_value = SearchResponse( + results=[sample_search_result], + total_found=1, + offset=0, + limit=100, + ) + mock_openlibrary.get_work_details.return_value = sample_work_details + + app.dependency_overrides[get_cache_service] = lambda: mock_cache + app.dependency_overrides[get_openlibrary_service] = lambda: mock_openlibrary + + response = await async_client.get("/api/v1/books/isbn/978-0-618-64015-7") + + assert response.status_code == 200 + + app.dependency_overrides.clear() + + +# ============================================================================= +# Request Headers Integration Tests +# ============================================================================= + + +class TestRequestHeadersIntegration: + """Integration tests for request headers.""" + + @pytest.mark.asyncio + async def test_request_id_header_returned( + self, + async_client: AsyncClient, + ) -> None: + """Test X-Request-ID header is returned in response.""" + response = await async_client.get("/health/live") + + assert "X-Request-ID" in response.headers + assert len(response.headers["X-Request-ID"]) > 0 + + @pytest.mark.asyncio + async def test_custom_request_id_preserved( + self, + async_client: AsyncClient, + ) -> None: + """Test custom X-Request-ID is preserved.""" + custom_id = "my-custom-request-id-123" + + response = await async_client.get( + "/health/live", + headers={"X-Request-ID": custom_id}, + ) + + assert response.headers["X-Request-ID"] == custom_id diff --git a/backend/tests/mocks/__init__.py b/backend/tests/mocks/__init__.py new file mode 100644 index 0000000..5f4d4f4 --- /dev/null +++ b/backend/tests/mocks/__init__.py @@ -0,0 +1 @@ +"""Mock responses for external services.""" diff --git a/backend/tests/mocks/openai_responses.py b/backend/tests/mocks/openai_responses.py new file mode 100644 index 0000000..f7ca788 --- /dev/null +++ b/backend/tests/mocks/openai_responses.py @@ -0,0 +1,129 @@ +"""Mock responses for OpenAI API calls. + +These mocks allow testing without making real OpenAI API calls. +""" + +from typing import Any + +# ============================================================================= +# Chapter Extraction Responses +# ============================================================================= + +CHAPTER_EXTRACTION_RESPONSE: dict[str, Any] = { + "id": "chatcmpl-test123", + "object": "chat.completion", + "created": 1699000000, + "model": "gpt-4o-mini", + "choices": [ + { + "index": 0, + "message": { + "role": "assistant", + "content": """[ + {"number": 1, "title": "Introduction"}, + {"number": 2, "title": "Getting Started"}, + {"number": 3, "title": "Core Concepts"}, + {"number": 4, "title": "Advanced Topics"}, + {"number": 5, "title": "Best Practices"}, + {"number": 6, "title": "Conclusion"} + ]""", + }, + "finish_reason": "stop", + } + ], + "usage": {"prompt_tokens": 50, "completion_tokens": 100, "total_tokens": 150}, +} + +CHAPTER_EXTRACTION_PARSED: list[dict[str, Any]] = [ + {"number": 1, "title": "Introduction"}, + {"number": 2, "title": "Getting Started"}, + {"number": 3, "title": "Core Concepts"}, + {"number": 4, "title": "Advanced Topics"}, + {"number": 5, "title": "Best Practices"}, + {"number": 6, "title": "Conclusion"}, +] + + +# ============================================================================= +# Summary Generation Responses +# ============================================================================= + +SUMMARY_GENERATION_RESPONSE: dict[str, Any] = { + "id": "chatcmpl-test456", + "object": "chat.completion", + "created": 1699000001, + "model": "gpt-4o-mini", + "choices": [ + { + "index": 0, + "message": { + "role": "assistant", + "content": ( + "This chapter introduces the fundamental concepts of the book. " + "The author explains the motivation behind writing this guide and " + "outlines what readers can expect to learn. Key themes include " + "practical examples, best practices, and real-world applications. " + "The introduction sets the stage for deeper exploration in subsequent chapters." + ), + }, + "finish_reason": "stop", + } + ], + "usage": {"prompt_tokens": 30, "completion_tokens": 80, "total_tokens": 110}, +} + +SAMPLE_SUMMARIES: dict[int, str] = { + 1: ( + "This chapter introduces the fundamental concepts of the book. " + "The author explains the motivation and outlines what readers will learn." + ), + 2: ( + "Getting Started covers the essential setup and configuration needed. " + "Readers learn how to prepare their environment and install dependencies." + ), + 3: ( + "Core Concepts dives deep into the main ideas that form the foundation. " + "Key patterns and principles are explained with practical examples." + ), + 4: ( + "Advanced Topics explores sophisticated techniques for experienced practitioners. " + "Complex scenarios and edge cases are addressed with detailed solutions." + ), + 5: ( + "Best Practices summarizes proven approaches and common pitfalls to avoid. " + "The chapter provides actionable guidance for production environments." + ), + 6: ( + "The Conclusion wraps up the key learnings and provides next steps. " + "Resources for further learning and community engagement are shared." + ), +} + + +# ============================================================================= +# Error Responses +# ============================================================================= + +RATE_LIMIT_ERROR: dict[str, Any] = { + "error": { + "message": "Rate limit exceeded. Please retry after 20 seconds.", + "type": "rate_limit_error", + "code": "rate_limit_exceeded", + } +} + +INVALID_API_KEY_ERROR: dict[str, Any] = { + "error": { + "message": "Invalid API key provided.", + "type": "invalid_request_error", + "code": "invalid_api_key", + } +} + +CONTEXT_LENGTH_ERROR: dict[str, Any] = { + "error": { + "message": "This model's maximum context length is 16385 tokens.", + "type": "invalid_request_error", + "code": "context_length_exceeded", + } +} diff --git a/backend/tests/mocks/openlibrary_responses.py b/backend/tests/mocks/openlibrary_responses.py new file mode 100644 index 0000000..2c37a8d --- /dev/null +++ b/backend/tests/mocks/openlibrary_responses.py @@ -0,0 +1,160 @@ +"""Mock responses for Open Library API calls. + +These mocks allow testing without making real Open Library API calls. +""" + +from typing import Any + +# ============================================================================= +# Successful Book Lookups +# ============================================================================= + +# ISBN: 9780134685991 - Effective Python +EFFECTIVE_PYTHON_RESPONSE: dict[str, Any] = { + "ISBN:9780134685991": { + "url": "https://openlibrary.org/books/OL27258011M", + "key": "/books/OL27258011M", + "title": "Effective Python: 90 Specific Ways to Write Better Python", + "authors": [{"url": "/authors/OL7373539A", "name": "Brett Slatkin"}], + "publishers": [{"name": "Addison-Wesley Professional"}], + "publish_date": "2019", + "number_of_pages": 480, + "cover": { + "small": "https://covers.openlibrary.org/b/isbn/9780134685991-S.jpg", + "medium": "https://covers.openlibrary.org/b/isbn/9780134685991-M.jpg", + "large": "https://covers.openlibrary.org/b/isbn/9780134685991-L.jpg", + }, + "identifiers": { + "isbn_10": ["0134853989"], + "isbn_13": ["9780134685991"], + "openlibrary": ["OL27258011M"], + }, + "subjects": [ + {"name": "Python (Computer program language)"}, + {"name": "Computer programming"}, + ], + } +} + +# ISBN: 9780596517984 - The Ruby Programming Language +RUBY_BOOK_RESPONSE: dict[str, Any] = { + "ISBN:9780596517984": { + "url": "https://openlibrary.org/books/OL23177938M", + "key": "/books/OL23177938M", + "title": "The Ruby Programming Language", + "authors": [ + {"url": "/authors/OL2734036A", "name": "David Flanagan"}, + {"url": "/authors/OL2734037A", "name": "Yukihiro Matsumoto"}, + ], + "publishers": [{"name": "O'Reilly Media"}], + "publish_date": "2008", + "number_of_pages": 446, + "cover": { + "small": "https://covers.openlibrary.org/b/isbn/9780596517984-S.jpg", + "medium": "https://covers.openlibrary.org/b/isbn/9780596517984-M.jpg", + "large": "https://covers.openlibrary.org/b/isbn/9780596517984-L.jpg", + }, + "identifiers": { + "isbn_10": ["0596516177"], + "isbn_13": ["9780596517984"], + }, + } +} + + +# ============================================================================= +# Parsed Book Metadata +# ============================================================================= + +EFFECTIVE_PYTHON_METADATA: dict[str, Any] = { + "title": "Effective Python: 90 Specific Ways to Write Better Python", + "author": "Brett Slatkin", + "publisher": "Addison-Wesley Professional", + "publish_date": "2019", + "pages": 480, + "cover_url": "https://covers.openlibrary.org/b/isbn/9780134685991-L.jpg", + "language": "en", + "isbns": [ + {"isbn": "0134853989", "type": "isbn10"}, + {"isbn": "9780134685991", "type": "isbn13"}, + ], +} + +RUBY_BOOK_METADATA: dict[str, Any] = { + "title": "The Ruby Programming Language", + "author": "David Flanagan, Yukihiro Matsumoto", + "publisher": "O'Reilly Media", + "publish_date": "2008", + "pages": 446, + "cover_url": "https://covers.openlibrary.org/b/isbn/9780596517984-L.jpg", + "language": "en", + "isbns": [ + {"isbn": "0596516177", "type": "isbn10"}, + {"isbn": "9780596517984", "type": "isbn13"}, + ], +} + + +# ============================================================================= +# Not Found Responses +# ============================================================================= + +NOT_FOUND_RESPONSE: dict[str, Any] = {} + +INVALID_ISBN_RESPONSE: dict[str, Any] = {} + + +# ============================================================================= +# Error Responses +# ============================================================================= + +SERVICE_UNAVAILABLE_RESPONSE = { + "error": "Service temporarily unavailable", + "status": 503, +} + +RATE_LIMITED_RESPONSE = { + "error": "Rate limit exceeded", + "status": 429, +} + + +# ============================================================================= +# Helper Functions +# ============================================================================= + + +def get_mock_response(isbn: str) -> dict[str, Any]: + """Get a mock response for a given ISBN. + + Args: + isbn: The ISBN to look up + + Returns: + Mock API response or empty dict if not found + """ + responses = { + "9780134685991": EFFECTIVE_PYTHON_RESPONSE, + "0134853989": EFFECTIVE_PYTHON_RESPONSE, + "9780596517984": RUBY_BOOK_RESPONSE, + "0596516177": RUBY_BOOK_RESPONSE, + } + return responses.get(isbn, {}) + + +def get_mock_metadata(isbn: str) -> dict[str, Any] | None: + """Get parsed metadata for a given ISBN. + + Args: + isbn: The ISBN to look up + + Returns: + Parsed metadata dict or None if not found + """ + metadata = { + "9780134685991": EFFECTIVE_PYTHON_METADATA, + "0134853989": EFFECTIVE_PYTHON_METADATA, + "9780596517984": RUBY_BOOK_METADATA, + "0596516177": RUBY_BOOK_METADATA, + } + return metadata.get(isbn) diff --git a/backend/tests/unit/__init__.py b/backend/tests/unit/__init__.py new file mode 100644 index 0000000..ea3f8b9 --- /dev/null +++ b/backend/tests/unit/__init__.py @@ -0,0 +1 @@ +"""Unit tests package.""" diff --git a/backend/tests/unit/test_cache_service.py b/backend/tests/unit/test_cache_service.py new file mode 100644 index 0000000..529ff58 --- /dev/null +++ b/backend/tests/unit/test_cache_service.py @@ -0,0 +1,282 @@ +"""Tests for CacheService. + +Tests the Redis caching layer with TTL jitter, stale-while-revalidate, +and cache key generation. +""" + +import json +from unittest.mock import AsyncMock, MagicMock + +import pytest + +from bookbytes.services.cache import CacheService + +# ============================================================================= +# Fixtures +# ============================================================================= + + +@pytest.fixture +def mock_redis() -> MagicMock: + """Create a mock Redis client.""" + redis = MagicMock() + redis.get = AsyncMock(return_value=None) + redis.setex = AsyncMock(return_value=True) + redis.delete = AsyncMock(return_value=1) + redis.ttl = AsyncMock(return_value=3600) + redis.scan_iter = MagicMock(return_value=iter([])) + return redis + + +@pytest.fixture +def cache_service(mock_redis: MagicMock) -> CacheService: + """Create CacheService with mock Redis.""" + return CacheService(mock_redis) + + +# ============================================================================= +# Cache Key Generation Tests +# ============================================================================= + + +class TestCacheKeyGeneration: + """Tests for static cache key generation methods.""" + + def test_search_key_basic(self) -> None: + """Test basic search key generation.""" + key = CacheService.search_key(title="Lord of the Rings") + assert key.startswith("search:") + assert len(key) == len("search:") + 16 # 16-char hash + + def test_search_key_normalized(self) -> None: + """Test that search keys are normalized (lowercase, trimmed).""" + key1 = CacheService.search_key(title="Lord of the Rings") + key2 = CacheService.search_key(title=" LORD OF THE RINGS ") + assert key1 == key2 + + def test_search_key_with_author(self) -> None: + """Test search key with author.""" + key1 = CacheService.search_key(title="Lord of the Rings") + key2 = CacheService.search_key(title="Lord of the Rings", author="Tolkien") + assert key1 != key2 + + def test_search_key_deterministic(self) -> None: + """Test that same inputs always produce same key.""" + key1 = CacheService.search_key( + title="Test", author="Author", publisher="Pub", language="eng" + ) + key2 = CacheService.search_key( + title="Test", author="Author", publisher="Pub", language="eng" + ) + assert key1 == key2 + + def test_isbn_key(self) -> None: + """Test ISBN key generation.""" + key = CacheService.isbn_key("9780618640157") + assert key == "isbn:9780618640157" + + def test_work_key(self) -> None: + """Test work key generation.""" + key = CacheService.work_key("/works/OL27448W") + assert key == "work:/works/OL27448W" + + +# ============================================================================= +# TTL Tests +# ============================================================================= + + +class TestTTLHandling: + """Tests for TTL jitter and original TTL detection.""" + + def test_jitter_ttl_within_range(self, cache_service: CacheService) -> None: + """Test that jittered TTL is within Β±10% of base.""" + base_ttl = 86400 # 24 hours + + # Run multiple times to test randomness + for _ in range(100): + jittered = cache_service._jitter_ttl(base_ttl) + # Should be within Β±10% + assert base_ttl * 0.89 <= jittered <= base_ttl * 1.11 + + def test_jitter_ttl_minimum(self, cache_service: CacheService) -> None: + """Test that jittered TTL is at least 1.""" + jittered = cache_service._jitter_ttl(1) + assert jittered >= 1 + + def test_get_original_ttl_search(self, cache_service: CacheService) -> None: + """Test TTL detection for search keys.""" + ttl = cache_service._get_original_ttl("search:abc123") + assert ttl == CacheService.TTL_SEARCH_RESULTS + + def test_get_original_ttl_work(self, cache_service: CacheService) -> None: + """Test TTL detection for work keys.""" + ttl = cache_service._get_original_ttl("work:/works/OL27448W") + assert ttl == CacheService.TTL_WORK_DETAILS + + def test_get_original_ttl_isbn(self, cache_service: CacheService) -> None: + """Test TTL detection for ISBN keys.""" + ttl = cache_service._get_original_ttl("isbn:9780618640157") + assert ttl == CacheService.TTL_ISBN_DETAILS + + def test_get_original_ttl_unknown_defaults( + self, cache_service: CacheService + ) -> None: + """Test that unknown keys default to search TTL.""" + ttl = cache_service._get_original_ttl("unknown:key") + assert ttl == CacheService.TTL_SEARCH_RESULTS + + +# ============================================================================= +# Cache Get Tests +# ============================================================================= + + +class TestCacheGet: + """Tests for cache get operation.""" + + @pytest.mark.asyncio + async def test_get_miss_returns_none( + self, cache_service: CacheService, mock_redis: MagicMock + ) -> None: + """Test cache miss returns (None, False).""" + mock_redis.get.return_value = None + + data, needs_revalidation = await cache_service.get("search:abc123") + + assert data is None + assert needs_revalidation is False + mock_redis.get.assert_called_once_with("search:abc123") + + @pytest.mark.asyncio + async def test_get_hit_returns_data( + self, cache_service: CacheService, mock_redis: MagicMock + ) -> None: + """Test cache hit returns parsed data.""" + cached_data = {"title": "Test Book", "authors": ["Author"]} + mock_redis.get.return_value = json.dumps(cached_data) + mock_redis.ttl.return_value = 50000 # Lots of TTL remaining + + data, needs_revalidation = await cache_service.get("search:abc123") + + assert data == cached_data + assert needs_revalidation is False + + @pytest.mark.asyncio + async def test_get_stale_needs_revalidation( + self, cache_service: CacheService, mock_redis: MagicMock + ) -> None: + """Test that near-expiry data triggers revalidation flag.""" + cached_data = {"title": "Test Book"} + mock_redis.get.return_value = json.dumps(cached_data) + # TTL is 10% of original (below 20% threshold) + mock_redis.ttl.return_value = int(CacheService.TTL_SEARCH_RESULTS * 0.1) + + data, needs_revalidation = await cache_service.get("search:abc123") + + assert data == cached_data + assert needs_revalidation is True + + @pytest.mark.asyncio + async def test_get_handles_redis_error( + self, cache_service: CacheService, mock_redis: MagicMock + ) -> None: + """Test that Redis errors return None without raising.""" + mock_redis.get.side_effect = Exception("Redis connection error") + + data, needs_revalidation = await cache_service.get("search:abc123") + + assert data is None + assert needs_revalidation is False + + +# ============================================================================= +# Cache Set Tests +# ============================================================================= + + +class TestCacheSet: + """Tests for cache set operation.""" + + @pytest.mark.asyncio + async def test_set_stores_data( + self, cache_service: CacheService, mock_redis: MagicMock + ) -> None: + """Test that set stores JSON-serialized data.""" + data = {"title": "Test Book", "authors": ["Author"]} + + await cache_service.set("search:abc123", data) + + mock_redis.setex.assert_called_once() + call_args = mock_redis.setex.call_args + assert call_args[0][0] == "search:abc123" + assert json.loads(call_args[0][2]) == data + + @pytest.mark.asyncio + async def test_set_with_custom_ttl( + self, cache_service: CacheService, mock_redis: MagicMock + ) -> None: + """Test that custom TTL is applied with jitter.""" + await cache_service.set("search:abc123", {"data": "test"}, base_ttl=3600) + + call_args = mock_redis.setex.call_args + ttl_used = call_args[0][1] + # Should be within Β±10% of 3600 + assert 3600 * 0.89 <= ttl_used <= 3600 * 1.11 + + @pytest.mark.asyncio + async def test_set_handles_redis_error( + self, cache_service: CacheService, mock_redis: MagicMock + ) -> None: + """Test that Redis errors don't raise.""" + mock_redis.setex.side_effect = Exception("Redis error") + + # Should not raise + await cache_service.set("search:abc123", {"data": "test"}) + + +# ============================================================================= +# Cache Invalidation Tests +# ============================================================================= + + +class TestCacheInvalidation: + """Tests for cache invalidation.""" + + @pytest.mark.asyncio + async def test_invalidate_single_key( + self, cache_service: CacheService, mock_redis: MagicMock + ) -> None: + """Test single key invalidation.""" + await cache_service.invalidate("search:abc123") + + mock_redis.delete.assert_called_once_with("search:abc123") + + @pytest.mark.asyncio + async def test_invalidate_pattern( + self, cache_service: CacheService, mock_redis: MagicMock + ) -> None: + """Test pattern-based invalidation.""" + # Mock scan_iter to return some keys + mock_keys = [b"search:key1", b"search:key2", b"search:key3"] + + async def mock_scan_iter(match: str): + for key in mock_keys: + yield key + + mock_redis.scan_iter = mock_scan_iter + + count = await cache_service.invalidate_pattern("search:*") + + assert count == 3 + assert mock_redis.delete.call_count == 3 + + @pytest.mark.asyncio + async def test_invalidate_handles_error( + self, cache_service: CacheService, mock_redis: MagicMock + ) -> None: + """Test that invalidation errors don't raise.""" + mock_redis.delete.side_effect = Exception("Redis error") + + # Should not raise + await cache_service.invalidate("search:abc123") diff --git a/backend/tests/unit/test_library_service.py b/backend/tests/unit/test_library_service.py new file mode 100644 index 0000000..bd00939 --- /dev/null +++ b/backend/tests/unit/test_library_service.py @@ -0,0 +1,322 @@ +"""Tests for LibraryService. + +Tests the library persistence layer with mocked repositories. +""" + +from unittest.mock import AsyncMock, MagicMock +from uuid import UUID + +import pytest + +from bookbytes.models.edition import Edition +from bookbytes.models.work import Work +from bookbytes.services.library import LibraryService +from bookbytes.services.openlibrary import BookSearchResult + +# ============================================================================= +# Fixtures +# ============================================================================= + + +@pytest.fixture +def mock_work_repo(): + """Create a mock WorkRepository.""" + repo = MagicMock() + repo.get = AsyncMock(return_value=None) + repo.create = AsyncMock() + return repo + + +@pytest.fixture +def mock_edition_repo(): + """Create a mock EditionRepository.""" + repo = MagicMock() + repo.get_by_isbn = AsyncMock(return_value=None) + repo.get_latest_by_work = AsyncMock(return_value=None) + repo.isbn_exists = AsyncMock(return_value=False) + repo.create = AsyncMock() + return repo + + +@pytest.fixture +def mock_provider_repo(): + """Create a mock BookProviderRepository.""" + repo = MagicMock() + repo.get_by_provider_key = AsyncMock(return_value=None) + repo.create_work_mapping = AsyncMock() + repo.create_edition_mapping = AsyncMock() + return repo + + +@pytest.fixture +def library_service(mock_work_repo, mock_edition_repo, mock_provider_repo): + """Create LibraryService with mocked repositories.""" + return LibraryService( + work_repo=mock_work_repo, + edition_repo=mock_edition_repo, + provider_repo=mock_provider_repo, + ) + + +@pytest.fixture +def sample_search_result() -> BookSearchResult: + """Create a sample BookSearchResult.""" + return BookSearchResult( + title="The Lord of the Rings", + authors=["J. R. R. Tolkien"], + first_publish_year=1954, + cover_url="https://covers.openlibrary.org/b/id/258027-M.jpg", + isbn_list=["9780618640157"], + edition_count=120, + subjects=["Fantasy", "Epic"], + external_work_key="/works/OL27448W", + ) + + +@pytest.fixture +def sample_work() -> Work: + """Create a sample Work.""" + work = Work( + title="The Lord of the Rings", + authors=["J. R. R. Tolkien"], + subjects=["Fantasy"], + first_publish_year=1954, + ) + # Mock the ID + work.id = UUID("01234567-89ab-cdef-0123-456789abcdef") + return work + + +@pytest.fixture +def sample_edition() -> Edition: + """Create a sample Edition.""" + edition = Edition( + work_id=UUID("01234567-89ab-cdef-0123-456789abcdef"), + isbn="9780618640157", + isbn_type="isbn13", + title="The Lord of the Rings", + language="eng", + ) + edition.id = UUID("fedcba98-7654-3210-fedc-ba9876543210") + return edition + + +# ============================================================================= +# Work Operations Tests +# ============================================================================= + + +class TestFindWorkByProvider: + """Tests for find_work_by_provider method.""" + + @pytest.mark.asyncio + async def test_find_work_returns_none_when_not_found( + self, + library_service: LibraryService, + mock_provider_repo: MagicMock, + ) -> None: + """Test returns None when no mapping exists.""" + mock_provider_repo.get_by_provider_key.return_value = None + + result = await library_service.find_work_by_provider( + "openlibrary", "/works/OL12345W" + ) + + assert result is None + mock_provider_repo.get_by_provider_key.assert_called_once() + + @pytest.mark.asyncio + async def test_find_work_returns_work_when_found( + self, + library_service: LibraryService, + mock_provider_repo: MagicMock, + mock_work_repo: MagicMock, + sample_work: Work, + ) -> None: + """Test returns Work when mapping exists.""" + mock_mapping = MagicMock() + mock_mapping.work_id = sample_work.id + mock_provider_repo.get_by_provider_key.return_value = mock_mapping + mock_work_repo.get.return_value = sample_work + + result = await library_service.find_work_by_provider( + "openlibrary", "/works/OL27448W" + ) + + assert result == sample_work + mock_work_repo.get.assert_called_once_with(sample_work.id) + + +class TestGetOrCreateWork: + """Tests for get_or_create_work method.""" + + @pytest.mark.asyncio + async def test_returns_existing_work( + self, + library_service: LibraryService, + mock_provider_repo: MagicMock, + mock_work_repo: MagicMock, + sample_search_result: BookSearchResult, + sample_work: Work, + ) -> None: + """Test returns existing work if already in library.""" + mock_mapping = MagicMock() + mock_mapping.work_id = sample_work.id + mock_provider_repo.get_by_provider_key.return_value = mock_mapping + mock_work_repo.get.return_value = sample_work + + result = await library_service.get_or_create_work(sample_search_result) + + assert result == sample_work + mock_work_repo.create.assert_not_called() + + @pytest.mark.asyncio + async def test_creates_new_work( + self, + library_service: LibraryService, + mock_provider_repo: MagicMock, + mock_work_repo: MagicMock, + sample_search_result: BookSearchResult, + sample_work: Work, + ) -> None: + """Test creates new work if not in library.""" + mock_provider_repo.get_by_provider_key.return_value = None + mock_work_repo.create.return_value = sample_work + + result = await library_service.get_or_create_work(sample_search_result) + + assert result == sample_work + mock_work_repo.create.assert_called_once() + mock_provider_repo.create_work_mapping.assert_called_once() + + +# ============================================================================= +# Edition Operations Tests +# ============================================================================= + + +class TestFindByISBN: + """Tests for find_by_isbn method.""" + + @pytest.mark.asyncio + async def test_find_by_isbn_normalizes_input( + self, + library_service: LibraryService, + mock_edition_repo: MagicMock, + ) -> None: + """Test ISBN is normalized before lookup.""" + await library_service.find_by_isbn("978-0-618-64015-7") + + # Should be called with normalized ISBN + mock_edition_repo.get_by_isbn.assert_called_once_with("9780618640157") + + @pytest.mark.asyncio + async def test_find_by_isbn_returns_edition( + self, + library_service: LibraryService, + mock_edition_repo: MagicMock, + sample_edition: Edition, + ) -> None: + """Test returns edition when found.""" + mock_edition_repo.get_by_isbn.return_value = sample_edition + + result = await library_service.find_by_isbn("9780618640157") + + assert result == sample_edition + + +class TestStoreEdition: + """Tests for store_edition method.""" + + @pytest.mark.asyncio + async def test_store_edition_returns_existing( + self, + library_service: LibraryService, + mock_edition_repo: MagicMock, + sample_work: Work, + sample_edition: Edition, + ) -> None: + """Test returns existing edition if ISBN already exists.""" + mock_edition_repo.get_by_isbn.return_value = sample_edition + + result = await library_service.store_edition( + work=sample_work, + isbn="9780618640157", + ) + + assert result == sample_edition + mock_edition_repo.create.assert_not_called() + + @pytest.mark.asyncio + async def test_store_edition_creates_new( + self, + library_service: LibraryService, + mock_edition_repo: MagicMock, + sample_work: Work, + sample_edition: Edition, + ) -> None: + """Test creates new edition if not exists.""" + mock_edition_repo.get_by_isbn.return_value = None + mock_edition_repo.create.return_value = sample_edition + + result = await library_service.store_edition( + work=sample_work, + isbn="9780618640157", + title="Special Edition", + publisher="Houghton Mifflin", + publish_year=2004, + ) + + assert result == sample_edition + mock_edition_repo.create.assert_called_once() + + @pytest.mark.asyncio + async def test_store_edition_with_provider_mapping( + self, + library_service: LibraryService, + mock_edition_repo: MagicMock, + mock_provider_repo: MagicMock, + sample_work: Work, + sample_edition: Edition, + ) -> None: + """Test creates provider mapping when external_key provided.""" + mock_edition_repo.get_by_isbn.return_value = None + mock_edition_repo.create.return_value = sample_edition + + await library_service.store_edition( + work=sample_work, + isbn="9780618640157", + external_key="/books/OL12345M", + ) + + mock_provider_repo.create_edition_mapping.assert_called_once() + + +class TestISBNExists: + """Tests for isbn_exists method.""" + + @pytest.mark.asyncio + async def test_isbn_exists_true( + self, + library_service: LibraryService, + mock_edition_repo: MagicMock, + ) -> None: + """Test returns True when ISBN exists.""" + mock_edition_repo.isbn_exists.return_value = True + + result = await library_service.isbn_exists("9780618640157") + + assert result is True + + @pytest.mark.asyncio + async def test_isbn_exists_false( + self, + library_service: LibraryService, + mock_edition_repo: MagicMock, + ) -> None: + """Test returns False when ISBN doesn't exist.""" + mock_edition_repo.isbn_exists.return_value = False + + result = await library_service.isbn_exists("0000000000000") + + assert result is False diff --git a/backend/tests/unit/test_openlibrary_service.py b/backend/tests/unit/test_openlibrary_service.py new file mode 100644 index 0000000..fbdedeb --- /dev/null +++ b/backend/tests/unit/test_openlibrary_service.py @@ -0,0 +1,496 @@ +"""Tests for OpenLibraryService. + +Tests the OpenLibrary API client with mocked HTTP responses. +""" + +from unittest.mock import AsyncMock, MagicMock, patch + +import httpx +import pytest + +from bookbytes.services.openlibrary import ( + BookSearchResult, + OpenLibraryError, + OpenLibraryRateLimitError, + OpenLibraryService, + SearchResponse, + WorkDetails, +) + +# ============================================================================= +# Mock Response Data +# ============================================================================= + +MOCK_SEARCH_RESPONSE = { + "numFound": 2, + "start": 0, + "docs": [ + { + "key": "/works/OL27448W", + "title": "The Lord of the Rings", + "author_name": ["J. R. R. Tolkien"], + "author_key": ["OL26320A"], + "first_publish_year": 1954, + "edition_count": 120, + "cover_i": 258027, + "isbn": ["9780618640157", "0618640150"], + "language": ["eng", "spa"], + "publisher": ["Houghton Mifflin"], + "subject": ["Fantasy", "Epic"], + }, + { + "key": "/works/OL12345W", + "title": "The Hobbit", + "author_name": ["J. R. R. Tolkien"], + "first_publish_year": 1937, + "edition_count": 50, + "cover_i": None, + "isbn": ["9780618968633"], + "subject": ["Fantasy"], + }, + ], +} + +MOCK_WORK_RESPONSE = { + "key": "/works/OL27448W", + "title": "The Lord of the Rings", + "description": "An epic fantasy novel.", + "subjects": ["Fantasy", "Adventure", "Epic"], + "covers": [258027, 258028], +} + + +# ============================================================================= +# Fixtures +# ============================================================================= + + +@pytest.fixture +def mock_cache() -> MagicMock: + """Create a mock CacheService.""" + cache = MagicMock() + cache.get = AsyncMock(return_value=(None, False)) # Cache miss by default + cache.set = AsyncMock(return_value=None) + return cache + + +@pytest.fixture +def openlibrary_service(mock_cache: MagicMock) -> OpenLibraryService: + """Create OpenLibraryService with mock cache.""" + return OpenLibraryService(mock_cache) + + +# ============================================================================= +# DTO Tests +# ============================================================================= + + +class TestBookSearchResult: + """Tests for BookSearchResult DTO.""" + + def test_to_dict(self) -> None: + """Test serialization to dict.""" + result = BookSearchResult( + title="Test Book", + authors=["Author One", "Author Two"], + first_publish_year=2020, + cover_url="https://example.com/cover.jpg", + isbn_list=["9781234567890"], + edition_count=5, + subjects=["Fiction"], + external_work_key="/works/OL12345W", + ) + + data = result.to_dict() + + assert data["title"] == "Test Book" + assert data["authors"] == ["Author One", "Author Two"] + assert data["first_publish_year"] == 2020 + assert data["cover_url"] == "https://example.com/cover.jpg" + assert data["isbn_list"] == ["9781234567890"] + assert data["edition_count"] == 5 + assert data["subjects"] == ["Fiction"] + assert data["external_work_key"] == "/works/OL12345W" + assert data["source_provider"] == "openlibrary" + assert "fetched_at" in data + + def test_from_dict(self) -> None: + """Test deserialization from dict.""" + data = { + "title": "Test Book", + "authors": ["Author"], + "first_publish_year": 2020, + "cover_url": None, + "isbn_list": [], + "edition_count": 1, + "subjects": [], + "external_work_key": "/works/OL12345W", + "source_provider": "openlibrary", + "fetched_at": "2024-01-01T00:00:00", + } + + result = BookSearchResult.from_dict(data) + + assert result.title == "Test Book" + assert result.authors == ["Author"] + assert result.first_publish_year == 2020 + assert result.external_work_key == "/works/OL12345W" + + +class TestWorkDetails: + """Tests for WorkDetails DTO.""" + + def test_to_dict_and_from_dict_roundtrip(self) -> None: + """Test serialization roundtrip.""" + original = WorkDetails( + title="Epic Novel", + authors=["Famous Author"], + description="A great book.", + subjects=["Fiction", "Drama"], + first_publish_year=1990, + cover_url="https://example.com/cover.jpg", + edition_count=10, + isbn_list=["9781234567890", "1234567890"], + external_work_key="/works/OL99999W", + ) + + data = original.to_dict() + restored = WorkDetails.from_dict(data) + + assert restored.title == original.title + assert restored.authors == original.authors + assert restored.description == original.description + assert restored.subjects == original.subjects + assert restored.external_work_key == original.external_work_key + + +class TestSearchResponse: + """Tests for SearchResponse container.""" + + def test_has_more_true(self) -> None: + """Test has_more when more results exist.""" + response = SearchResponse( + results=[], + total_found=100, + offset=0, + limit=10, + ) + # At offset 0 with 10 results, 100 total -> has more + response.results = [MagicMock()] * 10 + assert response.has_more is True + + def test_has_more_false(self) -> None: + """Test has_more when no more results.""" + response = SearchResponse( + results=[MagicMock()] * 5, + total_found=5, + offset=0, + limit=10, + ) + assert response.has_more is False + + +# ============================================================================= +# Search Tests +# ============================================================================= + + +class TestSearchBooks: + """Tests for search_books method.""" + + @pytest.mark.asyncio + async def test_search_cache_hit( + self, openlibrary_service: OpenLibraryService, mock_cache: MagicMock + ) -> None: + """Test that cached results are returned without API call.""" + cached_data = { + "results": [ + { + "title": "Cached Book", + "authors": ["Author"], + "first_publish_year": 2020, + "cover_url": None, + "isbn_list": [], + "edition_count": 1, + "subjects": [], + "external_work_key": "/works/OL123W", + "source_provider": "openlibrary", + "fetched_at": "2024-01-01", + } + ], + "total_found": 1, + "offset": 0, + "limit": 100, + } + mock_cache.get.return_value = (cached_data, False) + + result = await openlibrary_service.search_books(title="Cached Book") + + assert len(result.results) == 1 + assert result.results[0].title == "Cached Book" + mock_cache.get.assert_called_once() + + @pytest.mark.asyncio + async def test_search_cache_miss_calls_api( + self, openlibrary_service: OpenLibraryService, mock_cache: MagicMock + ) -> None: + """Test that API is called on cache miss.""" + mock_cache.get.return_value = (None, False) + + mock_response = MagicMock() + mock_response.json.return_value = MOCK_SEARCH_RESPONSE + mock_response.raise_for_status = MagicMock() + + with patch.object(openlibrary_service, "_get_client") as mock_get_client: + mock_client = AsyncMock() + mock_client.get.return_value = mock_response + mock_get_client.return_value = mock_client + + result = await openlibrary_service.search_books(title="Lord of the Rings") + + assert result.total_found == 2 + assert len(result.results) == 2 + assert result.results[0].title == "The Lord of the Rings" + assert result.results[0].authors == ["J. R. R. Tolkien"] + mock_cache.set.assert_called_once() + + @pytest.mark.asyncio + async def test_search_with_all_parameters( + self, openlibrary_service: OpenLibraryService, mock_cache: MagicMock + ) -> None: + """Test search with all optional parameters.""" + mock_cache.get.return_value = (None, False) + + mock_response = MagicMock() + mock_response.json.return_value = MOCK_SEARCH_RESPONSE + mock_response.raise_for_status = MagicMock() + + with patch.object(openlibrary_service, "_get_client") as mock_get_client: + mock_client = AsyncMock() + mock_client.get.return_value = mock_response + mock_get_client.return_value = mock_client + + await openlibrary_service.search_books( + title="Test", + author="Author", + publisher="Publisher", + language="eng", + ) + + # Verify all params were passed + call_args = mock_client.get.call_args + params = call_args[1]["params"] + assert "author" in params + assert "publisher" in params + assert "language" in params + + +# ============================================================================= +# Work Details Tests +# ============================================================================= + + +class TestGetWorkDetails: + """Tests for get_work_details method.""" + + @pytest.mark.asyncio + async def test_get_work_details_cache_hit( + self, openlibrary_service: OpenLibraryService, mock_cache: MagicMock + ) -> None: + """Test cached work details are returned.""" + cached_data = { + "title": "Cached Work", + "authors": [], + "description": "Cached description", + "subjects": [], + "first_publish_year": None, + "cover_url": None, + "edition_count": 0, + "isbn_list": [], + "external_work_key": "/works/OL123W", + "source_provider": "openlibrary", + "fetched_at": "2024-01-01", + } + mock_cache.get.return_value = (cached_data, False) + + result = await openlibrary_service.get_work_details("/works/OL123W") + + assert result.title == "Cached Work" + assert result.description == "Cached description" + + @pytest.mark.asyncio + async def test_get_work_details_api_call( + self, openlibrary_service: OpenLibraryService, mock_cache: MagicMock + ) -> None: + """Test API is called for uncached work.""" + mock_cache.get.return_value = (None, False) + + mock_response = MagicMock() + mock_response.json.return_value = MOCK_WORK_RESPONSE + mock_response.raise_for_status = MagicMock() + + with patch.object(openlibrary_service, "_get_client") as mock_get_client: + mock_client = AsyncMock() + mock_client.get.return_value = mock_response + mock_get_client.return_value = mock_client + + result = await openlibrary_service.get_work_details("/works/OL27448W") + + assert result.title == "The Lord of the Rings" + assert result.description == "An epic fantasy novel." + assert "Fantasy" in result.subjects + mock_cache.set.assert_called_once() + + +# ============================================================================= +# Error Handling Tests +# ============================================================================= + + +class TestErrorHandling: + """Tests for error handling.""" + + @pytest.mark.asyncio + async def test_rate_limit_error( + self, openlibrary_service: OpenLibraryService, mock_cache: MagicMock + ) -> None: + """Test rate limit error is raised properly.""" + mock_cache.get.return_value = (None, False) + + mock_response = MagicMock() + mock_response.status_code = 429 + mock_response.raise_for_status.side_effect = httpx.HTTPStatusError( + "Rate limited", + request=MagicMock(), + response=mock_response, + ) + + with patch.object(openlibrary_service, "_get_client") as mock_get_client: + mock_client = AsyncMock() + mock_client.get.return_value = mock_response + mock_get_client.return_value = mock_client + + with pytest.raises(OpenLibraryRateLimitError): + await openlibrary_service.search_books(title="Test") + + @pytest.mark.asyncio + async def test_api_error( + self, openlibrary_service: OpenLibraryService, mock_cache: MagicMock + ) -> None: + """Test generic API error is raised.""" + mock_cache.get.return_value = (None, False) + + mock_response = MagicMock() + mock_response.status_code = 500 + mock_response.raise_for_status.side_effect = httpx.HTTPStatusError( + "Server error", + request=MagicMock(), + response=mock_response, + ) + + with patch.object(openlibrary_service, "_get_client") as mock_get_client: + mock_client = AsyncMock() + mock_client.get.return_value = mock_response + mock_get_client.return_value = mock_client + + with pytest.raises(OpenLibraryError): + await openlibrary_service.search_books(title="Test") + + @pytest.mark.asyncio + async def test_network_error( + self, openlibrary_service: OpenLibraryService, mock_cache: MagicMock + ) -> None: + """Test network error is handled.""" + mock_cache.get.return_value = (None, False) + + with patch.object(openlibrary_service, "_get_client") as mock_get_client: + mock_client = AsyncMock() + mock_client.get.side_effect = httpx.RequestError( + "Connection failed", request=MagicMock() + ) + mock_get_client.return_value = mock_client + + with pytest.raises(OpenLibraryError): + await openlibrary_service.search_books(title="Test") + + +# ============================================================================= +# Helper Method Tests +# ============================================================================= + + +class TestHelperMethods: + """Tests for helper methods.""" + + @pytest.mark.asyncio + async def test_get_all_isbns_for_work( + self, openlibrary_service: OpenLibraryService, mock_cache: MagicMock + ) -> None: + """Test ISBN collection delegates to get_work_details.""" + cached_data = { + "title": "Test", + "authors": [], + "description": None, + "subjects": [], + "first_publish_year": None, + "cover_url": None, + "edition_count": 5, + "isbn_list": ["9781234567890", "1234567890"], + "external_work_key": "/works/OL123W", + "source_provider": "openlibrary", + "fetched_at": "2024-01-01", + } + mock_cache.get.return_value = (cached_data, False) + + isbns = await openlibrary_service.get_all_isbns_for_work("/works/OL123W") + + assert isbns == ["9781234567890", "1234567890"] + + def test_user_agent_format(self, openlibrary_service: OpenLibraryService) -> None: + """Test User-Agent header format.""" + ua = openlibrary_service._user_agent + + assert "BookBytes" in ua + assert "contact" in ua.lower() or "@" in ua + + +# ============================================================================= +# Response Parsing Tests +# ============================================================================= + + +class TestResponseParsing: + """Tests for API response parsing.""" + + def test_parse_search_response_with_cover( + self, openlibrary_service: OpenLibraryService + ) -> None: + """Test search response parsing generates cover URL.""" + response = openlibrary_service._parse_search_response(MOCK_SEARCH_RESPONSE) + + assert response.results[0].cover_url is not None + assert "258027" in response.results[0].cover_url + + def test_parse_search_response_without_cover( + self, openlibrary_service: OpenLibraryService + ) -> None: + """Test search response parsing handles missing cover.""" + response = openlibrary_service._parse_search_response(MOCK_SEARCH_RESPONSE) + + # Second result has no cover + assert response.results[1].cover_url is None + + def test_parse_work_response_dict_description( + self, openlibrary_service: OpenLibraryService + ) -> None: + """Test work parsing handles dict description format.""" + work_data = { + "title": "Test", + "description": {"value": "Description from dict"}, + "subjects": [], + "covers": [], + } + + result = openlibrary_service._parse_work_response(work_data, "/works/OL123W") + + assert result.description == "Description from dict" diff --git a/backend/tests/unit/test_processing_endpoints.py b/backend/tests/unit/test_processing_endpoints.py new file mode 100644 index 0000000..4f16151 --- /dev/null +++ b/backend/tests/unit/test_processing_endpoints.py @@ -0,0 +1,176 @@ +"""Unit tests for processing endpoints. + +Tests the processing API endpoints including request validation, +job creation, and status checking. +""" + +import pytest +from httpx import AsyncClient +from pydantic import ValidationError + +from bookbytes.schemas.processing import ( + JobStatusResponse, + ProcessRequest, + RefreshRequest, +) + +# ============================================================================= +# Schema Validation Tests +# ============================================================================= + + +class TestProcessRequestValidation: + """Test ProcessRequest schema validation.""" + + def test_valid_edition_id(self) -> None: + """Edition ID only is valid.""" + request = ProcessRequest( + edition_id="01234567-89ab-cdef-0123-456789abcdef", + ) + assert request.edition_id is not None + assert request.isbn is None + + def test_valid_isbn(self) -> None: + """ISBN only is valid.""" + request = ProcessRequest(isbn="978-0-13-468599-1") + assert request.isbn == "978-0-13-468599-1" + assert request.edition_id is None + + def test_isbn_without_hyphens(self) -> None: + """ISBN without hyphens is valid.""" + request = ProcessRequest(isbn="9780134685991") + assert request.isbn == "9780134685991" + + def test_neither_provided_raises_error(self) -> None: + """Neither edition_id nor isbn raises validation error.""" + with pytest.raises(ValidationError) as exc_info: + ProcessRequest() + + errors = exc_info.value.errors() + assert len(errors) == 1 + assert "edition_id or isbn" in str(errors[0]["msg"]).lower() + + def test_both_provided_raises_error(self) -> None: + """Both edition_id and isbn raises validation error.""" + with pytest.raises(ValidationError) as exc_info: + ProcessRequest( + edition_id="01234567-89ab-cdef-0123-456789abcdef", + isbn="978-0-13-468599-1", + ) + + errors = exc_info.value.errors() + assert len(errors) == 1 + assert "only one" in str(errors[0]["msg"]).lower() + + def test_invalid_uuid_raises_error(self) -> None: + """Invalid UUID format raises validation error.""" + with pytest.raises(ValidationError): + ProcessRequest(edition_id="not-a-uuid") + + def test_isbn_too_short_raises_error(self) -> None: + """ISBN shorter than 10 chars raises validation error.""" + with pytest.raises(ValidationError): + ProcessRequest(isbn="123456789") # 9 chars + + def test_isbn_too_long_raises_error(self) -> None: + """ISBN longer than 17 chars raises validation error.""" + with pytest.raises(ValidationError): + ProcessRequest(isbn="978-0-13-468599-1-2-3") # too long + + +class TestRefreshRequestValidation: + """Test RefreshRequest schema validation.""" + + def test_default_force_is_false(self) -> None: + """Default force value is False.""" + request = RefreshRequest() + assert request.force is False + + def test_force_true(self) -> None: + """Force can be set to True.""" + request = RefreshRequest(force=True) + assert request.force is True + + +class TestJobStatusResponse: + """Test JobStatusResponse schema.""" + + def test_from_attributes_enabled(self) -> None: + """Can create from ORM model.""" + assert JobStatusResponse.model_config.get("from_attributes") is True + + +# ============================================================================= +# Endpoint Integration Tests +# ============================================================================= + + +@pytest.mark.integration +class TestProcessEndpoint: + """Test POST /books/process endpoint.""" + + async def test_process_returns_501_not_implemented( + self, + async_client: AsyncClient, + ) -> None: + """Process returns 501 until service is implemented.""" + response = await async_client.post( + "/api/v1/books/process", + json={"isbn": "9780134685991"}, + ) + assert response.status_code == 501 + assert "not yet implemented" in response.json()["detail"].lower() + + async def test_process_validates_request_body( + self, + async_client: AsyncClient, + ) -> None: + """Process validates request body.""" + response = await async_client.post( + "/api/v1/books/process", + json={}, # Empty body - neither edition_id nor isbn + ) + assert response.status_code == 422 + + +@pytest.mark.integration +class TestRefreshEndpoint: + """Test POST /books/{audio_book_id}/refresh endpoint.""" + + async def test_refresh_returns_501_not_implemented( + self, + async_client: AsyncClient, + ) -> None: + """Refresh returns 501 until service is implemented.""" + response = await async_client.post( + "/api/v1/books/01234567-89ab-cdef-0123-456789abcdef/refresh", + json={}, + ) + assert response.status_code == 501 + assert "not yet implemented" in response.json()["detail"].lower() + + +@pytest.mark.integration +class TestJobStatusEndpoint: + """Test GET /books/jobs/{job_id} endpoint.""" + + async def test_job_status_returns_501_not_implemented( + self, + async_client: AsyncClient, + ) -> None: + """Job status returns 501 until repository is implemented.""" + response = await async_client.get( + "/api/v1/books/jobs/01234567-89ab-cdef-0123-456789abcdef", + ) + assert response.status_code == 501 + assert "not yet implemented" in response.json()["detail"].lower() + + async def test_job_status_invalid_uuid_returns_422( + self, + async_client: AsyncClient, + ) -> None: + """Invalid UUID returns 422.""" + response = await async_client.get( + "/api/v1/books/jobs/not-a-uuid", + ) + assert response.status_code == 422 diff --git a/backend/tests/unit/test_search_endpoints.py b/backend/tests/unit/test_search_endpoints.py new file mode 100644 index 0000000..c47c843 --- /dev/null +++ b/backend/tests/unit/test_search_endpoints.py @@ -0,0 +1,327 @@ +"""Tests for search API endpoints. + +Tests the /api/v1/books/search, /works, and /isbn endpoints. +""" + +from unittest.mock import AsyncMock, MagicMock + +import pytest +from httpx import ASGITransport, AsyncClient + +from bookbytes.api.v1.search import get_openlibrary_service +from bookbytes.main import create_app +from bookbytes.services.cache import get_cache_service +from bookbytes.services.openlibrary import BookSearchResult, SearchResponse, WorkDetails + +# ============================================================================= +# Fixtures +# ============================================================================= + + +@pytest.fixture +def mock_cache_service(): + """Create a mock CacheService.""" + cache = MagicMock() + cache.get = AsyncMock(return_value=(None, False)) + cache.set = AsyncMock(return_value=None) + return cache + + +@pytest.fixture +def mock_openlibrary_service(): + """Create a mock OpenLibraryService.""" + service = MagicMock() + service.search_books = AsyncMock() + service.get_work_details = AsyncMock() + service.close = AsyncMock() + return service + + +@pytest.fixture +def sample_search_result() -> BookSearchResult: + """Create a sample search result.""" + return BookSearchResult( + title="The Lord of the Rings", + authors=["J. R. R. Tolkien"], + first_publish_year=1954, + cover_url="https://covers.openlibrary.org/b/id/258027-M.jpg", + isbn_list=["9780618640157", "0618640150"], + edition_count=120, + subjects=["Fantasy", "Epic"], + external_work_key="/works/OL27448W", + ) + + +@pytest.fixture +def sample_work_details() -> WorkDetails: + """Create sample work details.""" + return WorkDetails( + title="The Lord of the Rings", + authors=["J. R. R. Tolkien"], + description="An epic fantasy novel.", + subjects=["Fantasy", "Adventure"], + first_publish_year=1954, + cover_url="https://covers.openlibrary.org/b/id/258027-M.jpg", + edition_count=120, + isbn_list=["9780618640157"], + external_work_key="/works/OL27448W", + ) + + +@pytest.fixture +def test_app(mock_cache_service, mock_openlibrary_service): + """Create app with mocked dependencies.""" + app = create_app() + + # Override dependencies + app.dependency_overrides[get_cache_service] = lambda: mock_cache_service + app.dependency_overrides[get_openlibrary_service] = lambda: mock_openlibrary_service + + yield app + + # Cleanup + app.dependency_overrides.clear() + + +# ============================================================================= +# Search Endpoint Tests +# ============================================================================= + + +class TestSearchBooksEndpoint: + """Tests for POST /api/v1/books/search.""" + + @pytest.mark.asyncio + async def test_search_returns_results( + self, + test_app, + mock_openlibrary_service: MagicMock, + sample_search_result: BookSearchResult, + ) -> None: + """Test successful search returns results.""" + mock_openlibrary_service.search_books.return_value = SearchResponse( + results=[sample_search_result], + total_found=1, + offset=0, + limit=100, + ) + + async with AsyncClient( + transport=ASGITransport(app=test_app), + base_url="http://test", + ) as client: + response = await client.post( + "/api/v1/books/search", + json={"title": "Lord of the Rings"}, + ) + + assert response.status_code == 200 + data = response.json() + assert data["total_found"] == 1 + assert len(data["results"]) == 1 + assert data["results"][0]["title"] == "The Lord of the Rings" + assert data["results"][0]["authors"] == ["J. R. R. Tolkien"] + + @pytest.mark.asyncio + async def test_search_with_pagination( + self, + test_app, + mock_openlibrary_service: MagicMock, + sample_search_result: BookSearchResult, + ) -> None: + """Test search with pagination parameters.""" + mock_openlibrary_service.search_books.return_value = SearchResponse( + results=[sample_search_result] * 50, + total_found=100, + offset=0, + limit=100, + ) + + async with AsyncClient( + transport=ASGITransport(app=test_app), + base_url="http://test", + ) as client: + response = await client.post( + "/api/v1/books/search?page=1&page_size=10", + json={"title": "Test"}, + ) + + assert response.status_code == 200 + data = response.json() + assert data["page"] == 1 + assert data["page_size"] == 10 + assert len(data["results"]) <= 10 + assert data["has_more"] is True + + @pytest.mark.asyncio + async def test_search_with_all_filters( + self, + test_app, + mock_openlibrary_service: MagicMock, + sample_search_result: BookSearchResult, + ) -> None: + """Test search with all filter parameters.""" + mock_openlibrary_service.search_books.return_value = SearchResponse( + results=[sample_search_result], + total_found=1, + offset=0, + limit=100, + ) + + async with AsyncClient( + transport=ASGITransport(app=test_app), + base_url="http://test", + ) as client: + response = await client.post( + "/api/v1/books/search", + json={ + "title": "Lord of the Rings", + "author": "Tolkien", + "publisher": "Houghton", + "language": "eng", + }, + ) + + assert response.status_code == 200 + # Verify service was called with filters + mock_openlibrary_service.search_books.assert_called_once() + call_kwargs = mock_openlibrary_service.search_books.call_args.kwargs + assert call_kwargs["author"] == "Tolkien" + assert call_kwargs["publisher"] == "Houghton" + assert call_kwargs["language"] == "eng" + + @pytest.mark.asyncio + async def test_search_empty_title_returns_422( + self, + test_app, + ) -> None: + """Test search with empty title returns validation error.""" + async with AsyncClient( + transport=ASGITransport(app=test_app), + base_url="http://test", + ) as client: + response = await client.post( + "/api/v1/books/search", + json={"title": ""}, + ) + + assert response.status_code == 422 # Validation error + + @pytest.mark.asyncio + async def test_search_no_results( + self, + test_app, + mock_openlibrary_service: MagicMock, + ) -> None: + """Test search with no results.""" + mock_openlibrary_service.search_books.return_value = SearchResponse( + results=[], + total_found=0, + offset=0, + limit=100, + ) + + async with AsyncClient( + transport=ASGITransport(app=test_app), + base_url="http://test", + ) as client: + response = await client.post( + "/api/v1/books/search", + json={"title": "nonexistentbook12345"}, + ) + + assert response.status_code == 200 + data = response.json() + assert data["total_found"] == 0 + assert data["results"] == [] + + +# ============================================================================= +# Work Details Endpoint Tests +# ============================================================================= + + +class TestGetWorkDetailsEndpoint: + """Tests for GET /api/v1/books/works/{work_key}.""" + + @pytest.mark.asyncio + async def test_get_work_details_success( + self, + test_app, + mock_openlibrary_service: MagicMock, + sample_work_details: WorkDetails, + ) -> None: + """Test successful work details fetch.""" + mock_openlibrary_service.get_work_details.return_value = sample_work_details + + async with AsyncClient( + transport=ASGITransport(app=test_app), + base_url="http://test", + ) as client: + response = await client.get("/api/v1/books/works/works/OL27448W") + + assert response.status_code == 200 + data = response.json() + assert data["title"] == "The Lord of the Rings" + assert data["description"] == "An epic fantasy novel." + assert data["external_work_key"] == "/works/OL27448W" + + +# ============================================================================= +# ISBN Lookup Endpoint Tests +# ============================================================================= + + +class TestLookupByISBNEndpoint: + """Tests for GET /api/v1/books/isbn/{isbn}.""" + + @pytest.mark.asyncio + async def test_isbn_lookup_success( + self, + test_app, + mock_openlibrary_service: MagicMock, + sample_search_result: BookSearchResult, + sample_work_details: WorkDetails, + ) -> None: + """Test successful ISBN lookup.""" + mock_openlibrary_service.search_books.return_value = SearchResponse( + results=[sample_search_result], + total_found=1, + offset=0, + limit=100, + ) + mock_openlibrary_service.get_work_details.return_value = sample_work_details + + async with AsyncClient( + transport=ASGITransport(app=test_app), + base_url="http://test", + ) as client: + response = await client.get("/api/v1/books/isbn/9780618640157") + + assert response.status_code == 200 + data = response.json() + assert data["title"] == "The Lord of the Rings" + + @pytest.mark.asyncio + async def test_isbn_not_found_returns_404( + self, + test_app, + mock_openlibrary_service: MagicMock, + ) -> None: + """Test ISBN not found returns 404.""" + mock_openlibrary_service.search_books.return_value = SearchResponse( + results=[], + total_found=0, + offset=0, + limit=100, + ) + + async with AsyncClient( + transport=ASGITransport(app=test_app), + base_url="http://test", + ) as client: + response = await client.get("/api/v1/books/isbn/0000000000000") + + assert response.status_code == 404 + data = response.json() + assert data["error"]["code"] == "BOOK_NOT_FOUND" diff --git a/backend/uv.lock b/backend/uv.lock new file mode 100644 index 0000000..bac3b96 --- /dev/null +++ b/backend/uv.lock @@ -0,0 +1,2305 @@ +version = 1 +revision = 3 +requires-python = ">=3.13" + +[[package]] +name = "aioboto3" +version = "15.5.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "aiobotocore", extra = ["boto3"] }, + { name = "aiofiles" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/a2/01/92e9ab00f36e2899315f49eefcd5b4685fbb19016c7f19a9edf06da80bb0/aioboto3-15.5.0.tar.gz", hash = "sha256:ea8d8787d315594842fbfcf2c4dce3bac2ad61be275bc8584b2ce9a3402a6979", size = 255069, upload-time = "2025-10-30T13:37:16.122Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/e5/3e/e8f5b665bca646d43b916763c901e00a07e40f7746c9128bdc912a089424/aioboto3-15.5.0-py3-none-any.whl", hash = "sha256:cc880c4d6a8481dd7e05da89f41c384dbd841454fc1998ae25ca9c39201437a6", size = 35913, upload-time = "2025-10-30T13:37:14.549Z" }, +] + +[[package]] +name = "aiobotocore" +version = "2.25.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "aiohttp" }, + { name = "aioitertools" }, + { name = "botocore" }, + { name = "jmespath" }, + { name = "multidict" }, + { name = "python-dateutil" }, + { name = "wrapt" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/62/94/2e4ec48cf1abb89971cb2612d86f979a6240520f0a659b53a43116d344dc/aiobotocore-2.25.1.tar.gz", hash = "sha256:ea9be739bfd7ece8864f072ec99bb9ed5c7e78ebb2b0b15f29781fbe02daedbc", size = 120560, upload-time = "2025-10-28T22:33:21.787Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/95/2a/d275ec4ce5cd0096665043995a7d76f5d0524853c76a3d04656de49f8808/aiobotocore-2.25.1-py3-none-any.whl", hash = "sha256:eb6daebe3cbef5b39a0bb2a97cffbe9c7cb46b2fcc399ad141f369f3c2134b1f", size = 86039, upload-time = "2025-10-28T22:33:19.949Z" }, +] + +[package.optional-dependencies] +boto3 = [ + { name = "boto3" }, +] + +[[package]] +name = "aiofiles" +version = "25.1.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/41/c3/534eac40372d8ee36ef40df62ec129bee4fdb5ad9706e58a29be53b2c970/aiofiles-25.1.0.tar.gz", hash = "sha256:a8d728f0a29de45dc521f18f07297428d56992a742f0cd2701ba86e44d23d5b2", size = 46354, upload-time = "2025-10-09T20:51:04.358Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/bc/8a/340a1555ae33d7354dbca4faa54948d76d89a27ceef032c8c3bc661d003e/aiofiles-25.1.0-py3-none-any.whl", hash = "sha256:abe311e527c862958650f9438e859c1fa7568a141b22abcd015e120e86a85695", size = 14668, upload-time = "2025-10-09T20:51:03.174Z" }, +] + +[[package]] +name = "aiohappyeyeballs" +version = "2.6.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/26/30/f84a107a9c4331c14b2b586036f40965c128aa4fee4dda5d3d51cb14ad54/aiohappyeyeballs-2.6.1.tar.gz", hash = "sha256:c3f9d0113123803ccadfdf3f0faa505bc78e6a72d1cc4806cbd719826e943558", size = 22760, upload-time = "2025-03-12T01:42:48.764Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/0f/15/5bf3b99495fb160b63f95972b81750f18f7f4e02ad051373b669d17d44f2/aiohappyeyeballs-2.6.1-py3-none-any.whl", hash = "sha256:f349ba8f4b75cb25c99c5c2d84e997e485204d2902a9597802b0371f09331fb8", size = 15265, upload-time = "2025-03-12T01:42:47.083Z" }, +] + +[[package]] +name = "aiohttp" +version = "3.13.2" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "aiohappyeyeballs" }, + { name = "aiosignal" }, + { name = "attrs" }, + { name = "frozenlist" }, + { name = "multidict" }, + { name = "propcache" }, + { name = "yarl" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/1c/ce/3b83ebba6b3207a7135e5fcaba49706f8a4b6008153b4e30540c982fae26/aiohttp-3.13.2.tar.gz", hash = "sha256:40176a52c186aefef6eb3cad2cdd30cd06e3afbe88fe8ab2af9c0b90f228daca", size = 7837994, upload-time = "2025-10-28T20:59:39.937Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/bf/78/7e90ca79e5aa39f9694dcfd74f4720782d3c6828113bb1f3197f7e7c4a56/aiohttp-3.13.2-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:7519bdc7dfc1940d201651b52bf5e03f5503bda45ad6eacf64dda98be5b2b6be", size = 732139, upload-time = "2025-10-28T20:57:02.455Z" }, + { url = "https://files.pythonhosted.org/packages/db/ed/1f59215ab6853fbaa5c8495fa6cbc39edfc93553426152b75d82a5f32b76/aiohttp-3.13.2-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:088912a78b4d4f547a1f19c099d5a506df17eacec3c6f4375e2831ec1d995742", size = 490082, upload-time = "2025-10-28T20:57:04.784Z" }, + { url = "https://files.pythonhosted.org/packages/68/7b/fe0fe0f5e05e13629d893c760465173a15ad0039c0a5b0d0040995c8075e/aiohttp-3.13.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:5276807b9de9092af38ed23ce120539ab0ac955547b38563a9ba4f5b07b95293", size = 489035, upload-time = "2025-10-28T20:57:06.894Z" }, + { url = "https://files.pythonhosted.org/packages/d2/04/db5279e38471b7ac801d7d36a57d1230feeee130bbe2a74f72731b23c2b1/aiohttp-3.13.2-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:1237c1375eaef0db4dcd7c2559f42e8af7b87ea7d295b118c60c36a6e61cb811", size = 1720387, upload-time = "2025-10-28T20:57:08.685Z" }, + { url = "https://files.pythonhosted.org/packages/31/07/8ea4326bd7dae2bd59828f69d7fdc6e04523caa55e4a70f4a8725a7e4ed2/aiohttp-3.13.2-cp313-cp313-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:96581619c57419c3d7d78703d5b78c1e5e5fc0172d60f555bdebaced82ded19a", size = 1688314, upload-time = "2025-10-28T20:57:10.693Z" }, + { url = "https://files.pythonhosted.org/packages/48/ab/3d98007b5b87ffd519d065225438cc3b668b2f245572a8cb53da5dd2b1bc/aiohttp-3.13.2-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:a2713a95b47374169409d18103366de1050fe0ea73db358fc7a7acb2880422d4", size = 1756317, upload-time = "2025-10-28T20:57:12.563Z" }, + { url = "https://files.pythonhosted.org/packages/97/3d/801ca172b3d857fafb7b50c7c03f91b72b867a13abca982ed6b3081774ef/aiohttp-3.13.2-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:228a1cd556b3caca590e9511a89444925da87d35219a49ab5da0c36d2d943a6a", size = 1858539, upload-time = "2025-10-28T20:57:14.623Z" }, + { url = "https://files.pythonhosted.org/packages/f7/0d/4764669bdf47bd472899b3d3db91fffbe925c8e3038ec591a2fd2ad6a14d/aiohttp-3.13.2-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:ac6cde5fba8d7d8c6ac963dbb0256a9854e9fafff52fbcc58fdf819357892c3e", size = 1739597, upload-time = "2025-10-28T20:57:16.399Z" }, + { url = "https://files.pythonhosted.org/packages/c4/52/7bd3c6693da58ba16e657eb904a5b6decfc48ecd06e9ac098591653b1566/aiohttp-3.13.2-cp313-cp313-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:f2bef8237544f4e42878c61cef4e2839fee6346dc60f5739f876a9c50be7fcdb", size = 1555006, upload-time = "2025-10-28T20:57:18.288Z" }, + { url = "https://files.pythonhosted.org/packages/48/30/9586667acec5993b6f41d2ebcf96e97a1255a85f62f3c653110a5de4d346/aiohttp-3.13.2-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:16f15a4eac3bc2d76c45f7ebdd48a65d41b242eb6c31c2245463b40b34584ded", size = 1683220, upload-time = "2025-10-28T20:57:20.241Z" }, + { url = "https://files.pythonhosted.org/packages/71/01/3afe4c96854cfd7b30d78333852e8e851dceaec1c40fd00fec90c6402dd2/aiohttp-3.13.2-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:bb7fb776645af5cc58ab804c58d7eba545a97e047254a52ce89c157b5af6cd0b", size = 1712570, upload-time = "2025-10-28T20:57:22.253Z" }, + { url = "https://files.pythonhosted.org/packages/11/2c/22799d8e720f4697a9e66fd9c02479e40a49de3de2f0bbe7f9f78a987808/aiohttp-3.13.2-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:e1b4951125ec10c70802f2cb09736c895861cd39fd9dcb35107b4dc8ae6220b8", size = 1733407, upload-time = "2025-10-28T20:57:24.37Z" }, + { url = "https://files.pythonhosted.org/packages/34/cb/90f15dd029f07cebbd91f8238a8b363978b530cd128488085b5703683594/aiohttp-3.13.2-cp313-cp313-musllinux_1_2_riscv64.whl", hash = "sha256:550bf765101ae721ee1d37d8095f47b1f220650f85fe1af37a90ce75bab89d04", size = 1550093, upload-time = "2025-10-28T20:57:26.257Z" }, + { url = "https://files.pythonhosted.org/packages/69/46/12dce9be9d3303ecbf4d30ad45a7683dc63d90733c2d9fe512be6716cd40/aiohttp-3.13.2-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:fe91b87fc295973096251e2d25a811388e7d8adf3bd2b97ef6ae78bc4ac6c476", size = 1758084, upload-time = "2025-10-28T20:57:28.349Z" }, + { url = "https://files.pythonhosted.org/packages/f9/c8/0932b558da0c302ffd639fc6362a313b98fdf235dc417bc2493da8394df7/aiohttp-3.13.2-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:e0c8e31cfcc4592cb200160344b2fb6ae0f9e4effe06c644b5a125d4ae5ebe23", size = 1716987, upload-time = "2025-10-28T20:57:30.233Z" }, + { url = "https://files.pythonhosted.org/packages/5d/8b/f5bd1a75003daed099baec373aed678f2e9b34f2ad40d85baa1368556396/aiohttp-3.13.2-cp313-cp313-win32.whl", hash = "sha256:0740f31a60848d6edb296a0df827473eede90c689b8f9f2a4cdde74889eb2254", size = 425859, upload-time = "2025-10-28T20:57:32.105Z" }, + { url = "https://files.pythonhosted.org/packages/5d/28/a8a9fc6957b2cee8902414e41816b5ab5536ecf43c3b1843c10e82c559b2/aiohttp-3.13.2-cp313-cp313-win_amd64.whl", hash = "sha256:a88d13e7ca367394908f8a276b89d04a3652044612b9a408a0bb22a5ed976a1a", size = 452192, upload-time = "2025-10-28T20:57:34.166Z" }, + { url = "https://files.pythonhosted.org/packages/9b/36/e2abae1bd815f01c957cbf7be817b3043304e1c87bad526292a0410fdcf9/aiohttp-3.13.2-cp314-cp314-macosx_10_13_universal2.whl", hash = "sha256:2475391c29230e063ef53a66669b7b691c9bfc3f1426a0f7bcdf1216bdbac38b", size = 735234, upload-time = "2025-10-28T20:57:36.415Z" }, + { url = "https://files.pythonhosted.org/packages/ca/e3/1ee62dde9b335e4ed41db6bba02613295a0d5b41f74a783c142745a12763/aiohttp-3.13.2-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:f33c8748abef4d8717bb20e8fb1b3e07c6adacb7fd6beaae971a764cf5f30d61", size = 490733, upload-time = "2025-10-28T20:57:38.205Z" }, + { url = "https://files.pythonhosted.org/packages/1a/aa/7a451b1d6a04e8d15a362af3e9b897de71d86feac3babf8894545d08d537/aiohttp-3.13.2-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:ae32f24bbfb7dbb485a24b30b1149e2f200be94777232aeadba3eecece4d0aa4", size = 491303, upload-time = "2025-10-28T20:57:40.122Z" }, + { url = "https://files.pythonhosted.org/packages/57/1e/209958dbb9b01174870f6a7538cd1f3f28274fdbc88a750c238e2c456295/aiohttp-3.13.2-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:5d7f02042c1f009ffb70067326ef183a047425bb2ff3bc434ead4dd4a4a66a2b", size = 1717965, upload-time = "2025-10-28T20:57:42.28Z" }, + { url = "https://files.pythonhosted.org/packages/08/aa/6a01848d6432f241416bc4866cae8dc03f05a5a884d2311280f6a09c73d6/aiohttp-3.13.2-cp314-cp314-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:93655083005d71cd6c072cdab54c886e6570ad2c4592139c3fb967bfc19e4694", size = 1667221, upload-time = "2025-10-28T20:57:44.869Z" }, + { url = "https://files.pythonhosted.org/packages/87/4f/36c1992432d31bbc789fa0b93c768d2e9047ec8c7177e5cd84ea85155f36/aiohttp-3.13.2-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:0db1e24b852f5f664cd728db140cf11ea0e82450471232a394b3d1a540b0f906", size = 1757178, upload-time = "2025-10-28T20:57:47.216Z" }, + { url = "https://files.pythonhosted.org/packages/ac/b4/8e940dfb03b7e0f68a82b88fd182b9be0a65cb3f35612fe38c038c3112cf/aiohttp-3.13.2-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:b009194665bcd128e23eaddef362e745601afa4641930848af4c8559e88f18f9", size = 1838001, upload-time = "2025-10-28T20:57:49.337Z" }, + { url = "https://files.pythonhosted.org/packages/d7/ef/39f3448795499c440ab66084a9db7d20ca7662e94305f175a80f5b7e0072/aiohttp-3.13.2-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:c038a8fdc8103cd51dbd986ecdce141473ffd9775a7a8057a6ed9c3653478011", size = 1716325, upload-time = "2025-10-28T20:57:51.327Z" }, + { url = "https://files.pythonhosted.org/packages/d7/51/b311500ffc860b181c05d91c59a1313bdd05c82960fdd4035a15740d431e/aiohttp-3.13.2-cp314-cp314-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:66bac29b95a00db411cd758fea0e4b9bdba6d549dfe333f9a945430f5f2cc5a6", size = 1547978, upload-time = "2025-10-28T20:57:53.554Z" }, + { url = "https://files.pythonhosted.org/packages/31/64/b9d733296ef79815226dab8c586ff9e3df41c6aff2e16c06697b2d2e6775/aiohttp-3.13.2-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:4ebf9cfc9ba24a74cf0718f04aac2a3bbe745902cc7c5ebc55c0f3b5777ef213", size = 1682042, upload-time = "2025-10-28T20:57:55.617Z" }, + { url = "https://files.pythonhosted.org/packages/3f/30/43d3e0f9d6473a6db7d472104c4eff4417b1e9df01774cb930338806d36b/aiohttp-3.13.2-cp314-cp314-musllinux_1_2_armv7l.whl", hash = "sha256:a4b88ebe35ce54205c7074f7302bd08a4cb83256a3e0870c72d6f68a3aaf8e49", size = 1680085, upload-time = "2025-10-28T20:57:57.59Z" }, + { url = "https://files.pythonhosted.org/packages/16/51/c709f352c911b1864cfd1087577760ced64b3e5bee2aa88b8c0c8e2e4972/aiohttp-3.13.2-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:98c4fb90bb82b70a4ed79ca35f656f4281885be076f3f970ce315402b53099ae", size = 1728238, upload-time = "2025-10-28T20:57:59.525Z" }, + { url = "https://files.pythonhosted.org/packages/19/e2/19bd4c547092b773caeb48ff5ae4b1ae86756a0ee76c16727fcfd281404b/aiohttp-3.13.2-cp314-cp314-musllinux_1_2_riscv64.whl", hash = "sha256:ec7534e63ae0f3759df3a1ed4fa6bc8f75082a924b590619c0dd2f76d7043caa", size = 1544395, upload-time = "2025-10-28T20:58:01.914Z" }, + { url = "https://files.pythonhosted.org/packages/cf/87/860f2803b27dfc5ed7be532832a3498e4919da61299b4a1f8eb89b8ff44d/aiohttp-3.13.2-cp314-cp314-musllinux_1_2_s390x.whl", hash = "sha256:5b927cf9b935a13e33644cbed6c8c4b2d0f25b713d838743f8fe7191b33829c4", size = 1742965, upload-time = "2025-10-28T20:58:03.972Z" }, + { url = "https://files.pythonhosted.org/packages/67/7f/db2fc7618925e8c7a601094d5cbe539f732df4fb570740be88ed9e40e99a/aiohttp-3.13.2-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:88d6c017966a78c5265d996c19cdb79235be5e6412268d7e2ce7dee339471b7a", size = 1697585, upload-time = "2025-10-28T20:58:06.189Z" }, + { url = "https://files.pythonhosted.org/packages/0c/07/9127916cb09bb38284db5036036042b7b2c514c8ebaeee79da550c43a6d6/aiohttp-3.13.2-cp314-cp314-win32.whl", hash = "sha256:f7c183e786e299b5d6c49fb43a769f8eb8e04a2726a2bd5887b98b5cc2d67940", size = 431621, upload-time = "2025-10-28T20:58:08.636Z" }, + { url = "https://files.pythonhosted.org/packages/fb/41/554a8a380df6d3a2bba8a7726429a23f4ac62aaf38de43bb6d6cde7b4d4d/aiohttp-3.13.2-cp314-cp314-win_amd64.whl", hash = "sha256:fe242cd381e0fb65758faf5ad96c2e460df6ee5b2de1072fe97e4127927e00b4", size = 457627, upload-time = "2025-10-28T20:58:11Z" }, + { url = "https://files.pythonhosted.org/packages/c7/8e/3824ef98c039d3951cb65b9205a96dd2b20f22241ee17d89c5701557c826/aiohttp-3.13.2-cp314-cp314t-macosx_10_13_universal2.whl", hash = "sha256:f10d9c0b0188fe85398c61147bbd2a657d616c876863bfeff43376e0e3134673", size = 767360, upload-time = "2025-10-28T20:58:13.358Z" }, + { url = "https://files.pythonhosted.org/packages/a4/0f/6a03e3fc7595421274fa34122c973bde2d89344f8a881b728fa8c774e4f1/aiohttp-3.13.2-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:e7c952aefdf2460f4ae55c5e9c3e80aa72f706a6317e06020f80e96253b1accd", size = 504616, upload-time = "2025-10-28T20:58:15.339Z" }, + { url = "https://files.pythonhosted.org/packages/c6/aa/ed341b670f1bc8a6f2c6a718353d13b9546e2cef3544f573c6a1ff0da711/aiohttp-3.13.2-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:c20423ce14771d98353d2e25e83591fa75dfa90a3c1848f3d7c68243b4fbded3", size = 509131, upload-time = "2025-10-28T20:58:17.693Z" }, + { url = "https://files.pythonhosted.org/packages/7f/f0/c68dac234189dae5c4bbccc0f96ce0cc16b76632cfc3a08fff180045cfa4/aiohttp-3.13.2-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:e96eb1a34396e9430c19d8338d2ec33015e4a87ef2b4449db94c22412e25ccdf", size = 1864168, upload-time = "2025-10-28T20:58:20.113Z" }, + { url = "https://files.pythonhosted.org/packages/8f/65/75a9a76db8364b5d0e52a0c20eabc5d52297385d9af9c35335b924fafdee/aiohttp-3.13.2-cp314-cp314t-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:23fb0783bc1a33640036465019d3bba069942616a6a2353c6907d7fe1ccdaf4e", size = 1719200, upload-time = "2025-10-28T20:58:22.583Z" }, + { url = "https://files.pythonhosted.org/packages/f5/55/8df2ed78d7f41d232f6bd3ff866b6f617026551aa1d07e2f03458f964575/aiohttp-3.13.2-cp314-cp314t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:2e1a9bea6244a1d05a4e57c295d69e159a5c50d8ef16aa390948ee873478d9a5", size = 1843497, upload-time = "2025-10-28T20:58:24.672Z" }, + { url = "https://files.pythonhosted.org/packages/e9/e0/94d7215e405c5a02ccb6a35c7a3a6cfff242f457a00196496935f700cde5/aiohttp-3.13.2-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:0a3d54e822688b56e9f6b5816fb3de3a3a64660efac64e4c2dc435230ad23bad", size = 1935703, upload-time = "2025-10-28T20:58:26.758Z" }, + { url = "https://files.pythonhosted.org/packages/0b/78/1eeb63c3f9b2d1015a4c02788fb543141aad0a03ae3f7a7b669b2483f8d4/aiohttp-3.13.2-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:7a653d872afe9f33497215745da7a943d1dc15b728a9c8da1c3ac423af35178e", size = 1792738, upload-time = "2025-10-28T20:58:29.787Z" }, + { url = "https://files.pythonhosted.org/packages/41/75/aaf1eea4c188e51538c04cc568040e3082db263a57086ea74a7d38c39e42/aiohttp-3.13.2-cp314-cp314t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:56d36e80d2003fa3fc0207fac644216d8532e9504a785ef9a8fd013f84a42c61", size = 1624061, upload-time = "2025-10-28T20:58:32.529Z" }, + { url = "https://files.pythonhosted.org/packages/9b/c2/3b6034de81fbcc43de8aeb209073a2286dfb50b86e927b4efd81cf848197/aiohttp-3.13.2-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:78cd586d8331fb8e241c2dd6b2f4061778cc69e150514b39a9e28dd050475661", size = 1789201, upload-time = "2025-10-28T20:58:34.618Z" }, + { url = "https://files.pythonhosted.org/packages/c9/38/c15dcf6d4d890217dae79d7213988f4e5fe6183d43893a9cf2fe9e84ca8d/aiohttp-3.13.2-cp314-cp314t-musllinux_1_2_armv7l.whl", hash = "sha256:20b10bbfbff766294fe99987f7bb3b74fdd2f1a2905f2562132641ad434dcf98", size = 1776868, upload-time = "2025-10-28T20:58:38.835Z" }, + { url = "https://files.pythonhosted.org/packages/04/75/f74fd178ac81adf4f283a74847807ade5150e48feda6aef024403716c30c/aiohttp-3.13.2-cp314-cp314t-musllinux_1_2_ppc64le.whl", hash = "sha256:9ec49dff7e2b3c85cdeaa412e9d438f0ecd71676fde61ec57027dd392f00c693", size = 1790660, upload-time = "2025-10-28T20:58:41.507Z" }, + { url = "https://files.pythonhosted.org/packages/e7/80/7368bd0d06b16b3aba358c16b919e9c46cf11587dc572091031b0e9e3ef0/aiohttp-3.13.2-cp314-cp314t-musllinux_1_2_riscv64.whl", hash = "sha256:94f05348c4406450f9d73d38efb41d669ad6cd90c7ee194810d0eefbfa875a7a", size = 1617548, upload-time = "2025-10-28T20:58:43.674Z" }, + { url = "https://files.pythonhosted.org/packages/7d/4b/a6212790c50483cb3212e507378fbe26b5086d73941e1ec4b56a30439688/aiohttp-3.13.2-cp314-cp314t-musllinux_1_2_s390x.whl", hash = "sha256:fa4dcb605c6f82a80c7f95713c2b11c3b8e9893b3ebd2bc9bde93165ed6107be", size = 1817240, upload-time = "2025-10-28T20:58:45.787Z" }, + { url = "https://files.pythonhosted.org/packages/ff/f7/ba5f0ba4ea8d8f3c32850912944532b933acbf0f3a75546b89269b9b7dde/aiohttp-3.13.2-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:cf00e5db968c3f67eccd2778574cf64d8b27d95b237770aa32400bd7a1ca4f6c", size = 1762334, upload-time = "2025-10-28T20:58:47.936Z" }, + { url = "https://files.pythonhosted.org/packages/7e/83/1a5a1856574588b1cad63609ea9ad75b32a8353ac995d830bf5da9357364/aiohttp-3.13.2-cp314-cp314t-win32.whl", hash = "sha256:d23b5fe492b0805a50d3371e8a728a9134d8de5447dce4c885f5587294750734", size = 464685, upload-time = "2025-10-28T20:58:50.642Z" }, + { url = "https://files.pythonhosted.org/packages/9f/4d/d22668674122c08f4d56972297c51a624e64b3ed1efaa40187607a7cb66e/aiohttp-3.13.2-cp314-cp314t-win_amd64.whl", hash = "sha256:ff0a7b0a82a7ab905cbda74006318d1b12e37c797eb1b0d4eb3e316cf47f658f", size = 498093, upload-time = "2025-10-28T20:58:52.782Z" }, +] + +[[package]] +name = "aioitertools" +version = "0.13.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/fd/3c/53c4a17a05fb9ea2313ee1777ff53f5e001aefd5cc85aa2f4c2d982e1e38/aioitertools-0.13.0.tar.gz", hash = "sha256:620bd241acc0bbb9ec819f1ab215866871b4bbd1f73836a55f799200ee86950c", size = 19322, upload-time = "2025-11-06T22:17:07.609Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/10/a1/510b0a7fadc6f43a6ce50152e69dbd86415240835868bb0bd9b5b88b1e06/aioitertools-0.13.0-py3-none-any.whl", hash = "sha256:0be0292b856f08dfac90e31f4739432f4cb6d7520ab9eb73e143f4f2fa5259be", size = 24182, upload-time = "2025-11-06T22:17:06.502Z" }, +] + +[[package]] +name = "aiosignal" +version = "1.4.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "frozenlist" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/61/62/06741b579156360248d1ec624842ad0edf697050bbaf7c3e46394e106ad1/aiosignal-1.4.0.tar.gz", hash = "sha256:f47eecd9468083c2029cc99945502cb7708b082c232f9aca65da147157b251c7", size = 25007, upload-time = "2025-07-03T22:54:43.528Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/fb/76/641ae371508676492379f16e2fa48f4e2c11741bd63c48be4b12a6b09cba/aiosignal-1.4.0-py3-none-any.whl", hash = "sha256:053243f8b92b990551949e63930a839ff0cf0b0ebbe0597b0f3fb19e1a0fe82e", size = 7490, upload-time = "2025-07-03T22:54:42.156Z" }, +] + +[[package]] +name = "aiosqlite" +version = "0.21.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "typing-extensions" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/13/7d/8bca2bf9a247c2c5dfeec1d7a5f40db6518f88d314b8bca9da29670d2671/aiosqlite-0.21.0.tar.gz", hash = "sha256:131bb8056daa3bc875608c631c678cda73922a2d4ba8aec373b19f18c17e7aa3", size = 13454, upload-time = "2025-02-03T07:30:16.235Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/f5/10/6c25ed6de94c49f88a91fa5018cb4c0f3625f31d5be9f771ebe5cc7cd506/aiosqlite-0.21.0-py3-none-any.whl", hash = "sha256:2549cf4057f95f53dcba16f2b64e8e2791d7e1adedb13197dd8ed77bb226d7d0", size = 15792, upload-time = "2025-02-03T07:30:13.6Z" }, +] + +[[package]] +name = "alembic" +version = "1.17.2" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "mako" }, + { name = "sqlalchemy" }, + { name = "typing-extensions" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/02/a6/74c8cadc2882977d80ad756a13857857dbcf9bd405bc80b662eb10651282/alembic-1.17.2.tar.gz", hash = "sha256:bbe9751705c5e0f14877f02d46c53d10885e377e3d90eda810a016f9baa19e8e", size = 1988064, upload-time = "2025-11-14T20:35:04.057Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/ba/88/6237e97e3385b57b5f1528647addea5cc03d4d65d5979ab24327d41fb00d/alembic-1.17.2-py3-none-any.whl", hash = "sha256:f483dd1fe93f6c5d49217055e4d15b905b425b6af906746abb35b69c1996c4e6", size = 248554, upload-time = "2025-11-14T20:35:05.699Z" }, +] + +[[package]] +name = "annotated-doc" +version = "0.0.4" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/57/ba/046ceea27344560984e26a590f90bc7f4a75b06701f653222458922b558c/annotated_doc-0.0.4.tar.gz", hash = "sha256:fbcda96e87e9c92ad167c2e53839e57503ecfda18804ea28102353485033faa4", size = 7288, upload-time = "2025-11-10T22:07:42.062Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/1e/d3/26bf1008eb3d2daa8ef4cacc7f3bfdc11818d111f7e2d0201bc6e3b49d45/annotated_doc-0.0.4-py3-none-any.whl", hash = "sha256:571ac1dc6991c450b25a9c2d84a3705e2ae7a53467b5d111c24fa8baabbed320", size = 5303, upload-time = "2025-11-10T22:07:40.673Z" }, +] + +[[package]] +name = "annotated-types" +version = "0.7.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/ee/67/531ea369ba64dcff5ec9c3402f9f51bf748cec26dde048a2f973a4eea7f5/annotated_types-0.7.0.tar.gz", hash = "sha256:aff07c09a53a08bc8cfccb9c85b05f1aa9a2a6f23728d790723543408344ce89", size = 16081, upload-time = "2024-05-20T21:33:25.928Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/78/b6/6307fbef88d9b5ee7421e68d78a9f162e0da4900bc5f5793f6d3d0e34fb8/annotated_types-0.7.0-py3-none-any.whl", hash = "sha256:1f02e8b43a8fbbc3f3e0d4f0f4bfc8131bcb4eebe8849b8e5c773f3a1c582a53", size = 13643, upload-time = "2024-05-20T21:33:24.1Z" }, +] + +[[package]] +name = "anyio" +version = "4.12.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "idna" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/16/ce/8a777047513153587e5434fd752e89334ac33e379aa3497db860eeb60377/anyio-4.12.0.tar.gz", hash = "sha256:73c693b567b0c55130c104d0b43a9baf3aa6a31fc6110116509f27bf75e21ec0", size = 228266, upload-time = "2025-11-28T23:37:38.911Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/7f/9c/36c5c37947ebfb8c7f22e0eb6e4d188ee2d53aa3880f3f2744fb894f0cb1/anyio-4.12.0-py3-none-any.whl", hash = "sha256:dad2376a628f98eeca4881fc56cd06affd18f659b17a747d3ff0307ced94b1bb", size = 113362, upload-time = "2025-11-28T23:36:57.897Z" }, +] + +[[package]] +name = "arq" +version = "0.26.3" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "click" }, + { name = "redis", extra = ["hiredis"] }, +] +sdist = { url = "https://files.pythonhosted.org/packages/4f/65/5add7049297a449d1453e26a8d5924f0d5440b3876edc9e80d5dc621f16d/arq-0.26.3.tar.gz", hash = "sha256:362063ea3c726562fb69c723d5b8ee80827fdefda782a8547da5be3d380ac4b1", size = 291111, upload-time = "2025-01-06T22:44:49.771Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/85/b3/a24a183c628da633b7cafd1759b14aaf47958de82ba6bcae9f1c2898781d/arq-0.26.3-py3-none-any.whl", hash = "sha256:9f4b78149a58c9dc4b88454861a254b7c4e7a159f2c973c89b548288b77e9005", size = 25968, upload-time = "2025-01-06T22:44:45.771Z" }, +] + +[[package]] +name = "asyncpg" +version = "0.31.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/fe/cc/d18065ce2380d80b1bcce927c24a2642efd38918e33fd724bc4bca904877/asyncpg-0.31.0.tar.gz", hash = "sha256:c989386c83940bfbd787180f2b1519415e2d3d6277a70d9d0f0145ac73500735", size = 993667, upload-time = "2025-11-24T23:27:00.812Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/95/11/97b5c2af72a5d0b9bc3fa30cd4b9ce22284a9a943a150fdc768763caf035/asyncpg-0.31.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:c204fab1b91e08b0f47e90a75d1b3c62174dab21f670ad6c5d0f243a228f015b", size = 661111, upload-time = "2025-11-24T23:26:04.467Z" }, + { url = "https://files.pythonhosted.org/packages/1b/71/157d611c791a5e2d0423f09f027bd499935f0906e0c2a416ce712ba51ef3/asyncpg-0.31.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:54a64f91839ba59008eccf7aad2e93d6e3de688d796f35803235ea1c4898ae1e", size = 636928, upload-time = "2025-11-24T23:26:05.944Z" }, + { url = "https://files.pythonhosted.org/packages/2e/fc/9e3486fb2bbe69d4a867c0b76d68542650a7ff1574ca40e84c3111bb0c6e/asyncpg-0.31.0-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:c0e0822b1038dc7253b337b0f3f676cadc4ac31b126c5d42691c39691962e403", size = 3424067, upload-time = "2025-11-24T23:26:07.957Z" }, + { url = "https://files.pythonhosted.org/packages/12/c6/8c9d076f73f07f995013c791e018a1cd5f31823c2a3187fc8581706aa00f/asyncpg-0.31.0-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:bef056aa502ee34204c161c72ca1f3c274917596877f825968368b2c33f585f4", size = 3518156, upload-time = "2025-11-24T23:26:09.591Z" }, + { url = "https://files.pythonhosted.org/packages/ae/3b/60683a0baf50fbc546499cfb53132cb6835b92b529a05f6a81471ab60d0c/asyncpg-0.31.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:0bfbcc5b7ffcd9b75ab1558f00db2ae07db9c80637ad1b2469c43df79d7a5ae2", size = 3319636, upload-time = "2025-11-24T23:26:11.168Z" }, + { url = "https://files.pythonhosted.org/packages/50/dc/8487df0f69bd398a61e1792b3cba0e47477f214eff085ba0efa7eac9ce87/asyncpg-0.31.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:22bc525ebbdc24d1261ecbf6f504998244d4e3be1721784b5f64664d61fbe602", size = 3472079, upload-time = "2025-11-24T23:26:13.164Z" }, + { url = "https://files.pythonhosted.org/packages/13/a1/c5bbeeb8531c05c89135cb8b28575ac2fac618bcb60119ee9696c3faf71c/asyncpg-0.31.0-cp313-cp313-win32.whl", hash = "sha256:f890de5e1e4f7e14023619399a471ce4b71f5418cd67a51853b9910fdfa73696", size = 527606, upload-time = "2025-11-24T23:26:14.78Z" }, + { url = "https://files.pythonhosted.org/packages/91/66/b25ccb84a246b470eb943b0107c07edcae51804912b824054b3413995a10/asyncpg-0.31.0-cp313-cp313-win_amd64.whl", hash = "sha256:dc5f2fa9916f292e5c5c8b2ac2813763bcd7f58e130055b4ad8a0531314201ab", size = 596569, upload-time = "2025-11-24T23:26:16.189Z" }, + { url = "https://files.pythonhosted.org/packages/3c/36/e9450d62e84a13aea6580c83a47a437f26c7ca6fa0f0fd40b6670793ea30/asyncpg-0.31.0-cp314-cp314-macosx_10_15_x86_64.whl", hash = "sha256:f6b56b91bb0ffc328c4e3ed113136cddd9deefdf5f79ab448598b9772831df44", size = 660867, upload-time = "2025-11-24T23:26:17.631Z" }, + { url = "https://files.pythonhosted.org/packages/82/4b/1d0a2b33b3102d210439338e1beea616a6122267c0df459ff0265cd5807a/asyncpg-0.31.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:334dec28cf20d7f5bb9e45b39546ddf247f8042a690bff9b9573d00086e69cb5", size = 638349, upload-time = "2025-11-24T23:26:19.689Z" }, + { url = "https://files.pythonhosted.org/packages/41/aa/e7f7ac9a7974f08eff9183e392b2d62516f90412686532d27e196c0f0eeb/asyncpg-0.31.0-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:98cc158c53f46de7bb677fd20c417e264fc02b36d901cc2a43bd6cb0dc6dbfd2", size = 3410428, upload-time = "2025-11-24T23:26:21.275Z" }, + { url = "https://files.pythonhosted.org/packages/6f/de/bf1b60de3dede5c2731e6788617a512bc0ebd9693eac297ee74086f101d7/asyncpg-0.31.0-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:9322b563e2661a52e3cdbc93eed3be7748b289f792e0011cb2720d278b366ce2", size = 3471678, upload-time = "2025-11-24T23:26:23.627Z" }, + { url = "https://files.pythonhosted.org/packages/46/78/fc3ade003e22d8bd53aaf8f75f4be48f0b460fa73738f0391b9c856a9147/asyncpg-0.31.0-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:19857a358fc811d82227449b7ca40afb46e75b33eb8897240c3839dd8b744218", size = 3313505, upload-time = "2025-11-24T23:26:25.235Z" }, + { url = "https://files.pythonhosted.org/packages/bf/e9/73eb8a6789e927816f4705291be21f2225687bfa97321e40cd23055e903a/asyncpg-0.31.0-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:ba5f8886e850882ff2c2ace5732300e99193823e8107e2c53ef01c1ebfa1e85d", size = 3434744, upload-time = "2025-11-24T23:26:26.944Z" }, + { url = "https://files.pythonhosted.org/packages/08/4b/f10b880534413c65c5b5862f79b8e81553a8f364e5238832ad4c0af71b7f/asyncpg-0.31.0-cp314-cp314-win32.whl", hash = "sha256:cea3a0b2a14f95834cee29432e4ddc399b95700eb1d51bbc5bfee8f31fa07b2b", size = 532251, upload-time = "2025-11-24T23:26:28.404Z" }, + { url = "https://files.pythonhosted.org/packages/d3/2d/7aa40750b7a19efa5d66e67fc06008ca0f27ba1bd082e457ad82f59aba49/asyncpg-0.31.0-cp314-cp314-win_amd64.whl", hash = "sha256:04d19392716af6b029411a0264d92093b6e5e8285ae97a39957b9a9c14ea72be", size = 604901, upload-time = "2025-11-24T23:26:30.34Z" }, + { url = "https://files.pythonhosted.org/packages/ce/fe/b9dfe349b83b9dee28cc42360d2c86b2cdce4cb551a2c2d27e156bcac84d/asyncpg-0.31.0-cp314-cp314t-macosx_10_15_x86_64.whl", hash = "sha256:bdb957706da132e982cc6856bb2f7b740603472b54c3ebc77fe60ea3e57e1bd2", size = 702280, upload-time = "2025-11-24T23:26:32Z" }, + { url = "https://files.pythonhosted.org/packages/6a/81/e6be6e37e560bd91e6c23ea8a6138a04fd057b08cf63d3c5055c98e81c1d/asyncpg-0.31.0-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:6d11b198111a72f47154fa03b85799f9be63701e068b43f84ac25da0bda9cb31", size = 682931, upload-time = "2025-11-24T23:26:33.572Z" }, + { url = "https://files.pythonhosted.org/packages/a6/45/6009040da85a1648dd5bc75b3b0a062081c483e75a1a29041ae63a0bf0dc/asyncpg-0.31.0-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:18c83b03bc0d1b23e6230f5bf8d4f217dc9bc08644ce0502a9d91dc9e634a9c7", size = 3581608, upload-time = "2025-11-24T23:26:35.638Z" }, + { url = "https://files.pythonhosted.org/packages/7e/06/2e3d4d7608b0b2b3adbee0d0bd6a2d29ca0fc4d8a78f8277df04e2d1fd7b/asyncpg-0.31.0-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:e009abc333464ff18b8f6fd146addffd9aaf63e79aa3bb40ab7a4c332d0c5e9e", size = 3498738, upload-time = "2025-11-24T23:26:37.275Z" }, + { url = "https://files.pythonhosted.org/packages/7d/aa/7d75ede780033141c51d83577ea23236ba7d3a23593929b32b49db8ed36e/asyncpg-0.31.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:3b1fbcb0e396a5ca435a8826a87e5c2c2cc0c8c68eb6fadf82168056b0e53a8c", size = 3401026, upload-time = "2025-11-24T23:26:39.423Z" }, + { url = "https://files.pythonhosted.org/packages/ba/7a/15e37d45e7f7c94facc1e9148c0e455e8f33c08f0b8a0b1deb2c5171771b/asyncpg-0.31.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:8df714dba348efcc162d2adf02d213e5fab1bd9f557e1305633e851a61814a7a", size = 3429426, upload-time = "2025-11-24T23:26:41.032Z" }, + { url = "https://files.pythonhosted.org/packages/13/d5/71437c5f6ae5f307828710efbe62163974e71237d5d46ebd2869ea052d10/asyncpg-0.31.0-cp314-cp314t-win32.whl", hash = "sha256:1b41f1afb1033f2b44f3234993b15096ddc9cd71b21a42dbd87fc6a57b43d65d", size = 614495, upload-time = "2025-11-24T23:26:42.659Z" }, + { url = "https://files.pythonhosted.org/packages/3c/d7/8fb3044eaef08a310acfe23dae9a8e2e07d305edc29a53497e52bc76eca7/asyncpg-0.31.0-cp314-cp314t-win_amd64.whl", hash = "sha256:bd4107bb7cdd0e9e65fae66a62afd3a249663b844fa34d479f6d5b3bef9c04c3", size = 706062, upload-time = "2025-11-24T23:26:44.086Z" }, +] + +[[package]] +name = "attrs" +version = "25.4.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/6b/5c/685e6633917e101e5dcb62b9dd76946cbb57c26e133bae9e0cd36033c0a9/attrs-25.4.0.tar.gz", hash = "sha256:16d5969b87f0859ef33a48b35d55ac1be6e42ae49d5e853b597db70c35c57e11", size = 934251, upload-time = "2025-10-06T13:54:44.725Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/3a/2a/7cc015f5b9f5db42b7d48157e23356022889fc354a2813c15934b7cb5c0e/attrs-25.4.0-py3-none-any.whl", hash = "sha256:adcf7e2a1fb3b36ac48d97835bb6d8ade15b8dcce26aba8bf1d14847b57a3373", size = 67615, upload-time = "2025-10-06T13:54:43.17Z" }, +] + +[[package]] +name = "bcrypt" +version = "5.0.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/d4/36/3329e2518d70ad8e2e5817d5a4cac6bba05a47767ec416c7d020a965f408/bcrypt-5.0.0.tar.gz", hash = "sha256:f748f7c2d6fd375cc93d3fba7ef4a9e3a092421b8dbf34d8d4dc06be9492dfdd", size = 25386, upload-time = "2025-09-25T19:50:47.829Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/13/85/3e65e01985fddf25b64ca67275bb5bdb4040bd1a53b66d355c6c37c8a680/bcrypt-5.0.0-cp313-cp313t-macosx_10_12_universal2.whl", hash = "sha256:f3c08197f3039bec79cee59a606d62b96b16669cff3949f21e74796b6e3cd2be", size = 481806, upload-time = "2025-09-25T19:49:05.102Z" }, + { url = "https://files.pythonhosted.org/packages/44/dc/01eb79f12b177017a726cbf78330eb0eb442fae0e7b3dfd84ea2849552f3/bcrypt-5.0.0-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:200af71bc25f22006f4069060c88ed36f8aa4ff7f53e67ff04d2ab3f1e79a5b2", size = 268626, upload-time = "2025-09-25T19:49:06.723Z" }, + { url = "https://files.pythonhosted.org/packages/8c/cf/e82388ad5959c40d6afd94fb4743cc077129d45b952d46bdc3180310e2df/bcrypt-5.0.0-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:baade0a5657654c2984468efb7d6c110db87ea63ef5a4b54732e7e337253e44f", size = 271853, upload-time = "2025-09-25T19:49:08.028Z" }, + { url = "https://files.pythonhosted.org/packages/ec/86/7134b9dae7cf0efa85671651341f6afa695857fae172615e960fb6a466fa/bcrypt-5.0.0-cp313-cp313t-manylinux_2_28_aarch64.whl", hash = "sha256:c58b56cdfb03202b3bcc9fd8daee8e8e9b6d7e3163aa97c631dfcfcc24d36c86", size = 269793, upload-time = "2025-09-25T19:49:09.727Z" }, + { url = "https://files.pythonhosted.org/packages/cc/82/6296688ac1b9e503d034e7d0614d56e80c5d1a08402ff856a4549cb59207/bcrypt-5.0.0-cp313-cp313t-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:4bfd2a34de661f34d0bda43c3e4e79df586e4716ef401fe31ea39d69d581ef23", size = 289930, upload-time = "2025-09-25T19:49:11.204Z" }, + { url = "https://files.pythonhosted.org/packages/d1/18/884a44aa47f2a3b88dd09bc05a1e40b57878ecd111d17e5bba6f09f8bb77/bcrypt-5.0.0-cp313-cp313t-manylinux_2_28_x86_64.whl", hash = "sha256:ed2e1365e31fc73f1825fa830f1c8f8917ca1b3ca6185773b349c20fd606cec2", size = 272194, upload-time = "2025-09-25T19:49:12.524Z" }, + { url = "https://files.pythonhosted.org/packages/0e/8f/371a3ab33c6982070b674f1788e05b656cfbf5685894acbfef0c65483a59/bcrypt-5.0.0-cp313-cp313t-manylinux_2_34_aarch64.whl", hash = "sha256:83e787d7a84dbbfba6f250dd7a5efd689e935f03dd83b0f919d39349e1f23f83", size = 269381, upload-time = "2025-09-25T19:49:14.308Z" }, + { url = "https://files.pythonhosted.org/packages/b1/34/7e4e6abb7a8778db6422e88b1f06eb07c47682313997ee8a8f9352e5a6f1/bcrypt-5.0.0-cp313-cp313t-manylinux_2_34_x86_64.whl", hash = "sha256:137c5156524328a24b9fac1cb5db0ba618bc97d11970b39184c1d87dc4bf1746", size = 271750, upload-time = "2025-09-25T19:49:15.584Z" }, + { url = "https://files.pythonhosted.org/packages/c0/1b/54f416be2499bd72123c70d98d36c6cd61a4e33d9b89562c22481c81bb30/bcrypt-5.0.0-cp313-cp313t-musllinux_1_1_aarch64.whl", hash = "sha256:38cac74101777a6a7d3b3e3cfefa57089b5ada650dce2baf0cbdd9d65db22a9e", size = 303757, upload-time = "2025-09-25T19:49:17.244Z" }, + { url = "https://files.pythonhosted.org/packages/13/62/062c24c7bcf9d2826a1a843d0d605c65a755bc98002923d01fd61270705a/bcrypt-5.0.0-cp313-cp313t-musllinux_1_1_x86_64.whl", hash = "sha256:d8d65b564ec849643d9f7ea05c6d9f0cd7ca23bdd4ac0c2dbef1104ab504543d", size = 306740, upload-time = "2025-09-25T19:49:18.693Z" }, + { url = "https://files.pythonhosted.org/packages/d5/c8/1fdbfc8c0f20875b6b4020f3c7dc447b8de60aa0be5faaf009d24242aec9/bcrypt-5.0.0-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:741449132f64b3524e95cd30e5cd3343006ce146088f074f31ab26b94e6c75ba", size = 334197, upload-time = "2025-09-25T19:49:20.523Z" }, + { url = "https://files.pythonhosted.org/packages/a6/c1/8b84545382d75bef226fbc6588af0f7b7d095f7cd6a670b42a86243183cd/bcrypt-5.0.0-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:212139484ab3207b1f0c00633d3be92fef3c5f0af17cad155679d03ff2ee1e41", size = 352974, upload-time = "2025-09-25T19:49:22.254Z" }, + { url = "https://files.pythonhosted.org/packages/10/a6/ffb49d4254ed085e62e3e5dd05982b4393e32fe1e49bb1130186617c29cd/bcrypt-5.0.0-cp313-cp313t-win32.whl", hash = "sha256:9d52ed507c2488eddd6a95bccee4e808d3234fa78dd370e24bac65a21212b861", size = 148498, upload-time = "2025-09-25T19:49:24.134Z" }, + { url = "https://files.pythonhosted.org/packages/48/a9/259559edc85258b6d5fc5471a62a3299a6aa37a6611a169756bf4689323c/bcrypt-5.0.0-cp313-cp313t-win_amd64.whl", hash = "sha256:f6984a24db30548fd39a44360532898c33528b74aedf81c26cf29c51ee47057e", size = 145853, upload-time = "2025-09-25T19:49:25.702Z" }, + { url = "https://files.pythonhosted.org/packages/2d/df/9714173403c7e8b245acf8e4be8876aac64a209d1b392af457c79e60492e/bcrypt-5.0.0-cp313-cp313t-win_arm64.whl", hash = "sha256:9fffdb387abe6aa775af36ef16f55e318dcda4194ddbf82007a6f21da29de8f5", size = 139626, upload-time = "2025-09-25T19:49:26.928Z" }, + { url = "https://files.pythonhosted.org/packages/f8/14/c18006f91816606a4abe294ccc5d1e6f0e42304df5a33710e9e8e95416e1/bcrypt-5.0.0-cp314-cp314t-macosx_10_12_universal2.whl", hash = "sha256:4870a52610537037adb382444fefd3706d96d663ac44cbb2f37e3919dca3d7ef", size = 481862, upload-time = "2025-09-25T19:49:28.365Z" }, + { url = "https://files.pythonhosted.org/packages/67/49/dd074d831f00e589537e07a0725cf0e220d1f0d5d8e85ad5bbff251c45aa/bcrypt-5.0.0-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:48f753100931605686f74e27a7b49238122aa761a9aefe9373265b8b7aa43ea4", size = 268544, upload-time = "2025-09-25T19:49:30.39Z" }, + { url = "https://files.pythonhosted.org/packages/f5/91/50ccba088b8c474545b034a1424d05195d9fcbaaf802ab8bfe2be5a4e0d7/bcrypt-5.0.0-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:f70aadb7a809305226daedf75d90379c397b094755a710d7014b8b117df1ebbf", size = 271787, upload-time = "2025-09-25T19:49:32.144Z" }, + { url = "https://files.pythonhosted.org/packages/aa/e7/d7dba133e02abcda3b52087a7eea8c0d4f64d3e593b4fffc10c31b7061f3/bcrypt-5.0.0-cp314-cp314t-manylinux_2_28_aarch64.whl", hash = "sha256:744d3c6b164caa658adcb72cb8cc9ad9b4b75c7db507ab4bc2480474a51989da", size = 269753, upload-time = "2025-09-25T19:49:33.885Z" }, + { url = "https://files.pythonhosted.org/packages/33/fc/5b145673c4b8d01018307b5c2c1fc87a6f5a436f0ad56607aee389de8ee3/bcrypt-5.0.0-cp314-cp314t-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:a28bc05039bdf3289d757f49d616ab3efe8cf40d8e8001ccdd621cd4f98f4fc9", size = 289587, upload-time = "2025-09-25T19:49:35.144Z" }, + { url = "https://files.pythonhosted.org/packages/27/d7/1ff22703ec6d4f90e62f1a5654b8867ef96bafb8e8102c2288333e1a6ca6/bcrypt-5.0.0-cp314-cp314t-manylinux_2_28_x86_64.whl", hash = "sha256:7f277a4b3390ab4bebe597800a90da0edae882c6196d3038a73adf446c4f969f", size = 272178, upload-time = "2025-09-25T19:49:36.793Z" }, + { url = "https://files.pythonhosted.org/packages/c8/88/815b6d558a1e4d40ece04a2f84865b0fef233513bd85fd0e40c294272d62/bcrypt-5.0.0-cp314-cp314t-manylinux_2_34_aarch64.whl", hash = "sha256:79cfa161eda8d2ddf29acad370356b47f02387153b11d46042e93a0a95127493", size = 269295, upload-time = "2025-09-25T19:49:38.164Z" }, + { url = "https://files.pythonhosted.org/packages/51/8c/e0db387c79ab4931fc89827d37608c31cc57b6edc08ccd2386139028dc0d/bcrypt-5.0.0-cp314-cp314t-manylinux_2_34_x86_64.whl", hash = "sha256:a5393eae5722bcef046a990b84dff02b954904c36a194f6cfc817d7dca6c6f0b", size = 271700, upload-time = "2025-09-25T19:49:39.917Z" }, + { url = "https://files.pythonhosted.org/packages/06/83/1570edddd150f572dbe9fc00f6203a89fc7d4226821f67328a85c330f239/bcrypt-5.0.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:7f4c94dec1b5ab5d522750cb059bb9409ea8872d4494fd152b53cca99f1ddd8c", size = 334034, upload-time = "2025-09-25T19:49:41.227Z" }, + { url = "https://files.pythonhosted.org/packages/c9/f2/ea64e51a65e56ae7a8a4ec236c2bfbdd4b23008abd50ac33fbb2d1d15424/bcrypt-5.0.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:0cae4cb350934dfd74c020525eeae0a5f79257e8a201c0c176f4b84fdbf2a4b4", size = 352766, upload-time = "2025-09-25T19:49:43.08Z" }, + { url = "https://files.pythonhosted.org/packages/d7/d4/1a388d21ee66876f27d1a1f41287897d0c0f1712ef97d395d708ba93004c/bcrypt-5.0.0-cp314-cp314t-win32.whl", hash = "sha256:b17366316c654e1ad0306a6858e189fc835eca39f7eb2cafd6aaca8ce0c40a2e", size = 152449, upload-time = "2025-09-25T19:49:44.971Z" }, + { url = "https://files.pythonhosted.org/packages/3f/61/3291c2243ae0229e5bca5d19f4032cecad5dfb05a2557169d3a69dc0ba91/bcrypt-5.0.0-cp314-cp314t-win_amd64.whl", hash = "sha256:92864f54fb48b4c718fc92a32825d0e42265a627f956bc0361fe869f1adc3e7d", size = 149310, upload-time = "2025-09-25T19:49:46.162Z" }, + { url = "https://files.pythonhosted.org/packages/3e/89/4b01c52ae0c1a681d4021e5dd3e45b111a8fb47254a274fa9a378d8d834b/bcrypt-5.0.0-cp314-cp314t-win_arm64.whl", hash = "sha256:dd19cf5184a90c873009244586396a6a884d591a5323f0e8a5922560718d4993", size = 143761, upload-time = "2025-09-25T19:49:47.345Z" }, + { url = "https://files.pythonhosted.org/packages/84/29/6237f151fbfe295fe3e074ecc6d44228faa1e842a81f6d34a02937ee1736/bcrypt-5.0.0-cp38-abi3-macosx_10_12_universal2.whl", hash = "sha256:fc746432b951e92b58317af8e0ca746efe93e66555f1b40888865ef5bf56446b", size = 494553, upload-time = "2025-09-25T19:49:49.006Z" }, + { url = "https://files.pythonhosted.org/packages/45/b6/4c1205dde5e464ea3bd88e8742e19f899c16fa8916fb8510a851fae985b5/bcrypt-5.0.0-cp38-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:c2388ca94ffee269b6038d48747f4ce8df0ffbea43f31abfa18ac72f0218effb", size = 275009, upload-time = "2025-09-25T19:49:50.581Z" }, + { url = "https://files.pythonhosted.org/packages/3b/71/427945e6ead72ccffe77894b2655b695ccf14ae1866cd977e185d606dd2f/bcrypt-5.0.0-cp38-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:560ddb6ec730386e7b3b26b8b4c88197aaed924430e7b74666a586ac997249ef", size = 278029, upload-time = "2025-09-25T19:49:52.533Z" }, + { url = "https://files.pythonhosted.org/packages/17/72/c344825e3b83c5389a369c8a8e58ffe1480b8a699f46c127c34580c4666b/bcrypt-5.0.0-cp38-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:d79e5c65dcc9af213594d6f7f1fa2c98ad3fc10431e7aa53c176b441943efbdd", size = 275907, upload-time = "2025-09-25T19:49:54.709Z" }, + { url = "https://files.pythonhosted.org/packages/0b/7e/d4e47d2df1641a36d1212e5c0514f5291e1a956a7749f1e595c07a972038/bcrypt-5.0.0-cp38-abi3-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:2b732e7d388fa22d48920baa267ba5d97cca38070b69c0e2d37087b381c681fd", size = 296500, upload-time = "2025-09-25T19:49:56.013Z" }, + { url = "https://files.pythonhosted.org/packages/0f/c3/0ae57a68be2039287ec28bc463b82e4b8dc23f9d12c0be331f4782e19108/bcrypt-5.0.0-cp38-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:0c8e093ea2532601a6f686edbc2c6b2ec24131ff5c52f7610dd64fa4553b5464", size = 278412, upload-time = "2025-09-25T19:49:57.356Z" }, + { url = "https://files.pythonhosted.org/packages/45/2b/77424511adb11e6a99e3a00dcc7745034bee89036ad7d7e255a7e47be7d8/bcrypt-5.0.0-cp38-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:5b1589f4839a0899c146e8892efe320c0fa096568abd9b95593efac50a87cb75", size = 275486, upload-time = "2025-09-25T19:49:59.116Z" }, + { url = "https://files.pythonhosted.org/packages/43/0a/405c753f6158e0f3f14b00b462d8bca31296f7ecfc8fc8bc7919c0c7d73a/bcrypt-5.0.0-cp38-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:89042e61b5e808b67daf24a434d89bab164d4de1746b37a8d173b6b14f3db9ff", size = 277940, upload-time = "2025-09-25T19:50:00.869Z" }, + { url = "https://files.pythonhosted.org/packages/62/83/b3efc285d4aadc1fa83db385ec64dcfa1707e890eb42f03b127d66ac1b7b/bcrypt-5.0.0-cp38-abi3-musllinux_1_1_aarch64.whl", hash = "sha256:e3cf5b2560c7b5a142286f69bde914494b6d8f901aaa71e453078388a50881c4", size = 310776, upload-time = "2025-09-25T19:50:02.393Z" }, + { url = "https://files.pythonhosted.org/packages/95/7d/47ee337dacecde6d234890fe929936cb03ebc4c3a7460854bbd9c97780b8/bcrypt-5.0.0-cp38-abi3-musllinux_1_1_x86_64.whl", hash = "sha256:f632fd56fc4e61564f78b46a2269153122db34988e78b6be8b32d28507b7eaeb", size = 312922, upload-time = "2025-09-25T19:50:04.232Z" }, + { url = "https://files.pythonhosted.org/packages/d6/3a/43d494dfb728f55f4e1cf8fd435d50c16a2d75493225b54c8d06122523c6/bcrypt-5.0.0-cp38-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:801cad5ccb6b87d1b430f183269b94c24f248dddbbc5c1f78b6ed231743e001c", size = 341367, upload-time = "2025-09-25T19:50:05.559Z" }, + { url = "https://files.pythonhosted.org/packages/55/ab/a0727a4547e383e2e22a630e0f908113db37904f58719dc48d4622139b5c/bcrypt-5.0.0-cp38-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:3cf67a804fc66fc217e6914a5635000259fbbbb12e78a99488e4d5ba445a71eb", size = 359187, upload-time = "2025-09-25T19:50:06.916Z" }, + { url = "https://files.pythonhosted.org/packages/1b/bb/461f352fdca663524b4643d8b09e8435b4990f17fbf4fea6bc2a90aa0cc7/bcrypt-5.0.0-cp38-abi3-win32.whl", hash = "sha256:3abeb543874b2c0524ff40c57a4e14e5d3a66ff33fb423529c88f180fd756538", size = 153752, upload-time = "2025-09-25T19:50:08.515Z" }, + { url = "https://files.pythonhosted.org/packages/41/aa/4190e60921927b7056820291f56fc57d00d04757c8b316b2d3c0d1d6da2c/bcrypt-5.0.0-cp38-abi3-win_amd64.whl", hash = "sha256:35a77ec55b541e5e583eb3436ffbbf53b0ffa1fa16ca6782279daf95d146dcd9", size = 150881, upload-time = "2025-09-25T19:50:09.742Z" }, + { url = "https://files.pythonhosted.org/packages/54/12/cd77221719d0b39ac0b55dbd39358db1cd1246e0282e104366ebbfb8266a/bcrypt-5.0.0-cp38-abi3-win_arm64.whl", hash = "sha256:cde08734f12c6a4e28dc6755cd11d3bdfea608d93d958fffbe95a7026ebe4980", size = 144931, upload-time = "2025-09-25T19:50:11.016Z" }, + { url = "https://files.pythonhosted.org/packages/5d/ba/2af136406e1c3839aea9ecadc2f6be2bcd1eff255bd451dd39bcf302c47a/bcrypt-5.0.0-cp39-abi3-macosx_10_12_universal2.whl", hash = "sha256:0c418ca99fd47e9c59a301744d63328f17798b5947b0f791e9af3c1c499c2d0a", size = 495313, upload-time = "2025-09-25T19:50:12.309Z" }, + { url = "https://files.pythonhosted.org/packages/ac/ee/2f4985dbad090ace5ad1f7dd8ff94477fe089b5fab2040bd784a3d5f187b/bcrypt-5.0.0-cp39-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:ddb4e1500f6efdd402218ffe34d040a1196c072e07929b9820f363a1fd1f4191", size = 275290, upload-time = "2025-09-25T19:50:13.673Z" }, + { url = "https://files.pythonhosted.org/packages/e4/6e/b77ade812672d15cf50842e167eead80ac3514f3beacac8902915417f8b7/bcrypt-5.0.0-cp39-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:7aeef54b60ceddb6f30ee3db090351ecf0d40ec6e2abf41430997407a46d2254", size = 278253, upload-time = "2025-09-25T19:50:15.089Z" }, + { url = "https://files.pythonhosted.org/packages/36/c4/ed00ed32f1040f7990dac7115f82273e3c03da1e1a1587a778d8cea496d8/bcrypt-5.0.0-cp39-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:f0ce778135f60799d89c9693b9b398819d15f1921ba15fe719acb3178215a7db", size = 276084, upload-time = "2025-09-25T19:50:16.699Z" }, + { url = "https://files.pythonhosted.org/packages/e7/c4/fa6e16145e145e87f1fa351bbd54b429354fd72145cd3d4e0c5157cf4c70/bcrypt-5.0.0-cp39-abi3-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:a71f70ee269671460b37a449f5ff26982a6f2ba493b3eabdd687b4bf35f875ac", size = 297185, upload-time = "2025-09-25T19:50:18.525Z" }, + { url = "https://files.pythonhosted.org/packages/24/b4/11f8a31d8b67cca3371e046db49baa7c0594d71eb40ac8121e2fc0888db0/bcrypt-5.0.0-cp39-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:f8429e1c410b4073944f03bd778a9e066e7fad723564a52ff91841d278dfc822", size = 278656, upload-time = "2025-09-25T19:50:19.809Z" }, + { url = "https://files.pythonhosted.org/packages/ac/31/79f11865f8078e192847d2cb526e3fa27c200933c982c5b2869720fa5fce/bcrypt-5.0.0-cp39-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:edfcdcedd0d0f05850c52ba3127b1fce70b9f89e0fe5ff16517df7e81fa3cbb8", size = 275662, upload-time = "2025-09-25T19:50:21.567Z" }, + { url = "https://files.pythonhosted.org/packages/d4/8d/5e43d9584b3b3591a6f9b68f755a4da879a59712981ef5ad2a0ac1379f7a/bcrypt-5.0.0-cp39-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:611f0a17aa4a25a69362dcc299fda5c8a3d4f160e2abb3831041feb77393a14a", size = 278240, upload-time = "2025-09-25T19:50:23.305Z" }, + { url = "https://files.pythonhosted.org/packages/89/48/44590e3fc158620f680a978aafe8f87a4c4320da81ed11552f0323aa9a57/bcrypt-5.0.0-cp39-abi3-musllinux_1_1_aarch64.whl", hash = "sha256:db99dca3b1fdc3db87d7c57eac0c82281242d1eabf19dcb8a6b10eb29a2e72d1", size = 311152, upload-time = "2025-09-25T19:50:24.597Z" }, + { url = "https://files.pythonhosted.org/packages/5f/85/e4fbfc46f14f47b0d20493669a625da5827d07e8a88ee460af6cd9768b44/bcrypt-5.0.0-cp39-abi3-musllinux_1_1_x86_64.whl", hash = "sha256:5feebf85a9cefda32966d8171f5db7e3ba964b77fdfe31919622256f80f9cf42", size = 313284, upload-time = "2025-09-25T19:50:26.268Z" }, + { url = "https://files.pythonhosted.org/packages/25/ae/479f81d3f4594456a01ea2f05b132a519eff9ab5768a70430fa1132384b1/bcrypt-5.0.0-cp39-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:3ca8a166b1140436e058298a34d88032ab62f15aae1c598580333dc21d27ef10", size = 341643, upload-time = "2025-09-25T19:50:28.02Z" }, + { url = "https://files.pythonhosted.org/packages/df/d2/36a086dee1473b14276cd6ea7f61aef3b2648710b5d7f1c9e032c29b859f/bcrypt-5.0.0-cp39-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:61afc381250c3182d9078551e3ac3a41da14154fbff647ddf52a769f588c4172", size = 359698, upload-time = "2025-09-25T19:50:31.347Z" }, + { url = "https://files.pythonhosted.org/packages/c0/f6/688d2cd64bfd0b14d805ddb8a565e11ca1fb0fd6817175d58b10052b6d88/bcrypt-5.0.0-cp39-abi3-win32.whl", hash = "sha256:64d7ce196203e468c457c37ec22390f1a61c85c6f0b8160fd752940ccfb3a683", size = 153725, upload-time = "2025-09-25T19:50:34.384Z" }, + { url = "https://files.pythonhosted.org/packages/9f/b9/9d9a641194a730bda138b3dfe53f584d61c58cd5230e37566e83ec2ffa0d/bcrypt-5.0.0-cp39-abi3-win_amd64.whl", hash = "sha256:64ee8434b0da054d830fa8e89e1c8bf30061d539044a39524ff7dec90481e5c2", size = 150912, upload-time = "2025-09-25T19:50:35.69Z" }, + { url = "https://files.pythonhosted.org/packages/27/44/d2ef5e87509158ad2187f4dd0852df80695bb1ee0cfe0a684727b01a69e0/bcrypt-5.0.0-cp39-abi3-win_arm64.whl", hash = "sha256:f2347d3534e76bf50bca5500989d6c1d05ed64b440408057a37673282c654927", size = 144953, upload-time = "2025-09-25T19:50:37.32Z" }, +] + +[[package]] +name = "bookbytes" +version = "0.1.0" +source = { editable = "." } +dependencies = [ + { name = "aioboto3" }, + { name = "aiofiles" }, + { name = "aiosqlite" }, + { name = "alembic" }, + { name = "anyio" }, + { name = "arq" }, + { name = "asyncpg" }, + { name = "boto3" }, + { name = "fastapi" }, + { name = "gtts" }, + { name = "httpx" }, + { name = "instructor" }, + { name = "openai" }, + { name = "passlib", extra = ["bcrypt"] }, + { name = "pydantic" }, + { name = "pydantic-settings" }, + { name = "python-dotenv" }, + { name = "python-jose", extra = ["cryptography"] }, + { name = "python-multipart" }, + { name = "redis" }, + { name = "sqlalchemy", extra = ["asyncio"] }, + { name = "structlog" }, + { name = "tenacity" }, + { name = "uuid6" }, + { name = "uvicorn", extra = ["standard"] }, +] + +[package.optional-dependencies] +dev = [ + { name = "fakeredis" }, + { name = "mypy" }, + { name = "pre-commit" }, + { name = "pytest" }, + { name = "pytest-asyncio" }, + { name = "pytest-cov" }, + { name = "respx" }, + { name = "ruff" }, +] + +[package.metadata] +requires-dist = [ + { name = "aioboto3" }, + { name = "aiofiles" }, + { name = "aiosqlite" }, + { name = "alembic" }, + { name = "anyio" }, + { name = "arq" }, + { name = "asyncpg" }, + { name = "boto3" }, + { name = "fakeredis", marker = "extra == 'dev'" }, + { name = "fastapi" }, + { name = "gtts" }, + { name = "httpx" }, + { name = "instructor", specifier = ">=1.0.0" }, + { name = "mypy", marker = "extra == 'dev'" }, + { name = "openai" }, + { name = "passlib", extras = ["bcrypt"] }, + { name = "pre-commit", marker = "extra == 'dev'" }, + { name = "pydantic" }, + { name = "pydantic-settings" }, + { name = "pytest", marker = "extra == 'dev'" }, + { name = "pytest-asyncio", marker = "extra == 'dev'" }, + { name = "pytest-cov", marker = "extra == 'dev'" }, + { name = "python-dotenv" }, + { name = "python-jose", extras = ["cryptography"] }, + { name = "python-multipart" }, + { name = "redis" }, + { name = "respx", marker = "extra == 'dev'" }, + { name = "ruff", marker = "extra == 'dev'" }, + { name = "sqlalchemy", extras = ["asyncio"] }, + { name = "structlog" }, + { name = "tenacity" }, + { name = "uuid6" }, + { name = "uvicorn", extras = ["standard"] }, +] +provides-extras = ["dev"] + +[[package]] +name = "boto3" +version = "1.40.61" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "botocore" }, + { name = "jmespath" }, + { name = "s3transfer" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/ed/f9/6ef8feb52c3cce5ec3967a535a6114b57ac7949fd166b0f3090c2b06e4e5/boto3-1.40.61.tar.gz", hash = "sha256:d6c56277251adf6c2bdd25249feae625abe4966831676689ff23b4694dea5b12", size = 111535, upload-time = "2025-10-28T19:26:57.247Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/61/24/3bf865b07d15fea85b63504856e137029b6acbc73762496064219cdb265d/boto3-1.40.61-py3-none-any.whl", hash = "sha256:6b9c57b2a922b5d8c17766e29ed792586a818098efe84def27c8f582b33f898c", size = 139321, upload-time = "2025-10-28T19:26:55.007Z" }, +] + +[[package]] +name = "botocore" +version = "1.40.61" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "jmespath" }, + { name = "python-dateutil" }, + { name = "urllib3" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/28/a3/81d3a47c2dbfd76f185d3b894f2ad01a75096c006a2dd91f237dca182188/botocore-1.40.61.tar.gz", hash = "sha256:a2487ad69b090f9cccd64cf07c7021cd80ee9c0655ad974f87045b02f3ef52cd", size = 14393956, upload-time = "2025-10-28T19:26:46.108Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/38/c5/f6ce561004db45f0b847c2cd9b19c67c6bf348a82018a48cb718be6b58b0/botocore-1.40.61-py3-none-any.whl", hash = "sha256:17ebae412692fd4824f99cde0f08d50126dc97954008e5ba2b522eb049238aa7", size = 14055973, upload-time = "2025-10-28T19:26:42.15Z" }, +] + +[[package]] +name = "certifi" +version = "2025.11.12" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/a2/8c/58f469717fa48465e4a50c014a0400602d3c437d7c0c468e17ada824da3a/certifi-2025.11.12.tar.gz", hash = "sha256:d8ab5478f2ecd78af242878415affce761ca6bc54a22a27e026d7c25357c3316", size = 160538, upload-time = "2025-11-12T02:54:51.517Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/70/7d/9bc192684cea499815ff478dfcdc13835ddf401365057044fb721ec6bddb/certifi-2025.11.12-py3-none-any.whl", hash = "sha256:97de8790030bbd5c2d96b7ec782fc2f7820ef8dba6db909ccf95449f2d062d4b", size = 159438, upload-time = "2025-11-12T02:54:49.735Z" }, +] + +[[package]] +name = "cffi" +version = "2.0.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "pycparser", marker = "implementation_name != 'PyPy'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/eb/56/b1ba7935a17738ae8453301356628e8147c79dbb825bcbc73dc7401f9846/cffi-2.0.0.tar.gz", hash = "sha256:44d1b5909021139fe36001ae048dbdde8214afa20200eda0f64c068cac5d5529", size = 523588, upload-time = "2025-09-08T23:24:04.541Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/4b/8d/a0a47a0c9e413a658623d014e91e74a50cdd2c423f7ccfd44086ef767f90/cffi-2.0.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:00bdf7acc5f795150faa6957054fbbca2439db2f775ce831222b66f192f03beb", size = 185230, upload-time = "2025-09-08T23:23:00.879Z" }, + { url = "https://files.pythonhosted.org/packages/4a/d2/a6c0296814556c68ee32009d9c2ad4f85f2707cdecfd7727951ec228005d/cffi-2.0.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:45d5e886156860dc35862657e1494b9bae8dfa63bf56796f2fb56e1679fc0bca", size = 181043, upload-time = "2025-09-08T23:23:02.231Z" }, + { url = "https://files.pythonhosted.org/packages/b0/1e/d22cc63332bd59b06481ceaac49d6c507598642e2230f201649058a7e704/cffi-2.0.0-cp313-cp313-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:07b271772c100085dd28b74fa0cd81c8fb1a3ba18b21e03d7c27f3436a10606b", size = 212446, upload-time = "2025-09-08T23:23:03.472Z" }, + { url = "https://files.pythonhosted.org/packages/a9/f5/a2c23eb03b61a0b8747f211eb716446c826ad66818ddc7810cc2cc19b3f2/cffi-2.0.0-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:d48a880098c96020b02d5a1f7d9251308510ce8858940e6fa99ece33f610838b", size = 220101, upload-time = "2025-09-08T23:23:04.792Z" }, + { url = "https://files.pythonhosted.org/packages/f2/7f/e6647792fc5850d634695bc0e6ab4111ae88e89981d35ac269956605feba/cffi-2.0.0-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:f93fd8e5c8c0a4aa1f424d6173f14a892044054871c771f8566e4008eaa359d2", size = 207948, upload-time = "2025-09-08T23:23:06.127Z" }, + { url = "https://files.pythonhosted.org/packages/cb/1e/a5a1bd6f1fb30f22573f76533de12a00bf274abcdc55c8edab639078abb6/cffi-2.0.0-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:dd4f05f54a52fb558f1ba9f528228066954fee3ebe629fc1660d874d040ae5a3", size = 206422, upload-time = "2025-09-08T23:23:07.753Z" }, + { url = "https://files.pythonhosted.org/packages/98/df/0a1755e750013a2081e863e7cd37e0cdd02664372c754e5560099eb7aa44/cffi-2.0.0-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:c8d3b5532fc71b7a77c09192b4a5a200ea992702734a2e9279a37f2478236f26", size = 219499, upload-time = "2025-09-08T23:23:09.648Z" }, + { url = "https://files.pythonhosted.org/packages/50/e1/a969e687fcf9ea58e6e2a928ad5e2dd88cc12f6f0ab477e9971f2309b57c/cffi-2.0.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:d9b29c1f0ae438d5ee9acb31cadee00a58c46cc9c0b2f9038c6b0b3470877a8c", size = 222928, upload-time = "2025-09-08T23:23:10.928Z" }, + { url = "https://files.pythonhosted.org/packages/36/54/0362578dd2c9e557a28ac77698ed67323ed5b9775ca9d3fe73fe191bb5d8/cffi-2.0.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:6d50360be4546678fc1b79ffe7a66265e28667840010348dd69a314145807a1b", size = 221302, upload-time = "2025-09-08T23:23:12.42Z" }, + { url = "https://files.pythonhosted.org/packages/eb/6d/bf9bda840d5f1dfdbf0feca87fbdb64a918a69bca42cfa0ba7b137c48cb8/cffi-2.0.0-cp313-cp313-win32.whl", hash = "sha256:74a03b9698e198d47562765773b4a8309919089150a0bb17d829ad7b44b60d27", size = 172909, upload-time = "2025-09-08T23:23:14.32Z" }, + { url = "https://files.pythonhosted.org/packages/37/18/6519e1ee6f5a1e579e04b9ddb6f1676c17368a7aba48299c3759bbc3c8b3/cffi-2.0.0-cp313-cp313-win_amd64.whl", hash = "sha256:19f705ada2530c1167abacb171925dd886168931e0a7b78f5bffcae5c6b5be75", size = 183402, upload-time = "2025-09-08T23:23:15.535Z" }, + { url = "https://files.pythonhosted.org/packages/cb/0e/02ceeec9a7d6ee63bb596121c2c8e9b3a9e150936f4fbef6ca1943e6137c/cffi-2.0.0-cp313-cp313-win_arm64.whl", hash = "sha256:256f80b80ca3853f90c21b23ee78cd008713787b1b1e93eae9f3d6a7134abd91", size = 177780, upload-time = "2025-09-08T23:23:16.761Z" }, + { url = "https://files.pythonhosted.org/packages/92/c4/3ce07396253a83250ee98564f8d7e9789fab8e58858f35d07a9a2c78de9f/cffi-2.0.0-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:fc33c5141b55ed366cfaad382df24fe7dcbc686de5be719b207bb248e3053dc5", size = 185320, upload-time = "2025-09-08T23:23:18.087Z" }, + { url = "https://files.pythonhosted.org/packages/59/dd/27e9fa567a23931c838c6b02d0764611c62290062a6d4e8ff7863daf9730/cffi-2.0.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:c654de545946e0db659b3400168c9ad31b5d29593291482c43e3564effbcee13", size = 181487, upload-time = "2025-09-08T23:23:19.622Z" }, + { url = "https://files.pythonhosted.org/packages/d6/43/0e822876f87ea8a4ef95442c3d766a06a51fc5298823f884ef87aaad168c/cffi-2.0.0-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:24b6f81f1983e6df8db3adc38562c83f7d4a0c36162885ec7f7b77c7dcbec97b", size = 220049, upload-time = "2025-09-08T23:23:20.853Z" }, + { url = "https://files.pythonhosted.org/packages/b4/89/76799151d9c2d2d1ead63c2429da9ea9d7aac304603de0c6e8764e6e8e70/cffi-2.0.0-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:12873ca6cb9b0f0d3a0da705d6086fe911591737a59f28b7936bdfed27c0d47c", size = 207793, upload-time = "2025-09-08T23:23:22.08Z" }, + { url = "https://files.pythonhosted.org/packages/bb/dd/3465b14bb9e24ee24cb88c9e3730f6de63111fffe513492bf8c808a3547e/cffi-2.0.0-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:d9b97165e8aed9272a6bb17c01e3cc5871a594a446ebedc996e2397a1c1ea8ef", size = 206300, upload-time = "2025-09-08T23:23:23.314Z" }, + { url = "https://files.pythonhosted.org/packages/47/d9/d83e293854571c877a92da46fdec39158f8d7e68da75bf73581225d28e90/cffi-2.0.0-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:afb8db5439b81cf9c9d0c80404b60c3cc9c3add93e114dcae767f1477cb53775", size = 219244, upload-time = "2025-09-08T23:23:24.541Z" }, + { url = "https://files.pythonhosted.org/packages/2b/0f/1f177e3683aead2bb00f7679a16451d302c436b5cbf2505f0ea8146ef59e/cffi-2.0.0-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:737fe7d37e1a1bffe70bd5754ea763a62a066dc5913ca57e957824b72a85e205", size = 222828, upload-time = "2025-09-08T23:23:26.143Z" }, + { url = "https://files.pythonhosted.org/packages/c6/0f/cafacebd4b040e3119dcb32fed8bdef8dfe94da653155f9d0b9dc660166e/cffi-2.0.0-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:38100abb9d1b1435bc4cc340bb4489635dc2f0da7456590877030c9b3d40b0c1", size = 220926, upload-time = "2025-09-08T23:23:27.873Z" }, + { url = "https://files.pythonhosted.org/packages/3e/aa/df335faa45b395396fcbc03de2dfcab242cd61a9900e914fe682a59170b1/cffi-2.0.0-cp314-cp314-win32.whl", hash = "sha256:087067fa8953339c723661eda6b54bc98c5625757ea62e95eb4898ad5e776e9f", size = 175328, upload-time = "2025-09-08T23:23:44.61Z" }, + { url = "https://files.pythonhosted.org/packages/bb/92/882c2d30831744296ce713f0feb4c1cd30f346ef747b530b5318715cc367/cffi-2.0.0-cp314-cp314-win_amd64.whl", hash = "sha256:203a48d1fb583fc7d78a4c6655692963b860a417c0528492a6bc21f1aaefab25", size = 185650, upload-time = "2025-09-08T23:23:45.848Z" }, + { url = "https://files.pythonhosted.org/packages/9f/2c/98ece204b9d35a7366b5b2c6539c350313ca13932143e79dc133ba757104/cffi-2.0.0-cp314-cp314-win_arm64.whl", hash = "sha256:dbd5c7a25a7cb98f5ca55d258b103a2054f859a46ae11aaf23134f9cc0d356ad", size = 180687, upload-time = "2025-09-08T23:23:47.105Z" }, + { url = "https://files.pythonhosted.org/packages/3e/61/c768e4d548bfa607abcda77423448df8c471f25dbe64fb2ef6d555eae006/cffi-2.0.0-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:9a67fc9e8eb39039280526379fb3a70023d77caec1852002b4da7e8b270c4dd9", size = 188773, upload-time = "2025-09-08T23:23:29.347Z" }, + { url = "https://files.pythonhosted.org/packages/2c/ea/5f76bce7cf6fcd0ab1a1058b5af899bfbef198bea4d5686da88471ea0336/cffi-2.0.0-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:7a66c7204d8869299919db4d5069a82f1561581af12b11b3c9f48c584eb8743d", size = 185013, upload-time = "2025-09-08T23:23:30.63Z" }, + { url = "https://files.pythonhosted.org/packages/be/b4/c56878d0d1755cf9caa54ba71e5d049479c52f9e4afc230f06822162ab2f/cffi-2.0.0-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:7cc09976e8b56f8cebd752f7113ad07752461f48a58cbba644139015ac24954c", size = 221593, upload-time = "2025-09-08T23:23:31.91Z" }, + { url = "https://files.pythonhosted.org/packages/e0/0d/eb704606dfe8033e7128df5e90fee946bbcb64a04fcdaa97321309004000/cffi-2.0.0-cp314-cp314t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:92b68146a71df78564e4ef48af17551a5ddd142e5190cdf2c5624d0c3ff5b2e8", size = 209354, upload-time = "2025-09-08T23:23:33.214Z" }, + { url = "https://files.pythonhosted.org/packages/d8/19/3c435d727b368ca475fb8742ab97c9cb13a0de600ce86f62eab7fa3eea60/cffi-2.0.0-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:b1e74d11748e7e98e2f426ab176d4ed720a64412b6a15054378afdb71e0f37dc", size = 208480, upload-time = "2025-09-08T23:23:34.495Z" }, + { url = "https://files.pythonhosted.org/packages/d0/44/681604464ed9541673e486521497406fadcc15b5217c3e326b061696899a/cffi-2.0.0-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:28a3a209b96630bca57cce802da70c266eb08c6e97e5afd61a75611ee6c64592", size = 221584, upload-time = "2025-09-08T23:23:36.096Z" }, + { url = "https://files.pythonhosted.org/packages/25/8e/342a504ff018a2825d395d44d63a767dd8ebc927ebda557fecdaca3ac33a/cffi-2.0.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:7553fb2090d71822f02c629afe6042c299edf91ba1bf94951165613553984512", size = 224443, upload-time = "2025-09-08T23:23:37.328Z" }, + { url = "https://files.pythonhosted.org/packages/e1/5e/b666bacbbc60fbf415ba9988324a132c9a7a0448a9a8f125074671c0f2c3/cffi-2.0.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:6c6c373cfc5c83a975506110d17457138c8c63016b563cc9ed6e056a82f13ce4", size = 223437, upload-time = "2025-09-08T23:23:38.945Z" }, + { url = "https://files.pythonhosted.org/packages/a0/1d/ec1a60bd1a10daa292d3cd6bb0b359a81607154fb8165f3ec95fe003b85c/cffi-2.0.0-cp314-cp314t-win32.whl", hash = "sha256:1fc9ea04857caf665289b7a75923f2c6ed559b8298a1b8c49e59f7dd95c8481e", size = 180487, upload-time = "2025-09-08T23:23:40.423Z" }, + { url = "https://files.pythonhosted.org/packages/bf/41/4c1168c74fac325c0c8156f04b6749c8b6a8f405bbf91413ba088359f60d/cffi-2.0.0-cp314-cp314t-win_amd64.whl", hash = "sha256:d68b6cef7827e8641e8ef16f4494edda8b36104d79773a334beaa1e3521430f6", size = 191726, upload-time = "2025-09-08T23:23:41.742Z" }, + { url = "https://files.pythonhosted.org/packages/ae/3a/dbeec9d1ee0844c679f6bb5d6ad4e9f198b1224f4e7a32825f47f6192b0c/cffi-2.0.0-cp314-cp314t-win_arm64.whl", hash = "sha256:0a1527a803f0a659de1af2e1fd700213caba79377e27e4693648c2923da066f9", size = 184195, upload-time = "2025-09-08T23:23:43.004Z" }, +] + +[[package]] +name = "cfgv" +version = "3.5.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/4e/b5/721b8799b04bf9afe054a3899c6cf4e880fcf8563cc71c15610242490a0c/cfgv-3.5.0.tar.gz", hash = "sha256:d5b1034354820651caa73ede66a6294d6e95c1b00acc5e9b098e917404669132", size = 7334, upload-time = "2025-11-19T20:55:51.612Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/db/3c/33bac158f8ab7f89b2e59426d5fe2e4f63f7ed25df84c036890172b412b5/cfgv-3.5.0-py2.py3-none-any.whl", hash = "sha256:a8dc6b26ad22ff227d2634a65cb388215ce6cc96bbcc5cfde7641ae87e8dacc0", size = 7445, upload-time = "2025-11-19T20:55:50.744Z" }, +] + +[[package]] +name = "charset-normalizer" +version = "3.4.4" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/13/69/33ddede1939fdd074bce5434295f38fae7136463422fe4fd3e0e89b98062/charset_normalizer-3.4.4.tar.gz", hash = "sha256:94537985111c35f28720e43603b8e7b43a6ecfb2ce1d3058bbe955b73404e21a", size = 129418, upload-time = "2025-10-14T04:42:32.879Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/97/45/4b3a1239bbacd321068ea6e7ac28875b03ab8bc0aa0966452db17cd36714/charset_normalizer-3.4.4-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:e1f185f86a6f3403aa2420e815904c67b2f9ebc443f045edd0de921108345794", size = 208091, upload-time = "2025-10-14T04:41:13.346Z" }, + { url = "https://files.pythonhosted.org/packages/7d/62/73a6d7450829655a35bb88a88fca7d736f9882a27eacdca2c6d505b57e2e/charset_normalizer-3.4.4-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:6b39f987ae8ccdf0d2642338faf2abb1862340facc796048b604ef14919e55ed", size = 147936, upload-time = "2025-10-14T04:41:14.461Z" }, + { url = "https://files.pythonhosted.org/packages/89/c5/adb8c8b3d6625bef6d88b251bbb0d95f8205831b987631ab0c8bb5d937c2/charset_normalizer-3.4.4-cp313-cp313-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:3162d5d8ce1bb98dd51af660f2121c55d0fa541b46dff7bb9b9f86ea1d87de72", size = 144180, upload-time = "2025-10-14T04:41:15.588Z" }, + { url = "https://files.pythonhosted.org/packages/91/ed/9706e4070682d1cc219050b6048bfd293ccf67b3d4f5a4f39207453d4b99/charset_normalizer-3.4.4-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:81d5eb2a312700f4ecaa977a8235b634ce853200e828fbadf3a9c50bab278328", size = 161346, upload-time = "2025-10-14T04:41:16.738Z" }, + { url = "https://files.pythonhosted.org/packages/d5/0d/031f0d95e4972901a2f6f09ef055751805ff541511dc1252ba3ca1f80cf5/charset_normalizer-3.4.4-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:5bd2293095d766545ec1a8f612559f6b40abc0eb18bb2f5d1171872d34036ede", size = 158874, upload-time = "2025-10-14T04:41:17.923Z" }, + { url = "https://files.pythonhosted.org/packages/f5/83/6ab5883f57c9c801ce5e5677242328aa45592be8a00644310a008d04f922/charset_normalizer-3.4.4-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:a8a8b89589086a25749f471e6a900d3f662d1d3b6e2e59dcecf787b1cc3a1894", size = 153076, upload-time = "2025-10-14T04:41:19.106Z" }, + { url = "https://files.pythonhosted.org/packages/75/1e/5ff781ddf5260e387d6419959ee89ef13878229732732ee73cdae01800f2/charset_normalizer-3.4.4-cp313-cp313-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:bc7637e2f80d8530ee4a78e878bce464f70087ce73cf7c1caf142416923b98f1", size = 150601, upload-time = "2025-10-14T04:41:20.245Z" }, + { url = "https://files.pythonhosted.org/packages/d7/57/71be810965493d3510a6ca79b90c19e48696fb1ff964da319334b12677f0/charset_normalizer-3.4.4-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:f8bf04158c6b607d747e93949aa60618b61312fe647a6369f88ce2ff16043490", size = 150376, upload-time = "2025-10-14T04:41:21.398Z" }, + { url = "https://files.pythonhosted.org/packages/e5/d5/c3d057a78c181d007014feb7e9f2e65905a6c4ef182c0ddf0de2924edd65/charset_normalizer-3.4.4-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:554af85e960429cf30784dd47447d5125aaa3b99a6f0683589dbd27e2f45da44", size = 144825, upload-time = "2025-10-14T04:41:22.583Z" }, + { url = "https://files.pythonhosted.org/packages/e6/8c/d0406294828d4976f275ffbe66f00266c4b3136b7506941d87c00cab5272/charset_normalizer-3.4.4-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:74018750915ee7ad843a774364e13a3db91682f26142baddf775342c3f5b1133", size = 162583, upload-time = "2025-10-14T04:41:23.754Z" }, + { url = "https://files.pythonhosted.org/packages/d7/24/e2aa1f18c8f15c4c0e932d9287b8609dd30ad56dbe41d926bd846e22fb8d/charset_normalizer-3.4.4-cp313-cp313-musllinux_1_2_riscv64.whl", hash = "sha256:c0463276121fdee9c49b98908b3a89c39be45d86d1dbaa22957e38f6321d4ce3", size = 150366, upload-time = "2025-10-14T04:41:25.27Z" }, + { url = "https://files.pythonhosted.org/packages/e4/5b/1e6160c7739aad1e2df054300cc618b06bf784a7a164b0f238360721ab86/charset_normalizer-3.4.4-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:362d61fd13843997c1c446760ef36f240cf81d3ebf74ac62652aebaf7838561e", size = 160300, upload-time = "2025-10-14T04:41:26.725Z" }, + { url = "https://files.pythonhosted.org/packages/7a/10/f882167cd207fbdd743e55534d5d9620e095089d176d55cb22d5322f2afd/charset_normalizer-3.4.4-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:9a26f18905b8dd5d685d6d07b0cdf98a79f3c7a918906af7cc143ea2e164c8bc", size = 154465, upload-time = "2025-10-14T04:41:28.322Z" }, + { url = "https://files.pythonhosted.org/packages/89/66/c7a9e1b7429be72123441bfdbaf2bc13faab3f90b933f664db506dea5915/charset_normalizer-3.4.4-cp313-cp313-win32.whl", hash = "sha256:9b35f4c90079ff2e2edc5b26c0c77925e5d2d255c42c74fdb70fb49b172726ac", size = 99404, upload-time = "2025-10-14T04:41:29.95Z" }, + { url = "https://files.pythonhosted.org/packages/c4/26/b9924fa27db384bdcd97ab83b4f0a8058d96ad9626ead570674d5e737d90/charset_normalizer-3.4.4-cp313-cp313-win_amd64.whl", hash = "sha256:b435cba5f4f750aa6c0a0d92c541fb79f69a387c91e61f1795227e4ed9cece14", size = 107092, upload-time = "2025-10-14T04:41:31.188Z" }, + { url = "https://files.pythonhosted.org/packages/af/8f/3ed4bfa0c0c72a7ca17f0380cd9e4dd842b09f664e780c13cff1dcf2ef1b/charset_normalizer-3.4.4-cp313-cp313-win_arm64.whl", hash = "sha256:542d2cee80be6f80247095cc36c418f7bddd14f4a6de45af91dfad36d817bba2", size = 100408, upload-time = "2025-10-14T04:41:32.624Z" }, + { url = "https://files.pythonhosted.org/packages/2a/35/7051599bd493e62411d6ede36fd5af83a38f37c4767b92884df7301db25d/charset_normalizer-3.4.4-cp314-cp314-macosx_10_13_universal2.whl", hash = "sha256:da3326d9e65ef63a817ecbcc0df6e94463713b754fe293eaa03da99befb9a5bd", size = 207746, upload-time = "2025-10-14T04:41:33.773Z" }, + { url = "https://files.pythonhosted.org/packages/10/9a/97c8d48ef10d6cd4fcead2415523221624bf58bcf68a802721a6bc807c8f/charset_normalizer-3.4.4-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:8af65f14dc14a79b924524b1e7fffe304517b2bff5a58bf64f30b98bbc5079eb", size = 147889, upload-time = "2025-10-14T04:41:34.897Z" }, + { url = "https://files.pythonhosted.org/packages/10/bf/979224a919a1b606c82bd2c5fa49b5c6d5727aa47b4312bb27b1734f53cd/charset_normalizer-3.4.4-cp314-cp314-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:74664978bb272435107de04e36db5a9735e78232b85b77d45cfb38f758efd33e", size = 143641, upload-time = "2025-10-14T04:41:36.116Z" }, + { url = "https://files.pythonhosted.org/packages/ba/33/0ad65587441fc730dc7bd90e9716b30b4702dc7b617e6ba4997dc8651495/charset_normalizer-3.4.4-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:752944c7ffbfdd10c074dc58ec2d5a8a4cd9493b314d367c14d24c17684ddd14", size = 160779, upload-time = "2025-10-14T04:41:37.229Z" }, + { url = "https://files.pythonhosted.org/packages/67/ed/331d6b249259ee71ddea93f6f2f0a56cfebd46938bde6fcc6f7b9a3d0e09/charset_normalizer-3.4.4-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:d1f13550535ad8cff21b8d757a3257963e951d96e20ec82ab44bc64aeb62a191", size = 159035, upload-time = "2025-10-14T04:41:38.368Z" }, + { url = "https://files.pythonhosted.org/packages/67/ff/f6b948ca32e4f2a4576aa129d8bed61f2e0543bf9f5f2b7fc3758ed005c9/charset_normalizer-3.4.4-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:ecaae4149d99b1c9e7b88bb03e3221956f68fd6d50be2ef061b2381b61d20838", size = 152542, upload-time = "2025-10-14T04:41:39.862Z" }, + { url = "https://files.pythonhosted.org/packages/16/85/276033dcbcc369eb176594de22728541a925b2632f9716428c851b149e83/charset_normalizer-3.4.4-cp314-cp314-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:cb6254dc36b47a990e59e1068afacdcd02958bdcce30bb50cc1700a8b9d624a6", size = 149524, upload-time = "2025-10-14T04:41:41.319Z" }, + { url = "https://files.pythonhosted.org/packages/9e/f2/6a2a1f722b6aba37050e626530a46a68f74e63683947a8acff92569f979a/charset_normalizer-3.4.4-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:c8ae8a0f02f57a6e61203a31428fa1d677cbe50c93622b4149d5c0f319c1d19e", size = 150395, upload-time = "2025-10-14T04:41:42.539Z" }, + { url = "https://files.pythonhosted.org/packages/60/bb/2186cb2f2bbaea6338cad15ce23a67f9b0672929744381e28b0592676824/charset_normalizer-3.4.4-cp314-cp314-musllinux_1_2_armv7l.whl", hash = "sha256:47cc91b2f4dd2833fddaedd2893006b0106129d4b94fdb6af1f4ce5a9965577c", size = 143680, upload-time = "2025-10-14T04:41:43.661Z" }, + { url = "https://files.pythonhosted.org/packages/7d/a5/bf6f13b772fbb2a90360eb620d52ed8f796f3c5caee8398c3b2eb7b1c60d/charset_normalizer-3.4.4-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:82004af6c302b5d3ab2cfc4cc5f29db16123b1a8417f2e25f9066f91d4411090", size = 162045, upload-time = "2025-10-14T04:41:44.821Z" }, + { url = "https://files.pythonhosted.org/packages/df/c5/d1be898bf0dc3ef9030c3825e5d3b83f2c528d207d246cbabe245966808d/charset_normalizer-3.4.4-cp314-cp314-musllinux_1_2_riscv64.whl", hash = "sha256:2b7d8f6c26245217bd2ad053761201e9f9680f8ce52f0fcd8d0755aeae5b2152", size = 149687, upload-time = "2025-10-14T04:41:46.442Z" }, + { url = "https://files.pythonhosted.org/packages/a5/42/90c1f7b9341eef50c8a1cb3f098ac43b0508413f33affd762855f67a410e/charset_normalizer-3.4.4-cp314-cp314-musllinux_1_2_s390x.whl", hash = "sha256:799a7a5e4fb2d5898c60b640fd4981d6a25f1c11790935a44ce38c54e985f828", size = 160014, upload-time = "2025-10-14T04:41:47.631Z" }, + { url = "https://files.pythonhosted.org/packages/76/be/4d3ee471e8145d12795ab655ece37baed0929462a86e72372fd25859047c/charset_normalizer-3.4.4-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:99ae2cffebb06e6c22bdc25801d7b30f503cc87dbd283479e7b606f70aff57ec", size = 154044, upload-time = "2025-10-14T04:41:48.81Z" }, + { url = "https://files.pythonhosted.org/packages/b0/6f/8f7af07237c34a1defe7defc565a9bc1807762f672c0fde711a4b22bf9c0/charset_normalizer-3.4.4-cp314-cp314-win32.whl", hash = "sha256:f9d332f8c2a2fcbffe1378594431458ddbef721c1769d78e2cbc06280d8155f9", size = 99940, upload-time = "2025-10-14T04:41:49.946Z" }, + { url = "https://files.pythonhosted.org/packages/4b/51/8ade005e5ca5b0d80fb4aff72a3775b325bdc3d27408c8113811a7cbe640/charset_normalizer-3.4.4-cp314-cp314-win_amd64.whl", hash = "sha256:8a6562c3700cce886c5be75ade4a5db4214fda19fede41d9792d100288d8f94c", size = 107104, upload-time = "2025-10-14T04:41:51.051Z" }, + { url = "https://files.pythonhosted.org/packages/da/5f/6b8f83a55bb8278772c5ae54a577f3099025f9ade59d0136ac24a0df4bde/charset_normalizer-3.4.4-cp314-cp314-win_arm64.whl", hash = "sha256:de00632ca48df9daf77a2c65a484531649261ec9f25489917f09e455cb09ddb2", size = 100743, upload-time = "2025-10-14T04:41:52.122Z" }, + { url = "https://files.pythonhosted.org/packages/0a/4c/925909008ed5a988ccbb72dcc897407e5d6d3bd72410d69e051fc0c14647/charset_normalizer-3.4.4-py3-none-any.whl", hash = "sha256:7a32c560861a02ff789ad905a2fe94e3f840803362c84fecf1851cb4cf3dc37f", size = 53402, upload-time = "2025-10-14T04:42:31.76Z" }, +] + +[[package]] +name = "click" +version = "8.1.8" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "colorama", marker = "sys_platform == 'win32'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/b9/2e/0090cbf739cee7d23781ad4b89a9894a41538e4fcf4c31dcdd705b78eb8b/click-8.1.8.tar.gz", hash = "sha256:ed53c9d8990d83c2a27deae68e4ee337473f6330c040a31d4225c9574d16096a", size = 226593, upload-time = "2024-12-21T18:38:44.339Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/7e/d4/7ebdbd03970677812aac39c869717059dbb71a4cfc033ca6e5221787892c/click-8.1.8-py3-none-any.whl", hash = "sha256:63c132bbbed01578a06712a2d1f497bb62d9c1c0d329b7903a866228027263b2", size = 98188, upload-time = "2024-12-21T18:38:41.666Z" }, +] + +[[package]] +name = "colorama" +version = "0.4.6" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/d8/53/6f443c9a4a8358a93a6792e2acffb9d9d5cb0a5cfd8802644b7b1c9a02e4/colorama-0.4.6.tar.gz", hash = "sha256:08695f5cb7ed6e0531a20572697297273c47b8cae5a63ffc6d6ed5c201be6e44", size = 27697, upload-time = "2022-10-25T02:36:22.414Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/d1/d6/3965ed04c63042e047cb6a3e6ed1a63a35087b6a609aa3a15ed8ac56c221/colorama-0.4.6-py2.py3-none-any.whl", hash = "sha256:4f1d9991f5acc0ca119f9d443620b77f9d6b33703e51011c16baf57afb285fc6", size = 25335, upload-time = "2022-10-25T02:36:20.889Z" }, +] + +[[package]] +name = "coverage" +version = "7.12.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/89/26/4a96807b193b011588099c3b5c89fbb05294e5b90e71018e065465f34eb6/coverage-7.12.0.tar.gz", hash = "sha256:fc11e0a4e372cb5f282f16ef90d4a585034050ccda536451901abfb19a57f40c", size = 819341, upload-time = "2025-11-18T13:34:20.766Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/b8/14/771700b4048774e48d2c54ed0c674273702713c9ee7acdfede40c2666747/coverage-7.12.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:47324fffca8d8eae7e185b5bb20c14645f23350f870c1649003618ea91a78941", size = 217725, upload-time = "2025-11-18T13:32:49.22Z" }, + { url = "https://files.pythonhosted.org/packages/17/a7/3aa4144d3bcb719bf67b22d2d51c2d577bf801498c13cb08f64173e80497/coverage-7.12.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:ccf3b2ede91decd2fb53ec73c1f949c3e034129d1e0b07798ff1d02ea0c8fa4a", size = 218098, upload-time = "2025-11-18T13:32:50.78Z" }, + { url = "https://files.pythonhosted.org/packages/fc/9c/b846bbc774ff81091a12a10203e70562c91ae71badda00c5ae5b613527b1/coverage-7.12.0-cp313-cp313-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:b365adc70a6936c6b0582dc38746b33b2454148c02349345412c6e743efb646d", size = 249093, upload-time = "2025-11-18T13:32:52.554Z" }, + { url = "https://files.pythonhosted.org/packages/76/b6/67d7c0e1f400b32c883e9342de4a8c2ae7c1a0b57c5de87622b7262e2309/coverage-7.12.0-cp313-cp313-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:bc13baf85cd8a4cfcf4a35c7bc9d795837ad809775f782f697bf630b7e200211", size = 251686, upload-time = "2025-11-18T13:32:54.862Z" }, + { url = "https://files.pythonhosted.org/packages/cc/75/b095bd4b39d49c3be4bffbb3135fea18a99a431c52dd7513637c0762fecb/coverage-7.12.0-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:099d11698385d572ceafb3288a5b80fe1fc58bf665b3f9d362389de488361d3d", size = 252930, upload-time = "2025-11-18T13:32:56.417Z" }, + { url = "https://files.pythonhosted.org/packages/6e/f3/466f63015c7c80550bead3093aacabf5380c1220a2a93c35d374cae8f762/coverage-7.12.0-cp313-cp313-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:473dc45d69694069adb7680c405fb1e81f60b2aff42c81e2f2c3feaf544d878c", size = 249296, upload-time = "2025-11-18T13:32:58.074Z" }, + { url = "https://files.pythonhosted.org/packages/27/86/eba2209bf2b7e28c68698fc13437519a295b2d228ba9e0ec91673e09fa92/coverage-7.12.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:583f9adbefd278e9de33c33d6846aa8f5d164fa49b47144180a0e037f0688bb9", size = 251068, upload-time = "2025-11-18T13:32:59.646Z" }, + { url = "https://files.pythonhosted.org/packages/ec/55/ca8ae7dbba962a3351f18940b359b94c6bafdd7757945fdc79ec9e452dc7/coverage-7.12.0-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:b2089cc445f2dc0af6f801f0d1355c025b76c24481935303cf1af28f636688f0", size = 249034, upload-time = "2025-11-18T13:33:01.481Z" }, + { url = "https://files.pythonhosted.org/packages/7a/d7/39136149325cad92d420b023b5fd900dabdd1c3a0d1d5f148ef4a8cedef5/coverage-7.12.0-cp313-cp313-musllinux_1_2_riscv64.whl", hash = "sha256:950411f1eb5d579999c5f66c62a40961f126fc71e5e14419f004471957b51508", size = 248853, upload-time = "2025-11-18T13:33:02.935Z" }, + { url = "https://files.pythonhosted.org/packages/fe/b6/76e1add8b87ef60e00643b0b7f8f7bb73d4bf5249a3be19ebefc5793dd25/coverage-7.12.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:b1aab7302a87bafebfe76b12af681b56ff446dc6f32ed178ff9c092ca776e6bc", size = 250619, upload-time = "2025-11-18T13:33:04.336Z" }, + { url = "https://files.pythonhosted.org/packages/95/87/924c6dc64f9203f7a3c1832a6a0eee5a8335dbe5f1bdadcc278d6f1b4d74/coverage-7.12.0-cp313-cp313-win32.whl", hash = "sha256:d7e0d0303c13b54db495eb636bc2465b2fb8475d4c8bcec8fe4b5ca454dfbae8", size = 220261, upload-time = "2025-11-18T13:33:06.493Z" }, + { url = "https://files.pythonhosted.org/packages/91/77/dd4aff9af16ff776bf355a24d87eeb48fc6acde54c907cc1ea89b14a8804/coverage-7.12.0-cp313-cp313-win_amd64.whl", hash = "sha256:ce61969812d6a98a981d147d9ac583a36ac7db7766f2e64a9d4d059c2fe29d07", size = 221072, upload-time = "2025-11-18T13:33:07.926Z" }, + { url = "https://files.pythonhosted.org/packages/70/49/5c9dc46205fef31b1b226a6e16513193715290584317fd4df91cdaf28b22/coverage-7.12.0-cp313-cp313-win_arm64.whl", hash = "sha256:bcec6f47e4cb8a4c2dc91ce507f6eefc6a1b10f58df32cdc61dff65455031dfc", size = 219702, upload-time = "2025-11-18T13:33:09.631Z" }, + { url = "https://files.pythonhosted.org/packages/9b/62/f87922641c7198667994dd472a91e1d9b829c95d6c29529ceb52132436ad/coverage-7.12.0-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:459443346509476170d553035e4a3eed7b860f4fe5242f02de1010501956ce87", size = 218420, upload-time = "2025-11-18T13:33:11.153Z" }, + { url = "https://files.pythonhosted.org/packages/85/dd/1cc13b2395ef15dbb27d7370a2509b4aee77890a464fb35d72d428f84871/coverage-7.12.0-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:04a79245ab2b7a61688958f7a855275997134bc84f4a03bc240cf64ff132abf6", size = 218773, upload-time = "2025-11-18T13:33:12.569Z" }, + { url = "https://files.pythonhosted.org/packages/74/40/35773cc4bb1e9d4658d4fb669eb4195b3151bef3bbd6f866aba5cd5dac82/coverage-7.12.0-cp313-cp313t-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:09a86acaaa8455f13d6a99221d9654df249b33937b4e212b4e5a822065f12aa7", size = 260078, upload-time = "2025-11-18T13:33:14.037Z" }, + { url = "https://files.pythonhosted.org/packages/ec/ee/231bb1a6ffc2905e396557585ebc6bdc559e7c66708376d245a1f1d330fc/coverage-7.12.0-cp313-cp313t-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:907e0df1b71ba77463687a74149c6122c3f6aac56c2510a5d906b2f368208560", size = 262144, upload-time = "2025-11-18T13:33:15.601Z" }, + { url = "https://files.pythonhosted.org/packages/28/be/32f4aa9f3bf0b56f3971001b56508352c7753915345d45fab4296a986f01/coverage-7.12.0-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:9b57e2d0ddd5f0582bae5437c04ee71c46cd908e7bc5d4d0391f9a41e812dd12", size = 264574, upload-time = "2025-11-18T13:33:17.354Z" }, + { url = "https://files.pythonhosted.org/packages/68/7c/00489fcbc2245d13ab12189b977e0cf06ff3351cb98bc6beba8bd68c5902/coverage-7.12.0-cp313-cp313t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:58c1c6aa677f3a1411fe6fb28ec3a942e4f665df036a3608816e0847fad23296", size = 259298, upload-time = "2025-11-18T13:33:18.958Z" }, + { url = "https://files.pythonhosted.org/packages/96/b4/f0760d65d56c3bea95b449e02570d4abd2549dc784bf39a2d4721a2d8ceb/coverage-7.12.0-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:4c589361263ab2953e3c4cd2a94db94c4ad4a8e572776ecfbad2389c626e4507", size = 262150, upload-time = "2025-11-18T13:33:20.644Z" }, + { url = "https://files.pythonhosted.org/packages/c5/71/9a9314df00f9326d78c1e5a910f520d599205907432d90d1c1b7a97aa4b1/coverage-7.12.0-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:91b810a163ccad2e43b1faa11d70d3cf4b6f3d83f9fd5f2df82a32d47b648e0d", size = 259763, upload-time = "2025-11-18T13:33:22.189Z" }, + { url = "https://files.pythonhosted.org/packages/10/34/01a0aceed13fbdf925876b9a15d50862eb8845454301fe3cdd1df08b2182/coverage-7.12.0-cp313-cp313t-musllinux_1_2_riscv64.whl", hash = "sha256:40c867af715f22592e0d0fb533a33a71ec9e0f73a6945f722a0c85c8c1cbe3a2", size = 258653, upload-time = "2025-11-18T13:33:24.239Z" }, + { url = "https://files.pythonhosted.org/packages/8d/04/81d8fd64928acf1574bbb0181f66901c6c1c6279c8ccf5f84259d2c68ae9/coverage-7.12.0-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:68b0d0a2d84f333de875666259dadf28cc67858bc8fd8b3f1eae84d3c2bec455", size = 260856, upload-time = "2025-11-18T13:33:26.365Z" }, + { url = "https://files.pythonhosted.org/packages/f2/76/fa2a37bfaeaf1f766a2d2360a25a5297d4fb567098112f6517475eee120b/coverage-7.12.0-cp313-cp313t-win32.whl", hash = "sha256:73f9e7fbd51a221818fd11b7090eaa835a353ddd59c236c57b2199486b116c6d", size = 220936, upload-time = "2025-11-18T13:33:28.165Z" }, + { url = "https://files.pythonhosted.org/packages/f9/52/60f64d932d555102611c366afb0eb434b34266b1d9266fc2fe18ab641c47/coverage-7.12.0-cp313-cp313t-win_amd64.whl", hash = "sha256:24cff9d1f5743f67db7ba46ff284018a6e9aeb649b67aa1e70c396aa1b7cb23c", size = 222001, upload-time = "2025-11-18T13:33:29.656Z" }, + { url = "https://files.pythonhosted.org/packages/77/df/c303164154a5a3aea7472bf323b7c857fed93b26618ed9fc5c2955566bb0/coverage-7.12.0-cp313-cp313t-win_arm64.whl", hash = "sha256:c87395744f5c77c866d0f5a43d97cc39e17c7f1cb0115e54a2fe67ca75c5d14d", size = 220273, upload-time = "2025-11-18T13:33:31.415Z" }, + { url = "https://files.pythonhosted.org/packages/bf/2e/fc12db0883478d6e12bbd62d481210f0c8daf036102aa11434a0c5755825/coverage-7.12.0-cp314-cp314-macosx_10_15_x86_64.whl", hash = "sha256:a1c59b7dc169809a88b21a936eccf71c3895a78f5592051b1af8f4d59c2b4f92", size = 217777, upload-time = "2025-11-18T13:33:32.86Z" }, + { url = "https://files.pythonhosted.org/packages/1f/c1/ce3e525d223350c6ec16b9be8a057623f54226ef7f4c2fee361ebb6a02b8/coverage-7.12.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:8787b0f982e020adb732b9f051f3e49dd5054cebbc3f3432061278512a2b1360", size = 218100, upload-time = "2025-11-18T13:33:34.532Z" }, + { url = "https://files.pythonhosted.org/packages/15/87/113757441504aee3808cb422990ed7c8bcc2d53a6779c66c5adef0942939/coverage-7.12.0-cp314-cp314-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:5ea5a9f7dc8877455b13dd1effd3202e0bca72f6f3ab09f9036b1bcf728f69ac", size = 249151, upload-time = "2025-11-18T13:33:36.135Z" }, + { url = "https://files.pythonhosted.org/packages/d9/1d/9529d9bd44049b6b05bb319c03a3a7e4b0a8a802d28fa348ad407e10706d/coverage-7.12.0-cp314-cp314-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:fdba9f15849534594f60b47c9a30bc70409b54947319a7c4fd0e8e3d8d2f355d", size = 251667, upload-time = "2025-11-18T13:33:37.996Z" }, + { url = "https://files.pythonhosted.org/packages/11/bb/567e751c41e9c03dc29d3ce74b8c89a1e3396313e34f255a2a2e8b9ebb56/coverage-7.12.0-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:a00594770eb715854fb1c57e0dea08cce6720cfbc531accdb9850d7c7770396c", size = 253003, upload-time = "2025-11-18T13:33:39.553Z" }, + { url = "https://files.pythonhosted.org/packages/e4/b3/c2cce2d8526a02fb9e9ca14a263ca6fc074449b33a6afa4892838c903528/coverage-7.12.0-cp314-cp314-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:5560c7e0d82b42eb1951e4f68f071f8017c824ebfd5a6ebe42c60ac16c6c2434", size = 249185, upload-time = "2025-11-18T13:33:42.086Z" }, + { url = "https://files.pythonhosted.org/packages/0e/a7/967f93bb66e82c9113c66a8d0b65ecf72fc865adfba5a145f50c7af7e58d/coverage-7.12.0-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:d6c2e26b481c9159c2773a37947a9718cfdc58893029cdfb177531793e375cfc", size = 251025, upload-time = "2025-11-18T13:33:43.634Z" }, + { url = "https://files.pythonhosted.org/packages/b9/b2/f2f6f56337bc1af465d5b2dc1ee7ee2141b8b9272f3bf6213fcbc309a836/coverage-7.12.0-cp314-cp314-musllinux_1_2_i686.whl", hash = "sha256:6e1a8c066dabcde56d5d9fed6a66bc19a2883a3fe051f0c397a41fc42aedd4cc", size = 248979, upload-time = "2025-11-18T13:33:46.04Z" }, + { url = "https://files.pythonhosted.org/packages/f4/7a/bf4209f45a4aec09d10a01a57313a46c0e0e8f4c55ff2965467d41a92036/coverage-7.12.0-cp314-cp314-musllinux_1_2_riscv64.whl", hash = "sha256:f7ba9da4726e446d8dd8aae5a6cd872511184a5d861de80a86ef970b5dacce3e", size = 248800, upload-time = "2025-11-18T13:33:47.546Z" }, + { url = "https://files.pythonhosted.org/packages/b8/b7/1e01b8696fb0521810f60c5bbebf699100d6754183e6cc0679bf2ed76531/coverage-7.12.0-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:e0f483ab4f749039894abaf80c2f9e7ed77bbf3c737517fb88c8e8e305896a17", size = 250460, upload-time = "2025-11-18T13:33:49.537Z" }, + { url = "https://files.pythonhosted.org/packages/71/ae/84324fb9cb46c024760e706353d9b771a81b398d117d8c1fe010391c186f/coverage-7.12.0-cp314-cp314-win32.whl", hash = "sha256:76336c19a9ef4a94b2f8dc79f8ac2da3f193f625bb5d6f51a328cd19bfc19933", size = 220533, upload-time = "2025-11-18T13:33:51.16Z" }, + { url = "https://files.pythonhosted.org/packages/e2/71/1033629deb8460a8f97f83e6ac4ca3b93952e2b6f826056684df8275e015/coverage-7.12.0-cp314-cp314-win_amd64.whl", hash = "sha256:7c1059b600aec6ef090721f8f633f60ed70afaffe8ecab85b59df748f24b31fe", size = 221348, upload-time = "2025-11-18T13:33:52.776Z" }, + { url = "https://files.pythonhosted.org/packages/0a/5f/ac8107a902f623b0c251abdb749be282dc2ab61854a8a4fcf49e276fce2f/coverage-7.12.0-cp314-cp314-win_arm64.whl", hash = "sha256:172cf3a34bfef42611963e2b661302a8931f44df31629e5b1050567d6b90287d", size = 219922, upload-time = "2025-11-18T13:33:54.316Z" }, + { url = "https://files.pythonhosted.org/packages/79/6e/f27af2d4da367f16077d21ef6fe796c874408219fa6dd3f3efe7751bd910/coverage-7.12.0-cp314-cp314t-macosx_10_15_x86_64.whl", hash = "sha256:aa7d48520a32cb21c7a9b31f81799e8eaec7239db36c3b670be0fa2403828d1d", size = 218511, upload-time = "2025-11-18T13:33:56.343Z" }, + { url = "https://files.pythonhosted.org/packages/67/dd/65fd874aa460c30da78f9d259400d8e6a4ef457d61ab052fd248f0050558/coverage-7.12.0-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:90d58ac63bc85e0fb919f14d09d6caa63f35a5512a2205284b7816cafd21bb03", size = 218771, upload-time = "2025-11-18T13:33:57.966Z" }, + { url = "https://files.pythonhosted.org/packages/55/e0/7c6b71d327d8068cb79c05f8f45bf1b6145f7a0de23bbebe63578fe5240a/coverage-7.12.0-cp314-cp314t-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:ca8ecfa283764fdda3eae1bdb6afe58bf78c2c3ec2b2edcb05a671f0bba7b3f9", size = 260151, upload-time = "2025-11-18T13:33:59.597Z" }, + { url = "https://files.pythonhosted.org/packages/49/ce/4697457d58285b7200de6b46d606ea71066c6e674571a946a6ea908fb588/coverage-7.12.0-cp314-cp314t-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:874fe69a0785d96bd066059cd4368022cebbec1a8958f224f0016979183916e6", size = 262257, upload-time = "2025-11-18T13:34:01.166Z" }, + { url = "https://files.pythonhosted.org/packages/2f/33/acbc6e447aee4ceba88c15528dbe04a35fb4d67b59d393d2e0d6f1e242c1/coverage-7.12.0-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:5b3c889c0b8b283a24d721a9eabc8ccafcfc3aebf167e4cd0d0e23bf8ec4e339", size = 264671, upload-time = "2025-11-18T13:34:02.795Z" }, + { url = "https://files.pythonhosted.org/packages/87/ec/e2822a795c1ed44d569980097be839c5e734d4c0c1119ef8e0a073496a30/coverage-7.12.0-cp314-cp314t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:8bb5b894b3ec09dcd6d3743229dc7f2c42ef7787dc40596ae04c0edda487371e", size = 259231, upload-time = "2025-11-18T13:34:04.397Z" }, + { url = "https://files.pythonhosted.org/packages/72/c5/a7ec5395bb4a49c9b7ad97e63f0c92f6bf4a9e006b1393555a02dae75f16/coverage-7.12.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:79a44421cd5fba96aa57b5e3b5a4d3274c449d4c622e8f76882d76635501fd13", size = 262137, upload-time = "2025-11-18T13:34:06.068Z" }, + { url = "https://files.pythonhosted.org/packages/67/0c/02c08858b764129f4ecb8e316684272972e60777ae986f3865b10940bdd6/coverage-7.12.0-cp314-cp314t-musllinux_1_2_i686.whl", hash = "sha256:33baadc0efd5c7294f436a632566ccc1f72c867f82833eb59820ee37dc811c6f", size = 259745, upload-time = "2025-11-18T13:34:08.04Z" }, + { url = "https://files.pythonhosted.org/packages/5a/04/4fd32b7084505f3829a8fe45c1a74a7a728cb251aaadbe3bec04abcef06d/coverage-7.12.0-cp314-cp314t-musllinux_1_2_riscv64.whl", hash = "sha256:c406a71f544800ef7e9e0000af706b88465f3573ae8b8de37e5f96c59f689ad1", size = 258570, upload-time = "2025-11-18T13:34:09.676Z" }, + { url = "https://files.pythonhosted.org/packages/48/35/2365e37c90df4f5342c4fa202223744119fe31264ee2924f09f074ea9b6d/coverage-7.12.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:e71bba6a40883b00c6d571599b4627f50c360b3d0d02bfc658168936be74027b", size = 260899, upload-time = "2025-11-18T13:34:11.259Z" }, + { url = "https://files.pythonhosted.org/packages/05/56/26ab0464ca733fa325e8e71455c58c1c374ce30f7c04cebb88eabb037b18/coverage-7.12.0-cp314-cp314t-win32.whl", hash = "sha256:9157a5e233c40ce6613dead4c131a006adfda70e557b6856b97aceed01b0e27a", size = 221313, upload-time = "2025-11-18T13:34:12.863Z" }, + { url = "https://files.pythonhosted.org/packages/da/1c/017a3e1113ed34d998b27d2c6dba08a9e7cb97d362f0ec988fcd873dcf81/coverage-7.12.0-cp314-cp314t-win_amd64.whl", hash = "sha256:e84da3a0fd233aeec797b981c51af1cabac74f9bd67be42458365b30d11b5291", size = 222423, upload-time = "2025-11-18T13:34:15.14Z" }, + { url = "https://files.pythonhosted.org/packages/4c/36/bcc504fdd5169301b52568802bb1b9cdde2e27a01d39fbb3b4b508ab7c2c/coverage-7.12.0-cp314-cp314t-win_arm64.whl", hash = "sha256:01d24af36fedda51c2b1aca56e4330a3710f83b02a5ff3743a6b015ffa7c9384", size = 220459, upload-time = "2025-11-18T13:34:17.222Z" }, + { url = "https://files.pythonhosted.org/packages/ce/a3/43b749004e3c09452e39bb56347a008f0a0668aad37324a99b5c8ca91d9e/coverage-7.12.0-py3-none-any.whl", hash = "sha256:159d50c0b12e060b15ed3d39f87ed43d4f7f7ad40b8a534f4dd331adbb51104a", size = 209503, upload-time = "2025-11-18T13:34:18.892Z" }, +] + +[[package]] +name = "cryptography" +version = "46.0.3" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "cffi", marker = "platform_python_implementation != 'PyPy'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/9f/33/c00162f49c0e2fe8064a62cb92b93e50c74a72bc370ab92f86112b33ff62/cryptography-46.0.3.tar.gz", hash = "sha256:a8b17438104fed022ce745b362294d9ce35b4c2e45c1d958ad4a4b019285f4a1", size = 749258, upload-time = "2025-10-15T23:18:31.74Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/1d/42/9c391dd801d6cf0d561b5890549d4b27bafcc53b39c31a817e69d87c625b/cryptography-46.0.3-cp311-abi3-macosx_10_9_universal2.whl", hash = "sha256:109d4ddfadf17e8e7779c39f9b18111a09efb969a301a31e987416a0191ed93a", size = 7225004, upload-time = "2025-10-15T23:16:52.239Z" }, + { url = "https://files.pythonhosted.org/packages/1c/67/38769ca6b65f07461eb200e85fc1639b438bdc667be02cf7f2cd6a64601c/cryptography-46.0.3-cp311-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:09859af8466b69bc3c27bdf4f5d84a665e0f7ab5088412e9e2ec49758eca5cbc", size = 4296667, upload-time = "2025-10-15T23:16:54.369Z" }, + { url = "https://files.pythonhosted.org/packages/5c/49/498c86566a1d80e978b42f0d702795f69887005548c041636df6ae1ca64c/cryptography-46.0.3-cp311-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:01ca9ff2885f3acc98c29f1860552e37f6d7c7d013d7334ff2a9de43a449315d", size = 4450807, upload-time = "2025-10-15T23:16:56.414Z" }, + { url = "https://files.pythonhosted.org/packages/4b/0a/863a3604112174c8624a2ac3c038662d9e59970c7f926acdcfaed8d61142/cryptography-46.0.3-cp311-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:6eae65d4c3d33da080cff9c4ab1f711b15c1d9760809dad6ea763f3812d254cb", size = 4299615, upload-time = "2025-10-15T23:16:58.442Z" }, + { url = "https://files.pythonhosted.org/packages/64/02/b73a533f6b64a69f3cd3872acb6ebc12aef924d8d103133bb3ea750dc703/cryptography-46.0.3-cp311-abi3-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:e5bf0ed4490068a2e72ac03d786693adeb909981cc596425d09032d372bcc849", size = 4016800, upload-time = "2025-10-15T23:17:00.378Z" }, + { url = "https://files.pythonhosted.org/packages/25/d5/16e41afbfa450cde85a3b7ec599bebefaef16b5c6ba4ec49a3532336ed72/cryptography-46.0.3-cp311-abi3-manylinux_2_28_ppc64le.whl", hash = "sha256:5ecfccd2329e37e9b7112a888e76d9feca2347f12f37918facbb893d7bb88ee8", size = 4984707, upload-time = "2025-10-15T23:17:01.98Z" }, + { url = "https://files.pythonhosted.org/packages/c9/56/e7e69b427c3878352c2fb9b450bd0e19ed552753491d39d7d0a2f5226d41/cryptography-46.0.3-cp311-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:a2c0cd47381a3229c403062f764160d57d4d175e022c1df84e168c6251a22eec", size = 4482541, upload-time = "2025-10-15T23:17:04.078Z" }, + { url = "https://files.pythonhosted.org/packages/78/f6/50736d40d97e8483172f1bb6e698895b92a223dba513b0ca6f06b2365339/cryptography-46.0.3-cp311-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:549e234ff32571b1f4076ac269fcce7a808d3bf98b76c8dd560e42dbc66d7d91", size = 4299464, upload-time = "2025-10-15T23:17:05.483Z" }, + { url = "https://files.pythonhosted.org/packages/00/de/d8e26b1a855f19d9994a19c702fa2e93b0456beccbcfe437eda00e0701f2/cryptography-46.0.3-cp311-abi3-manylinux_2_34_ppc64le.whl", hash = "sha256:c0a7bb1a68a5d3471880e264621346c48665b3bf1c3759d682fc0864c540bd9e", size = 4950838, upload-time = "2025-10-15T23:17:07.425Z" }, + { url = "https://files.pythonhosted.org/packages/8f/29/798fc4ec461a1c9e9f735f2fc58741b0daae30688f41b2497dcbc9ed1355/cryptography-46.0.3-cp311-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:10b01676fc208c3e6feeb25a8b83d81767e8059e1fe86e1dc62d10a3018fa926", size = 4481596, upload-time = "2025-10-15T23:17:09.343Z" }, + { url = "https://files.pythonhosted.org/packages/15/8d/03cd48b20a573adfff7652b76271078e3045b9f49387920e7f1f631d125e/cryptography-46.0.3-cp311-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:0abf1ffd6e57c67e92af68330d05760b7b7efb243aab8377e583284dbab72c71", size = 4426782, upload-time = "2025-10-15T23:17:11.22Z" }, + { url = "https://files.pythonhosted.org/packages/fa/b1/ebacbfe53317d55cf33165bda24c86523497a6881f339f9aae5c2e13e57b/cryptography-46.0.3-cp311-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:a04bee9ab6a4da801eb9b51f1b708a1b5b5c9eb48c03f74198464c66f0d344ac", size = 4698381, upload-time = "2025-10-15T23:17:12.829Z" }, + { url = "https://files.pythonhosted.org/packages/96/92/8a6a9525893325fc057a01f654d7efc2c64b9de90413adcf605a85744ff4/cryptography-46.0.3-cp311-abi3-win32.whl", hash = "sha256:f260d0d41e9b4da1ed1e0f1ce571f97fe370b152ab18778e9e8f67d6af432018", size = 3055988, upload-time = "2025-10-15T23:17:14.65Z" }, + { url = "https://files.pythonhosted.org/packages/7e/bf/80fbf45253ea585a1e492a6a17efcb93467701fa79e71550a430c5e60df0/cryptography-46.0.3-cp311-abi3-win_amd64.whl", hash = "sha256:a9a3008438615669153eb86b26b61e09993921ebdd75385ddd748702c5adfddb", size = 3514451, upload-time = "2025-10-15T23:17:16.142Z" }, + { url = "https://files.pythonhosted.org/packages/2e/af/9b302da4c87b0beb9db4e756386a7c6c5b8003cd0e742277888d352ae91d/cryptography-46.0.3-cp311-abi3-win_arm64.whl", hash = "sha256:5d7f93296ee28f68447397bf5198428c9aeeab45705a55d53a6343455dcb2c3c", size = 2928007, upload-time = "2025-10-15T23:17:18.04Z" }, + { url = "https://files.pythonhosted.org/packages/f5/e2/a510aa736755bffa9d2f75029c229111a1d02f8ecd5de03078f4c18d91a3/cryptography-46.0.3-cp314-cp314t-macosx_10_9_universal2.whl", hash = "sha256:00a5e7e87938e5ff9ff5447ab086a5706a957137e6e433841e9d24f38a065217", size = 7158012, upload-time = "2025-10-15T23:17:19.982Z" }, + { url = "https://files.pythonhosted.org/packages/73/dc/9aa866fbdbb95b02e7f9d086f1fccfeebf8953509b87e3f28fff927ff8a0/cryptography-46.0.3-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:c8daeb2d2174beb4575b77482320303f3d39b8e81153da4f0fb08eb5fe86a6c5", size = 4288728, upload-time = "2025-10-15T23:17:21.527Z" }, + { url = "https://files.pythonhosted.org/packages/c5/fd/bc1daf8230eaa075184cbbf5f8cd00ba9db4fd32d63fb83da4671b72ed8a/cryptography-46.0.3-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:39b6755623145ad5eff1dab323f4eae2a32a77a7abef2c5089a04a3d04366715", size = 4435078, upload-time = "2025-10-15T23:17:23.042Z" }, + { url = "https://files.pythonhosted.org/packages/82/98/d3bd5407ce4c60017f8ff9e63ffee4200ab3e23fe05b765cab805a7db008/cryptography-46.0.3-cp314-cp314t-manylinux_2_28_aarch64.whl", hash = "sha256:db391fa7c66df6762ee3f00c95a89e6d428f4d60e7abc8328f4fe155b5ac6e54", size = 4293460, upload-time = "2025-10-15T23:17:24.885Z" }, + { url = "https://files.pythonhosted.org/packages/26/e9/e23e7900983c2b8af7a08098db406cf989d7f09caea7897e347598d4cd5b/cryptography-46.0.3-cp314-cp314t-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:78a97cf6a8839a48c49271cdcbd5cf37ca2c1d6b7fdd86cc864f302b5e9bf459", size = 3995237, upload-time = "2025-10-15T23:17:26.449Z" }, + { url = "https://files.pythonhosted.org/packages/91/15/af68c509d4a138cfe299d0d7ddb14afba15233223ebd933b4bbdbc7155d3/cryptography-46.0.3-cp314-cp314t-manylinux_2_28_ppc64le.whl", hash = "sha256:dfb781ff7eaa91a6f7fd41776ec37c5853c795d3b358d4896fdbb5df168af422", size = 4967344, upload-time = "2025-10-15T23:17:28.06Z" }, + { url = "https://files.pythonhosted.org/packages/ca/e3/8643d077c53868b681af077edf6b3cb58288b5423610f21c62aadcbe99f4/cryptography-46.0.3-cp314-cp314t-manylinux_2_28_x86_64.whl", hash = "sha256:6f61efb26e76c45c4a227835ddeae96d83624fb0d29eb5df5b96e14ed1a0afb7", size = 4466564, upload-time = "2025-10-15T23:17:29.665Z" }, + { url = "https://files.pythonhosted.org/packages/0e/43/c1e8726fa59c236ff477ff2b5dc071e54b21e5a1e51aa2cee1676f1c986f/cryptography-46.0.3-cp314-cp314t-manylinux_2_34_aarch64.whl", hash = "sha256:23b1a8f26e43f47ceb6d6a43115f33a5a37d57df4ea0ca295b780ae8546e8044", size = 4292415, upload-time = "2025-10-15T23:17:31.686Z" }, + { url = "https://files.pythonhosted.org/packages/42/f9/2f8fefdb1aee8a8e3256a0568cffc4e6d517b256a2fe97a029b3f1b9fe7e/cryptography-46.0.3-cp314-cp314t-manylinux_2_34_ppc64le.whl", hash = "sha256:b419ae593c86b87014b9be7396b385491ad7f320bde96826d0dd174459e54665", size = 4931457, upload-time = "2025-10-15T23:17:33.478Z" }, + { url = "https://files.pythonhosted.org/packages/79/30/9b54127a9a778ccd6d27c3da7563e9f2d341826075ceab89ae3b41bf5be2/cryptography-46.0.3-cp314-cp314t-manylinux_2_34_x86_64.whl", hash = "sha256:50fc3343ac490c6b08c0cf0d704e881d0d660be923fd3076db3e932007e726e3", size = 4466074, upload-time = "2025-10-15T23:17:35.158Z" }, + { url = "https://files.pythonhosted.org/packages/ac/68/b4f4a10928e26c941b1b6a179143af9f4d27d88fe84a6a3c53592d2e76bf/cryptography-46.0.3-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:22d7e97932f511d6b0b04f2bfd818d73dcd5928db509460aaf48384778eb6d20", size = 4420569, upload-time = "2025-10-15T23:17:37.188Z" }, + { url = "https://files.pythonhosted.org/packages/a3/49/3746dab4c0d1979888f125226357d3262a6dd40e114ac29e3d2abdf1ec55/cryptography-46.0.3-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:d55f3dffadd674514ad19451161118fd010988540cee43d8bc20675e775925de", size = 4681941, upload-time = "2025-10-15T23:17:39.236Z" }, + { url = "https://files.pythonhosted.org/packages/fd/30/27654c1dbaf7e4a3531fa1fc77986d04aefa4d6d78259a62c9dc13d7ad36/cryptography-46.0.3-cp314-cp314t-win32.whl", hash = "sha256:8a6e050cb6164d3f830453754094c086ff2d0b2f3a897a1d9820f6139a1f0914", size = 3022339, upload-time = "2025-10-15T23:17:40.888Z" }, + { url = "https://files.pythonhosted.org/packages/f6/30/640f34ccd4d2a1bc88367b54b926b781b5a018d65f404d409aba76a84b1c/cryptography-46.0.3-cp314-cp314t-win_amd64.whl", hash = "sha256:760f83faa07f8b64e9c33fc963d790a2edb24efb479e3520c14a45741cd9b2db", size = 3494315, upload-time = "2025-10-15T23:17:42.769Z" }, + { url = "https://files.pythonhosted.org/packages/ba/8b/88cc7e3bd0a8e7b861f26981f7b820e1f46aa9d26cc482d0feba0ecb4919/cryptography-46.0.3-cp314-cp314t-win_arm64.whl", hash = "sha256:516ea134e703e9fe26bcd1277a4b59ad30586ea90c365a87781d7887a646fe21", size = 2919331, upload-time = "2025-10-15T23:17:44.468Z" }, + { url = "https://files.pythonhosted.org/packages/fd/23/45fe7f376a7df8daf6da3556603b36f53475a99ce4faacb6ba2cf3d82021/cryptography-46.0.3-cp38-abi3-macosx_10_9_universal2.whl", hash = "sha256:cb3d760a6117f621261d662bccc8ef5bc32ca673e037c83fbe565324f5c46936", size = 7218248, upload-time = "2025-10-15T23:17:46.294Z" }, + { url = "https://files.pythonhosted.org/packages/27/32/b68d27471372737054cbd34c84981f9edbc24fe67ca225d389799614e27f/cryptography-46.0.3-cp38-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:4b7387121ac7d15e550f5cb4a43aef2559ed759c35df7336c402bb8275ac9683", size = 4294089, upload-time = "2025-10-15T23:17:48.269Z" }, + { url = "https://files.pythonhosted.org/packages/26/42/fa8389d4478368743e24e61eea78846a0006caffaf72ea24a15159215a14/cryptography-46.0.3-cp38-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:15ab9b093e8f09daab0f2159bb7e47532596075139dd74365da52ecc9cb46c5d", size = 4440029, upload-time = "2025-10-15T23:17:49.837Z" }, + { url = "https://files.pythonhosted.org/packages/5f/eb/f483db0ec5ac040824f269e93dd2bd8a21ecd1027e77ad7bdf6914f2fd80/cryptography-46.0.3-cp38-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:46acf53b40ea38f9c6c229599a4a13f0d46a6c3fa9ef19fc1a124d62e338dfa0", size = 4297222, upload-time = "2025-10-15T23:17:51.357Z" }, + { url = "https://files.pythonhosted.org/packages/fd/cf/da9502c4e1912cb1da3807ea3618a6829bee8207456fbbeebc361ec38ba3/cryptography-46.0.3-cp38-abi3-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:10ca84c4668d066a9878890047f03546f3ae0a6b8b39b697457b7757aaf18dbc", size = 4012280, upload-time = "2025-10-15T23:17:52.964Z" }, + { url = "https://files.pythonhosted.org/packages/6b/8f/9adb86b93330e0df8b3dcf03eae67c33ba89958fc2e03862ef1ac2b42465/cryptography-46.0.3-cp38-abi3-manylinux_2_28_ppc64le.whl", hash = "sha256:36e627112085bb3b81b19fed209c05ce2a52ee8b15d161b7c643a7d5a88491f3", size = 4978958, upload-time = "2025-10-15T23:17:54.965Z" }, + { url = "https://files.pythonhosted.org/packages/d1/a0/5fa77988289c34bdb9f913f5606ecc9ada1adb5ae870bd0d1054a7021cc4/cryptography-46.0.3-cp38-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:1000713389b75c449a6e979ffc7dcc8ac90b437048766cef052d4d30b8220971", size = 4473714, upload-time = "2025-10-15T23:17:56.754Z" }, + { url = "https://files.pythonhosted.org/packages/14/e5/fc82d72a58d41c393697aa18c9abe5ae1214ff6f2a5c18ac470f92777895/cryptography-46.0.3-cp38-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:b02cf04496f6576afffef5ddd04a0cb7d49cf6be16a9059d793a30b035f6b6ac", size = 4296970, upload-time = "2025-10-15T23:17:58.588Z" }, + { url = "https://files.pythonhosted.org/packages/78/06/5663ed35438d0b09056973994f1aec467492b33bd31da36e468b01ec1097/cryptography-46.0.3-cp38-abi3-manylinux_2_34_ppc64le.whl", hash = "sha256:71e842ec9bc7abf543b47cf86b9a743baa95f4677d22baa4c7d5c69e49e9bc04", size = 4940236, upload-time = "2025-10-15T23:18:00.897Z" }, + { url = "https://files.pythonhosted.org/packages/fc/59/873633f3f2dcd8a053b8dd1d38f783043b5fce589c0f6988bf55ef57e43e/cryptography-46.0.3-cp38-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:402b58fc32614f00980b66d6e56a5b4118e6cb362ae8f3fda141ba4689bd4506", size = 4472642, upload-time = "2025-10-15T23:18:02.749Z" }, + { url = "https://files.pythonhosted.org/packages/3d/39/8e71f3930e40f6877737d6f69248cf74d4e34b886a3967d32f919cc50d3b/cryptography-46.0.3-cp38-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:ef639cb3372f69ec44915fafcd6698b6cc78fbe0c2ea41be867f6ed612811963", size = 4423126, upload-time = "2025-10-15T23:18:04.85Z" }, + { url = "https://files.pythonhosted.org/packages/cd/c7/f65027c2810e14c3e7268353b1681932b87e5a48e65505d8cc17c99e36ae/cryptography-46.0.3-cp38-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:3b51b8ca4f1c6453d8829e1eb7299499ca7f313900dd4d89a24b8b87c0a780d4", size = 4686573, upload-time = "2025-10-15T23:18:06.908Z" }, + { url = "https://files.pythonhosted.org/packages/0a/6e/1c8331ddf91ca4730ab3086a0f1be19c65510a33b5a441cb334e7a2d2560/cryptography-46.0.3-cp38-abi3-win32.whl", hash = "sha256:6276eb85ef938dc035d59b87c8a7dc559a232f954962520137529d77b18ff1df", size = 3036695, upload-time = "2025-10-15T23:18:08.672Z" }, + { url = "https://files.pythonhosted.org/packages/90/45/b0d691df20633eff80955a0fc7695ff9051ffce8b69741444bd9ed7bd0db/cryptography-46.0.3-cp38-abi3-win_amd64.whl", hash = "sha256:416260257577718c05135c55958b674000baef9a1c7d9e8f306ec60d71db850f", size = 3501720, upload-time = "2025-10-15T23:18:10.632Z" }, + { url = "https://files.pythonhosted.org/packages/e8/cb/2da4cc83f5edb9c3257d09e1e7ab7b23f049c7962cae8d842bbef0a9cec9/cryptography-46.0.3-cp38-abi3-win_arm64.whl", hash = "sha256:d89c3468de4cdc4f08a57e214384d0471911a3830fcdaf7a8cc587e42a866372", size = 2918740, upload-time = "2025-10-15T23:18:12.277Z" }, +] + +[[package]] +name = "diskcache" +version = "5.6.3" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/3f/21/1c1ffc1a039ddcc459db43cc108658f32c57d271d7289a2794e401d0fdb6/diskcache-5.6.3.tar.gz", hash = "sha256:2c3a3fa2743d8535d832ec61c2054a1641f41775aa7c556758a109941e33e4fc", size = 67916, upload-time = "2023-08-31T06:12:00.316Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/3f/27/4570e78fc0bf5ea0ca45eb1de3818a23787af9b390c0b0a0033a1b8236f9/diskcache-5.6.3-py3-none-any.whl", hash = "sha256:5e31b2d5fbad117cc363ebaf6b689474db18a1f6438bc82358b024abd4c2ca19", size = 45550, upload-time = "2023-08-31T06:11:58.822Z" }, +] + +[[package]] +name = "distlib" +version = "0.4.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/96/8e/709914eb2b5749865801041647dc7f4e6d00b549cfe88b65ca192995f07c/distlib-0.4.0.tar.gz", hash = "sha256:feec40075be03a04501a973d81f633735b4b69f98b05450592310c0f401a4e0d", size = 614605, upload-time = "2025-07-17T16:52:00.465Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/33/6b/e0547afaf41bf2c42e52430072fa5658766e3d65bd4b03a563d1b6336f57/distlib-0.4.0-py2.py3-none-any.whl", hash = "sha256:9659f7d87e46584a30b5780e43ac7a2143098441670ff0a49d5f9034c54a6c16", size = 469047, upload-time = "2025-07-17T16:51:58.613Z" }, +] + +[[package]] +name = "distro" +version = "1.9.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/fc/f8/98eea607f65de6527f8a2e8885fc8015d3e6f5775df186e443e0964a11c3/distro-1.9.0.tar.gz", hash = "sha256:2fa77c6fd8940f116ee1d6b94a2f90b13b5ea8d019b98bc8bafdcabcdd9bdbed", size = 60722, upload-time = "2023-12-24T09:54:32.31Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/12/b3/231ffd4ab1fc9d679809f356cebee130ac7daa00d6d6f3206dd4fd137e9e/distro-1.9.0-py3-none-any.whl", hash = "sha256:7bffd925d65168f85027d8da9af6bddab658135b840670a223589bc0c8ef02b2", size = 20277, upload-time = "2023-12-24T09:54:30.421Z" }, +] + +[[package]] +name = "docstring-parser" +version = "0.17.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/b2/9d/c3b43da9515bd270df0f80548d9944e389870713cc1fe2b8fb35fe2bcefd/docstring_parser-0.17.0.tar.gz", hash = "sha256:583de4a309722b3315439bb31d64ba3eebada841f2e2cee23b99df001434c912", size = 27442, upload-time = "2025-07-21T07:35:01.868Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/55/e2/2537ebcff11c1ee1ff17d8d0b6f4db75873e3b0fb32c2d4a2ee31ecb310a/docstring_parser-0.17.0-py3-none-any.whl", hash = "sha256:cf2569abd23dce8099b300f9b4fa8191e9582dda731fd533daf54c4551658708", size = 36896, upload-time = "2025-07-21T07:35:00.684Z" }, +] + +[[package]] +name = "ecdsa" +version = "0.19.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "six" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/c0/1f/924e3caae75f471eae4b26bd13b698f6af2c44279f67af317439c2f4c46a/ecdsa-0.19.1.tar.gz", hash = "sha256:478cba7b62555866fcb3bb3fe985e06decbdb68ef55713c4e5ab98c57d508e61", size = 201793, upload-time = "2025-03-13T11:52:43.25Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/cb/a3/460c57f094a4a165c84a1341c373b0a4f5ec6ac244b998d5021aade89b77/ecdsa-0.19.1-py2.py3-none-any.whl", hash = "sha256:30638e27cf77b7e15c4c4cc1973720149e1033827cfd00661ca5c8cc0cdb24c3", size = 150607, upload-time = "2025-03-13T11:52:41.757Z" }, +] + +[[package]] +name = "fakeredis" +version = "2.32.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "redis" }, + { name = "sortedcontainers" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/56/14/b47b8471303af7deed7080290c14cff27a831fa47b38f45643e6bf889cee/fakeredis-2.32.1.tar.gz", hash = "sha256:dd8246db159f0b66a1ced7800c9d5ef07769e3d2fde44b389a57f2ce2834e444", size = 171582, upload-time = "2025-11-06T01:40:57.836Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/c2/d2/c28f6909864bfdb7411bb8f39fabedb5a50da1cbd7da5a1a3a46dfea2eab/fakeredis-2.32.1-py3-none-any.whl", hash = "sha256:e80c8886db2e47ba784f7dfe66aad6cd2eab76093c6bfda50041e5bc890d46cf", size = 118964, upload-time = "2025-11-06T01:40:55.885Z" }, +] + +[[package]] +name = "fastapi" +version = "0.123.9" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "annotated-doc" }, + { name = "pydantic" }, + { name = "starlette" }, + { name = "typing-extensions" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/2c/01/c3fb48c0135d89586a03c3e2c5bc04540dda52079a1af5cac4a63598efb9/fastapi-0.123.9.tar.gz", hash = "sha256:ab33d672d8e1cc6e0b49777eb73c32ccf20761011f5ca16755889ab406fd1de0", size = 355616, upload-time = "2025-12-04T22:24:47.598Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/db/15/a785e992a27620e022d0bc61b6c897ec14cff07c5ab7ff9f27651a21570b/fastapi-0.123.9-py3-none-any.whl", hash = "sha256:f54c69f23db14bd3dbcdfaf3fdce0483ca5f499512380c8e379a70cda30aa920", size = 111776, upload-time = "2025-12-04T22:24:46.042Z" }, +] + +[[package]] +name = "filelock" +version = "3.20.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/58/46/0028a82567109b5ef6e4d2a1f04a583fb513e6cf9527fcdd09afd817deeb/filelock-3.20.0.tar.gz", hash = "sha256:711e943b4ec6be42e1d4e6690b48dc175c822967466bb31c0c293f34334c13f4", size = 18922, upload-time = "2025-10-08T18:03:50.056Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/76/91/7216b27286936c16f5b4d0c530087e4a54eead683e6b0b73dd0c64844af6/filelock-3.20.0-py3-none-any.whl", hash = "sha256:339b4732ffda5cd79b13f4e2711a31b0365ce445d95d243bb996273d072546a2", size = 16054, upload-time = "2025-10-08T18:03:48.35Z" }, +] + +[[package]] +name = "frozenlist" +version = "1.8.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/2d/f5/c831fac6cc817d26fd54c7eaccd04ef7e0288806943f7cc5bbf69f3ac1f0/frozenlist-1.8.0.tar.gz", hash = "sha256:3ede829ed8d842f6cd48fc7081d7a41001a56f1f38603f9d49bf3020d59a31ad", size = 45875, upload-time = "2025-10-06T05:38:17.865Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/2d/40/0832c31a37d60f60ed79e9dfb5a92e1e2af4f40a16a29abcc7992af9edff/frozenlist-1.8.0-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:8d92f1a84bb12d9e56f818b3a746f3efba93c1b63c8387a73dde655e1e42282a", size = 85717, upload-time = "2025-10-06T05:36:27.341Z" }, + { url = "https://files.pythonhosted.org/packages/30/ba/b0b3de23f40bc55a7057bd38434e25c34fa48e17f20ee273bbde5e0650f3/frozenlist-1.8.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:96153e77a591c8adc2ee805756c61f59fef4cf4073a9275ee86fe8cba41241f7", size = 49651, upload-time = "2025-10-06T05:36:28.855Z" }, + { url = "https://files.pythonhosted.org/packages/0c/ab/6e5080ee374f875296c4243c381bbdef97a9ac39c6e3ce1d5f7d42cb78d6/frozenlist-1.8.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:f21f00a91358803399890ab167098c131ec2ddd5f8f5fd5fe9c9f2c6fcd91e40", size = 49417, upload-time = "2025-10-06T05:36:29.877Z" }, + { url = "https://files.pythonhosted.org/packages/d5/4e/e4691508f9477ce67da2015d8c00acd751e6287739123113a9fca6f1604e/frozenlist-1.8.0-cp313-cp313-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:fb30f9626572a76dfe4293c7194a09fb1fe93ba94c7d4f720dfae3b646b45027", size = 234391, upload-time = "2025-10-06T05:36:31.301Z" }, + { url = "https://files.pythonhosted.org/packages/40/76/c202df58e3acdf12969a7895fd6f3bc016c642e6726aa63bd3025e0fc71c/frozenlist-1.8.0-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:eaa352d7047a31d87dafcacbabe89df0aa506abb5b1b85a2fb91bc3faa02d822", size = 233048, upload-time = "2025-10-06T05:36:32.531Z" }, + { url = "https://files.pythonhosted.org/packages/f9/c0/8746afb90f17b73ca5979c7a3958116e105ff796e718575175319b5bb4ce/frozenlist-1.8.0-cp313-cp313-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:03ae967b4e297f58f8c774c7eabcce57fe3c2434817d4385c50661845a058121", size = 226549, upload-time = "2025-10-06T05:36:33.706Z" }, + { url = "https://files.pythonhosted.org/packages/7e/eb/4c7eefc718ff72f9b6c4893291abaae5fbc0c82226a32dcd8ef4f7a5dbef/frozenlist-1.8.0-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:f6292f1de555ffcc675941d65fffffb0a5bcd992905015f85d0592201793e0e5", size = 239833, upload-time = "2025-10-06T05:36:34.947Z" }, + { url = "https://files.pythonhosted.org/packages/c2/4e/e5c02187cf704224f8b21bee886f3d713ca379535f16893233b9d672ea71/frozenlist-1.8.0-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:29548f9b5b5e3460ce7378144c3010363d8035cea44bc0bf02d57f5a685e084e", size = 245363, upload-time = "2025-10-06T05:36:36.534Z" }, + { url = "https://files.pythonhosted.org/packages/1f/96/cb85ec608464472e82ad37a17f844889c36100eed57bea094518bf270692/frozenlist-1.8.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:ec3cc8c5d4084591b4237c0a272cc4f50a5b03396a47d9caaf76f5d7b38a4f11", size = 229314, upload-time = "2025-10-06T05:36:38.582Z" }, + { url = "https://files.pythonhosted.org/packages/5d/6f/4ae69c550e4cee66b57887daeebe006fe985917c01d0fff9caab9883f6d0/frozenlist-1.8.0-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:517279f58009d0b1f2e7c1b130b377a349405da3f7621ed6bfae50b10adf20c1", size = 243365, upload-time = "2025-10-06T05:36:40.152Z" }, + { url = "https://files.pythonhosted.org/packages/7a/58/afd56de246cf11780a40a2c28dc7cbabbf06337cc8ddb1c780a2d97e88d8/frozenlist-1.8.0-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:db1e72ede2d0d7ccb213f218df6a078a9c09a7de257c2fe8fcef16d5925230b1", size = 237763, upload-time = "2025-10-06T05:36:41.355Z" }, + { url = "https://files.pythonhosted.org/packages/cb/36/cdfaf6ed42e2644740d4a10452d8e97fa1c062e2a8006e4b09f1b5fd7d63/frozenlist-1.8.0-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:b4dec9482a65c54a5044486847b8a66bf10c9cb4926d42927ec4e8fd5db7fed8", size = 240110, upload-time = "2025-10-06T05:36:42.716Z" }, + { url = "https://files.pythonhosted.org/packages/03/a8/9ea226fbefad669f11b52e864c55f0bd57d3c8d7eb07e9f2e9a0b39502e1/frozenlist-1.8.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:21900c48ae04d13d416f0e1e0c4d81f7931f73a9dfa0b7a8746fb2fe7dd970ed", size = 233717, upload-time = "2025-10-06T05:36:44.251Z" }, + { url = "https://files.pythonhosted.org/packages/1e/0b/1b5531611e83ba7d13ccc9988967ea1b51186af64c42b7a7af465dcc9568/frozenlist-1.8.0-cp313-cp313-win32.whl", hash = "sha256:8b7b94a067d1c504ee0b16def57ad5738701e4ba10cec90529f13fa03c833496", size = 39628, upload-time = "2025-10-06T05:36:45.423Z" }, + { url = "https://files.pythonhosted.org/packages/d8/cf/174c91dbc9cc49bc7b7aab74d8b734e974d1faa8f191c74af9b7e80848e6/frozenlist-1.8.0-cp313-cp313-win_amd64.whl", hash = "sha256:878be833caa6a3821caf85eb39c5ba92d28e85df26d57afb06b35b2efd937231", size = 43882, upload-time = "2025-10-06T05:36:46.796Z" }, + { url = "https://files.pythonhosted.org/packages/c1/17/502cd212cbfa96eb1388614fe39a3fc9ab87dbbe042b66f97acb57474834/frozenlist-1.8.0-cp313-cp313-win_arm64.whl", hash = "sha256:44389d135b3ff43ba8cc89ff7f51f5a0bb6b63d829c8300f79a2fe4fe61bcc62", size = 39676, upload-time = "2025-10-06T05:36:47.8Z" }, + { url = "https://files.pythonhosted.org/packages/d2/5c/3bbfaa920dfab09e76946a5d2833a7cbdf7b9b4a91c714666ac4855b88b4/frozenlist-1.8.0-cp313-cp313t-macosx_10_13_universal2.whl", hash = "sha256:e25ac20a2ef37e91c1b39938b591457666a0fa835c7783c3a8f33ea42870db94", size = 89235, upload-time = "2025-10-06T05:36:48.78Z" }, + { url = "https://files.pythonhosted.org/packages/d2/d6/f03961ef72166cec1687e84e8925838442b615bd0b8854b54923ce5b7b8a/frozenlist-1.8.0-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:07cdca25a91a4386d2e76ad992916a85038a9b97561bf7a3fd12d5d9ce31870c", size = 50742, upload-time = "2025-10-06T05:36:49.837Z" }, + { url = "https://files.pythonhosted.org/packages/1e/bb/a6d12b7ba4c3337667d0e421f7181c82dda448ce4e7ad7ecd249a16fa806/frozenlist-1.8.0-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:4e0c11f2cc6717e0a741f84a527c52616140741cd812a50422f83dc31749fb52", size = 51725, upload-time = "2025-10-06T05:36:50.851Z" }, + { url = "https://files.pythonhosted.org/packages/bc/71/d1fed0ffe2c2ccd70b43714c6cab0f4188f09f8a67a7914a6b46ee30f274/frozenlist-1.8.0-cp313-cp313t-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:b3210649ee28062ea6099cfda39e147fa1bc039583c8ee4481cb7811e2448c51", size = 284533, upload-time = "2025-10-06T05:36:51.898Z" }, + { url = "https://files.pythonhosted.org/packages/c9/1f/fb1685a7b009d89f9bf78a42d94461bc06581f6e718c39344754a5d9bada/frozenlist-1.8.0-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:581ef5194c48035a7de2aefc72ac6539823bb71508189e5de01d60c9dcd5fa65", size = 292506, upload-time = "2025-10-06T05:36:53.101Z" }, + { url = "https://files.pythonhosted.org/packages/e6/3b/b991fe1612703f7e0d05c0cf734c1b77aaf7c7d321df4572e8d36e7048c8/frozenlist-1.8.0-cp313-cp313t-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:3ef2d026f16a2b1866e1d86fc4e1291e1ed8a387b2c333809419a2f8b3a77b82", size = 274161, upload-time = "2025-10-06T05:36:54.309Z" }, + { url = "https://files.pythonhosted.org/packages/ca/ec/c5c618767bcdf66e88945ec0157d7f6c4a1322f1473392319b7a2501ded7/frozenlist-1.8.0-cp313-cp313t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:5500ef82073f599ac84d888e3a8c1f77ac831183244bfd7f11eaa0289fb30714", size = 294676, upload-time = "2025-10-06T05:36:55.566Z" }, + { url = "https://files.pythonhosted.org/packages/7c/ce/3934758637d8f8a88d11f0585d6495ef54b2044ed6ec84492a91fa3b27aa/frozenlist-1.8.0-cp313-cp313t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:50066c3997d0091c411a66e710f4e11752251e6d2d73d70d8d5d4c76442a199d", size = 300638, upload-time = "2025-10-06T05:36:56.758Z" }, + { url = "https://files.pythonhosted.org/packages/fc/4f/a7e4d0d467298f42de4b41cbc7ddaf19d3cfeabaf9ff97c20c6c7ee409f9/frozenlist-1.8.0-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:5c1c8e78426e59b3f8005e9b19f6ff46e5845895adbde20ece9218319eca6506", size = 283067, upload-time = "2025-10-06T05:36:57.965Z" }, + { url = "https://files.pythonhosted.org/packages/dc/48/c7b163063d55a83772b268e6d1affb960771b0e203b632cfe09522d67ea5/frozenlist-1.8.0-cp313-cp313t-musllinux_1_2_armv7l.whl", hash = "sha256:eefdba20de0d938cec6a89bd4d70f346a03108a19b9df4248d3cf0d88f1b0f51", size = 292101, upload-time = "2025-10-06T05:36:59.237Z" }, + { url = "https://files.pythonhosted.org/packages/9f/d0/2366d3c4ecdc2fd391e0afa6e11500bfba0ea772764d631bbf82f0136c9d/frozenlist-1.8.0-cp313-cp313t-musllinux_1_2_ppc64le.whl", hash = "sha256:cf253e0e1c3ceb4aaff6df637ce033ff6535fb8c70a764a8f46aafd3d6ab798e", size = 289901, upload-time = "2025-10-06T05:37:00.811Z" }, + { url = "https://files.pythonhosted.org/packages/b8/94/daff920e82c1b70e3618a2ac39fbc01ae3e2ff6124e80739ce5d71c9b920/frozenlist-1.8.0-cp313-cp313t-musllinux_1_2_s390x.whl", hash = "sha256:032efa2674356903cd0261c4317a561a6850f3ac864a63fc1583147fb05a79b0", size = 289395, upload-time = "2025-10-06T05:37:02.115Z" }, + { url = "https://files.pythonhosted.org/packages/e3/20/bba307ab4235a09fdcd3cc5508dbabd17c4634a1af4b96e0f69bfe551ebd/frozenlist-1.8.0-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:6da155091429aeba16851ecb10a9104a108bcd32f6c1642867eadaee401c1c41", size = 283659, upload-time = "2025-10-06T05:37:03.711Z" }, + { url = "https://files.pythonhosted.org/packages/fd/00/04ca1c3a7a124b6de4f8a9a17cc2fcad138b4608e7a3fc5877804b8715d7/frozenlist-1.8.0-cp313-cp313t-win32.whl", hash = "sha256:0f96534f8bfebc1a394209427d0f8a63d343c9779cda6fc25e8e121b5fd8555b", size = 43492, upload-time = "2025-10-06T05:37:04.915Z" }, + { url = "https://files.pythonhosted.org/packages/59/5e/c69f733a86a94ab10f68e496dc6b7e8bc078ebb415281d5698313e3af3a1/frozenlist-1.8.0-cp313-cp313t-win_amd64.whl", hash = "sha256:5d63a068f978fc69421fb0e6eb91a9603187527c86b7cd3f534a5b77a592b888", size = 48034, upload-time = "2025-10-06T05:37:06.343Z" }, + { url = "https://files.pythonhosted.org/packages/16/6c/be9d79775d8abe79b05fa6d23da99ad6e7763a1d080fbae7290b286093fd/frozenlist-1.8.0-cp313-cp313t-win_arm64.whl", hash = "sha256:bf0a7e10b077bf5fb9380ad3ae8ce20ef919a6ad93b4552896419ac7e1d8e042", size = 41749, upload-time = "2025-10-06T05:37:07.431Z" }, + { url = "https://files.pythonhosted.org/packages/f1/c8/85da824b7e7b9b6e7f7705b2ecaf9591ba6f79c1177f324c2735e41d36a2/frozenlist-1.8.0-cp314-cp314-macosx_10_13_universal2.whl", hash = "sha256:cee686f1f4cadeb2136007ddedd0aaf928ab95216e7691c63e50a8ec066336d0", size = 86127, upload-time = "2025-10-06T05:37:08.438Z" }, + { url = "https://files.pythonhosted.org/packages/8e/e8/a1185e236ec66c20afd72399522f142c3724c785789255202d27ae992818/frozenlist-1.8.0-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:119fb2a1bd47307e899c2fac7f28e85b9a543864df47aa7ec9d3c1b4545f096f", size = 49698, upload-time = "2025-10-06T05:37:09.48Z" }, + { url = "https://files.pythonhosted.org/packages/a1/93/72b1736d68f03fda5fdf0f2180fb6caaae3894f1b854d006ac61ecc727ee/frozenlist-1.8.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:4970ece02dbc8c3a92fcc5228e36a3e933a01a999f7094ff7c23fbd2beeaa67c", size = 49749, upload-time = "2025-10-06T05:37:10.569Z" }, + { url = "https://files.pythonhosted.org/packages/a7/b2/fabede9fafd976b991e9f1b9c8c873ed86f202889b864756f240ce6dd855/frozenlist-1.8.0-cp314-cp314-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:cba69cb73723c3f329622e34bdbf5ce1f80c21c290ff04256cff1cd3c2036ed2", size = 231298, upload-time = "2025-10-06T05:37:11.993Z" }, + { url = "https://files.pythonhosted.org/packages/3a/3b/d9b1e0b0eed36e70477ffb8360c49c85c8ca8ef9700a4e6711f39a6e8b45/frozenlist-1.8.0-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:778a11b15673f6f1df23d9586f83c4846c471a8af693a22e066508b77d201ec8", size = 232015, upload-time = "2025-10-06T05:37:13.194Z" }, + { url = "https://files.pythonhosted.org/packages/dc/94/be719d2766c1138148564a3960fc2c06eb688da592bdc25adcf856101be7/frozenlist-1.8.0-cp314-cp314-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:0325024fe97f94c41c08872db482cf8ac4800d80e79222c6b0b7b162d5b13686", size = 225038, upload-time = "2025-10-06T05:37:14.577Z" }, + { url = "https://files.pythonhosted.org/packages/e4/09/6712b6c5465f083f52f50cf74167b92d4ea2f50e46a9eea0523d658454ae/frozenlist-1.8.0-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:97260ff46b207a82a7567b581ab4190bd4dfa09f4db8a8b49d1a958f6aa4940e", size = 240130, upload-time = "2025-10-06T05:37:15.781Z" }, + { url = "https://files.pythonhosted.org/packages/f8/d4/cd065cdcf21550b54f3ce6a22e143ac9e4836ca42a0de1022da8498eac89/frozenlist-1.8.0-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:54b2077180eb7f83dd52c40b2750d0a9f175e06a42e3213ce047219de902717a", size = 242845, upload-time = "2025-10-06T05:37:17.037Z" }, + { url = "https://files.pythonhosted.org/packages/62/c3/f57a5c8c70cd1ead3d5d5f776f89d33110b1addae0ab010ad774d9a44fb9/frozenlist-1.8.0-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:2f05983daecab868a31e1da44462873306d3cbfd76d1f0b5b69c473d21dbb128", size = 229131, upload-time = "2025-10-06T05:37:18.221Z" }, + { url = "https://files.pythonhosted.org/packages/6c/52/232476fe9cb64f0742f3fde2b7d26c1dac18b6d62071c74d4ded55e0ef94/frozenlist-1.8.0-cp314-cp314-musllinux_1_2_armv7l.whl", hash = "sha256:33f48f51a446114bc5d251fb2954ab0164d5be02ad3382abcbfe07e2531d650f", size = 240542, upload-time = "2025-10-06T05:37:19.771Z" }, + { url = "https://files.pythonhosted.org/packages/5f/85/07bf3f5d0fb5414aee5f47d33c6f5c77bfe49aac680bfece33d4fdf6a246/frozenlist-1.8.0-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:154e55ec0655291b5dd1b8731c637ecdb50975a2ae70c606d100750a540082f7", size = 237308, upload-time = "2025-10-06T05:37:20.969Z" }, + { url = "https://files.pythonhosted.org/packages/11/99/ae3a33d5befd41ac0ca2cc7fd3aa707c9c324de2e89db0e0f45db9a64c26/frozenlist-1.8.0-cp314-cp314-musllinux_1_2_s390x.whl", hash = "sha256:4314debad13beb564b708b4a496020e5306c7333fa9a3ab90374169a20ffab30", size = 238210, upload-time = "2025-10-06T05:37:22.252Z" }, + { url = "https://files.pythonhosted.org/packages/b2/60/b1d2da22f4970e7a155f0adde9b1435712ece01b3cd45ba63702aea33938/frozenlist-1.8.0-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:073f8bf8becba60aa931eb3bc420b217bb7d5b8f4750e6f8b3be7f3da85d38b7", size = 231972, upload-time = "2025-10-06T05:37:23.5Z" }, + { url = "https://files.pythonhosted.org/packages/3f/ab/945b2f32de889993b9c9133216c068b7fcf257d8595a0ac420ac8677cab0/frozenlist-1.8.0-cp314-cp314-win32.whl", hash = "sha256:bac9c42ba2ac65ddc115d930c78d24ab8d4f465fd3fc473cdedfccadb9429806", size = 40536, upload-time = "2025-10-06T05:37:25.581Z" }, + { url = "https://files.pythonhosted.org/packages/59/ad/9caa9b9c836d9ad6f067157a531ac48b7d36499f5036d4141ce78c230b1b/frozenlist-1.8.0-cp314-cp314-win_amd64.whl", hash = "sha256:3e0761f4d1a44f1d1a47996511752cf3dcec5bbdd9cc2b4fe595caf97754b7a0", size = 44330, upload-time = "2025-10-06T05:37:26.928Z" }, + { url = "https://files.pythonhosted.org/packages/82/13/e6950121764f2676f43534c555249f57030150260aee9dcf7d64efda11dd/frozenlist-1.8.0-cp314-cp314-win_arm64.whl", hash = "sha256:d1eaff1d00c7751b7c6662e9c5ba6eb2c17a2306ba5e2a37f24ddf3cc953402b", size = 40627, upload-time = "2025-10-06T05:37:28.075Z" }, + { url = "https://files.pythonhosted.org/packages/c0/c7/43200656ecc4e02d3f8bc248df68256cd9572b3f0017f0a0c4e93440ae23/frozenlist-1.8.0-cp314-cp314t-macosx_10_13_universal2.whl", hash = "sha256:d3bb933317c52d7ea5004a1c442eef86f426886fba134ef8cf4226ea6ee1821d", size = 89238, upload-time = "2025-10-06T05:37:29.373Z" }, + { url = "https://files.pythonhosted.org/packages/d1/29/55c5f0689b9c0fb765055629f472c0de484dcaf0acee2f7707266ae3583c/frozenlist-1.8.0-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:8009897cdef112072f93a0efdce29cd819e717fd2f649ee3016efd3cd885a7ed", size = 50738, upload-time = "2025-10-06T05:37:30.792Z" }, + { url = "https://files.pythonhosted.org/packages/ba/7d/b7282a445956506fa11da8c2db7d276adcbf2b17d8bb8407a47685263f90/frozenlist-1.8.0-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:2c5dcbbc55383e5883246d11fd179782a9d07a986c40f49abe89ddf865913930", size = 51739, upload-time = "2025-10-06T05:37:32.127Z" }, + { url = "https://files.pythonhosted.org/packages/62/1c/3d8622e60d0b767a5510d1d3cf21065b9db874696a51ea6d7a43180a259c/frozenlist-1.8.0-cp314-cp314t-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:39ecbc32f1390387d2aa4f5a995e465e9e2f79ba3adcac92d68e3e0afae6657c", size = 284186, upload-time = "2025-10-06T05:37:33.21Z" }, + { url = "https://files.pythonhosted.org/packages/2d/14/aa36d5f85a89679a85a1d44cd7a6657e0b1c75f61e7cad987b203d2daca8/frozenlist-1.8.0-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:92db2bf818d5cc8d9c1f1fc56b897662e24ea5adb36ad1f1d82875bd64e03c24", size = 292196, upload-time = "2025-10-06T05:37:36.107Z" }, + { url = "https://files.pythonhosted.org/packages/05/23/6bde59eb55abd407d34f77d39a5126fb7b4f109a3f611d3929f14b700c66/frozenlist-1.8.0-cp314-cp314t-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:2dc43a022e555de94c3b68a4ef0b11c4f747d12c024a520c7101709a2144fb37", size = 273830, upload-time = "2025-10-06T05:37:37.663Z" }, + { url = "https://files.pythonhosted.org/packages/d2/3f/22cff331bfad7a8afa616289000ba793347fcd7bc275f3b28ecea2a27909/frozenlist-1.8.0-cp314-cp314t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:cb89a7f2de3602cfed448095bab3f178399646ab7c61454315089787df07733a", size = 294289, upload-time = "2025-10-06T05:37:39.261Z" }, + { url = "https://files.pythonhosted.org/packages/a4/89/5b057c799de4838b6c69aa82b79705f2027615e01be996d2486a69ca99c4/frozenlist-1.8.0-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:33139dc858c580ea50e7e60a1b0ea003efa1fd42e6ec7fdbad78fff65fad2fd2", size = 300318, upload-time = "2025-10-06T05:37:43.213Z" }, + { url = "https://files.pythonhosted.org/packages/30/de/2c22ab3eb2a8af6d69dc799e48455813bab3690c760de58e1bf43b36da3e/frozenlist-1.8.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:168c0969a329b416119507ba30b9ea13688fafffac1b7822802537569a1cb0ef", size = 282814, upload-time = "2025-10-06T05:37:45.337Z" }, + { url = "https://files.pythonhosted.org/packages/59/f7/970141a6a8dbd7f556d94977858cfb36fa9b66e0892c6dd780d2219d8cd8/frozenlist-1.8.0-cp314-cp314t-musllinux_1_2_armv7l.whl", hash = "sha256:28bd570e8e189d7f7b001966435f9dac6718324b5be2990ac496cf1ea9ddb7fe", size = 291762, upload-time = "2025-10-06T05:37:46.657Z" }, + { url = "https://files.pythonhosted.org/packages/c1/15/ca1adae83a719f82df9116d66f5bb28bb95557b3951903d39135620ef157/frozenlist-1.8.0-cp314-cp314t-musllinux_1_2_ppc64le.whl", hash = "sha256:b2a095d45c5d46e5e79ba1e5b9cb787f541a8dee0433836cea4b96a2c439dcd8", size = 289470, upload-time = "2025-10-06T05:37:47.946Z" }, + { url = "https://files.pythonhosted.org/packages/ac/83/dca6dc53bf657d371fbc88ddeb21b79891e747189c5de990b9dfff2ccba1/frozenlist-1.8.0-cp314-cp314t-musllinux_1_2_s390x.whl", hash = "sha256:eab8145831a0d56ec9c4139b6c3e594c7a83c2c8be25d5bcf2d86136a532287a", size = 289042, upload-time = "2025-10-06T05:37:49.499Z" }, + { url = "https://files.pythonhosted.org/packages/96/52/abddd34ca99be142f354398700536c5bd315880ed0a213812bc491cff5e4/frozenlist-1.8.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:974b28cf63cc99dfb2188d8d222bc6843656188164848c4f679e63dae4b0708e", size = 283148, upload-time = "2025-10-06T05:37:50.745Z" }, + { url = "https://files.pythonhosted.org/packages/af/d3/76bd4ed4317e7119c2b7f57c3f6934aba26d277acc6309f873341640e21f/frozenlist-1.8.0-cp314-cp314t-win32.whl", hash = "sha256:342c97bf697ac5480c0a7ec73cd700ecfa5a8a40ac923bd035484616efecc2df", size = 44676, upload-time = "2025-10-06T05:37:52.222Z" }, + { url = "https://files.pythonhosted.org/packages/89/76/c615883b7b521ead2944bb3480398cbb07e12b7b4e4d073d3752eb721558/frozenlist-1.8.0-cp314-cp314t-win_amd64.whl", hash = "sha256:06be8f67f39c8b1dc671f5d83aaefd3358ae5cdcf8314552c57e7ed3e6475bdd", size = 49451, upload-time = "2025-10-06T05:37:53.425Z" }, + { url = "https://files.pythonhosted.org/packages/e0/a3/5982da14e113d07b325230f95060e2169f5311b1017ea8af2a29b374c289/frozenlist-1.8.0-cp314-cp314t-win_arm64.whl", hash = "sha256:102e6314ca4da683dca92e3b1355490fed5f313b768500084fbe6371fddfdb79", size = 42507, upload-time = "2025-10-06T05:37:54.513Z" }, + { url = "https://files.pythonhosted.org/packages/9a/9a/e35b4a917281c0b8419d4207f4334c8e8c5dbf4f3f5f9ada73958d937dcc/frozenlist-1.8.0-py3-none-any.whl", hash = "sha256:0c18a16eab41e82c295618a77502e17b195883241c563b00f0aa5106fc4eaa0d", size = 13409, upload-time = "2025-10-06T05:38:16.721Z" }, +] + +[[package]] +name = "greenlet" +version = "3.3.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/c7/e5/40dbda2736893e3e53d25838e0f19a2b417dfc122b9989c91918db30b5d3/greenlet-3.3.0.tar.gz", hash = "sha256:a82bb225a4e9e4d653dd2fb7b8b2d36e4fb25bc0165422a11e48b88e9e6f78fb", size = 190651, upload-time = "2025-12-04T14:49:44.05Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/02/2f/28592176381b9ab2cafa12829ba7b472d177f3acc35d8fbcf3673d966fff/greenlet-3.3.0-cp313-cp313-macosx_11_0_universal2.whl", hash = "sha256:a1e41a81c7e2825822f4e068c48cb2196002362619e2d70b148f20a831c00739", size = 275140, upload-time = "2025-12-04T14:23:01.282Z" }, + { url = "https://files.pythonhosted.org/packages/2c/80/fbe937bf81e9fca98c981fe499e59a3f45df2a04da0baa5c2be0dca0d329/greenlet-3.3.0-cp313-cp313-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:9f515a47d02da4d30caaa85b69474cec77b7929b2e936ff7fb853d42f4bf8808", size = 599219, upload-time = "2025-12-04T14:50:08.309Z" }, + { url = "https://files.pythonhosted.org/packages/c2/ff/7c985128f0514271b8268476af89aee6866df5eec04ac17dcfbc676213df/greenlet-3.3.0-cp313-cp313-manylinux_2_24_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:7d2d9fd66bfadf230b385fdc90426fcd6eb64db54b40c495b72ac0feb5766c54", size = 610211, upload-time = "2025-12-04T14:57:43.968Z" }, + { url = "https://files.pythonhosted.org/packages/79/07/c47a82d881319ec18a4510bb30463ed6891f2ad2c1901ed5ec23d3de351f/greenlet-3.3.0-cp313-cp313-manylinux_2_24_s390x.manylinux_2_28_s390x.whl", hash = "sha256:30a6e28487a790417d036088b3bcb3f3ac7d8babaa7d0139edbaddebf3af9492", size = 624311, upload-time = "2025-12-04T15:07:14.697Z" }, + { url = "https://files.pythonhosted.org/packages/fd/8e/424b8c6e78bd9837d14ff7df01a9829fc883ba2ab4ea787d4f848435f23f/greenlet-3.3.0-cp313-cp313-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:087ea5e004437321508a8d6f20efc4cfec5e3c30118e1417ea96ed1d93950527", size = 612833, upload-time = "2025-12-04T14:26:03.669Z" }, + { url = "https://files.pythonhosted.org/packages/b5/ba/56699ff9b7c76ca12f1cdc27a886d0f81f2189c3455ff9f65246780f713d/greenlet-3.3.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:ab97cf74045343f6c60a39913fa59710e4bd26a536ce7ab2397adf8b27e67c39", size = 1567256, upload-time = "2025-12-04T15:04:25.276Z" }, + { url = "https://files.pythonhosted.org/packages/1e/37/f31136132967982d698c71a281a8901daf1a8fbab935dce7c0cf15f942cc/greenlet-3.3.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:5375d2e23184629112ca1ea89a53389dddbffcf417dad40125713d88eb5f96e8", size = 1636483, upload-time = "2025-12-04T14:27:30.804Z" }, + { url = "https://files.pythonhosted.org/packages/7e/71/ba21c3fb8c5dce83b8c01f458a42e99ffdb1963aeec08fff5a18588d8fd7/greenlet-3.3.0-cp313-cp313-win_amd64.whl", hash = "sha256:9ee1942ea19550094033c35d25d20726e4f1c40d59545815e1128ac58d416d38", size = 301833, upload-time = "2025-12-04T14:32:23.929Z" }, + { url = "https://files.pythonhosted.org/packages/d7/7c/f0a6d0ede2c7bf092d00bc83ad5bafb7e6ec9b4aab2fbdfa6f134dc73327/greenlet-3.3.0-cp314-cp314-macosx_11_0_universal2.whl", hash = "sha256:60c2ef0f578afb3c8d92ea07ad327f9a062547137afe91f38408f08aacab667f", size = 275671, upload-time = "2025-12-04T14:23:05.267Z" }, + { url = "https://files.pythonhosted.org/packages/44/06/dac639ae1a50f5969d82d2e3dd9767d30d6dbdbab0e1a54010c8fe90263c/greenlet-3.3.0-cp314-cp314-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:0a5d554d0712ba1de0a6c94c640f7aeba3f85b3a6e1f2899c11c2c0428da9365", size = 646360, upload-time = "2025-12-04T14:50:10.026Z" }, + { url = "https://files.pythonhosted.org/packages/e0/94/0fb76fe6c5369fba9bf98529ada6f4c3a1adf19e406a47332245ef0eb357/greenlet-3.3.0-cp314-cp314-manylinux_2_24_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:3a898b1e9c5f7307ebbde4102908e6cbfcb9ea16284a3abe15cab996bee8b9b3", size = 658160, upload-time = "2025-12-04T14:57:45.41Z" }, + { url = "https://files.pythonhosted.org/packages/93/79/d2c70cae6e823fac36c3bbc9077962105052b7ef81db2f01ec3b9bf17e2b/greenlet-3.3.0-cp314-cp314-manylinux_2_24_s390x.manylinux_2_28_s390x.whl", hash = "sha256:dcd2bdbd444ff340e8d6bdf54d2f206ccddbb3ccfdcd3c25bf4afaa7b8f0cf45", size = 671388, upload-time = "2025-12-04T15:07:15.789Z" }, + { url = "https://files.pythonhosted.org/packages/b8/14/bab308fc2c1b5228c3224ec2bf928ce2e4d21d8046c161e44a2012b5203e/greenlet-3.3.0-cp314-cp314-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:5773edda4dc00e173820722711d043799d3adb4f01731f40619e07ea2750b955", size = 660166, upload-time = "2025-12-04T14:26:05.099Z" }, + { url = "https://files.pythonhosted.org/packages/4b/d2/91465d39164eaa0085177f61983d80ffe746c5a1860f009811d498e7259c/greenlet-3.3.0-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:ac0549373982b36d5fd5d30beb8a7a33ee541ff98d2b502714a09f1169f31b55", size = 1615193, upload-time = "2025-12-04T15:04:27.041Z" }, + { url = "https://files.pythonhosted.org/packages/42/1b/83d110a37044b92423084d52d5d5a3b3a73cafb51b547e6d7366ff62eff1/greenlet-3.3.0-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:d198d2d977460358c3b3a4dc844f875d1adb33817f0613f663a656f463764ccc", size = 1683653, upload-time = "2025-12-04T14:27:32.366Z" }, + { url = "https://files.pythonhosted.org/packages/7c/9a/9030e6f9aa8fd7808e9c31ba4c38f87c4f8ec324ee67431d181fe396d705/greenlet-3.3.0-cp314-cp314-win_amd64.whl", hash = "sha256:73f51dd0e0bdb596fb0417e475fa3c5e32d4c83638296e560086b8d7da7c4170", size = 305387, upload-time = "2025-12-04T14:26:51.063Z" }, + { url = "https://files.pythonhosted.org/packages/a0/66/bd6317bc5932accf351fc19f177ffba53712a202f9df10587da8df257c7e/greenlet-3.3.0-cp314-cp314t-macosx_11_0_universal2.whl", hash = "sha256:d6ed6f85fae6cdfdb9ce04c9bf7a08d666cfcfb914e7d006f44f840b46741931", size = 282638, upload-time = "2025-12-04T14:25:20.941Z" }, + { url = "https://files.pythonhosted.org/packages/30/cf/cc81cb030b40e738d6e69502ccbd0dd1bced0588e958f9e757945de24404/greenlet-3.3.0-cp314-cp314t-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:d9125050fcf24554e69c4cacb086b87b3b55dc395a8b3ebe6487b045b2614388", size = 651145, upload-time = "2025-12-04T14:50:11.039Z" }, + { url = "https://files.pythonhosted.org/packages/9c/ea/1020037b5ecfe95ca7df8d8549959baceb8186031da83d5ecceff8b08cd2/greenlet-3.3.0-cp314-cp314t-manylinux_2_24_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:87e63ccfa13c0a0f6234ed0add552af24cc67dd886731f2261e46e241608bee3", size = 654236, upload-time = "2025-12-04T14:57:47.007Z" }, + { url = "https://files.pythonhosted.org/packages/69/cc/1e4bae2e45ca2fa55299f4e85854606a78ecc37fead20d69322f96000504/greenlet-3.3.0-cp314-cp314t-manylinux_2_24_s390x.manylinux_2_28_s390x.whl", hash = "sha256:2662433acbca297c9153a4023fe2161c8dcfdcc91f10433171cf7e7d94ba2221", size = 662506, upload-time = "2025-12-04T15:07:16.906Z" }, + { url = "https://files.pythonhosted.org/packages/57/b9/f8025d71a6085c441a7eaff0fd928bbb275a6633773667023d19179fe815/greenlet-3.3.0-cp314-cp314t-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:3c6e9b9c1527a78520357de498b0e709fb9e2f49c3a513afd5a249007261911b", size = 653783, upload-time = "2025-12-04T14:26:06.225Z" }, + { url = "https://files.pythonhosted.org/packages/f6/c7/876a8c7a7485d5d6b5c6821201d542ef28be645aa024cfe1145b35c120c1/greenlet-3.3.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:286d093f95ec98fdd92fcb955003b8a3d054b4e2cab3e2707a5039e7b50520fd", size = 1614857, upload-time = "2025-12-04T15:04:28.484Z" }, + { url = "https://files.pythonhosted.org/packages/4f/dc/041be1dff9f23dac5f48a43323cd0789cb798342011c19a248d9c9335536/greenlet-3.3.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:6c10513330af5b8ae16f023e8ddbfb486ab355d04467c4679c5cfe4659975dd9", size = 1676034, upload-time = "2025-12-04T14:27:33.531Z" }, +] + +[[package]] +name = "gtts" +version = "2.5.4" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "click" }, + { name = "requests" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/57/79/5ddb1dfcd663581d0d3fca34ccb1d8d841b47c22a24dc8dce416e3d87dfa/gtts-2.5.4.tar.gz", hash = "sha256:f5737b585f6442f677dbe8773424fd50697c75bdf3e36443585e30a8d48c1884", size = 24018, upload-time = "2024-11-10T21:58:00.358Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/e3/6c/8b8b1fdcaee7e268536f1bb00183a5894627726b54a9ddc6fc9909888447/gTTS-2.5.4-py3-none-any.whl", hash = "sha256:5dd579377f9f5546893bc26315ab1f846933dc27a054764b168f141065ca8436", size = 29184, upload-time = "2024-11-10T21:57:58.448Z" }, +] + +[[package]] +name = "h11" +version = "0.16.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/01/ee/02a2c011bdab74c6fb3c75474d40b3052059d95df7e73351460c8588d963/h11-0.16.0.tar.gz", hash = "sha256:4e35b956cf45792e4caa5885e69fba00bdbc6ffafbfa020300e549b208ee5ff1", size = 101250, upload-time = "2025-04-24T03:35:25.427Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/04/4b/29cac41a4d98d144bf5f6d33995617b185d14b22401f75ca86f384e87ff1/h11-0.16.0-py3-none-any.whl", hash = "sha256:63cf8bbe7522de3bf65932fda1d9c2772064ffb3dae62d55932da54b31cb6c86", size = 37515, upload-time = "2025-04-24T03:35:24.344Z" }, +] + +[[package]] +name = "hiredis" +version = "3.3.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/65/82/d2817ce0653628e0a0cb128533f6af0dd6318a49f3f3a6a7bd1f2f2154af/hiredis-3.3.0.tar.gz", hash = "sha256:105596aad9249634361815c574351f1bd50455dc23b537c2940066c4a9dea685", size = 89048, upload-time = "2025-10-14T16:33:34.263Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/6d/39/2b789ebadd1548ccb04a2c18fbc123746ad1a7e248b7f3f3cac618ca10a6/hiredis-3.3.0-cp313-cp313-macosx_10_15_universal2.whl", hash = "sha256:b7048b4ec0d5dddc8ddd03da603de0c4b43ef2540bf6e4c54f47d23e3480a4fa", size = 82035, upload-time = "2025-10-14T16:32:23.715Z" }, + { url = "https://files.pythonhosted.org/packages/85/74/4066d9c1093be744158ede277f2a0a4e4cd0fefeaa525c79e2876e9e5c72/hiredis-3.3.0-cp313-cp313-macosx_10_15_x86_64.whl", hash = "sha256:e5f86ce5a779319c15567b79e0be806e8e92c18bb2ea9153e136312fafa4b7d6", size = 46219, upload-time = "2025-10-14T16:32:24.554Z" }, + { url = "https://files.pythonhosted.org/packages/fa/3f/f9e0f6d632f399d95b3635703e1558ffaa2de3aea4cfcbc2d7832606ba43/hiredis-3.3.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:fbdb97a942e66016fff034df48a7a184e2b7dc69f14c4acd20772e156f20d04b", size = 41860, upload-time = "2025-10-14T16:32:25.356Z" }, + { url = "https://files.pythonhosted.org/packages/4a/c5/b7dde5ec390dabd1cabe7b364a509c66d4e26de783b0b64cf1618f7149fc/hiredis-3.3.0-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:b0fb4bea72fe45ff13e93ddd1352b43ff0749f9866263b5cca759a4c960c776f", size = 170094, upload-time = "2025-10-14T16:32:26.148Z" }, + { url = "https://files.pythonhosted.org/packages/3e/d6/7f05c08ee74d41613be466935688068e07f7b6c55266784b5ace7b35b766/hiredis-3.3.0-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:85b9baf98050e8f43c2826ab46aaf775090d608217baf7af7882596aef74e7f9", size = 181746, upload-time = "2025-10-14T16:32:27.844Z" }, + { url = "https://files.pythonhosted.org/packages/0e/d2/aaf9f8edab06fbf5b766e0cae3996324297c0516a91eb2ca3bd1959a0308/hiredis-3.3.0-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:69079fb0f0ebb61ba63340b9c4bce9388ad016092ca157e5772eb2818209d930", size = 180465, upload-time = "2025-10-14T16:32:29.185Z" }, + { url = "https://files.pythonhosted.org/packages/8d/1e/93ded8b9b484519b211fc71746a231af98c98928e3ebebb9086ed20bb1ad/hiredis-3.3.0-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:c17f77b79031ea4b0967d30255d2ae6e7df0603ee2426ad3274067f406938236", size = 172419, upload-time = "2025-10-14T16:32:30.059Z" }, + { url = "https://files.pythonhosted.org/packages/68/13/02880458e02bbfcedcaabb8f7510f9dda1c89d7c1921b1bb28c22bb38cbf/hiredis-3.3.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:45d14f745fc177bc05fc24bdf20e2b515e9a068d3d4cce90a0fb78d04c9c9d9a", size = 166400, upload-time = "2025-10-14T16:32:31.173Z" }, + { url = "https://files.pythonhosted.org/packages/11/60/896e03267670570f19f61dc65a2137fcb2b06e83ab0911d58eeec9f3cb88/hiredis-3.3.0-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:ba063fdf1eff6377a0c409609cbe890389aefddfec109c2d20fcc19cfdafe9da", size = 176845, upload-time = "2025-10-14T16:32:32.12Z" }, + { url = "https://files.pythonhosted.org/packages/f1/90/a1d4bd0cdcf251fda72ac0bd932f547b48ad3420f89bb2ef91bf6a494534/hiredis-3.3.0-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:1799cc66353ad066bfdd410135c951959da9f16bcb757c845aab2f21fc4ef099", size = 170365, upload-time = "2025-10-14T16:32:33.035Z" }, + { url = "https://files.pythonhosted.org/packages/f1/9a/7c98f7bb76bdb4a6a6003cf8209721f083e65d2eed2b514f4a5514bda665/hiredis-3.3.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:2cbf71a121996ffac82436b6153290815b746afb010cac19b3290a1644381b07", size = 168022, upload-time = "2025-10-14T16:32:34.81Z" }, + { url = "https://files.pythonhosted.org/packages/0d/ca/672ee658ffe9525558615d955b554ecd36aa185acd4431ccc9701c655c9b/hiredis-3.3.0-cp313-cp313-win32.whl", hash = "sha256:a7cbbc6026bf03659f0b25e94bbf6e64f6c8c22f7b4bc52fe569d041de274194", size = 20533, upload-time = "2025-10-14T16:32:35.7Z" }, + { url = "https://files.pythonhosted.org/packages/20/93/511fd94f6a7b6d72a4cf9c2b159bf3d780585a9a1dca52715dd463825299/hiredis-3.3.0-cp313-cp313-win_amd64.whl", hash = "sha256:a8def89dd19d4e2e4482b7412d453dec4a5898954d9a210d7d05f60576cedef6", size = 22387, upload-time = "2025-10-14T16:32:36.441Z" }, + { url = "https://files.pythonhosted.org/packages/aa/b3/b948ee76a6b2bc7e45249861646f91f29704f743b52565cf64cee9c4658b/hiredis-3.3.0-cp314-cp314-macosx_10_15_universal2.whl", hash = "sha256:c135bda87211f7af9e2fd4e046ab433c576cd17b69e639a0f5bb2eed5e0e71a9", size = 82105, upload-time = "2025-10-14T16:32:37.204Z" }, + { url = "https://files.pythonhosted.org/packages/a2/9b/4210f4ebfb3ab4ada964b8de08190f54cbac147198fb463cd3c111cc13e0/hiredis-3.3.0-cp314-cp314-macosx_10_15_x86_64.whl", hash = "sha256:2f855c678230aed6fc29b962ce1cc67e5858a785ef3a3fd6b15dece0487a2e60", size = 46237, upload-time = "2025-10-14T16:32:38.07Z" }, + { url = "https://files.pythonhosted.org/packages/b3/7a/e38bfd7d04c05036b4ccc6f42b86b1032185cf6ae426e112a97551fece14/hiredis-3.3.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:4059c78a930cbb33c391452ccce75b137d6f89e2eebf6273d75dafc5c2143c03", size = 41894, upload-time = "2025-10-14T16:32:38.929Z" }, + { url = "https://files.pythonhosted.org/packages/28/d3/eae43d9609c5d9a6effef0586ee47e13a0d84b44264b688d97a75cd17ee5/hiredis-3.3.0-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:334a3f1d14c253bb092e187736c3384203bd486b244e726319bbb3f7dffa4a20", size = 170486, upload-time = "2025-10-14T16:32:40.147Z" }, + { url = "https://files.pythonhosted.org/packages/c3/fd/34d664554880b27741ab2916d66207357563b1639e2648685f4c84cfb755/hiredis-3.3.0-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:fd137b147235447b3d067ec952c5b9b95ca54b71837e1b38dbb2ec03b89f24fc", size = 182031, upload-time = "2025-10-14T16:32:41.06Z" }, + { url = "https://files.pythonhosted.org/packages/08/a3/0c69fdde3f4155b9f7acc64ccffde46f312781469260061b3bbaa487fd34/hiredis-3.3.0-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:8f88f4f2aceb73329ece86a1cb0794fdbc8e6d614cb5ca2d1023c9b7eb432db8", size = 180542, upload-time = "2025-10-14T16:32:42.993Z" }, + { url = "https://files.pythonhosted.org/packages/68/7a/ad5da4d7bc241e57c5b0c4fe95aa75d1f2116e6e6c51577394d773216e01/hiredis-3.3.0-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:550f4d1538822fc75ebf8cf63adc396b23d4958bdbbad424521f2c0e3dfcb169", size = 172353, upload-time = "2025-10-14T16:32:43.965Z" }, + { url = "https://files.pythonhosted.org/packages/4b/dc/c46eace64eb047a5b31acd5e4b0dc6d2f0390a4a3f6d507442d9efa570ad/hiredis-3.3.0-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:54b14211fbd5930fc696f6fcd1f1f364c660970d61af065a80e48a1fa5464dd6", size = 166435, upload-time = "2025-10-14T16:32:44.97Z" }, + { url = "https://files.pythonhosted.org/packages/4a/ac/ad13a714e27883a2e4113c980c94caf46b801b810de5622c40f8d3e8335f/hiredis-3.3.0-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:c9e96f63dbc489fc86f69951e9f83dadb9582271f64f6822c47dcffa6fac7e4a", size = 177218, upload-time = "2025-10-14T16:32:45.936Z" }, + { url = "https://files.pythonhosted.org/packages/c2/38/268fabd85b225271fe1ba82cb4a484fcc1bf922493ff2c74b400f1a6f339/hiredis-3.3.0-cp314-cp314-musllinux_1_2_s390x.whl", hash = "sha256:106e99885d46684d62ab3ec1d6b01573cc0e0083ac295b11aaa56870b536c7ec", size = 170477, upload-time = "2025-10-14T16:32:46.898Z" }, + { url = "https://files.pythonhosted.org/packages/20/6b/02bb8af810ea04247334ab7148acff7a61c08a8832830c6703f464be83a9/hiredis-3.3.0-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:087e2ef3206361281b1a658b5b4263572b6ba99465253e827796964208680459", size = 167915, upload-time = "2025-10-14T16:32:47.847Z" }, + { url = "https://files.pythonhosted.org/packages/83/94/901fa817e667b2e69957626395e6dee416e31609dca738f28e6b545ca6c2/hiredis-3.3.0-cp314-cp314-win32.whl", hash = "sha256:80638ebeab1cefda9420e9fedc7920e1ec7b4f0513a6b23d58c9d13c882f8065", size = 21165, upload-time = "2025-10-14T16:32:50.753Z" }, + { url = "https://files.pythonhosted.org/packages/b1/7e/4881b9c1d0b4cdaba11bd10e600e97863f977ea9d67c5988f7ec8cd363e5/hiredis-3.3.0-cp314-cp314-win_amd64.whl", hash = "sha256:a68aaf9ba024f4e28cf23df9196ff4e897bd7085872f3a30644dca07fa787816", size = 22996, upload-time = "2025-10-14T16:32:51.543Z" }, + { url = "https://files.pythonhosted.org/packages/a7/b6/d7e6c17da032665a954a89c1e6ee3bd12cb51cd78c37527842b03519981d/hiredis-3.3.0-cp314-cp314t-macosx_10_15_universal2.whl", hash = "sha256:f7f80442a32ce51ee5d89aeb5a84ee56189a0e0e875f1a57bbf8d462555ae48f", size = 83034, upload-time = "2025-10-14T16:32:52.395Z" }, + { url = "https://files.pythonhosted.org/packages/27/6c/6751b698060cdd1b2d8427702cff367c9ed7a1705bcf3792eb5b896f149b/hiredis-3.3.0-cp314-cp314t-macosx_10_15_x86_64.whl", hash = "sha256:a1a67530da714954ed50579f4fe1ab0ddbac9c43643b1721c2cb226a50dde263", size = 46701, upload-time = "2025-10-14T16:32:53.572Z" }, + { url = "https://files.pythonhosted.org/packages/ce/8e/20a5cf2c83c7a7e08c76b9abab113f99f71cd57468a9c7909737ce6e9bf8/hiredis-3.3.0-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:616868352e47ab355559adca30f4f3859f9db895b4e7bc71e2323409a2add751", size = 42381, upload-time = "2025-10-14T16:32:54.762Z" }, + { url = "https://files.pythonhosted.org/packages/be/0a/547c29c06e8c9c337d0df3eec39da0cf1aad701daf8a9658dd37f25aca66/hiredis-3.3.0-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:e799b79f3150083e9702fc37e6243c0bd47a443d6eae3f3077b0b3f510d6a145", size = 180313, upload-time = "2025-10-14T16:32:55.644Z" }, + { url = "https://files.pythonhosted.org/packages/89/8a/488de5469e3d0921a1c425045bf00e983d48b2111a90e47cf5769eaa536c/hiredis-3.3.0-cp314-cp314t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:9ef1dfb0d2c92c3701655e2927e6bbe10c499aba632c7ea57b6392516df3864b", size = 190488, upload-time = "2025-10-14T16:32:56.649Z" }, + { url = "https://files.pythonhosted.org/packages/b5/59/8493edc3eb9ae0dbea2b2230c2041a52bc03e390b02ffa3ac0bca2af9aea/hiredis-3.3.0-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:c290da6bc2a57e854c7da9956cd65013483ede935677e84560da3b848f253596", size = 189210, upload-time = "2025-10-14T16:32:57.759Z" }, + { url = "https://files.pythonhosted.org/packages/f0/de/8c9a653922057b32fb1e2546ecd43ef44c9aa1a7cf460c87cae507eb2bc7/hiredis-3.3.0-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:fd8c438d9e1728f0085bf9b3c9484d19ec31f41002311464e75b69550c32ffa8", size = 180972, upload-time = "2025-10-14T16:32:58.737Z" }, + { url = "https://files.pythonhosted.org/packages/e4/a3/51e6e6afaef2990986d685ca6e254ffbd191f1635a59b2d06c9e5d10c8a2/hiredis-3.3.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:1bbc6b8a88bbe331e3ebf6685452cebca6dfe6d38a6d4efc5651d7e363ba28bd", size = 175315, upload-time = "2025-10-14T16:32:59.774Z" }, + { url = "https://files.pythonhosted.org/packages/96/54/e436312feb97601f70f8b39263b8da5ac4a5d18305ebdfb08ad7621f6119/hiredis-3.3.0-cp314-cp314t-musllinux_1_2_ppc64le.whl", hash = "sha256:55d8c18fe9a05496c5c04e6eccc695169d89bf358dff964bcad95696958ec05f", size = 185653, upload-time = "2025-10-14T16:33:00.749Z" }, + { url = "https://files.pythonhosted.org/packages/ed/a3/88e66030d066337c6c0f883a912c6d4b2d6d7173490fbbc113a6cbe414ff/hiredis-3.3.0-cp314-cp314t-musllinux_1_2_s390x.whl", hash = "sha256:4ddc79afa76b805d364e202a754666cb3c4d9c85153cbfed522871ff55827838", size = 179032, upload-time = "2025-10-14T16:33:01.711Z" }, + { url = "https://files.pythonhosted.org/packages/bc/1f/fb7375467e9adaa371cd617c2984fefe44bdce73add4c70b8dd8cab1b33a/hiredis-3.3.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:8e8a4b8540581dcd1b2b25827a54cfd538e0afeaa1a0e3ca87ad7126965981cc", size = 176127, upload-time = "2025-10-14T16:33:02.793Z" }, + { url = "https://files.pythonhosted.org/packages/66/14/0dc2b99209c400f3b8f24067273e9c3cb383d894e155830879108fb19e98/hiredis-3.3.0-cp314-cp314t-win32.whl", hash = "sha256:298593bb08487753b3afe6dc38bac2532e9bac8dcee8d992ef9977d539cc6776", size = 22024, upload-time = "2025-10-14T16:33:03.812Z" }, + { url = "https://files.pythonhosted.org/packages/b2/2f/8a0befeed8bbe142d5a6cf3b51e8cbe019c32a64a596b0ebcbc007a8f8f1/hiredis-3.3.0-cp314-cp314t-win_amd64.whl", hash = "sha256:b442b6ab038a6f3b5109874d2514c4edf389d8d8b553f10f12654548808683bc", size = 23808, upload-time = "2025-10-14T16:33:04.965Z" }, +] + +[[package]] +name = "httpcore" +version = "1.0.9" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "certifi" }, + { name = "h11" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/06/94/82699a10bca87a5556c9c59b5963f2d039dbd239f25bc2a63907a05a14cb/httpcore-1.0.9.tar.gz", hash = "sha256:6e34463af53fd2ab5d807f399a9b45ea31c3dfa2276f15a2c3f00afff6e176e8", size = 85484, upload-time = "2025-04-24T22:06:22.219Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/7e/f5/f66802a942d491edb555dd61e3a9961140fd64c90bce1eafd741609d334d/httpcore-1.0.9-py3-none-any.whl", hash = "sha256:2d400746a40668fc9dec9810239072b40b4484b640a8c38fd654a024c7a1bf55", size = 78784, upload-time = "2025-04-24T22:06:20.566Z" }, +] + +[[package]] +name = "httptools" +version = "0.7.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/b5/46/120a669232c7bdedb9d52d4aeae7e6c7dfe151e99dc70802e2fc7a5e1993/httptools-0.7.1.tar.gz", hash = "sha256:abd72556974f8e7c74a259655924a717a2365b236c882c3f6f8a45fe94703ac9", size = 258961, upload-time = "2025-10-10T03:55:08.559Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/09/8f/c77b1fcbfd262d422f12da02feb0d218fa228d52485b77b953832105bb90/httptools-0.7.1-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:6babce6cfa2a99545c60bfef8bee0cc0545413cb0018f617c8059a30ad985de3", size = 202889, upload-time = "2025-10-10T03:54:47.089Z" }, + { url = "https://files.pythonhosted.org/packages/0a/1a/22887f53602feaa066354867bc49a68fc295c2293433177ee90870a7d517/httptools-0.7.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:601b7628de7504077dd3dcb3791c6b8694bbd967148a6d1f01806509254fb1ca", size = 108180, upload-time = "2025-10-10T03:54:48.052Z" }, + { url = "https://files.pythonhosted.org/packages/32/6a/6aaa91937f0010d288d3d124ca2946d48d60c3a5ee7ca62afe870e3ea011/httptools-0.7.1-cp313-cp313-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:04c6c0e6c5fb0739c5b8a9eb046d298650a0ff38cf42537fc372b28dc7e4472c", size = 478596, upload-time = "2025-10-10T03:54:48.919Z" }, + { url = "https://files.pythonhosted.org/packages/6d/70/023d7ce117993107be88d2cbca566a7c1323ccbaf0af7eabf2064fe356f6/httptools-0.7.1-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:69d4f9705c405ae3ee83d6a12283dc9feba8cc6aaec671b412917e644ab4fa66", size = 473268, upload-time = "2025-10-10T03:54:49.993Z" }, + { url = "https://files.pythonhosted.org/packages/32/4d/9dd616c38da088e3f436e9a616e1d0cc66544b8cdac405cc4e81c8679fc7/httptools-0.7.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:44c8f4347d4b31269c8a9205d8a5ee2df5322b09bbbd30f8f862185bb6b05346", size = 455517, upload-time = "2025-10-10T03:54:51.066Z" }, + { url = "https://files.pythonhosted.org/packages/1d/3a/a6c595c310b7df958e739aae88724e24f9246a514d909547778d776799be/httptools-0.7.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:465275d76db4d554918aba40bf1cbebe324670f3dfc979eaffaa5d108e2ed650", size = 458337, upload-time = "2025-10-10T03:54:52.196Z" }, + { url = "https://files.pythonhosted.org/packages/fd/82/88e8d6d2c51edc1cc391b6e044c6c435b6aebe97b1abc33db1b0b24cd582/httptools-0.7.1-cp313-cp313-win_amd64.whl", hash = "sha256:322d00c2068d125bd570f7bf78b2d367dad02b919d8581d7476d8b75b294e3e6", size = 85743, upload-time = "2025-10-10T03:54:53.448Z" }, + { url = "https://files.pythonhosted.org/packages/34/50/9d095fcbb6de2d523e027a2f304d4551855c2f46e0b82befd718b8b20056/httptools-0.7.1-cp314-cp314-macosx_10_13_universal2.whl", hash = "sha256:c08fe65728b8d70b6923ce31e3956f859d5e1e8548e6f22ec520a962c6757270", size = 203619, upload-time = "2025-10-10T03:54:54.321Z" }, + { url = "https://files.pythonhosted.org/packages/07/f0/89720dc5139ae54b03f861b5e2c55a37dba9a5da7d51e1e824a1f343627f/httptools-0.7.1-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:7aea2e3c3953521c3c51106ee11487a910d45586e351202474d45472db7d72d3", size = 108714, upload-time = "2025-10-10T03:54:55.163Z" }, + { url = "https://files.pythonhosted.org/packages/b3/cb/eea88506f191fb552c11787c23f9a405f4c7b0c5799bf73f2249cd4f5228/httptools-0.7.1-cp314-cp314-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:0e68b8582f4ea9166be62926077a3334064d422cf08ab87d8b74664f8e9058e1", size = 472909, upload-time = "2025-10-10T03:54:56.056Z" }, + { url = "https://files.pythonhosted.org/packages/e0/4a/a548bdfae6369c0d078bab5769f7b66f17f1bfaa6fa28f81d6be6959066b/httptools-0.7.1-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:df091cf961a3be783d6aebae963cc9b71e00d57fa6f149025075217bc6a55a7b", size = 470831, upload-time = "2025-10-10T03:54:57.219Z" }, + { url = "https://files.pythonhosted.org/packages/4d/31/14df99e1c43bd132eec921c2e7e11cda7852f65619bc0fc5bdc2d0cb126c/httptools-0.7.1-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:f084813239e1eb403ddacd06a30de3d3e09a9b76e7894dcda2b22f8a726e9c60", size = 452631, upload-time = "2025-10-10T03:54:58.219Z" }, + { url = "https://files.pythonhosted.org/packages/22/d2/b7e131f7be8d854d48cb6d048113c30f9a46dca0c9a8b08fcb3fcd588cdc/httptools-0.7.1-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:7347714368fb2b335e9063bc2b96f2f87a9ceffcd9758ac295f8bbcd3ffbc0ca", size = 452910, upload-time = "2025-10-10T03:54:59.366Z" }, + { url = "https://files.pythonhosted.org/packages/53/cf/878f3b91e4e6e011eff6d1fa9ca39f7eb17d19c9d7971b04873734112f30/httptools-0.7.1-cp314-cp314-win_amd64.whl", hash = "sha256:cfabda2a5bb85aa2a904ce06d974a3f30fb36cc63d7feaddec05d2050acede96", size = 88205, upload-time = "2025-10-10T03:55:00.389Z" }, +] + +[[package]] +name = "httpx" +version = "0.28.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "anyio" }, + { name = "certifi" }, + { name = "httpcore" }, + { name = "idna" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/b1/df/48c586a5fe32a0f01324ee087459e112ebb7224f646c0b5023f5e79e9956/httpx-0.28.1.tar.gz", hash = "sha256:75e98c5f16b0f35b567856f597f06ff2270a374470a5c2392242528e3e3e42fc", size = 141406, upload-time = "2024-12-06T15:37:23.222Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/2a/39/e50c7c3a983047577ee07d2a9e53faf5a69493943ec3f6a384bdc792deb2/httpx-0.28.1-py3-none-any.whl", hash = "sha256:d909fcccc110f8c7faf814ca82a9a4d816bc5a6dbfea25d6591d6985b8ba59ad", size = 73517, upload-time = "2024-12-06T15:37:21.509Z" }, +] + +[[package]] +name = "identify" +version = "2.6.15" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/ff/e7/685de97986c916a6d93b3876139e00eef26ad5bbbd61925d670ae8013449/identify-2.6.15.tar.gz", hash = "sha256:e4f4864b96c6557ef2a1e1c951771838f4edc9df3a72ec7118b338801b11c7bf", size = 99311, upload-time = "2025-10-02T17:43:40.631Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/0f/1c/e5fd8f973d4f375adb21565739498e2e9a1e54c858a97b9a8ccfdc81da9b/identify-2.6.15-py2.py3-none-any.whl", hash = "sha256:1181ef7608e00704db228516541eb83a88a9f94433a8c80bb9b5bd54b1d81757", size = 99183, upload-time = "2025-10-02T17:43:39.137Z" }, +] + +[[package]] +name = "idna" +version = "3.11" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/6f/6d/0703ccc57f3a7233505399edb88de3cbd678da106337b9fcde432b65ed60/idna-3.11.tar.gz", hash = "sha256:795dafcc9c04ed0c1fb032c2aa73654d8e8c5023a7df64a53f39190ada629902", size = 194582, upload-time = "2025-10-12T14:55:20.501Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/0e/61/66938bbb5fc52dbdf84594873d5b51fb1f7c7794e9c0f5bd885f30bc507b/idna-3.11-py3-none-any.whl", hash = "sha256:771a87f49d9defaf64091e6e6fe9c18d4833f140bd19464795bc32d966ca37ea", size = 71008, upload-time = "2025-10-12T14:55:18.883Z" }, +] + +[[package]] +name = "iniconfig" +version = "2.3.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/72/34/14ca021ce8e5dfedc35312d08ba8bf51fdd999c576889fc2c24cb97f4f10/iniconfig-2.3.0.tar.gz", hash = "sha256:c76315c77db068650d49c5b56314774a7804df16fee4402c1f19d6d15d8c4730", size = 20503, upload-time = "2025-10-18T21:55:43.219Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/cb/b1/3846dd7f199d53cb17f49cba7e651e9ce294d8497c8c150530ed11865bb8/iniconfig-2.3.0-py3-none-any.whl", hash = "sha256:f631c04d2c48c52b84d0d0549c99ff3859c98df65b3101406327ecc7d53fbf12", size = 7484, upload-time = "2025-10-18T21:55:41.639Z" }, +] + +[[package]] +name = "instructor" +version = "1.13.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "aiohttp" }, + { name = "diskcache" }, + { name = "docstring-parser" }, + { name = "jinja2" }, + { name = "jiter" }, + { name = "openai" }, + { name = "pre-commit" }, + { name = "pydantic" }, + { name = "pydantic-core" }, + { name = "requests" }, + { name = "rich" }, + { name = "tenacity" }, + { name = "ty" }, + { name = "typer" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/31/f0/7f31609ec2fb84b140ff573abf1cce78cd3a2a3c6479b60aa82b69d40d2a/instructor-1.13.0.tar.gz", hash = "sha256:bf838a5c503fafdd034a9b1f8544c5e1f62462eea9f89932bc75c116ad35ab5a", size = 69898121, upload-time = "2025-11-06T04:19:31.034Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/95/64/6542ac826a4c9b937b67c096a785af1aaa26b22fcb7c81223cfe4038205b/instructor-1.13.0-py3-none-any.whl", hash = "sha256:2b735b6ea0d3194548369a18254f1dde83cb5ec0b182de77adbadd8be73caddc", size = 160904, upload-time = "2025-11-06T04:19:24.674Z" }, +] + +[[package]] +name = "jinja2" +version = "3.1.6" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "markupsafe" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/df/bf/f7da0350254c0ed7c72f3e33cef02e048281fec7ecec5f032d4aac52226b/jinja2-3.1.6.tar.gz", hash = "sha256:0137fb05990d35f1275a587e9aee6d56da821fc83491a0fb838183be43f66d6d", size = 245115, upload-time = "2025-03-05T20:05:02.478Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/62/a1/3d680cbfd5f4b8f15abc1d571870c5fc3e594bb582bc3b64ea099db13e56/jinja2-3.1.6-py3-none-any.whl", hash = "sha256:85ece4451f492d0c13c5dd7c13a64681a86afae63a5f347908daf103ce6d2f67", size = 134899, upload-time = "2025-03-05T20:05:00.369Z" }, +] + +[[package]] +name = "jiter" +version = "0.11.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/a3/68/0357982493a7b20925aece061f7fb7a2678e3b232f8d73a6edb7e5304443/jiter-0.11.1.tar.gz", hash = "sha256:849dcfc76481c0ea0099391235b7ca97d7279e0fa4c86005457ac7c88e8b76dc", size = 168385, upload-time = "2025-10-17T11:31:15.186Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/7c/4b/e4dd3c76424fad02a601d570f4f2a8438daea47ba081201a721a903d3f4c/jiter-0.11.1-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:71b6a920a5550f057d49d0e8bcc60945a8da998019e83f01adf110e226267663", size = 305272, upload-time = "2025-10-17T11:29:39.249Z" }, + { url = "https://files.pythonhosted.org/packages/67/83/2cd3ad5364191130f4de80eacc907f693723beaab11a46c7d155b07a092c/jiter-0.11.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:0b3de72e925388453a5171be83379549300db01284f04d2a6f244d1d8de36f94", size = 314038, upload-time = "2025-10-17T11:29:40.563Z" }, + { url = "https://files.pythonhosted.org/packages/d3/3c/8e67d9ba524e97d2f04c8f406f8769a23205026b13b0938d16646d6e2d3e/jiter-0.11.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cc19dd65a2bd3d9c044c5b4ebf657ca1e6003a97c0fc10f555aa4f7fb9821c00", size = 345977, upload-time = "2025-10-17T11:29:42.009Z" }, + { url = "https://files.pythonhosted.org/packages/8d/a5/489ce64d992c29bccbffabb13961bbb0435e890d7f2d266d1f3df5e917d2/jiter-0.11.1-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:d58faaa936743cd1464540562f60b7ce4fd927e695e8bc31b3da5b914baa9abd", size = 364503, upload-time = "2025-10-17T11:29:43.459Z" }, + { url = "https://files.pythonhosted.org/packages/d4/c0/e321dd83ee231d05c8fe4b1a12caf1f0e8c7a949bf4724d58397104f10f2/jiter-0.11.1-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:902640c3103625317291cb73773413b4d71847cdf9383ba65528745ff89f1d14", size = 487092, upload-time = "2025-10-17T11:29:44.835Z" }, + { url = "https://files.pythonhosted.org/packages/f9/5e/8f24ec49c8d37bd37f34ec0112e0b1a3b4b5a7b456c8efff1df5e189ad43/jiter-0.11.1-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:30405f726e4c2ed487b176c09f8b877a957f535d60c1bf194abb8dadedb5836f", size = 376328, upload-time = "2025-10-17T11:29:46.175Z" }, + { url = "https://files.pythonhosted.org/packages/7f/70/ded107620e809327cf7050727e17ccfa79d6385a771b7fe38fb31318ef00/jiter-0.11.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3217f61728b0baadd2551844870f65219ac4a1285d5e1a4abddff3d51fdabe96", size = 356632, upload-time = "2025-10-17T11:29:47.454Z" }, + { url = "https://files.pythonhosted.org/packages/19/53/c26f7251613f6a9079275ee43c89b8a973a95ff27532c421abc2a87afb04/jiter-0.11.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:b1364cc90c03a8196f35f396f84029f12abe925415049204446db86598c8b72c", size = 384358, upload-time = "2025-10-17T11:29:49.377Z" }, + { url = "https://files.pythonhosted.org/packages/84/16/e0f2cc61e9c4d0b62f6c1bd9b9781d878a427656f88293e2a5335fa8ff07/jiter-0.11.1-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:53a54bf8e873820ab186b2dca9f6c3303f00d65ae5e7b7d6bda1b95aa472d646", size = 517279, upload-time = "2025-10-17T11:29:50.968Z" }, + { url = "https://files.pythonhosted.org/packages/60/5c/4cd095eaee68961bca3081acbe7c89e12ae24a5dae5fd5d2a13e01ed2542/jiter-0.11.1-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:7e29aca023627b0e0c2392d4248f6414d566ff3974fa08ff2ac8dbb96dfee92a", size = 508276, upload-time = "2025-10-17T11:29:52.619Z" }, + { url = "https://files.pythonhosted.org/packages/4f/25/f459240e69b0e09a7706d96ce203ad615ca36b0fe832308d2b7123abf2d0/jiter-0.11.1-cp313-cp313-win32.whl", hash = "sha256:f153e31d8bca11363751e875c0a70b3d25160ecbaee7b51e457f14498fb39d8b", size = 205593, upload-time = "2025-10-17T11:29:53.938Z" }, + { url = "https://files.pythonhosted.org/packages/7c/16/461bafe22bae79bab74e217a09c907481a46d520c36b7b9fe71ee8c9e983/jiter-0.11.1-cp313-cp313-win_amd64.whl", hash = "sha256:f773f84080b667c69c4ea0403fc67bb08b07e2b7ce1ef335dea5868451e60fed", size = 203518, upload-time = "2025-10-17T11:29:55.216Z" }, + { url = "https://files.pythonhosted.org/packages/7b/72/c45de6e320edb4fa165b7b1a414193b3cae302dd82da2169d315dcc78b44/jiter-0.11.1-cp313-cp313-win_arm64.whl", hash = "sha256:635ecd45c04e4c340d2187bcb1cea204c7cc9d32c1364d251564bf42e0e39c2d", size = 188062, upload-time = "2025-10-17T11:29:56.631Z" }, + { url = "https://files.pythonhosted.org/packages/65/9b/4a57922437ca8753ef823f434c2dec5028b237d84fa320f06a3ba1aec6e8/jiter-0.11.1-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:d892b184da4d94d94ddb4031296931c74ec8b325513a541ebfd6dfb9ae89904b", size = 313814, upload-time = "2025-10-17T11:29:58.509Z" }, + { url = "https://files.pythonhosted.org/packages/76/50/62a0683dadca25490a4bedc6a88d59de9af2a3406dd5a576009a73a1d392/jiter-0.11.1-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:aa22c223a3041dacb2fcd37c70dfd648b44662b4a48e242592f95bda5ab09d58", size = 344987, upload-time = "2025-10-17T11:30:00.208Z" }, + { url = "https://files.pythonhosted.org/packages/da/00/2355dbfcbf6cdeaddfdca18287f0f38ae49446bb6378e4a5971e9356fc8a/jiter-0.11.1-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:330e8e6a11ad4980cd66a0f4a3e0e2e0f646c911ce047014f984841924729789", size = 356399, upload-time = "2025-10-17T11:30:02.084Z" }, + { url = "https://files.pythonhosted.org/packages/c9/07/c2bd748d578fa933d894a55bff33f983bc27f75fc4e491b354bef7b78012/jiter-0.11.1-cp313-cp313t-win_amd64.whl", hash = "sha256:09e2e386ebf298547ca3a3704b729471f7ec666c2906c5c26c1a915ea24741ec", size = 203289, upload-time = "2025-10-17T11:30:03.656Z" }, + { url = "https://files.pythonhosted.org/packages/e6/ee/ace64a853a1acbd318eb0ca167bad1cf5ee037207504b83a868a5849747b/jiter-0.11.1-cp313-cp313t-win_arm64.whl", hash = "sha256:fe4a431c291157e11cee7c34627990ea75e8d153894365a3bc84b7a959d23ca8", size = 188284, upload-time = "2025-10-17T11:30:05.046Z" }, + { url = "https://files.pythonhosted.org/packages/8d/00/d6006d069e7b076e4c66af90656b63da9481954f290d5eca8c715f4bf125/jiter-0.11.1-cp314-cp314-macosx_10_12_x86_64.whl", hash = "sha256:0fa1f70da7a8a9713ff8e5f75ec3f90c0c870be6d526aa95e7c906f6a1c8c676", size = 304624, upload-time = "2025-10-17T11:30:06.678Z" }, + { url = "https://files.pythonhosted.org/packages/fc/45/4a0e31eb996b9ccfddbae4d3017b46f358a599ccf2e19fbffa5e531bd304/jiter-0.11.1-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:569ee559e5046a42feb6828c55307cf20fe43308e3ae0d8e9e4f8d8634d99944", size = 315042, upload-time = "2025-10-17T11:30:08.87Z" }, + { url = "https://files.pythonhosted.org/packages/e7/91/22f5746f5159a28c76acdc0778801f3c1181799aab196dbea2d29e064968/jiter-0.11.1-cp314-cp314-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f69955fa1d92e81987f092b233f0be49d4c937da107b7f7dcf56306f1d3fcce9", size = 346357, upload-time = "2025-10-17T11:30:10.222Z" }, + { url = "https://files.pythonhosted.org/packages/f5/4f/57620857d4e1dc75c8ff4856c90cb6c135e61bff9b4ebfb5dc86814e82d7/jiter-0.11.1-cp314-cp314-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:090f4c9d4a825e0fcbd0a2647c9a88a0f366b75654d982d95a9590745ff0c48d", size = 365057, upload-time = "2025-10-17T11:30:11.585Z" }, + { url = "https://files.pythonhosted.org/packages/ce/34/caf7f9cc8ae0a5bb25a5440cc76c7452d264d1b36701b90fdadd28fe08ec/jiter-0.11.1-cp314-cp314-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:bbf3d8cedf9e9d825233e0dcac28ff15c47b7c5512fdfe2e25fd5bbb6e6b0cee", size = 487086, upload-time = "2025-10-17T11:30:13.052Z" }, + { url = "https://files.pythonhosted.org/packages/50/17/85b5857c329d533d433fedf98804ebec696004a1f88cabad202b2ddc55cf/jiter-0.11.1-cp314-cp314-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2aa9b1958f9c30d3d1a558b75f0626733c60eb9b7774a86b34d88060be1e67fe", size = 376083, upload-time = "2025-10-17T11:30:14.416Z" }, + { url = "https://files.pythonhosted.org/packages/85/d3/2d9f973f828226e6faebdef034097a2918077ea776fb4d88489949024787/jiter-0.11.1-cp314-cp314-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e42d1ca16590b768c5e7d723055acd2633908baacb3628dd430842e2e035aa90", size = 357825, upload-time = "2025-10-17T11:30:15.765Z" }, + { url = "https://files.pythonhosted.org/packages/f4/55/848d4dabf2c2c236a05468c315c2cb9dc736c5915e65449ccecdba22fb6f/jiter-0.11.1-cp314-cp314-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:5db4c2486a023820b701a17aec9c5a6173c5ba4393f26662f032f2de9c848b0f", size = 383933, upload-time = "2025-10-17T11:30:17.34Z" }, + { url = "https://files.pythonhosted.org/packages/0b/6c/204c95a4fbb0e26dfa7776c8ef4a878d0c0b215868011cc904bf44f707e2/jiter-0.11.1-cp314-cp314-musllinux_1_1_aarch64.whl", hash = "sha256:4573b78777ccfac954859a6eff45cbd9d281d80c8af049d0f1a3d9fc323d5c3a", size = 517118, upload-time = "2025-10-17T11:30:18.684Z" }, + { url = "https://files.pythonhosted.org/packages/88/25/09956644ea5a2b1e7a2a0f665cb69a973b28f4621fa61fc0c0f06ff40a31/jiter-0.11.1-cp314-cp314-musllinux_1_1_x86_64.whl", hash = "sha256:7593ac6f40831d7961cb67633c39b9fef6689a211d7919e958f45710504f52d3", size = 508194, upload-time = "2025-10-17T11:30:20.719Z" }, + { url = "https://files.pythonhosted.org/packages/09/49/4d1657355d7f5c9e783083a03a3f07d5858efa6916a7d9634d07db1c23bd/jiter-0.11.1-cp314-cp314-win32.whl", hash = "sha256:87202ec6ff9626ff5f9351507def98fcf0df60e9a146308e8ab221432228f4ea", size = 203961, upload-time = "2025-10-17T11:30:22.073Z" }, + { url = "https://files.pythonhosted.org/packages/76/bd/f063bd5cc2712e7ca3cf6beda50894418fc0cfeb3f6ff45a12d87af25996/jiter-0.11.1-cp314-cp314-win_amd64.whl", hash = "sha256:a5dd268f6531a182c89d0dd9a3f8848e86e92dfff4201b77a18e6b98aa59798c", size = 202804, upload-time = "2025-10-17T11:30:23.452Z" }, + { url = "https://files.pythonhosted.org/packages/52/ca/4d84193dfafef1020bf0bedd5e1a8d0e89cb67c54b8519040effc694964b/jiter-0.11.1-cp314-cp314-win_arm64.whl", hash = "sha256:5d761f863f912a44748a21b5c4979c04252588ded8d1d2760976d2e42cd8d991", size = 188001, upload-time = "2025-10-17T11:30:24.915Z" }, + { url = "https://files.pythonhosted.org/packages/d5/fa/3b05e5c9d32efc770a8510eeb0b071c42ae93a5b576fd91cee9af91689a1/jiter-0.11.1-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:2cc5a3965285ddc33e0cab933e96b640bc9ba5940cea27ebbbf6695e72d6511c", size = 312561, upload-time = "2025-10-17T11:30:26.742Z" }, + { url = "https://files.pythonhosted.org/packages/50/d3/335822eb216154ddb79a130cbdce88fdf5c3e2b43dc5dba1fd95c485aaf5/jiter-0.11.1-cp314-cp314t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6b572b3636a784c2768b2342f36a23078c8d3aa6d8a30745398b1bab58a6f1a8", size = 344551, upload-time = "2025-10-17T11:30:28.252Z" }, + { url = "https://files.pythonhosted.org/packages/31/6d/a0bed13676b1398f9b3ba61f32569f20a3ff270291161100956a577b2dd3/jiter-0.11.1-cp314-cp314t-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:ad93e3d67a981f96596d65d2298fe8d1aa649deb5374a2fb6a434410ee11915e", size = 363051, upload-time = "2025-10-17T11:30:30.009Z" }, + { url = "https://files.pythonhosted.org/packages/a4/03/313eda04aa08545a5a04ed5876e52f49ab76a4d98e54578896ca3e16313e/jiter-0.11.1-cp314-cp314t-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a83097ce379e202dcc3fe3fc71a16d523d1ee9192c8e4e854158f96b3efe3f2f", size = 485897, upload-time = "2025-10-17T11:30:31.429Z" }, + { url = "https://files.pythonhosted.org/packages/5f/13/a1011b9d325e40b53b1b96a17c010b8646013417f3902f97a86325b19299/jiter-0.11.1-cp314-cp314t-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:7042c51e7fbeca65631eb0c332f90c0c082eab04334e7ccc28a8588e8e2804d9", size = 375224, upload-time = "2025-10-17T11:30:33.18Z" }, + { url = "https://files.pythonhosted.org/packages/92/da/1b45026b19dd39b419e917165ff0ea629dbb95f374a3a13d2df95e40a6ac/jiter-0.11.1-cp314-cp314t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0a68d679c0e47649a61df591660507608adc2652442de7ec8276538ac46abe08", size = 356606, upload-time = "2025-10-17T11:30:34.572Z" }, + { url = "https://files.pythonhosted.org/packages/7a/0c/9acb0e54d6a8ba59ce923a180ebe824b4e00e80e56cefde86cc8e0a948be/jiter-0.11.1-cp314-cp314t-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:a1b0da75dbf4b6ec0b3c9e604d1ee8beaf15bc046fff7180f7d89e3cdbd3bb51", size = 384003, upload-time = "2025-10-17T11:30:35.987Z" }, + { url = "https://files.pythonhosted.org/packages/3f/2b/e5a5fe09d6da2145e4eed651e2ce37f3c0cf8016e48b1d302e21fb1628b7/jiter-0.11.1-cp314-cp314t-musllinux_1_1_aarch64.whl", hash = "sha256:69dd514bf0fa31c62147d6002e5ca2b3e7ef5894f5ac6f0a19752385f4e89437", size = 516946, upload-time = "2025-10-17T11:30:37.425Z" }, + { url = "https://files.pythonhosted.org/packages/5f/fe/db936e16e0228d48eb81f9934e8327e9fde5185e84f02174fcd22a01be87/jiter-0.11.1-cp314-cp314t-musllinux_1_1_x86_64.whl", hash = "sha256:bb31ac0b339efa24c0ca606febd8b77ef11c58d09af1b5f2be4c99e907b11111", size = 507614, upload-time = "2025-10-17T11:30:38.977Z" }, + { url = "https://files.pythonhosted.org/packages/86/db/c4438e8febfb303486d13c6b72f5eb71cf851e300a0c1f0b4140018dd31f/jiter-0.11.1-cp314-cp314t-win32.whl", hash = "sha256:b2ce0d6156a1d3ad41da3eec63b17e03e296b78b0e0da660876fccfada86d2f7", size = 204043, upload-time = "2025-10-17T11:30:40.308Z" }, + { url = "https://files.pythonhosted.org/packages/36/59/81badb169212f30f47f817dfaabf965bc9b8204fed906fab58104ee541f9/jiter-0.11.1-cp314-cp314t-win_amd64.whl", hash = "sha256:f4db07d127b54c4a2d43b4cf05ff0193e4f73e0dd90c74037e16df0b29f666e1", size = 204046, upload-time = "2025-10-17T11:30:41.692Z" }, + { url = "https://files.pythonhosted.org/packages/dd/01/43f7b4eb61db3e565574c4c5714685d042fb652f9eef7e5a3de6aafa943a/jiter-0.11.1-cp314-cp314t-win_arm64.whl", hash = "sha256:28e4fdf2d7ebfc935523e50d1efa3970043cfaa161674fe66f9642409d001dfe", size = 188069, upload-time = "2025-10-17T11:30:43.23Z" }, +] + +[[package]] +name = "jmespath" +version = "1.0.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/00/2a/e867e8531cf3e36b41201936b7fa7ba7b5702dbef42922193f05c8976cd6/jmespath-1.0.1.tar.gz", hash = "sha256:90261b206d6defd58fdd5e85f478bf633a2901798906be2ad389150c5c60edbe", size = 25843, upload-time = "2022-06-17T18:00:12.224Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/31/b4/b9b800c45527aadd64d5b442f9b932b00648617eb5d63d2c7a6587b7cafc/jmespath-1.0.1-py3-none-any.whl", hash = "sha256:02e2e4cc71b5bcab88332eebf907519190dd9e6e82107fa7f83b1003a6252980", size = 20256, upload-time = "2022-06-17T18:00:10.251Z" }, +] + +[[package]] +name = "librt" +version = "0.6.3" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/37/c3/cdff3c10e2e608490dc0a310ccf11ba777b3943ad4fcead2a2ade98c21e1/librt-0.6.3.tar.gz", hash = "sha256:c724a884e642aa2bbad52bb0203ea40406ad742368a5f90da1b220e970384aae", size = 54209, upload-time = "2025-11-29T14:01:56.058Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/dd/aa/3055dd440f8b8b3b7e8624539a0749dd8e1913e978993bcca9ce7e306231/librt-0.6.3-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:9e716f9012148a81f02f46a04fc4c663420c6fbfeacfac0b5e128cf43b4413d3", size = 27874, upload-time = "2025-11-29T14:01:10.615Z" }, + { url = "https://files.pythonhosted.org/packages/ef/93/226d7dd455eaa4c26712b5ccb2dfcca12831baa7f898c8ffd3a831e29fda/librt-0.6.3-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:669ff2495728009a96339c5ad2612569c6d8be4474e68f3f3ac85d7c3261f5f5", size = 27852, upload-time = "2025-11-29T14:01:11.535Z" }, + { url = "https://files.pythonhosted.org/packages/4e/8b/db9d51191aef4e4cc06285250affe0bb0ad8b2ed815f7ca77951655e6f02/librt-0.6.3-cp313-cp313-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:349b6873ebccfc24c9efd244e49da9f8a5c10f60f07575e248921aae2123fc42", size = 84264, upload-time = "2025-11-29T14:01:12.461Z" }, + { url = "https://files.pythonhosted.org/packages/8d/53/297c96bda3b5a73bdaf748f1e3ae757edd29a0a41a956b9c10379f193417/librt-0.6.3-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:0c74c26736008481c9f6d0adf1aedb5a52aff7361fea98276d1f965c0256ee70", size = 88432, upload-time = "2025-11-29T14:01:13.405Z" }, + { url = "https://files.pythonhosted.org/packages/54/3a/c005516071123278e340f22de72fa53d51e259d49215295c212da16c4dc2/librt-0.6.3-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:408a36ddc75e91918cb15b03460bdc8a015885025d67e68c6f78f08c3a88f522", size = 89014, upload-time = "2025-11-29T14:01:14.373Z" }, + { url = "https://files.pythonhosted.org/packages/8e/9b/ea715f818d926d17b94c80a12d81a79e95c44f52848e61e8ca1ff29bb9a9/librt-0.6.3-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:e61ab234624c9ffca0248a707feffe6fac2343758a36725d8eb8a6efef0f8c30", size = 90807, upload-time = "2025-11-29T14:01:15.377Z" }, + { url = "https://files.pythonhosted.org/packages/f0/fc/4e2e4c87e002fa60917a8e474fd13c4bac9a759df82be3778573bb1ab954/librt-0.6.3-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:324462fe7e3896d592b967196512491ec60ca6e49c446fe59f40743d08c97917", size = 88890, upload-time = "2025-11-29T14:01:16.633Z" }, + { url = "https://files.pythonhosted.org/packages/70/7f/c7428734fbdfd4db3d5b9237fc3a857880b2ace66492836f6529fef25d92/librt-0.6.3-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:36b2ec8c15030002c7f688b4863e7be42820d7c62d9c6eece3db54a2400f0530", size = 92300, upload-time = "2025-11-29T14:01:17.658Z" }, + { url = "https://files.pythonhosted.org/packages/f9/0c/738c4824fdfe74dc0f95d5e90ef9e759d4ecf7fd5ba964d54a7703322251/librt-0.6.3-cp313-cp313-win32.whl", hash = "sha256:25b1b60cb059471c0c0c803e07d0dfdc79e41a0a122f288b819219ed162672a3", size = 20159, upload-time = "2025-11-29T14:01:18.61Z" }, + { url = "https://files.pythonhosted.org/packages/f2/95/93d0e61bc617306ecf4c54636b5cbde4947d872563565c4abdd9d07a39d3/librt-0.6.3-cp313-cp313-win_amd64.whl", hash = "sha256:10a95ad074e2a98c9e4abc7f5b7d40e5ecbfa84c04c6ab8a70fabf59bd429b88", size = 21484, upload-time = "2025-11-29T14:01:19.506Z" }, + { url = "https://files.pythonhosted.org/packages/10/23/abd7ace79ab54d1dbee265f13529266f686a7ce2d21ab59a992f989009b6/librt-0.6.3-cp313-cp313-win_arm64.whl", hash = "sha256:17000df14f552e86877d67e4ab7966912224efc9368e998c96a6974a8d609bf9", size = 20935, upload-time = "2025-11-29T14:01:20.415Z" }, + { url = "https://files.pythonhosted.org/packages/83/14/c06cb31152182798ed98be73f54932ab984894f5a8fccf9b73130897a938/librt-0.6.3-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:8e695f25d1a425ad7a272902af8ab8c8d66c1998b177e4b5f5e7b4e215d0c88a", size = 27566, upload-time = "2025-11-29T14:01:21.609Z" }, + { url = "https://files.pythonhosted.org/packages/0c/b1/ce83ca7b057b06150519152f53a0b302d7c33c8692ce2f01f669b5a819d9/librt-0.6.3-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:3e84a4121a7ae360ca4da436548a9c1ca8ca134a5ced76c893cc5944426164bd", size = 27753, upload-time = "2025-11-29T14:01:22.558Z" }, + { url = "https://files.pythonhosted.org/packages/3b/ec/739a885ef0a2839b6c25f1b01c99149d2cb6a34e933ffc8c051fcd22012e/librt-0.6.3-cp314-cp314-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:05f385a414de3f950886ea0aad8f109650d4b712cf9cc14cc17f5f62a9ab240b", size = 83178, upload-time = "2025-11-29T14:01:23.555Z" }, + { url = "https://files.pythonhosted.org/packages/db/bd/dc18bb1489d48c0911b9f4d72eae2d304ea264e215ba80f1e6ba4a9fc41d/librt-0.6.3-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:36a8e337461150b05ca2c7bdedb9e591dfc262c5230422cea398e89d0c746cdc", size = 87266, upload-time = "2025-11-29T14:01:24.532Z" }, + { url = "https://files.pythonhosted.org/packages/94/f3/d0c5431b39eef15e48088b2d739ad84b17c2f1a22c0345c6d4c4a42b135e/librt-0.6.3-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:dcbe48f6a03979384f27086484dc2a14959be1613cb173458bd58f714f2c48f3", size = 87623, upload-time = "2025-11-29T14:01:25.798Z" }, + { url = "https://files.pythonhosted.org/packages/3b/15/9a52e90834e4bd6ee16cdbaf551cb32227cbaad27398391a189c489318bc/librt-0.6.3-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:4bca9e4c260233fba37b15c4ec2f78aa99c1a79fbf902d19dd4a763c5c3fb751", size = 89436, upload-time = "2025-11-29T14:01:26.769Z" }, + { url = "https://files.pythonhosted.org/packages/c3/8a/a7e78e46e8486e023c50f21758930ef4793999115229afd65de69e94c9cc/librt-0.6.3-cp314-cp314-musllinux_1_2_i686.whl", hash = "sha256:760c25ed6ac968e24803eb5f7deb17ce026902d39865e83036bacbf5cf242aa8", size = 87540, upload-time = "2025-11-29T14:01:27.756Z" }, + { url = "https://files.pythonhosted.org/packages/49/01/93799044a1cccac31f1074b07c583e181829d240539657e7f305ae63ae2a/librt-0.6.3-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:4aa4a93a353ccff20df6e34fa855ae8fd788832c88f40a9070e3ddd3356a9f0e", size = 90597, upload-time = "2025-11-29T14:01:29.35Z" }, + { url = "https://files.pythonhosted.org/packages/a7/29/00c7f58b8f8eb1bad6529ffb6c9cdcc0890a27dac59ecda04f817ead5277/librt-0.6.3-cp314-cp314-win32.whl", hash = "sha256:cb92741c2b4ea63c09609b064b26f7f5d9032b61ae222558c55832ec3ad0bcaf", size = 18955, upload-time = "2025-11-29T14:01:30.325Z" }, + { url = "https://files.pythonhosted.org/packages/d7/13/2739e6e197a9f751375a37908a6a5b0bff637b81338497a1bcb5817394da/librt-0.6.3-cp314-cp314-win_amd64.whl", hash = "sha256:fdcd095b1b812d756fa5452aca93b962cf620694c0cadb192cec2bb77dcca9a2", size = 20263, upload-time = "2025-11-29T14:01:31.287Z" }, + { url = "https://files.pythonhosted.org/packages/e1/73/393868fc2158705ea003114a24e73bb10b03bda31e9ad7b5c5ec6575338b/librt-0.6.3-cp314-cp314-win_arm64.whl", hash = "sha256:822ca79e28720a76a935c228d37da6579edef048a17cd98d406a2484d10eda78", size = 19575, upload-time = "2025-11-29T14:01:32.229Z" }, + { url = "https://files.pythonhosted.org/packages/48/6d/3c8ff3dec21bf804a205286dd63fd28dcdbe00b8dd7eb7ccf2e21a40a0b0/librt-0.6.3-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:078cd77064d1640cb7b0650871a772956066174d92c8aeda188a489b58495179", size = 28732, upload-time = "2025-11-29T14:01:33.165Z" }, + { url = "https://files.pythonhosted.org/packages/f4/90/e214b8b4aa34ed3d3f1040719c06c4d22472c40c5ef81a922d5af7876eb4/librt-0.6.3-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:5cc22f7f5c0cc50ed69f4b15b9c51d602aabc4500b433aaa2ddd29e578f452f7", size = 29065, upload-time = "2025-11-29T14:01:34.088Z" }, + { url = "https://files.pythonhosted.org/packages/ab/90/ef61ed51f0a7770cc703422d907a757bbd8811ce820c333d3db2fd13542a/librt-0.6.3-cp314-cp314t-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:14b345eb7afb61b9fdcdfda6738946bd11b8e0f6be258666b0646af3b9bb5916", size = 93703, upload-time = "2025-11-29T14:01:35.057Z" }, + { url = "https://files.pythonhosted.org/packages/a8/ae/c30bb119c35962cbe9a908a71da99c168056fc3f6e9bbcbc157d0b724d89/librt-0.6.3-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:6d46aa46aa29b067f0b8b84f448fd9719aaf5f4c621cc279164d76a9dc9ab3e8", size = 98890, upload-time = "2025-11-29T14:01:36.031Z" }, + { url = "https://files.pythonhosted.org/packages/d1/96/47a4a78d252d36f072b79d592df10600d379a895c3880c8cbd2ac699f0ad/librt-0.6.3-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:1b51ba7d9d5d9001494769eca8c0988adce25d0a970c3ba3f2eb9df9d08036fc", size = 98255, upload-time = "2025-11-29T14:01:37.058Z" }, + { url = "https://files.pythonhosted.org/packages/e5/28/779b5cc3cd9987683884eb5f5672e3251676bebaaae6b7da1cf366eb1da1/librt-0.6.3-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:ced0925a18fddcff289ef54386b2fc230c5af3c83b11558571124bfc485b8c07", size = 100769, upload-time = "2025-11-29T14:01:38.413Z" }, + { url = "https://files.pythonhosted.org/packages/28/d7/771755e57c375cb9d25a4e106f570607fd856e2cb91b02418db1db954796/librt-0.6.3-cp314-cp314t-musllinux_1_2_i686.whl", hash = "sha256:6bac97e51f66da2ca012adddbe9fd656b17f7368d439de30898f24b39512f40f", size = 98580, upload-time = "2025-11-29T14:01:39.459Z" }, + { url = "https://files.pythonhosted.org/packages/d0/ec/8b157eb8fbc066339a2f34b0aceb2028097d0ed6150a52e23284a311eafe/librt-0.6.3-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:b2922a0e8fa97395553c304edc3bd36168d8eeec26b92478e292e5d4445c1ef0", size = 101706, upload-time = "2025-11-29T14:01:40.474Z" }, + { url = "https://files.pythonhosted.org/packages/82/a8/4aaead9a06c795a318282aebf7d3e3e578fa889ff396e1b640c3be4c7806/librt-0.6.3-cp314-cp314t-win32.whl", hash = "sha256:f33462b19503ba68d80dac8a1354402675849259fb3ebf53b67de86421735a3a", size = 19465, upload-time = "2025-11-29T14:01:41.77Z" }, + { url = "https://files.pythonhosted.org/packages/3a/61/b7e6a02746c1731670c19ba07d86da90b1ae45d29e405c0b5615abf97cde/librt-0.6.3-cp314-cp314t-win_amd64.whl", hash = "sha256:04f8ce401d4f6380cfc42af0f4e67342bf34c820dae01343f58f472dbac75dcf", size = 21042, upload-time = "2025-11-29T14:01:42.865Z" }, + { url = "https://files.pythonhosted.org/packages/0e/3d/72cc9ec90bb80b5b1a65f0bb74a0f540195837baaf3b98c7fa4a7aa9718e/librt-0.6.3-cp314-cp314t-win_arm64.whl", hash = "sha256:afb39550205cc5e5c935762c6bf6a2bb34f7d21a68eadb25e2db7bf3593fecc0", size = 20246, upload-time = "2025-11-29T14:01:44.13Z" }, +] + +[[package]] +name = "mako" +version = "1.3.10" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "markupsafe" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/9e/38/bd5b78a920a64d708fe6bc8e0a2c075e1389d53bef8413725c63ba041535/mako-1.3.10.tar.gz", hash = "sha256:99579a6f39583fa7e5630a28c3c1f440e4e97a414b80372649c0ce338da2ea28", size = 392474, upload-time = "2025-04-10T12:44:31.16Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/87/fb/99f81ac72ae23375f22b7afdb7642aba97c00a713c217124420147681a2f/mako-1.3.10-py3-none-any.whl", hash = "sha256:baef24a52fc4fc514a0887ac600f9f1cff3d82c61d4d700a1fa84d597b88db59", size = 78509, upload-time = "2025-04-10T12:50:53.297Z" }, +] + +[[package]] +name = "markdown-it-py" +version = "4.0.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "mdurl" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/5b/f5/4ec618ed16cc4f8fb3b701563655a69816155e79e24a17b651541804721d/markdown_it_py-4.0.0.tar.gz", hash = "sha256:cb0a2b4aa34f932c007117b194e945bd74e0ec24133ceb5bac59009cda1cb9f3", size = 73070, upload-time = "2025-08-11T12:57:52.854Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/94/54/e7d793b573f298e1c9013b8c4dade17d481164aa517d1d7148619c2cedbf/markdown_it_py-4.0.0-py3-none-any.whl", hash = "sha256:87327c59b172c5011896038353a81343b6754500a08cd7a4973bb48c6d578147", size = 87321, upload-time = "2025-08-11T12:57:51.923Z" }, +] + +[[package]] +name = "markupsafe" +version = "3.0.3" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/7e/99/7690b6d4034fffd95959cbe0c02de8deb3098cc577c67bb6a24fe5d7caa7/markupsafe-3.0.3.tar.gz", hash = "sha256:722695808f4b6457b320fdc131280796bdceb04ab50fe1795cd540799ebe1698", size = 80313, upload-time = "2025-09-27T18:37:40.426Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/38/2f/907b9c7bbba283e68f20259574b13d005c121a0fa4c175f9bed27c4597ff/markupsafe-3.0.3-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:e1cf1972137e83c5d4c136c43ced9ac51d0e124706ee1c8aa8532c1287fa8795", size = 11622, upload-time = "2025-09-27T18:36:41.777Z" }, + { url = "https://files.pythonhosted.org/packages/9c/d9/5f7756922cdd676869eca1c4e3c0cd0df60ed30199ffd775e319089cb3ed/markupsafe-3.0.3-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:116bb52f642a37c115f517494ea5feb03889e04df47eeff5b130b1808ce7c219", size = 12029, upload-time = "2025-09-27T18:36:43.257Z" }, + { url = "https://files.pythonhosted.org/packages/00/07/575a68c754943058c78f30db02ee03a64b3c638586fba6a6dd56830b30a3/markupsafe-3.0.3-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:133a43e73a802c5562be9bbcd03d090aa5a1fe899db609c29e8c8d815c5f6de6", size = 24374, upload-time = "2025-09-27T18:36:44.508Z" }, + { url = "https://files.pythonhosted.org/packages/a9/21/9b05698b46f218fc0e118e1f8168395c65c8a2c750ae2bab54fc4bd4e0e8/markupsafe-3.0.3-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:ccfcd093f13f0f0b7fdd0f198b90053bf7b2f02a3927a30e63f3ccc9df56b676", size = 22980, upload-time = "2025-09-27T18:36:45.385Z" }, + { url = "https://files.pythonhosted.org/packages/7f/71/544260864f893f18b6827315b988c146b559391e6e7e8f7252839b1b846a/markupsafe-3.0.3-cp313-cp313-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:509fa21c6deb7a7a273d629cf5ec029bc209d1a51178615ddf718f5918992ab9", size = 21990, upload-time = "2025-09-27T18:36:46.916Z" }, + { url = "https://files.pythonhosted.org/packages/c2/28/b50fc2f74d1ad761af2f5dcce7492648b983d00a65b8c0e0cb457c82ebbe/markupsafe-3.0.3-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:a4afe79fb3de0b7097d81da19090f4df4f8d3a2b3adaa8764138aac2e44f3af1", size = 23784, upload-time = "2025-09-27T18:36:47.884Z" }, + { url = "https://files.pythonhosted.org/packages/ed/76/104b2aa106a208da8b17a2fb72e033a5a9d7073c68f7e508b94916ed47a9/markupsafe-3.0.3-cp313-cp313-musllinux_1_2_riscv64.whl", hash = "sha256:795e7751525cae078558e679d646ae45574b47ed6e7771863fcc079a6171a0fc", size = 21588, upload-time = "2025-09-27T18:36:48.82Z" }, + { url = "https://files.pythonhosted.org/packages/b5/99/16a5eb2d140087ebd97180d95249b00a03aa87e29cc224056274f2e45fd6/markupsafe-3.0.3-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:8485f406a96febb5140bfeca44a73e3ce5116b2501ac54fe953e488fb1d03b12", size = 23041, upload-time = "2025-09-27T18:36:49.797Z" }, + { url = "https://files.pythonhosted.org/packages/19/bc/e7140ed90c5d61d77cea142eed9f9c303f4c4806f60a1044c13e3f1471d0/markupsafe-3.0.3-cp313-cp313-win32.whl", hash = "sha256:bdd37121970bfd8be76c5fb069c7751683bdf373db1ed6c010162b2a130248ed", size = 14543, upload-time = "2025-09-27T18:36:51.584Z" }, + { url = "https://files.pythonhosted.org/packages/05/73/c4abe620b841b6b791f2edc248f556900667a5a1cf023a6646967ae98335/markupsafe-3.0.3-cp313-cp313-win_amd64.whl", hash = "sha256:9a1abfdc021a164803f4d485104931fb8f8c1efd55bc6b748d2f5774e78b62c5", size = 15113, upload-time = "2025-09-27T18:36:52.537Z" }, + { url = "https://files.pythonhosted.org/packages/f0/3a/fa34a0f7cfef23cf9500d68cb7c32dd64ffd58a12b09225fb03dd37d5b80/markupsafe-3.0.3-cp313-cp313-win_arm64.whl", hash = "sha256:7e68f88e5b8799aa49c85cd116c932a1ac15caaa3f5db09087854d218359e485", size = 13911, upload-time = "2025-09-27T18:36:53.513Z" }, + { url = "https://files.pythonhosted.org/packages/e4/d7/e05cd7efe43a88a17a37b3ae96e79a19e846f3f456fe79c57ca61356ef01/markupsafe-3.0.3-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:218551f6df4868a8d527e3062d0fb968682fe92054e89978594c28e642c43a73", size = 11658, upload-time = "2025-09-27T18:36:54.819Z" }, + { url = "https://files.pythonhosted.org/packages/99/9e/e412117548182ce2148bdeacdda3bb494260c0b0184360fe0d56389b523b/markupsafe-3.0.3-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:3524b778fe5cfb3452a09d31e7b5adefeea8c5be1d43c4f810ba09f2ceb29d37", size = 12066, upload-time = "2025-09-27T18:36:55.714Z" }, + { url = "https://files.pythonhosted.org/packages/bc/e6/fa0ffcda717ef64a5108eaa7b4f5ed28d56122c9a6d70ab8b72f9f715c80/markupsafe-3.0.3-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:4e885a3d1efa2eadc93c894a21770e4bc67899e3543680313b09f139e149ab19", size = 25639, upload-time = "2025-09-27T18:36:56.908Z" }, + { url = "https://files.pythonhosted.org/packages/96/ec/2102e881fe9d25fc16cb4b25d5f5cde50970967ffa5dddafdb771237062d/markupsafe-3.0.3-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:8709b08f4a89aa7586de0aadc8da56180242ee0ada3999749b183aa23df95025", size = 23569, upload-time = "2025-09-27T18:36:57.913Z" }, + { url = "https://files.pythonhosted.org/packages/4b/30/6f2fce1f1f205fc9323255b216ca8a235b15860c34b6798f810f05828e32/markupsafe-3.0.3-cp313-cp313t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:b8512a91625c9b3da6f127803b166b629725e68af71f8184ae7e7d54686a56d6", size = 23284, upload-time = "2025-09-27T18:36:58.833Z" }, + { url = "https://files.pythonhosted.org/packages/58/47/4a0ccea4ab9f5dcb6f79c0236d954acb382202721e704223a8aafa38b5c8/markupsafe-3.0.3-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:9b79b7a16f7fedff2495d684f2b59b0457c3b493778c9eed31111be64d58279f", size = 24801, upload-time = "2025-09-27T18:36:59.739Z" }, + { url = "https://files.pythonhosted.org/packages/6a/70/3780e9b72180b6fecb83a4814d84c3bf4b4ae4bf0b19c27196104149734c/markupsafe-3.0.3-cp313-cp313t-musllinux_1_2_riscv64.whl", hash = "sha256:12c63dfb4a98206f045aa9563db46507995f7ef6d83b2f68eda65c307c6829eb", size = 22769, upload-time = "2025-09-27T18:37:00.719Z" }, + { url = "https://files.pythonhosted.org/packages/98/c5/c03c7f4125180fc215220c035beac6b9cb684bc7a067c84fc69414d315f5/markupsafe-3.0.3-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:8f71bc33915be5186016f675cd83a1e08523649b0e33efdb898db577ef5bb009", size = 23642, upload-time = "2025-09-27T18:37:01.673Z" }, + { url = "https://files.pythonhosted.org/packages/80/d6/2d1b89f6ca4bff1036499b1e29a1d02d282259f3681540e16563f27ebc23/markupsafe-3.0.3-cp313-cp313t-win32.whl", hash = "sha256:69c0b73548bc525c8cb9a251cddf1931d1db4d2258e9599c28c07ef3580ef354", size = 14612, upload-time = "2025-09-27T18:37:02.639Z" }, + { url = "https://files.pythonhosted.org/packages/2b/98/e48a4bfba0a0ffcf9925fe2d69240bfaa19c6f7507b8cd09c70684a53c1e/markupsafe-3.0.3-cp313-cp313t-win_amd64.whl", hash = "sha256:1b4b79e8ebf6b55351f0d91fe80f893b4743f104bff22e90697db1590e47a218", size = 15200, upload-time = "2025-09-27T18:37:03.582Z" }, + { url = "https://files.pythonhosted.org/packages/0e/72/e3cc540f351f316e9ed0f092757459afbc595824ca724cbc5a5d4263713f/markupsafe-3.0.3-cp313-cp313t-win_arm64.whl", hash = "sha256:ad2cf8aa28b8c020ab2fc8287b0f823d0a7d8630784c31e9ee5edea20f406287", size = 13973, upload-time = "2025-09-27T18:37:04.929Z" }, + { url = "https://files.pythonhosted.org/packages/33/8a/8e42d4838cd89b7dde187011e97fe6c3af66d8c044997d2183fbd6d31352/markupsafe-3.0.3-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:eaa9599de571d72e2daf60164784109f19978b327a3910d3e9de8c97b5b70cfe", size = 11619, upload-time = "2025-09-27T18:37:06.342Z" }, + { url = "https://files.pythonhosted.org/packages/b5/64/7660f8a4a8e53c924d0fa05dc3a55c9cee10bbd82b11c5afb27d44b096ce/markupsafe-3.0.3-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:c47a551199eb8eb2121d4f0f15ae0f923d31350ab9280078d1e5f12b249e0026", size = 12029, upload-time = "2025-09-27T18:37:07.213Z" }, + { url = "https://files.pythonhosted.org/packages/da/ef/e648bfd021127bef5fa12e1720ffed0c6cbb8310c8d9bea7266337ff06de/markupsafe-3.0.3-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:f34c41761022dd093b4b6896d4810782ffbabe30f2d443ff5f083e0cbbb8c737", size = 24408, upload-time = "2025-09-27T18:37:09.572Z" }, + { url = "https://files.pythonhosted.org/packages/41/3c/a36c2450754618e62008bf7435ccb0f88053e07592e6028a34776213d877/markupsafe-3.0.3-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:457a69a9577064c05a97c41f4e65148652db078a3a509039e64d3467b9e7ef97", size = 23005, upload-time = "2025-09-27T18:37:10.58Z" }, + { url = "https://files.pythonhosted.org/packages/bc/20/b7fdf89a8456b099837cd1dc21974632a02a999ec9bf7ca3e490aacd98e7/markupsafe-3.0.3-cp314-cp314-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:e8afc3f2ccfa24215f8cb28dcf43f0113ac3c37c2f0f0806d8c70e4228c5cf4d", size = 22048, upload-time = "2025-09-27T18:37:11.547Z" }, + { url = "https://files.pythonhosted.org/packages/9a/a7/591f592afdc734f47db08a75793a55d7fbcc6902a723ae4cfbab61010cc5/markupsafe-3.0.3-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:ec15a59cf5af7be74194f7ab02d0f59a62bdcf1a537677ce67a2537c9b87fcda", size = 23821, upload-time = "2025-09-27T18:37:12.48Z" }, + { url = "https://files.pythonhosted.org/packages/7d/33/45b24e4f44195b26521bc6f1a82197118f74df348556594bd2262bda1038/markupsafe-3.0.3-cp314-cp314-musllinux_1_2_riscv64.whl", hash = "sha256:0eb9ff8191e8498cca014656ae6b8d61f39da5f95b488805da4bb029cccbfbaf", size = 21606, upload-time = "2025-09-27T18:37:13.485Z" }, + { url = "https://files.pythonhosted.org/packages/ff/0e/53dfaca23a69fbfbbf17a4b64072090e70717344c52eaaaa9c5ddff1e5f0/markupsafe-3.0.3-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:2713baf880df847f2bece4230d4d094280f4e67b1e813eec43b4c0e144a34ffe", size = 23043, upload-time = "2025-09-27T18:37:14.408Z" }, + { url = "https://files.pythonhosted.org/packages/46/11/f333a06fc16236d5238bfe74daccbca41459dcd8d1fa952e8fbd5dccfb70/markupsafe-3.0.3-cp314-cp314-win32.whl", hash = "sha256:729586769a26dbceff69f7a7dbbf59ab6572b99d94576a5592625d5b411576b9", size = 14747, upload-time = "2025-09-27T18:37:15.36Z" }, + { url = "https://files.pythonhosted.org/packages/28/52/182836104b33b444e400b14f797212f720cbc9ed6ba34c800639d154e821/markupsafe-3.0.3-cp314-cp314-win_amd64.whl", hash = "sha256:bdc919ead48f234740ad807933cdf545180bfbe9342c2bb451556db2ed958581", size = 15341, upload-time = "2025-09-27T18:37:16.496Z" }, + { url = "https://files.pythonhosted.org/packages/6f/18/acf23e91bd94fd7b3031558b1f013adfa21a8e407a3fdb32745538730382/markupsafe-3.0.3-cp314-cp314-win_arm64.whl", hash = "sha256:5a7d5dc5140555cf21a6fefbdbf8723f06fcd2f63ef108f2854de715e4422cb4", size = 14073, upload-time = "2025-09-27T18:37:17.476Z" }, + { url = "https://files.pythonhosted.org/packages/3c/f0/57689aa4076e1b43b15fdfa646b04653969d50cf30c32a102762be2485da/markupsafe-3.0.3-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:1353ef0c1b138e1907ae78e2f6c63ff67501122006b0f9abad68fda5f4ffc6ab", size = 11661, upload-time = "2025-09-27T18:37:18.453Z" }, + { url = "https://files.pythonhosted.org/packages/89/c3/2e67a7ca217c6912985ec766c6393b636fb0c2344443ff9d91404dc4c79f/markupsafe-3.0.3-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:1085e7fbddd3be5f89cc898938f42c0b3c711fdcb37d75221de2666af647c175", size = 12069, upload-time = "2025-09-27T18:37:19.332Z" }, + { url = "https://files.pythonhosted.org/packages/f0/00/be561dce4e6ca66b15276e184ce4b8aec61fe83662cce2f7d72bd3249d28/markupsafe-3.0.3-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:1b52b4fb9df4eb9ae465f8d0c228a00624de2334f216f178a995ccdcf82c4634", size = 25670, upload-time = "2025-09-27T18:37:20.245Z" }, + { url = "https://files.pythonhosted.org/packages/50/09/c419f6f5a92e5fadde27efd190eca90f05e1261b10dbd8cbcb39cd8ea1dc/markupsafe-3.0.3-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:fed51ac40f757d41b7c48425901843666a6677e3e8eb0abcff09e4ba6e664f50", size = 23598, upload-time = "2025-09-27T18:37:21.177Z" }, + { url = "https://files.pythonhosted.org/packages/22/44/a0681611106e0b2921b3033fc19bc53323e0b50bc70cffdd19f7d679bb66/markupsafe-3.0.3-cp314-cp314t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:f190daf01f13c72eac4efd5c430a8de82489d9cff23c364c3ea822545032993e", size = 23261, upload-time = "2025-09-27T18:37:22.167Z" }, + { url = "https://files.pythonhosted.org/packages/5f/57/1b0b3f100259dc9fffe780cfb60d4be71375510e435efec3d116b6436d43/markupsafe-3.0.3-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:e56b7d45a839a697b5eb268c82a71bd8c7f6c94d6fd50c3d577fa39a9f1409f5", size = 24835, upload-time = "2025-09-27T18:37:23.296Z" }, + { url = "https://files.pythonhosted.org/packages/26/6a/4bf6d0c97c4920f1597cc14dd720705eca0bf7c787aebc6bb4d1bead5388/markupsafe-3.0.3-cp314-cp314t-musllinux_1_2_riscv64.whl", hash = "sha256:f3e98bb3798ead92273dc0e5fd0f31ade220f59a266ffd8a4f6065e0a3ce0523", size = 22733, upload-time = "2025-09-27T18:37:24.237Z" }, + { url = "https://files.pythonhosted.org/packages/14/c7/ca723101509b518797fedc2fdf79ba57f886b4aca8a7d31857ba3ee8281f/markupsafe-3.0.3-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:5678211cb9333a6468fb8d8be0305520aa073f50d17f089b5b4b477ea6e67fdc", size = 23672, upload-time = "2025-09-27T18:37:25.271Z" }, + { url = "https://files.pythonhosted.org/packages/fb/df/5bd7a48c256faecd1d36edc13133e51397e41b73bb77e1a69deab746ebac/markupsafe-3.0.3-cp314-cp314t-win32.whl", hash = "sha256:915c04ba3851909ce68ccc2b8e2cd691618c4dc4c4232fb7982bca3f41fd8c3d", size = 14819, upload-time = "2025-09-27T18:37:26.285Z" }, + { url = "https://files.pythonhosted.org/packages/1a/8a/0402ba61a2f16038b48b39bccca271134be00c5c9f0f623208399333c448/markupsafe-3.0.3-cp314-cp314t-win_amd64.whl", hash = "sha256:4faffd047e07c38848ce017e8725090413cd80cbc23d86e55c587bf979e579c9", size = 15426, upload-time = "2025-09-27T18:37:27.316Z" }, + { url = "https://files.pythonhosted.org/packages/70/bc/6f1c2f612465f5fa89b95bead1f44dcb607670fd42891d8fdcd5d039f4f4/markupsafe-3.0.3-cp314-cp314t-win_arm64.whl", hash = "sha256:32001d6a8fc98c8cb5c947787c5d08b0a50663d139f1305bac5885d98d9b40fa", size = 14146, upload-time = "2025-09-27T18:37:28.327Z" }, +] + +[[package]] +name = "mdurl" +version = "0.1.2" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/d6/54/cfe61301667036ec958cb99bd3efefba235e65cdeb9c84d24a8293ba1d90/mdurl-0.1.2.tar.gz", hash = "sha256:bb413d29f5eea38f31dd4754dd7377d4465116fb207585f97bf925588687c1ba", size = 8729, upload-time = "2022-08-14T12:40:10.846Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/b3/38/89ba8ad64ae25be8de66a6d463314cf1eb366222074cfda9ee839c56a4b4/mdurl-0.1.2-py3-none-any.whl", hash = "sha256:84008a41e51615a49fc9966191ff91509e3c40b939176e643fd50a5c2196b8f8", size = 9979, upload-time = "2022-08-14T12:40:09.779Z" }, +] + +[[package]] +name = "multidict" +version = "6.7.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/80/1e/5492c365f222f907de1039b91f922b93fa4f764c713ee858d235495d8f50/multidict-6.7.0.tar.gz", hash = "sha256:c6e99d9a65ca282e578dfea819cfa9c0a62b2499d8677392e09feaf305e9e6f5", size = 101834, upload-time = "2025-10-06T14:52:30.657Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/d2/86/33272a544eeb36d66e4d9a920602d1a2f57d4ebea4ef3cdfe5a912574c95/multidict-6.7.0-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:bee7c0588aa0076ce77c0ea5d19a68d76ad81fcd9fe8501003b9a24f9d4000f6", size = 76135, upload-time = "2025-10-06T14:49:54.26Z" }, + { url = "https://files.pythonhosted.org/packages/91/1c/eb97db117a1ebe46d457a3d235a7b9d2e6dcab174f42d1b67663dd9e5371/multidict-6.7.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:7ef6b61cad77091056ce0e7ce69814ef72afacb150b7ac6a3e9470def2198159", size = 45117, upload-time = "2025-10-06T14:49:55.82Z" }, + { url = "https://files.pythonhosted.org/packages/f1/d8/6c3442322e41fb1dd4de8bd67bfd11cd72352ac131f6368315617de752f1/multidict-6.7.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:9c0359b1ec12b1d6849c59f9d319610b7f20ef990a6d454ab151aa0e3b9f78ca", size = 43472, upload-time = "2025-10-06T14:49:57.048Z" }, + { url = "https://files.pythonhosted.org/packages/75/3f/e2639e80325af0b6c6febdf8e57cc07043ff15f57fa1ef808f4ccb5ac4cd/multidict-6.7.0-cp313-cp313-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:cd240939f71c64bd658f186330603aac1a9a81bf6273f523fca63673cb7378a8", size = 249342, upload-time = "2025-10-06T14:49:58.368Z" }, + { url = "https://files.pythonhosted.org/packages/5d/cc/84e0585f805cbeaa9cbdaa95f9a3d6aed745b9d25700623ac89a6ecff400/multidict-6.7.0-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:a60a4d75718a5efa473ebd5ab685786ba0c67b8381f781d1be14da49f1a2dc60", size = 257082, upload-time = "2025-10-06T14:49:59.89Z" }, + { url = "https://files.pythonhosted.org/packages/b0/9c/ac851c107c92289acbbf5cfb485694084690c1b17e555f44952c26ddc5bd/multidict-6.7.0-cp313-cp313-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:53a42d364f323275126aff81fb67c5ca1b7a04fda0546245730a55c8c5f24bc4", size = 240704, upload-time = "2025-10-06T14:50:01.485Z" }, + { url = "https://files.pythonhosted.org/packages/50/cc/5f93e99427248c09da95b62d64b25748a5f5c98c7c2ab09825a1d6af0e15/multidict-6.7.0-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:3b29b980d0ddbecb736735ee5bef69bb2ddca56eff603c86f3f29a1128299b4f", size = 266355, upload-time = "2025-10-06T14:50:02.955Z" }, + { url = "https://files.pythonhosted.org/packages/ec/0c/2ec1d883ceb79c6f7f6d7ad90c919c898f5d1c6ea96d322751420211e072/multidict-6.7.0-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:f8a93b1c0ed2d04b97a5e9336fd2d33371b9a6e29ab7dd6503d63407c20ffbaf", size = 267259, upload-time = "2025-10-06T14:50:04.446Z" }, + { url = "https://files.pythonhosted.org/packages/c6/2d/f0b184fa88d6630aa267680bdb8623fb69cb0d024b8c6f0d23f9a0f406d3/multidict-6.7.0-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:9ff96e8815eecacc6645da76c413eb3b3d34cfca256c70b16b286a687d013c32", size = 254903, upload-time = "2025-10-06T14:50:05.98Z" }, + { url = "https://files.pythonhosted.org/packages/06/c9/11ea263ad0df7dfabcad404feb3c0dd40b131bc7f232d5537f2fb1356951/multidict-6.7.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:7516c579652f6a6be0e266aec0acd0db80829ca305c3d771ed898538804c2036", size = 252365, upload-time = "2025-10-06T14:50:07.511Z" }, + { url = "https://files.pythonhosted.org/packages/41/88/d714b86ee2c17d6e09850c70c9d310abac3d808ab49dfa16b43aba9d53fd/multidict-6.7.0-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:040f393368e63fb0f3330e70c26bfd336656bed925e5cbe17c9da839a6ab13ec", size = 250062, upload-time = "2025-10-06T14:50:09.074Z" }, + { url = "https://files.pythonhosted.org/packages/15/fe/ad407bb9e818c2b31383f6131ca19ea7e35ce93cf1310fce69f12e89de75/multidict-6.7.0-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:b3bc26a951007b1057a1c543af845f1c7e3e71cc240ed1ace7bf4484aa99196e", size = 249683, upload-time = "2025-10-06T14:50:10.714Z" }, + { url = "https://files.pythonhosted.org/packages/8c/a4/a89abdb0229e533fb925e7c6e5c40201c2873efebc9abaf14046a4536ee6/multidict-6.7.0-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:7b022717c748dd1992a83e219587aabe45980d88969f01b316e78683e6285f64", size = 261254, upload-time = "2025-10-06T14:50:12.28Z" }, + { url = "https://files.pythonhosted.org/packages/8d/aa/0e2b27bd88b40a4fb8dc53dd74eecac70edaa4c1dd0707eb2164da3675b3/multidict-6.7.0-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:9600082733859f00d79dee64effc7aef1beb26adb297416a4ad2116fd61374bd", size = 257967, upload-time = "2025-10-06T14:50:14.16Z" }, + { url = "https://files.pythonhosted.org/packages/d0/8e/0c67b7120d5d5f6d874ed85a085f9dc770a7f9d8813e80f44a9fec820bb7/multidict-6.7.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:94218fcec4d72bc61df51c198d098ce2b378e0ccbac41ddbed5ef44092913288", size = 250085, upload-time = "2025-10-06T14:50:15.639Z" }, + { url = "https://files.pythonhosted.org/packages/ba/55/b73e1d624ea4b8fd4dd07a3bb70f6e4c7c6c5d9d640a41c6ffe5cdbd2a55/multidict-6.7.0-cp313-cp313-win32.whl", hash = "sha256:a37bd74c3fa9d00be2d7b8eca074dc56bd8077ddd2917a839bd989612671ed17", size = 41713, upload-time = "2025-10-06T14:50:17.066Z" }, + { url = "https://files.pythonhosted.org/packages/32/31/75c59e7d3b4205075b4c183fa4ca398a2daf2303ddf616b04ae6ef55cffe/multidict-6.7.0-cp313-cp313-win_amd64.whl", hash = "sha256:30d193c6cc6d559db42b6bcec8a5d395d34d60c9877a0b71ecd7c204fcf15390", size = 45915, upload-time = "2025-10-06T14:50:18.264Z" }, + { url = "https://files.pythonhosted.org/packages/31/2a/8987831e811f1184c22bc2e45844934385363ee61c0a2dcfa8f71b87e608/multidict-6.7.0-cp313-cp313-win_arm64.whl", hash = "sha256:ea3334cabe4d41b7ccd01e4d349828678794edbc2d3ae97fc162a3312095092e", size = 43077, upload-time = "2025-10-06T14:50:19.853Z" }, + { url = "https://files.pythonhosted.org/packages/e8/68/7b3a5170a382a340147337b300b9eb25a9ddb573bcdfff19c0fa3f31ffba/multidict-6.7.0-cp313-cp313t-macosx_10_13_universal2.whl", hash = "sha256:ad9ce259f50abd98a1ca0aa6e490b58c316a0fce0617f609723e40804add2c00", size = 83114, upload-time = "2025-10-06T14:50:21.223Z" }, + { url = "https://files.pythonhosted.org/packages/55/5c/3fa2d07c84df4e302060f555bbf539310980362236ad49f50eeb0a1c1eb9/multidict-6.7.0-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:07f5594ac6d084cbb5de2df218d78baf55ef150b91f0ff8a21cc7a2e3a5a58eb", size = 48442, upload-time = "2025-10-06T14:50:22.871Z" }, + { url = "https://files.pythonhosted.org/packages/fc/56/67212d33239797f9bd91962bb899d72bb0f4c35a8652dcdb8ed049bef878/multidict-6.7.0-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:0591b48acf279821a579282444814a2d8d0af624ae0bc600aa4d1b920b6e924b", size = 46885, upload-time = "2025-10-06T14:50:24.258Z" }, + { url = "https://files.pythonhosted.org/packages/46/d1/908f896224290350721597a61a69cd19b89ad8ee0ae1f38b3f5cd12ea2ac/multidict-6.7.0-cp313-cp313t-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:749a72584761531d2b9467cfbdfd29487ee21124c304c4b6cb760d8777b27f9c", size = 242588, upload-time = "2025-10-06T14:50:25.716Z" }, + { url = "https://files.pythonhosted.org/packages/ab/67/8604288bbd68680eee0ab568fdcb56171d8b23a01bcd5cb0c8fedf6e5d99/multidict-6.7.0-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:6b4c3d199f953acd5b446bf7c0de1fe25d94e09e79086f8dc2f48a11a129cdf1", size = 249966, upload-time = "2025-10-06T14:50:28.192Z" }, + { url = "https://files.pythonhosted.org/packages/20/33/9228d76339f1ba51e3efef7da3ebd91964d3006217aae13211653193c3ff/multidict-6.7.0-cp313-cp313t-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:9fb0211dfc3b51efea2f349ec92c114d7754dd62c01f81c3e32b765b70c45c9b", size = 228618, upload-time = "2025-10-06T14:50:29.82Z" }, + { url = "https://files.pythonhosted.org/packages/f8/2d/25d9b566d10cab1c42b3b9e5b11ef79c9111eaf4463b8c257a3bd89e0ead/multidict-6.7.0-cp313-cp313t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:a027ec240fe73a8d6281872690b988eed307cd7d91b23998ff35ff577ca688b5", size = 257539, upload-time = "2025-10-06T14:50:31.731Z" }, + { url = "https://files.pythonhosted.org/packages/b6/b1/8d1a965e6637fc33de3c0d8f414485c2b7e4af00f42cab3d84e7b955c222/multidict-6.7.0-cp313-cp313t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:d1d964afecdf3a8288789df2f5751dc0a8261138c3768d9af117ed384e538fad", size = 256345, upload-time = "2025-10-06T14:50:33.26Z" }, + { url = "https://files.pythonhosted.org/packages/ba/0c/06b5a8adbdeedada6f4fb8d8f193d44a347223b11939b42953eeb6530b6b/multidict-6.7.0-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:caf53b15b1b7df9fbd0709aa01409000a2b4dd03a5f6f5cc548183c7c8f8b63c", size = 247934, upload-time = "2025-10-06T14:50:34.808Z" }, + { url = "https://files.pythonhosted.org/packages/8f/31/b2491b5fe167ca044c6eb4b8f2c9f3b8a00b24c432c365358eadac5d7625/multidict-6.7.0-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:654030da3197d927f05a536a66186070e98765aa5142794c9904555d3a9d8fb5", size = 245243, upload-time = "2025-10-06T14:50:36.436Z" }, + { url = "https://files.pythonhosted.org/packages/61/1a/982913957cb90406c8c94f53001abd9eafc271cb3e70ff6371590bec478e/multidict-6.7.0-cp313-cp313t-musllinux_1_2_armv7l.whl", hash = "sha256:2090d3718829d1e484706a2f525e50c892237b2bf9b17a79b059cb98cddc2f10", size = 235878, upload-time = "2025-10-06T14:50:37.953Z" }, + { url = "https://files.pythonhosted.org/packages/be/c0/21435d804c1a1cf7a2608593f4d19bca5bcbd7a81a70b253fdd1c12af9c0/multidict-6.7.0-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:2d2cfeec3f6f45651b3d408c4acec0ebf3daa9bc8a112a084206f5db5d05b754", size = 243452, upload-time = "2025-10-06T14:50:39.574Z" }, + { url = "https://files.pythonhosted.org/packages/54/0a/4349d540d4a883863191be6eb9a928846d4ec0ea007d3dcd36323bb058ac/multidict-6.7.0-cp313-cp313t-musllinux_1_2_ppc64le.whl", hash = "sha256:4ef089f985b8c194d341eb2c24ae6e7408c9a0e2e5658699c92f497437d88c3c", size = 252312, upload-time = "2025-10-06T14:50:41.612Z" }, + { url = "https://files.pythonhosted.org/packages/26/64/d5416038dbda1488daf16b676e4dbfd9674dde10a0cc8f4fc2b502d8125d/multidict-6.7.0-cp313-cp313t-musllinux_1_2_s390x.whl", hash = "sha256:e93a0617cd16998784bf4414c7e40f17a35d2350e5c6f0bd900d3a8e02bd3762", size = 246935, upload-time = "2025-10-06T14:50:43.972Z" }, + { url = "https://files.pythonhosted.org/packages/9f/8c/8290c50d14e49f35e0bd4abc25e1bc7711149ca9588ab7d04f886cdf03d9/multidict-6.7.0-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:f0feece2ef8ebc42ed9e2e8c78fc4aa3cf455733b507c09ef7406364c94376c6", size = 243385, upload-time = "2025-10-06T14:50:45.648Z" }, + { url = "https://files.pythonhosted.org/packages/ef/a0/f83ae75e42d694b3fbad3e047670e511c138be747bc713cf1b10d5096416/multidict-6.7.0-cp313-cp313t-win32.whl", hash = "sha256:19a1d55338ec1be74ef62440ca9e04a2f001a04d0cc49a4983dc320ff0f3212d", size = 47777, upload-time = "2025-10-06T14:50:47.154Z" }, + { url = "https://files.pythonhosted.org/packages/dc/80/9b174a92814a3830b7357307a792300f42c9e94664b01dee8e457551fa66/multidict-6.7.0-cp313-cp313t-win_amd64.whl", hash = "sha256:3da4fb467498df97e986af166b12d01f05d2e04f978a9c1c680ea1988e0bc4b6", size = 53104, upload-time = "2025-10-06T14:50:48.851Z" }, + { url = "https://files.pythonhosted.org/packages/cc/28/04baeaf0428d95bb7a7bea0e691ba2f31394338ba424fb0679a9ed0f4c09/multidict-6.7.0-cp313-cp313t-win_arm64.whl", hash = "sha256:b4121773c49a0776461f4a904cdf6264c88e42218aaa8407e803ca8025872792", size = 45503, upload-time = "2025-10-06T14:50:50.16Z" }, + { url = "https://files.pythonhosted.org/packages/e2/b1/3da6934455dd4b261d4c72f897e3a5728eba81db59959f3a639245891baa/multidict-6.7.0-cp314-cp314-macosx_10_13_universal2.whl", hash = "sha256:3bab1e4aff7adaa34410f93b1f8e57c4b36b9af0426a76003f441ee1d3c7e842", size = 75128, upload-time = "2025-10-06T14:50:51.92Z" }, + { url = "https://files.pythonhosted.org/packages/14/2c/f069cab5b51d175a1a2cb4ccdf7a2c2dabd58aa5bd933fa036a8d15e2404/multidict-6.7.0-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:b8512bac933afc3e45fb2b18da8e59b78d4f408399a960339598374d4ae3b56b", size = 44410, upload-time = "2025-10-06T14:50:53.275Z" }, + { url = "https://files.pythonhosted.org/packages/42/e2/64bb41266427af6642b6b128e8774ed84c11b80a90702c13ac0a86bb10cc/multidict-6.7.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:79dcf9e477bc65414ebfea98ffd013cb39552b5ecd62908752e0e413d6d06e38", size = 43205, upload-time = "2025-10-06T14:50:54.911Z" }, + { url = "https://files.pythonhosted.org/packages/02/68/6b086fef8a3f1a8541b9236c594f0c9245617c29841f2e0395d979485cde/multidict-6.7.0-cp314-cp314-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:31bae522710064b5cbeddaf2e9f32b1abab70ac6ac91d42572502299e9953128", size = 245084, upload-time = "2025-10-06T14:50:56.369Z" }, + { url = "https://files.pythonhosted.org/packages/15/ee/f524093232007cd7a75c1d132df70f235cfd590a7c9eaccd7ff422ef4ae8/multidict-6.7.0-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:4a0df7ff02397bb63e2fd22af2c87dfa39e8c7f12947bc524dbdc528282c7e34", size = 252667, upload-time = "2025-10-06T14:50:57.991Z" }, + { url = "https://files.pythonhosted.org/packages/02/a5/eeb3f43ab45878f1895118c3ef157a480db58ede3f248e29b5354139c2c9/multidict-6.7.0-cp314-cp314-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:7a0222514e8e4c514660e182d5156a415c13ef0aabbd71682fc714e327b95e99", size = 233590, upload-time = "2025-10-06T14:50:59.589Z" }, + { url = "https://files.pythonhosted.org/packages/6a/1e/76d02f8270b97269d7e3dbd45644b1785bda457b474315f8cf999525a193/multidict-6.7.0-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:2397ab4daaf2698eb51a76721e98db21ce4f52339e535725de03ea962b5a3202", size = 264112, upload-time = "2025-10-06T14:51:01.183Z" }, + { url = "https://files.pythonhosted.org/packages/76/0b/c28a70ecb58963847c2a8efe334904cd254812b10e535aefb3bcce513918/multidict-6.7.0-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:8891681594162635948a636c9fe0ff21746aeb3dd5463f6e25d9bea3a8a39ca1", size = 261194, upload-time = "2025-10-06T14:51:02.794Z" }, + { url = "https://files.pythonhosted.org/packages/b4/63/2ab26e4209773223159b83aa32721b4021ffb08102f8ac7d689c943fded1/multidict-6.7.0-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:18706cc31dbf402a7945916dd5cddf160251b6dab8a2c5f3d6d5a55949f676b3", size = 248510, upload-time = "2025-10-06T14:51:04.724Z" }, + { url = "https://files.pythonhosted.org/packages/93/cd/06c1fa8282af1d1c46fd55c10a7930af652afdce43999501d4d68664170c/multidict-6.7.0-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:f844a1bbf1d207dd311a56f383f7eda2d0e134921d45751842d8235e7778965d", size = 248395, upload-time = "2025-10-06T14:51:06.306Z" }, + { url = "https://files.pythonhosted.org/packages/99/ac/82cb419dd6b04ccf9e7e61befc00c77614fc8134362488b553402ecd55ce/multidict-6.7.0-cp314-cp314-musllinux_1_2_armv7l.whl", hash = "sha256:d4393e3581e84e5645506923816b9cc81f5609a778c7e7534054091acc64d1c6", size = 239520, upload-time = "2025-10-06T14:51:08.091Z" }, + { url = "https://files.pythonhosted.org/packages/fa/f3/a0f9bf09493421bd8716a362e0cd1d244f5a6550f5beffdd6b47e885b331/multidict-6.7.0-cp314-cp314-musllinux_1_2_i686.whl", hash = "sha256:fbd18dc82d7bf274b37aa48d664534330af744e03bccf696d6f4c6042e7d19e7", size = 245479, upload-time = "2025-10-06T14:51:10.365Z" }, + { url = "https://files.pythonhosted.org/packages/8d/01/476d38fc73a212843f43c852b0eee266b6971f0e28329c2184a8df90c376/multidict-6.7.0-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:b6234e14f9314731ec45c42fc4554b88133ad53a09092cc48a88e771c125dadb", size = 258903, upload-time = "2025-10-06T14:51:12.466Z" }, + { url = "https://files.pythonhosted.org/packages/49/6d/23faeb0868adba613b817d0e69c5f15531b24d462af8012c4f6de4fa8dc3/multidict-6.7.0-cp314-cp314-musllinux_1_2_s390x.whl", hash = "sha256:08d4379f9744d8f78d98c8673c06e202ffa88296f009c71bbafe8a6bf847d01f", size = 252333, upload-time = "2025-10-06T14:51:14.48Z" }, + { url = "https://files.pythonhosted.org/packages/1e/cc/48d02ac22b30fa247f7dad82866e4b1015431092f4ba6ebc7e77596e0b18/multidict-6.7.0-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:9fe04da3f79387f450fd0061d4dd2e45a72749d31bf634aecc9e27f24fdc4b3f", size = 243411, upload-time = "2025-10-06T14:51:16.072Z" }, + { url = "https://files.pythonhosted.org/packages/4a/03/29a8bf5a18abf1fe34535c88adbdfa88c9fb869b5a3b120692c64abe8284/multidict-6.7.0-cp314-cp314-win32.whl", hash = "sha256:fbafe31d191dfa7c4c51f7a6149c9fb7e914dcf9ffead27dcfd9f1ae382b3885", size = 40940, upload-time = "2025-10-06T14:51:17.544Z" }, + { url = "https://files.pythonhosted.org/packages/82/16/7ed27b680791b939de138f906d5cf2b4657b0d45ca6f5dd6236fdddafb1a/multidict-6.7.0-cp314-cp314-win_amd64.whl", hash = "sha256:2f67396ec0310764b9222a1728ced1ab638f61aadc6226f17a71dd9324f9a99c", size = 45087, upload-time = "2025-10-06T14:51:18.875Z" }, + { url = "https://files.pythonhosted.org/packages/cd/3c/e3e62eb35a1950292fe39315d3c89941e30a9d07d5d2df42965ab041da43/multidict-6.7.0-cp314-cp314-win_arm64.whl", hash = "sha256:ba672b26069957ee369cfa7fc180dde1fc6f176eaf1e6beaf61fbebbd3d9c000", size = 42368, upload-time = "2025-10-06T14:51:20.225Z" }, + { url = "https://files.pythonhosted.org/packages/8b/40/cd499bd0dbc5f1136726db3153042a735fffd0d77268e2ee20d5f33c010f/multidict-6.7.0-cp314-cp314t-macosx_10_13_universal2.whl", hash = "sha256:c1dcc7524066fa918c6a27d61444d4ee7900ec635779058571f70d042d86ed63", size = 82326, upload-time = "2025-10-06T14:51:21.588Z" }, + { url = "https://files.pythonhosted.org/packages/13/8a/18e031eca251c8df76daf0288e6790561806e439f5ce99a170b4af30676b/multidict-6.7.0-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:27e0b36c2d388dc7b6ced3406671b401e84ad7eb0656b8f3a2f46ed0ce483718", size = 48065, upload-time = "2025-10-06T14:51:22.93Z" }, + { url = "https://files.pythonhosted.org/packages/40/71/5e6701277470a87d234e433fb0a3a7deaf3bcd92566e421e7ae9776319de/multidict-6.7.0-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:2a7baa46a22e77f0988e3b23d4ede5513ebec1929e34ee9495be535662c0dfe2", size = 46475, upload-time = "2025-10-06T14:51:24.352Z" }, + { url = "https://files.pythonhosted.org/packages/fe/6a/bab00cbab6d9cfb57afe1663318f72ec28289ea03fd4e8236bb78429893a/multidict-6.7.0-cp314-cp314t-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:7bf77f54997a9166a2f5675d1201520586439424c2511723a7312bdb4bcc034e", size = 239324, upload-time = "2025-10-06T14:51:25.822Z" }, + { url = "https://files.pythonhosted.org/packages/2a/5f/8de95f629fc22a7769ade8b41028e3e5a822c1f8904f618d175945a81ad3/multidict-6.7.0-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:e011555abada53f1578d63389610ac8a5400fc70ce71156b0aa30d326f1a5064", size = 246877, upload-time = "2025-10-06T14:51:27.604Z" }, + { url = "https://files.pythonhosted.org/packages/23/b4/38881a960458f25b89e9f4a4fdcb02ac101cfa710190db6e5528841e67de/multidict-6.7.0-cp314-cp314t-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:28b37063541b897fd6a318007373930a75ca6d6ac7c940dbe14731ffdd8d498e", size = 225824, upload-time = "2025-10-06T14:51:29.664Z" }, + { url = "https://files.pythonhosted.org/packages/1e/39/6566210c83f8a261575f18e7144736059f0c460b362e96e9cf797a24b8e7/multidict-6.7.0-cp314-cp314t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:05047ada7a2fde2631a0ed706f1fd68b169a681dfe5e4cf0f8e4cb6618bbc2cd", size = 253558, upload-time = "2025-10-06T14:51:31.684Z" }, + { url = "https://files.pythonhosted.org/packages/00/a3/67f18315100f64c269f46e6c0319fa87ba68f0f64f2b8e7fd7c72b913a0b/multidict-6.7.0-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:716133f7d1d946a4e1b91b1756b23c088881e70ff180c24e864c26192ad7534a", size = 252339, upload-time = "2025-10-06T14:51:33.699Z" }, + { url = "https://files.pythonhosted.org/packages/c8/2a/1cb77266afee2458d82f50da41beba02159b1d6b1f7973afc9a1cad1499b/multidict-6.7.0-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:d1bed1b467ef657f2a0ae62844a607909ef1c6889562de5e1d505f74457d0b96", size = 244895, upload-time = "2025-10-06T14:51:36.189Z" }, + { url = "https://files.pythonhosted.org/packages/dd/72/09fa7dd487f119b2eb9524946ddd36e2067c08510576d43ff68469563b3b/multidict-6.7.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:ca43bdfa5d37bd6aee89d85e1d0831fb86e25541be7e9d376ead1b28974f8e5e", size = 241862, upload-time = "2025-10-06T14:51:41.291Z" }, + { url = "https://files.pythonhosted.org/packages/65/92/bc1f8bd0853d8669300f732c801974dfc3702c3eeadae2f60cef54dc69d7/multidict-6.7.0-cp314-cp314t-musllinux_1_2_armv7l.whl", hash = "sha256:44b546bd3eb645fd26fb949e43c02a25a2e632e2ca21a35e2e132c8105dc8599", size = 232376, upload-time = "2025-10-06T14:51:43.55Z" }, + { url = "https://files.pythonhosted.org/packages/09/86/ac39399e5cb9d0c2ac8ef6e10a768e4d3bc933ac808d49c41f9dc23337eb/multidict-6.7.0-cp314-cp314t-musllinux_1_2_i686.whl", hash = "sha256:a6ef16328011d3f468e7ebc326f24c1445f001ca1dec335b2f8e66bed3006394", size = 240272, upload-time = "2025-10-06T14:51:45.265Z" }, + { url = "https://files.pythonhosted.org/packages/3d/b6/fed5ac6b8563ec72df6cb1ea8dac6d17f0a4a1f65045f66b6d3bf1497c02/multidict-6.7.0-cp314-cp314t-musllinux_1_2_ppc64le.whl", hash = "sha256:5aa873cbc8e593d361ae65c68f85faadd755c3295ea2c12040ee146802f23b38", size = 248774, upload-time = "2025-10-06T14:51:46.836Z" }, + { url = "https://files.pythonhosted.org/packages/6b/8d/b954d8c0dc132b68f760aefd45870978deec6818897389dace00fcde32ff/multidict-6.7.0-cp314-cp314t-musllinux_1_2_s390x.whl", hash = "sha256:3d7b6ccce016e29df4b7ca819659f516f0bc7a4b3efa3bb2012ba06431b044f9", size = 242731, upload-time = "2025-10-06T14:51:48.541Z" }, + { url = "https://files.pythonhosted.org/packages/16/9d/a2dac7009125d3540c2f54e194829ea18ac53716c61b655d8ed300120b0f/multidict-6.7.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:171b73bd4ee683d307599b66793ac80981b06f069b62eea1c9e29c9241aa66b0", size = 240193, upload-time = "2025-10-06T14:51:50.355Z" }, + { url = "https://files.pythonhosted.org/packages/39/ca/c05f144128ea232ae2178b008d5011d4e2cea86e4ee8c85c2631b1b94802/multidict-6.7.0-cp314-cp314t-win32.whl", hash = "sha256:b2d7f80c4e1fd010b07cb26820aae86b7e73b681ee4889684fb8d2d4537aab13", size = 48023, upload-time = "2025-10-06T14:51:51.883Z" }, + { url = "https://files.pythonhosted.org/packages/ba/8f/0a60e501584145588be1af5cc829265701ba3c35a64aec8e07cbb71d39bb/multidict-6.7.0-cp314-cp314t-win_amd64.whl", hash = "sha256:09929cab6fcb68122776d575e03c6cc64ee0b8fca48d17e135474b042ce515cd", size = 53507, upload-time = "2025-10-06T14:51:53.672Z" }, + { url = "https://files.pythonhosted.org/packages/7f/ae/3148b988a9c6239903e786eac19c889fab607c31d6efa7fb2147e5680f23/multidict-6.7.0-cp314-cp314t-win_arm64.whl", hash = "sha256:cc41db090ed742f32bd2d2c721861725e6109681eddf835d0a82bd3a5c382827", size = 44804, upload-time = "2025-10-06T14:51:55.415Z" }, + { url = "https://files.pythonhosted.org/packages/b7/da/7d22601b625e241d4f23ef1ebff8acfc60da633c9e7e7922e24d10f592b3/multidict-6.7.0-py3-none-any.whl", hash = "sha256:394fc5c42a333c9ffc3e421a4c85e08580d990e08b99f6bf35b4132114c5dcb3", size = 12317, upload-time = "2025-10-06T14:52:29.272Z" }, +] + +[[package]] +name = "mypy" +version = "1.19.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "librt" }, + { name = "mypy-extensions" }, + { name = "pathspec" }, + { name = "typing-extensions" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/f9/b5/b58cdc25fadd424552804bf410855d52324183112aa004f0732c5f6324cf/mypy-1.19.0.tar.gz", hash = "sha256:f6b874ca77f733222641e5c46e4711648c4037ea13646fd0cdc814c2eaec2528", size = 3579025, upload-time = "2025-11-28T15:49:01.26Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/cb/0d/a1357e6bb49e37ce26fcf7e3cc55679ce9f4ebee0cd8b6ee3a0e301a9210/mypy-1.19.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:7686ed65dbabd24d20066f3115018d2dce030d8fa9db01aa9f0a59b6813e9f9e", size = 13191993, upload-time = "2025-11-28T15:47:22.336Z" }, + { url = "https://files.pythonhosted.org/packages/5d/75/8e5d492a879ec4490e6ba664b5154e48c46c85b5ac9785792a5ec6a4d58f/mypy-1.19.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:fd4a985b2e32f23bead72e2fb4bbe5d6aceee176be471243bd831d5b2644672d", size = 12174411, upload-time = "2025-11-28T15:44:55.492Z" }, + { url = "https://files.pythonhosted.org/packages/71/31/ad5dcee9bfe226e8eaba777e9d9d251c292650130f0450a280aec3485370/mypy-1.19.0-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:fc51a5b864f73a3a182584b1ac75c404396a17eced54341629d8bdcb644a5bba", size = 12727751, upload-time = "2025-11-28T15:44:14.169Z" }, + { url = "https://files.pythonhosted.org/packages/77/06/b6b8994ce07405f6039701f4b66e9d23f499d0b41c6dd46ec28f96d57ec3/mypy-1.19.0-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:37af5166f9475872034b56c5efdcf65ee25394e9e1d172907b84577120714364", size = 13593323, upload-time = "2025-11-28T15:46:34.699Z" }, + { url = "https://files.pythonhosted.org/packages/68/b1/126e274484cccdf099a8e328d4fda1c7bdb98a5e888fa6010b00e1bbf330/mypy-1.19.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:510c014b722308c9bd377993bcbf9a07d7e0692e5fa8fc70e639c1eb19fc6bee", size = 13818032, upload-time = "2025-11-28T15:46:18.286Z" }, + { url = "https://files.pythonhosted.org/packages/f8/56/53a8f70f562dfc466c766469133a8a4909f6c0012d83993143f2a9d48d2d/mypy-1.19.0-cp313-cp313-win_amd64.whl", hash = "sha256:cabbee74f29aa9cd3b444ec2f1e4fa5a9d0d746ce7567a6a609e224429781f53", size = 10120644, upload-time = "2025-11-28T15:47:43.99Z" }, + { url = "https://files.pythonhosted.org/packages/b0/f4/7751f32f56916f7f8c229fe902cbdba3e4dd3f3ea9e8b872be97e7fc546d/mypy-1.19.0-cp314-cp314-macosx_10_15_x86_64.whl", hash = "sha256:f2e36bed3c6d9b5f35d28b63ca4b727cb0228e480826ffc8953d1892ddc8999d", size = 13185236, upload-time = "2025-11-28T15:45:20.696Z" }, + { url = "https://files.pythonhosted.org/packages/35/31/871a9531f09e78e8d145032355890384f8a5b38c95a2c7732d226b93242e/mypy-1.19.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:a18d8abdda14035c5718acb748faec09571432811af129bf0d9e7b2d6699bf18", size = 12213902, upload-time = "2025-11-28T15:46:10.117Z" }, + { url = "https://files.pythonhosted.org/packages/58/b8/af221910dd40eeefa2077a59107e611550167b9994693fc5926a0b0f87c0/mypy-1.19.0-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:f75e60aca3723a23511948539b0d7ed514dda194bc3755eae0bfc7a6b4887aa7", size = 12738600, upload-time = "2025-11-28T15:44:22.521Z" }, + { url = "https://files.pythonhosted.org/packages/11/9f/c39e89a3e319c1d9c734dedec1183b2cc3aefbab066ec611619002abb932/mypy-1.19.0-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:8f44f2ae3c58421ee05fe609160343c25f70e3967f6e32792b5a78006a9d850f", size = 13592639, upload-time = "2025-11-28T15:48:08.55Z" }, + { url = "https://files.pythonhosted.org/packages/97/6d/ffaf5f01f5e284d9033de1267e6c1b8f3783f2cf784465378a86122e884b/mypy-1.19.0-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:63ea6a00e4bd6822adbfc75b02ab3653a17c02c4347f5bb0cf1d5b9df3a05835", size = 13799132, upload-time = "2025-11-28T15:47:06.032Z" }, + { url = "https://files.pythonhosted.org/packages/fe/b0/c33921e73aaa0106224e5a34822411bea38046188eb781637f5a5b07e269/mypy-1.19.0-cp314-cp314-win_amd64.whl", hash = "sha256:3ad925b14a0bb99821ff6f734553294aa6a3440a8cb082fe1f5b84dfb662afb1", size = 10269832, upload-time = "2025-11-28T15:47:29.392Z" }, + { url = "https://files.pythonhosted.org/packages/09/0e/fe228ed5aeab470c6f4eb82481837fadb642a5aa95cc8215fd2214822c10/mypy-1.19.0-py3-none-any.whl", hash = "sha256:0c01c99d626380752e527d5ce8e69ffbba2046eb8a060db0329690849cf9b6f9", size = 2469714, upload-time = "2025-11-28T15:45:33.22Z" }, +] + +[[package]] +name = "mypy-extensions" +version = "1.1.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/a2/6e/371856a3fb9d31ca8dac321cda606860fa4548858c0cc45d9d1d4ca2628b/mypy_extensions-1.1.0.tar.gz", hash = "sha256:52e68efc3284861e772bbcd66823fde5ae21fd2fdb51c62a211403730b916558", size = 6343, upload-time = "2025-04-22T14:54:24.164Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/79/7b/2c79738432f5c924bef5071f933bcc9efd0473bac3b4aa584a6f7c1c8df8/mypy_extensions-1.1.0-py3-none-any.whl", hash = "sha256:1be4cccdb0f2482337c4743e60421de3a356cd97508abadd57d47403e94f5505", size = 4963, upload-time = "2025-04-22T14:54:22.983Z" }, +] + +[[package]] +name = "nodeenv" +version = "1.9.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/43/16/fc88b08840de0e0a72a2f9d8c6bae36be573e475a6326ae854bcc549fc45/nodeenv-1.9.1.tar.gz", hash = "sha256:6ec12890a2dab7946721edbfbcd91f3319c6ccc9aec47be7c7e6b7011ee6645f", size = 47437, upload-time = "2024-06-04T18:44:11.171Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/d2/1d/1b658dbd2b9fa9c4c9f32accbfc0205d532c8c6194dc0f2a4c0428e7128a/nodeenv-1.9.1-py2.py3-none-any.whl", hash = "sha256:ba11c9782d29c27c70ffbdda2d7415098754709be8a7056d79a737cd901155c9", size = 22314, upload-time = "2024-06-04T18:44:08.352Z" }, +] + +[[package]] +name = "openai" +version = "2.9.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "anyio" }, + { name = "distro" }, + { name = "httpx" }, + { name = "jiter" }, + { name = "pydantic" }, + { name = "sniffio" }, + { name = "tqdm" }, + { name = "typing-extensions" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/09/48/516290f38745cc1e72856f50e8afed4a7f9ac396a5a18f39e892ab89dfc2/openai-2.9.0.tar.gz", hash = "sha256:b52ec65727fc8f1eed2fbc86c8eac0998900c7ef63aa2eb5c24b69717c56fa5f", size = 608202, upload-time = "2025-12-04T18:15:09.01Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/59/fd/ae2da789cd923dd033c99b8d544071a827c92046b150db01cfa5cea5b3fd/openai-2.9.0-py3-none-any.whl", hash = "sha256:0d168a490fbb45630ad508a6f3022013c155a68fd708069b6a1a01a5e8f0ffad", size = 1030836, upload-time = "2025-12-04T18:15:07.063Z" }, +] + +[[package]] +name = "packaging" +version = "25.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/a1/d4/1fc4078c65507b51b96ca8f8c3ba19e6a61c8253c72794544580a7b6c24d/packaging-25.0.tar.gz", hash = "sha256:d443872c98d677bf60f6a1f2f8c1cb748e8fe762d2bf9d3148b5599295b0fc4f", size = 165727, upload-time = "2025-04-19T11:48:59.673Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/20/12/38679034af332785aac8774540895e234f4d07f7545804097de4b666afd8/packaging-25.0-py3-none-any.whl", hash = "sha256:29572ef2b1f17581046b3a2227d5c611fb25ec70ca1ba8554b24b0e69331a484", size = 66469, upload-time = "2025-04-19T11:48:57.875Z" }, +] + +[[package]] +name = "passlib" +version = "1.7.4" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/b6/06/9da9ee59a67fae7761aab3ccc84fa4f3f33f125b370f1ccdb915bf967c11/passlib-1.7.4.tar.gz", hash = "sha256:defd50f72b65c5402ab2c573830a6978e5f202ad0d984793c8dde2c4152ebe04", size = 689844, upload-time = "2020-10-08T19:00:52.121Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/3b/a4/ab6b7589382ca3df236e03faa71deac88cae040af60c071a78d254a62172/passlib-1.7.4-py2.py3-none-any.whl", hash = "sha256:aa6bca462b8d8bda89c70b382f0c298a20b5560af6cbfa2dce410c0a2fb669f1", size = 525554, upload-time = "2020-10-08T19:00:49.856Z" }, +] + +[package.optional-dependencies] +bcrypt = [ + { name = "bcrypt" }, +] + +[[package]] +name = "pathspec" +version = "0.12.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/ca/bc/f35b8446f4531a7cb215605d100cd88b7ac6f44ab3fc94870c120ab3adbf/pathspec-0.12.1.tar.gz", hash = "sha256:a482d51503a1ab33b1c67a6c3813a26953dbdc71c31dacaef9a838c4e29f5712", size = 51043, upload-time = "2023-12-10T22:30:45Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/cc/20/ff623b09d963f88bfde16306a54e12ee5ea43e9b597108672ff3a408aad6/pathspec-0.12.1-py3-none-any.whl", hash = "sha256:a0d503e138a4c123b27490a4f7beda6a01c6f288df0e4a8b79c7eb0dc7b4cc08", size = 31191, upload-time = "2023-12-10T22:30:43.14Z" }, +] + +[[package]] +name = "platformdirs" +version = "4.5.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/61/33/9611380c2bdb1225fdef633e2a9610622310fed35ab11dac9620972ee088/platformdirs-4.5.0.tar.gz", hash = "sha256:70ddccdd7c99fc5942e9fc25636a8b34d04c24b335100223152c2803e4063312", size = 21632, upload-time = "2025-10-08T17:44:48.791Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/73/cb/ac7874b3e5d58441674fb70742e6c374b28b0c7cb988d37d991cde47166c/platformdirs-4.5.0-py3-none-any.whl", hash = "sha256:e578a81bb873cbb89a41fcc904c7ef523cc18284b7e3b3ccf06aca1403b7ebd3", size = 18651, upload-time = "2025-10-08T17:44:47.223Z" }, +] + +[[package]] +name = "pluggy" +version = "1.6.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/f9/e2/3e91f31a7d2b083fe6ef3fa267035b518369d9511ffab804f839851d2779/pluggy-1.6.0.tar.gz", hash = "sha256:7dcc130b76258d33b90f61b658791dede3486c3e6bfb003ee5c9bfb396dd22f3", size = 69412, upload-time = "2025-05-15T12:30:07.975Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/54/20/4d324d65cc6d9205fabedc306948156824eb9f0ee1633355a8f7ec5c66bf/pluggy-1.6.0-py3-none-any.whl", hash = "sha256:e920276dd6813095e9377c0bc5566d94c932c33b27a3e3945d8389c374dd4746", size = 20538, upload-time = "2025-05-15T12:30:06.134Z" }, +] + +[[package]] +name = "pre-commit" +version = "4.5.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "cfgv" }, + { name = "identify" }, + { name = "nodeenv" }, + { name = "pyyaml" }, + { name = "virtualenv" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/f4/9b/6a4ffb4ed980519da959e1cf3122fc6cb41211daa58dbae1c73c0e519a37/pre_commit-4.5.0.tar.gz", hash = "sha256:dc5a065e932b19fc1d4c653c6939068fe54325af8e741e74e88db4d28a4dd66b", size = 198428, upload-time = "2025-11-22T21:02:42.304Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/5d/c4/b2d28e9d2edf4f1713eb3c29307f1a63f3d67cf09bdda29715a36a68921a/pre_commit-4.5.0-py2.py3-none-any.whl", hash = "sha256:25e2ce09595174d9c97860a95609f9f852c0614ba602de3561e267547f2335e1", size = 226429, upload-time = "2025-11-22T21:02:40.836Z" }, +] + +[[package]] +name = "propcache" +version = "0.4.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/9e/da/e9fc233cf63743258bff22b3dfa7ea5baef7b5bc324af47a0ad89b8ffc6f/propcache-0.4.1.tar.gz", hash = "sha256:f48107a8c637e80362555f37ecf49abe20370e557cc4ab374f04ec4423c97c3d", size = 46442, upload-time = "2025-10-08T19:49:02.291Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/bf/df/6d9c1b6ac12b003837dde8a10231a7344512186e87b36e855bef32241942/propcache-0.4.1-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:43eedf29202c08550aac1d14e0ee619b0430aaef78f85864c1a892294fbc28cf", size = 77750, upload-time = "2025-10-08T19:47:07.648Z" }, + { url = "https://files.pythonhosted.org/packages/8b/e8/677a0025e8a2acf07d3418a2e7ba529c9c33caf09d3c1f25513023c1db56/propcache-0.4.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:d62cdfcfd89ccb8de04e0eda998535c406bf5e060ffd56be6c586cbcc05b3311", size = 44780, upload-time = "2025-10-08T19:47:08.851Z" }, + { url = "https://files.pythonhosted.org/packages/89/a4/92380f7ca60f99ebae761936bc48a72a639e8a47b29050615eef757cb2a7/propcache-0.4.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:cae65ad55793da34db5f54e4029b89d3b9b9490d8abe1b4c7ab5d4b8ec7ebf74", size = 46308, upload-time = "2025-10-08T19:47:09.982Z" }, + { url = "https://files.pythonhosted.org/packages/2d/48/c5ac64dee5262044348d1d78a5f85dd1a57464a60d30daee946699963eb3/propcache-0.4.1-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:333ddb9031d2704a301ee3e506dc46b1fe5f294ec198ed6435ad5b6a085facfe", size = 208182, upload-time = "2025-10-08T19:47:11.319Z" }, + { url = "https://files.pythonhosted.org/packages/c6/0c/cd762dd011a9287389a6a3eb43aa30207bde253610cca06824aeabfe9653/propcache-0.4.1-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:fd0858c20f078a32cf55f7e81473d96dcf3b93fd2ccdb3d40fdf54b8573df3af", size = 211215, upload-time = "2025-10-08T19:47:13.146Z" }, + { url = "https://files.pythonhosted.org/packages/30/3e/49861e90233ba36890ae0ca4c660e95df565b2cd15d4a68556ab5865974e/propcache-0.4.1-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:678ae89ebc632c5c204c794f8dab2837c5f159aeb59e6ed0539500400577298c", size = 218112, upload-time = "2025-10-08T19:47:14.913Z" }, + { url = "https://files.pythonhosted.org/packages/f1/8b/544bc867e24e1bd48f3118cecd3b05c694e160a168478fa28770f22fd094/propcache-0.4.1-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:d472aeb4fbf9865e0c6d622d7f4d54a4e101a89715d8904282bb5f9a2f476c3f", size = 204442, upload-time = "2025-10-08T19:47:16.277Z" }, + { url = "https://files.pythonhosted.org/packages/50/a6/4282772fd016a76d3e5c0df58380a5ea64900afd836cec2c2f662d1b9bb3/propcache-0.4.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:4d3df5fa7e36b3225954fba85589da77a0fe6a53e3976de39caf04a0db4c36f1", size = 199398, upload-time = "2025-10-08T19:47:17.962Z" }, + { url = "https://files.pythonhosted.org/packages/3e/ec/d8a7cd406ee1ddb705db2139f8a10a8a427100347bd698e7014351c7af09/propcache-0.4.1-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:ee17f18d2498f2673e432faaa71698032b0127ebf23ae5974eeaf806c279df24", size = 196920, upload-time = "2025-10-08T19:47:19.355Z" }, + { url = "https://files.pythonhosted.org/packages/f6/6c/f38ab64af3764f431e359f8baf9e0a21013e24329e8b85d2da32e8ed07ca/propcache-0.4.1-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:580e97762b950f993ae618e167e7be9256b8353c2dcd8b99ec100eb50f5286aa", size = 203748, upload-time = "2025-10-08T19:47:21.338Z" }, + { url = "https://files.pythonhosted.org/packages/d6/e3/fa846bd70f6534d647886621388f0a265254d30e3ce47e5c8e6e27dbf153/propcache-0.4.1-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:501d20b891688eb8e7aa903021f0b72d5a55db40ffaab27edefd1027caaafa61", size = 205877, upload-time = "2025-10-08T19:47:23.059Z" }, + { url = "https://files.pythonhosted.org/packages/e2/39/8163fc6f3133fea7b5f2827e8eba2029a0277ab2c5beee6c1db7b10fc23d/propcache-0.4.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:9a0bd56e5b100aef69bd8562b74b46254e7c8812918d3baa700c8a8009b0af66", size = 199437, upload-time = "2025-10-08T19:47:24.445Z" }, + { url = "https://files.pythonhosted.org/packages/93/89/caa9089970ca49c7c01662bd0eeedfe85494e863e8043565aeb6472ce8fe/propcache-0.4.1-cp313-cp313-win32.whl", hash = "sha256:bcc9aaa5d80322bc2fb24bb7accb4a30f81e90ab8d6ba187aec0744bc302ad81", size = 37586, upload-time = "2025-10-08T19:47:25.736Z" }, + { url = "https://files.pythonhosted.org/packages/f5/ab/f76ec3c3627c883215b5c8080debb4394ef5a7a29be811f786415fc1e6fd/propcache-0.4.1-cp313-cp313-win_amd64.whl", hash = "sha256:381914df18634f5494334d201e98245c0596067504b9372d8cf93f4bb23e025e", size = 40790, upload-time = "2025-10-08T19:47:26.847Z" }, + { url = "https://files.pythonhosted.org/packages/59/1b/e71ae98235f8e2ba5004d8cb19765a74877abf189bc53fc0c80d799e56c3/propcache-0.4.1-cp313-cp313-win_arm64.whl", hash = "sha256:8873eb4460fd55333ea49b7d189749ecf6e55bf85080f11b1c4530ed3034cba1", size = 37158, upload-time = "2025-10-08T19:47:27.961Z" }, + { url = "https://files.pythonhosted.org/packages/83/ce/a31bbdfc24ee0dcbba458c8175ed26089cf109a55bbe7b7640ed2470cfe9/propcache-0.4.1-cp313-cp313t-macosx_10_13_universal2.whl", hash = "sha256:92d1935ee1f8d7442da9c0c4fa7ac20d07e94064184811b685f5c4fada64553b", size = 81451, upload-time = "2025-10-08T19:47:29.445Z" }, + { url = "https://files.pythonhosted.org/packages/25/9c/442a45a470a68456e710d96cacd3573ef26a1d0a60067e6a7d5e655621ed/propcache-0.4.1-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:473c61b39e1460d386479b9b2f337da492042447c9b685f28be4f74d3529e566", size = 46374, upload-time = "2025-10-08T19:47:30.579Z" }, + { url = "https://files.pythonhosted.org/packages/f4/bf/b1d5e21dbc3b2e889ea4327044fb16312a736d97640fb8b6aa3f9c7b3b65/propcache-0.4.1-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:c0ef0aaafc66fbd87842a3fe3902fd889825646bc21149eafe47be6072725835", size = 48396, upload-time = "2025-10-08T19:47:31.79Z" }, + { url = "https://files.pythonhosted.org/packages/f4/04/5b4c54a103d480e978d3c8a76073502b18db0c4bc17ab91b3cb5092ad949/propcache-0.4.1-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:f95393b4d66bfae908c3ca8d169d5f79cd65636ae15b5e7a4f6e67af675adb0e", size = 275950, upload-time = "2025-10-08T19:47:33.481Z" }, + { url = "https://files.pythonhosted.org/packages/b4/c1/86f846827fb969c4b78b0af79bba1d1ea2156492e1b83dea8b8a6ae27395/propcache-0.4.1-cp313-cp313t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:c07fda85708bc48578467e85099645167a955ba093be0a2dcba962195676e859", size = 273856, upload-time = "2025-10-08T19:47:34.906Z" }, + { url = "https://files.pythonhosted.org/packages/36/1d/fc272a63c8d3bbad6878c336c7a7dea15e8f2d23a544bda43205dfa83ada/propcache-0.4.1-cp313-cp313t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:af223b406d6d000830c6f65f1e6431783fc3f713ba3e6cc8c024d5ee96170a4b", size = 280420, upload-time = "2025-10-08T19:47:36.338Z" }, + { url = "https://files.pythonhosted.org/packages/07/0c/01f2219d39f7e53d52e5173bcb09c976609ba30209912a0680adfb8c593a/propcache-0.4.1-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:a78372c932c90ee474559c5ddfffd718238e8673c340dc21fe45c5b8b54559a0", size = 263254, upload-time = "2025-10-08T19:47:37.692Z" }, + { url = "https://files.pythonhosted.org/packages/2d/18/cd28081658ce597898f0c4d174d4d0f3c5b6d4dc27ffafeef835c95eb359/propcache-0.4.1-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:564d9f0d4d9509e1a870c920a89b2fec951b44bf5ba7d537a9e7c1ccec2c18af", size = 261205, upload-time = "2025-10-08T19:47:39.659Z" }, + { url = "https://files.pythonhosted.org/packages/7a/71/1f9e22eb8b8316701c2a19fa1f388c8a3185082607da8e406a803c9b954e/propcache-0.4.1-cp313-cp313t-musllinux_1_2_armv7l.whl", hash = "sha256:17612831fda0138059cc5546f4d12a2aacfb9e47068c06af35c400ba58ba7393", size = 247873, upload-time = "2025-10-08T19:47:41.084Z" }, + { url = "https://files.pythonhosted.org/packages/4a/65/3d4b61f36af2b4eddba9def857959f1016a51066b4f1ce348e0cf7881f58/propcache-0.4.1-cp313-cp313t-musllinux_1_2_ppc64le.whl", hash = "sha256:41a89040cb10bd345b3c1a873b2bf36413d48da1def52f268a055f7398514874", size = 262739, upload-time = "2025-10-08T19:47:42.51Z" }, + { url = "https://files.pythonhosted.org/packages/2a/42/26746ab087faa77c1c68079b228810436ccd9a5ce9ac85e2b7307195fd06/propcache-0.4.1-cp313-cp313t-musllinux_1_2_s390x.whl", hash = "sha256:e35b88984e7fa64aacecea39236cee32dd9bd8c55f57ba8a75cf2399553f9bd7", size = 263514, upload-time = "2025-10-08T19:47:43.927Z" }, + { url = "https://files.pythonhosted.org/packages/94/13/630690fe201f5502d2403dd3cfd451ed8858fe3c738ee88d095ad2ff407b/propcache-0.4.1-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:6f8b465489f927b0df505cbe26ffbeed4d6d8a2bbc61ce90eb074ff129ef0ab1", size = 257781, upload-time = "2025-10-08T19:47:45.448Z" }, + { url = "https://files.pythonhosted.org/packages/92/f7/1d4ec5841505f423469efbfc381d64b7b467438cd5a4bbcbb063f3b73d27/propcache-0.4.1-cp313-cp313t-win32.whl", hash = "sha256:2ad890caa1d928c7c2965b48f3a3815c853180831d0e5503d35cf00c472f4717", size = 41396, upload-time = "2025-10-08T19:47:47.202Z" }, + { url = "https://files.pythonhosted.org/packages/48/f0/615c30622316496d2cbbc29f5985f7777d3ada70f23370608c1d3e081c1f/propcache-0.4.1-cp313-cp313t-win_amd64.whl", hash = "sha256:f7ee0e597f495cf415bcbd3da3caa3bd7e816b74d0d52b8145954c5e6fd3ff37", size = 44897, upload-time = "2025-10-08T19:47:48.336Z" }, + { url = "https://files.pythonhosted.org/packages/fd/ca/6002e46eccbe0e33dcd4069ef32f7f1c9e243736e07adca37ae8c4830ec3/propcache-0.4.1-cp313-cp313t-win_arm64.whl", hash = "sha256:929d7cbe1f01bb7baffb33dc14eb5691c95831450a26354cd210a8155170c93a", size = 39789, upload-time = "2025-10-08T19:47:49.876Z" }, + { url = "https://files.pythonhosted.org/packages/8e/5c/bca52d654a896f831b8256683457ceddd490ec18d9ec50e97dfd8fc726a8/propcache-0.4.1-cp314-cp314-macosx_10_13_universal2.whl", hash = "sha256:3f7124c9d820ba5548d431afb4632301acf965db49e666aa21c305cbe8c6de12", size = 78152, upload-time = "2025-10-08T19:47:51.051Z" }, + { url = "https://files.pythonhosted.org/packages/65/9b/03b04e7d82a5f54fb16113d839f5ea1ede58a61e90edf515f6577c66fa8f/propcache-0.4.1-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:c0d4b719b7da33599dfe3b22d3db1ef789210a0597bc650b7cee9c77c2be8c5c", size = 44869, upload-time = "2025-10-08T19:47:52.594Z" }, + { url = "https://files.pythonhosted.org/packages/b2/fa/89a8ef0468d5833a23fff277b143d0573897cf75bd56670a6d28126c7d68/propcache-0.4.1-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:9f302f4783709a78240ebc311b793f123328716a60911d667e0c036bc5dcbded", size = 46596, upload-time = "2025-10-08T19:47:54.073Z" }, + { url = "https://files.pythonhosted.org/packages/86/bd/47816020d337f4a746edc42fe8d53669965138f39ee117414c7d7a340cfe/propcache-0.4.1-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:c80ee5802e3fb9ea37938e7eecc307fb984837091d5fd262bb37238b1ae97641", size = 206981, upload-time = "2025-10-08T19:47:55.715Z" }, + { url = "https://files.pythonhosted.org/packages/df/f6/c5fa1357cc9748510ee55f37173eb31bfde6d94e98ccd9e6f033f2fc06e1/propcache-0.4.1-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:ed5a841e8bb29a55fb8159ed526b26adc5bdd7e8bd7bf793ce647cb08656cdf4", size = 211490, upload-time = "2025-10-08T19:47:57.499Z" }, + { url = "https://files.pythonhosted.org/packages/80/1e/e5889652a7c4a3846683401a48f0f2e5083ce0ec1a8a5221d8058fbd1adf/propcache-0.4.1-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:55c72fd6ea2da4c318e74ffdf93c4fe4e926051133657459131a95c846d16d44", size = 215371, upload-time = "2025-10-08T19:47:59.317Z" }, + { url = "https://files.pythonhosted.org/packages/b2/f2/889ad4b2408f72fe1a4f6a19491177b30ea7bf1a0fd5f17050ca08cfc882/propcache-0.4.1-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:8326e144341460402713f91df60ade3c999d601e7eb5ff8f6f7862d54de0610d", size = 201424, upload-time = "2025-10-08T19:48:00.67Z" }, + { url = "https://files.pythonhosted.org/packages/27/73/033d63069b57b0812c8bd19f311faebeceb6ba31b8f32b73432d12a0b826/propcache-0.4.1-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:060b16ae65bc098da7f6d25bf359f1f31f688384858204fe5d652979e0015e5b", size = 197566, upload-time = "2025-10-08T19:48:02.604Z" }, + { url = "https://files.pythonhosted.org/packages/dc/89/ce24f3dc182630b4e07aa6d15f0ff4b14ed4b9955fae95a0b54c58d66c05/propcache-0.4.1-cp314-cp314-musllinux_1_2_armv7l.whl", hash = "sha256:89eb3fa9524f7bec9de6e83cf3faed9d79bffa560672c118a96a171a6f55831e", size = 193130, upload-time = "2025-10-08T19:48:04.499Z" }, + { url = "https://files.pythonhosted.org/packages/a9/24/ef0d5fd1a811fb5c609278d0209c9f10c35f20581fcc16f818da959fc5b4/propcache-0.4.1-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:dee69d7015dc235f526fe80a9c90d65eb0039103fe565776250881731f06349f", size = 202625, upload-time = "2025-10-08T19:48:06.213Z" }, + { url = "https://files.pythonhosted.org/packages/f5/02/98ec20ff5546f68d673df2f7a69e8c0d076b5abd05ca882dc7ee3a83653d/propcache-0.4.1-cp314-cp314-musllinux_1_2_s390x.whl", hash = "sha256:5558992a00dfd54ccbc64a32726a3357ec93825a418a401f5cc67df0ac5d9e49", size = 204209, upload-time = "2025-10-08T19:48:08.432Z" }, + { url = "https://files.pythonhosted.org/packages/a0/87/492694f76759b15f0467a2a93ab68d32859672b646aa8a04ce4864e7932d/propcache-0.4.1-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:c9b822a577f560fbd9554812526831712c1436d2c046cedee4c3796d3543b144", size = 197797, upload-time = "2025-10-08T19:48:09.968Z" }, + { url = "https://files.pythonhosted.org/packages/ee/36/66367de3575db1d2d3f3d177432bd14ee577a39d3f5d1b3d5df8afe3b6e2/propcache-0.4.1-cp314-cp314-win32.whl", hash = "sha256:ab4c29b49d560fe48b696cdcb127dd36e0bc2472548f3bf56cc5cb3da2b2984f", size = 38140, upload-time = "2025-10-08T19:48:11.232Z" }, + { url = "https://files.pythonhosted.org/packages/0c/2a/a758b47de253636e1b8aef181c0b4f4f204bf0dd964914fb2af90a95b49b/propcache-0.4.1-cp314-cp314-win_amd64.whl", hash = "sha256:5a103c3eb905fcea0ab98be99c3a9a5ab2de60228aa5aceedc614c0281cf6153", size = 41257, upload-time = "2025-10-08T19:48:12.707Z" }, + { url = "https://files.pythonhosted.org/packages/34/5e/63bd5896c3fec12edcbd6f12508d4890d23c265df28c74b175e1ef9f4f3b/propcache-0.4.1-cp314-cp314-win_arm64.whl", hash = "sha256:74c1fb26515153e482e00177a1ad654721bf9207da8a494a0c05e797ad27b992", size = 38097, upload-time = "2025-10-08T19:48:13.923Z" }, + { url = "https://files.pythonhosted.org/packages/99/85/9ff785d787ccf9bbb3f3106f79884a130951436f58392000231b4c737c80/propcache-0.4.1-cp314-cp314t-macosx_10_13_universal2.whl", hash = "sha256:824e908bce90fb2743bd6b59db36eb4f45cd350a39637c9f73b1c1ea66f5b75f", size = 81455, upload-time = "2025-10-08T19:48:15.16Z" }, + { url = "https://files.pythonhosted.org/packages/90/85/2431c10c8e7ddb1445c1f7c4b54d886e8ad20e3c6307e7218f05922cad67/propcache-0.4.1-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:c2b5e7db5328427c57c8e8831abda175421b709672f6cfc3d630c3b7e2146393", size = 46372, upload-time = "2025-10-08T19:48:16.424Z" }, + { url = "https://files.pythonhosted.org/packages/01/20/b0972d902472da9bcb683fa595099911f4d2e86e5683bcc45de60dd05dc3/propcache-0.4.1-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:6f6ff873ed40292cd4969ef5310179afd5db59fdf055897e282485043fc80ad0", size = 48411, upload-time = "2025-10-08T19:48:17.577Z" }, + { url = "https://files.pythonhosted.org/packages/e2/e3/7dc89f4f21e8f99bad3d5ddb3a3389afcf9da4ac69e3deb2dcdc96e74169/propcache-0.4.1-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:49a2dc67c154db2c1463013594c458881a069fcf98940e61a0569016a583020a", size = 275712, upload-time = "2025-10-08T19:48:18.901Z" }, + { url = "https://files.pythonhosted.org/packages/20/67/89800c8352489b21a8047c773067644e3897f02ecbbd610f4d46b7f08612/propcache-0.4.1-cp314-cp314t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:005f08e6a0529984491e37d8dbc3dd86f84bd78a8ceb5fa9a021f4c48d4984be", size = 273557, upload-time = "2025-10-08T19:48:20.762Z" }, + { url = "https://files.pythonhosted.org/packages/e2/a1/b52b055c766a54ce6d9c16d9aca0cad8059acd9637cdf8aa0222f4a026ef/propcache-0.4.1-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:5c3310452e0d31390da9035c348633b43d7e7feb2e37be252be6da45abd1abcc", size = 280015, upload-time = "2025-10-08T19:48:22.592Z" }, + { url = "https://files.pythonhosted.org/packages/48/c8/33cee30bd890672c63743049f3c9e4be087e6780906bfc3ec58528be59c1/propcache-0.4.1-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:4c3c70630930447f9ef1caac7728c8ad1c56bc5015338b20fed0d08ea2480b3a", size = 262880, upload-time = "2025-10-08T19:48:23.947Z" }, + { url = "https://files.pythonhosted.org/packages/0c/b1/8f08a143b204b418285c88b83d00edbd61afbc2c6415ffafc8905da7038b/propcache-0.4.1-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:8e57061305815dfc910a3634dcf584f08168a8836e6999983569f51a8544cd89", size = 260938, upload-time = "2025-10-08T19:48:25.656Z" }, + { url = "https://files.pythonhosted.org/packages/cf/12/96e4664c82ca2f31e1c8dff86afb867348979eb78d3cb8546a680287a1e9/propcache-0.4.1-cp314-cp314t-musllinux_1_2_armv7l.whl", hash = "sha256:521a463429ef54143092c11a77e04056dd00636f72e8c45b70aaa3140d639726", size = 247641, upload-time = "2025-10-08T19:48:27.207Z" }, + { url = "https://files.pythonhosted.org/packages/18/ed/e7a9cfca28133386ba52278136d42209d3125db08d0a6395f0cba0c0285c/propcache-0.4.1-cp314-cp314t-musllinux_1_2_ppc64le.whl", hash = "sha256:120c964da3fdc75e3731aa392527136d4ad35868cc556fd09bb6d09172d9a367", size = 262510, upload-time = "2025-10-08T19:48:28.65Z" }, + { url = "https://files.pythonhosted.org/packages/f5/76/16d8bf65e8845dd62b4e2b57444ab81f07f40caa5652b8969b87ddcf2ef6/propcache-0.4.1-cp314-cp314t-musllinux_1_2_s390x.whl", hash = "sha256:d8f353eb14ee3441ee844ade4277d560cdd68288838673273b978e3d6d2c8f36", size = 263161, upload-time = "2025-10-08T19:48:30.133Z" }, + { url = "https://files.pythonhosted.org/packages/e7/70/c99e9edb5d91d5ad8a49fa3c1e8285ba64f1476782fed10ab251ff413ba1/propcache-0.4.1-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:ab2943be7c652f09638800905ee1bab2c544e537edb57d527997a24c13dc1455", size = 257393, upload-time = "2025-10-08T19:48:31.567Z" }, + { url = "https://files.pythonhosted.org/packages/08/02/87b25304249a35c0915d236575bc3574a323f60b47939a2262b77632a3ee/propcache-0.4.1-cp314-cp314t-win32.whl", hash = "sha256:05674a162469f31358c30bcaa8883cb7829fa3110bf9c0991fe27d7896c42d85", size = 42546, upload-time = "2025-10-08T19:48:32.872Z" }, + { url = "https://files.pythonhosted.org/packages/cb/ef/3c6ecf8b317aa982f309835e8f96987466123c6e596646d4e6a1dfcd080f/propcache-0.4.1-cp314-cp314t-win_amd64.whl", hash = "sha256:990f6b3e2a27d683cb7602ed6c86f15ee6b43b1194736f9baaeb93d0016633b1", size = 46259, upload-time = "2025-10-08T19:48:34.226Z" }, + { url = "https://files.pythonhosted.org/packages/c4/2d/346e946d4951f37eca1e4f55be0f0174c52cd70720f84029b02f296f4a38/propcache-0.4.1-cp314-cp314t-win_arm64.whl", hash = "sha256:ecef2343af4cc68e05131e45024ba34f6095821988a9d0a02aa7c73fcc448aa9", size = 40428, upload-time = "2025-10-08T19:48:35.441Z" }, + { url = "https://files.pythonhosted.org/packages/5b/5a/bc7b4a4ef808fa59a816c17b20c4bef6884daebbdf627ff2a161da67da19/propcache-0.4.1-py3-none-any.whl", hash = "sha256:af2a6052aeb6cf17d3e46ee169099044fd8224cbaf75c76a2ef596e8163e2237", size = 13305, upload-time = "2025-10-08T19:49:00.792Z" }, +] + +[[package]] +name = "pyasn1" +version = "0.6.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/ba/e9/01f1a64245b89f039897cb0130016d79f77d52669aae6ee7b159a6c4c018/pyasn1-0.6.1.tar.gz", hash = "sha256:6f580d2bdd84365380830acf45550f2511469f673cb4a5ae3857a3170128b034", size = 145322, upload-time = "2024-09-10T22:41:42.55Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/c8/f1/d6a797abb14f6283c0ddff96bbdd46937f64122b8c925cab503dd37f8214/pyasn1-0.6.1-py3-none-any.whl", hash = "sha256:0d632f46f2ba09143da3a8afe9e33fb6f92fa2320ab7e886e2d0f7672af84629", size = 83135, upload-time = "2024-09-11T16:00:36.122Z" }, +] + +[[package]] +name = "pycparser" +version = "2.23" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/fe/cf/d2d3b9f5699fb1e4615c8e32ff220203e43b248e1dfcc6736ad9057731ca/pycparser-2.23.tar.gz", hash = "sha256:78816d4f24add8f10a06d6f05b4d424ad9e96cfebf68a4ddc99c65c0720d00c2", size = 173734, upload-time = "2025-09-09T13:23:47.91Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/a0/e3/59cd50310fc9b59512193629e1984c1f95e5c8ae6e5d8c69532ccc65a7fe/pycparser-2.23-py3-none-any.whl", hash = "sha256:e5c6e8d3fbad53479cab09ac03729e0a9faf2bee3db8208a550daf5af81a5934", size = 118140, upload-time = "2025-09-09T13:23:46.651Z" }, +] + +[[package]] +name = "pydantic" +version = "2.12.5" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "annotated-types" }, + { name = "pydantic-core" }, + { name = "typing-extensions" }, + { name = "typing-inspection" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/69/44/36f1a6e523abc58ae5f928898e4aca2e0ea509b5aa6f6f392a5d882be928/pydantic-2.12.5.tar.gz", hash = "sha256:4d351024c75c0f085a9febbb665ce8c0c6ec5d30e903bdb6394b7ede26aebb49", size = 821591, upload-time = "2025-11-26T15:11:46.471Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/5a/87/b70ad306ebb6f9b585f114d0ac2137d792b48be34d732d60e597c2f8465a/pydantic-2.12.5-py3-none-any.whl", hash = "sha256:e561593fccf61e8a20fc46dfc2dfe075b8be7d0188df33f221ad1f0139180f9d", size = 463580, upload-time = "2025-11-26T15:11:44.605Z" }, +] + +[[package]] +name = "pydantic-core" +version = "2.41.5" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "typing-extensions" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/71/70/23b021c950c2addd24ec408e9ab05d59b035b39d97cdc1130e1bce647bb6/pydantic_core-2.41.5.tar.gz", hash = "sha256:08daa51ea16ad373ffd5e7606252cc32f07bc72b28284b6bc9c6df804816476e", size = 460952, upload-time = "2025-11-04T13:43:49.098Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/87/06/8806241ff1f70d9939f9af039c6c35f2360cf16e93c2ca76f184e76b1564/pydantic_core-2.41.5-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:941103c9be18ac8daf7b7adca8228f8ed6bb7a1849020f643b3a14d15b1924d9", size = 2120403, upload-time = "2025-11-04T13:40:25.248Z" }, + { url = "https://files.pythonhosted.org/packages/94/02/abfa0e0bda67faa65fef1c84971c7e45928e108fe24333c81f3bfe35d5f5/pydantic_core-2.41.5-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:112e305c3314f40c93998e567879e887a3160bb8689ef3d2c04b6cc62c33ac34", size = 1896206, upload-time = "2025-11-04T13:40:27.099Z" }, + { url = "https://files.pythonhosted.org/packages/15/df/a4c740c0943e93e6500f9eb23f4ca7ec9bf71b19e608ae5b579678c8d02f/pydantic_core-2.41.5-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0cbaad15cb0c90aa221d43c00e77bb33c93e8d36e0bf74760cd00e732d10a6a0", size = 1919307, upload-time = "2025-11-04T13:40:29.806Z" }, + { url = "https://files.pythonhosted.org/packages/9a/e3/6324802931ae1d123528988e0e86587c2072ac2e5394b4bc2bc34b61ff6e/pydantic_core-2.41.5-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:03ca43e12fab6023fc79d28ca6b39b05f794ad08ec2feccc59a339b02f2b3d33", size = 2063258, upload-time = "2025-11-04T13:40:33.544Z" }, + { url = "https://files.pythonhosted.org/packages/c9/d4/2230d7151d4957dd79c3044ea26346c148c98fbf0ee6ebd41056f2d62ab5/pydantic_core-2.41.5-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:dc799088c08fa04e43144b164feb0c13f9a0bc40503f8df3e9fde58a3c0c101e", size = 2214917, upload-time = "2025-11-04T13:40:35.479Z" }, + { url = "https://files.pythonhosted.org/packages/e6/9f/eaac5df17a3672fef0081b6c1bb0b82b33ee89aa5cec0d7b05f52fd4a1fa/pydantic_core-2.41.5-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:97aeba56665b4c3235a0e52b2c2f5ae9cd071b8a8310ad27bddb3f7fb30e9aa2", size = 2332186, upload-time = "2025-11-04T13:40:37.436Z" }, + { url = "https://files.pythonhosted.org/packages/cf/4e/35a80cae583a37cf15604b44240e45c05e04e86f9cfd766623149297e971/pydantic_core-2.41.5-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:406bf18d345822d6c21366031003612b9c77b3e29ffdb0f612367352aab7d586", size = 2073164, upload-time = "2025-11-04T13:40:40.289Z" }, + { url = "https://files.pythonhosted.org/packages/bf/e3/f6e262673c6140dd3305d144d032f7bd5f7497d3871c1428521f19f9efa2/pydantic_core-2.41.5-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:b93590ae81f7010dbe380cdeab6f515902ebcbefe0b9327cc4804d74e93ae69d", size = 2179146, upload-time = "2025-11-04T13:40:42.809Z" }, + { url = "https://files.pythonhosted.org/packages/75/c7/20bd7fc05f0c6ea2056a4565c6f36f8968c0924f19b7d97bbfea55780e73/pydantic_core-2.41.5-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:01a3d0ab748ee531f4ea6c3e48ad9dac84ddba4b0d82291f87248f2f9de8d740", size = 2137788, upload-time = "2025-11-04T13:40:44.752Z" }, + { url = "https://files.pythonhosted.org/packages/3a/8d/34318ef985c45196e004bc46c6eab2eda437e744c124ef0dbe1ff2c9d06b/pydantic_core-2.41.5-cp313-cp313-musllinux_1_1_armv7l.whl", hash = "sha256:6561e94ba9dacc9c61bce40e2d6bdc3bfaa0259d3ff36ace3b1e6901936d2e3e", size = 2340133, upload-time = "2025-11-04T13:40:46.66Z" }, + { url = "https://files.pythonhosted.org/packages/9c/59/013626bf8c78a5a5d9350d12e7697d3d4de951a75565496abd40ccd46bee/pydantic_core-2.41.5-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:915c3d10f81bec3a74fbd4faebe8391013ba61e5a1a8d48c4455b923bdda7858", size = 2324852, upload-time = "2025-11-04T13:40:48.575Z" }, + { url = "https://files.pythonhosted.org/packages/1a/d9/c248c103856f807ef70c18a4f986693a46a8ffe1602e5d361485da502d20/pydantic_core-2.41.5-cp313-cp313-win32.whl", hash = "sha256:650ae77860b45cfa6e2cdafc42618ceafab3a2d9a3811fcfbd3bbf8ac3c40d36", size = 1994679, upload-time = "2025-11-04T13:40:50.619Z" }, + { url = "https://files.pythonhosted.org/packages/9e/8b/341991b158ddab181cff136acd2552c9f35bd30380422a639c0671e99a91/pydantic_core-2.41.5-cp313-cp313-win_amd64.whl", hash = "sha256:79ec52ec461e99e13791ec6508c722742ad745571f234ea6255bed38c6480f11", size = 2019766, upload-time = "2025-11-04T13:40:52.631Z" }, + { url = "https://files.pythonhosted.org/packages/73/7d/f2f9db34af103bea3e09735bb40b021788a5e834c81eedb541991badf8f5/pydantic_core-2.41.5-cp313-cp313-win_arm64.whl", hash = "sha256:3f84d5c1b4ab906093bdc1ff10484838aca54ef08de4afa9de0f5f14d69639cd", size = 1981005, upload-time = "2025-11-04T13:40:54.734Z" }, + { url = "https://files.pythonhosted.org/packages/ea/28/46b7c5c9635ae96ea0fbb779e271a38129df2550f763937659ee6c5dbc65/pydantic_core-2.41.5-cp314-cp314-macosx_10_12_x86_64.whl", hash = "sha256:3f37a19d7ebcdd20b96485056ba9e8b304e27d9904d233d7b1015db320e51f0a", size = 2119622, upload-time = "2025-11-04T13:40:56.68Z" }, + { url = "https://files.pythonhosted.org/packages/74/1a/145646e5687e8d9a1e8d09acb278c8535ebe9e972e1f162ed338a622f193/pydantic_core-2.41.5-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:1d1d9764366c73f996edd17abb6d9d7649a7eb690006ab6adbda117717099b14", size = 1891725, upload-time = "2025-11-04T13:40:58.807Z" }, + { url = "https://files.pythonhosted.org/packages/23/04/e89c29e267b8060b40dca97bfc64a19b2a3cf99018167ea1677d96368273/pydantic_core-2.41.5-cp314-cp314-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:25e1c2af0fce638d5f1988b686f3b3ea8cd7de5f244ca147c777769e798a9cd1", size = 1915040, upload-time = "2025-11-04T13:41:00.853Z" }, + { url = "https://files.pythonhosted.org/packages/84/a3/15a82ac7bd97992a82257f777b3583d3e84bdb06ba6858f745daa2ec8a85/pydantic_core-2.41.5-cp314-cp314-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:506d766a8727beef16b7adaeb8ee6217c64fc813646b424d0804d67c16eddb66", size = 2063691, upload-time = "2025-11-04T13:41:03.504Z" }, + { url = "https://files.pythonhosted.org/packages/74/9b/0046701313c6ef08c0c1cf0e028c67c770a4e1275ca73131563c5f2a310a/pydantic_core-2.41.5-cp314-cp314-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:4819fa52133c9aa3c387b3328f25c1facc356491e6135b459f1de698ff64d869", size = 2213897, upload-time = "2025-11-04T13:41:05.804Z" }, + { url = "https://files.pythonhosted.org/packages/8a/cd/6bac76ecd1b27e75a95ca3a9a559c643b3afcd2dd62086d4b7a32a18b169/pydantic_core-2.41.5-cp314-cp314-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2b761d210c9ea91feda40d25b4efe82a1707da2ef62901466a42492c028553a2", size = 2333302, upload-time = "2025-11-04T13:41:07.809Z" }, + { url = "https://files.pythonhosted.org/packages/4c/d2/ef2074dc020dd6e109611a8be4449b98cd25e1b9b8a303c2f0fca2f2bcf7/pydantic_core-2.41.5-cp314-cp314-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:22f0fb8c1c583a3b6f24df2470833b40207e907b90c928cc8d3594b76f874375", size = 2064877, upload-time = "2025-11-04T13:41:09.827Z" }, + { url = "https://files.pythonhosted.org/packages/18/66/e9db17a9a763d72f03de903883c057b2592c09509ccfe468187f2a2eef29/pydantic_core-2.41.5-cp314-cp314-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:2782c870e99878c634505236d81e5443092fba820f0373997ff75f90f68cd553", size = 2180680, upload-time = "2025-11-04T13:41:12.379Z" }, + { url = "https://files.pythonhosted.org/packages/d3/9e/3ce66cebb929f3ced22be85d4c2399b8e85b622db77dad36b73c5387f8f8/pydantic_core-2.41.5-cp314-cp314-musllinux_1_1_aarch64.whl", hash = "sha256:0177272f88ab8312479336e1d777f6b124537d47f2123f89cb37e0accea97f90", size = 2138960, upload-time = "2025-11-04T13:41:14.627Z" }, + { url = "https://files.pythonhosted.org/packages/a6/62/205a998f4327d2079326b01abee48e502ea739d174f0a89295c481a2272e/pydantic_core-2.41.5-cp314-cp314-musllinux_1_1_armv7l.whl", hash = "sha256:63510af5e38f8955b8ee5687740d6ebf7c2a0886d15a6d65c32814613681bc07", size = 2339102, upload-time = "2025-11-04T13:41:16.868Z" }, + { url = "https://files.pythonhosted.org/packages/3c/0d/f05e79471e889d74d3d88f5bd20d0ed189ad94c2423d81ff8d0000aab4ff/pydantic_core-2.41.5-cp314-cp314-musllinux_1_1_x86_64.whl", hash = "sha256:e56ba91f47764cc14f1daacd723e3e82d1a89d783f0f5afe9c364b8bb491ccdb", size = 2326039, upload-time = "2025-11-04T13:41:18.934Z" }, + { url = "https://files.pythonhosted.org/packages/ec/e1/e08a6208bb100da7e0c4b288eed624a703f4d129bde2da475721a80cab32/pydantic_core-2.41.5-cp314-cp314-win32.whl", hash = "sha256:aec5cf2fd867b4ff45b9959f8b20ea3993fc93e63c7363fe6851424c8a7e7c23", size = 1995126, upload-time = "2025-11-04T13:41:21.418Z" }, + { url = "https://files.pythonhosted.org/packages/48/5d/56ba7b24e9557f99c9237e29f5c09913c81eeb2f3217e40e922353668092/pydantic_core-2.41.5-cp314-cp314-win_amd64.whl", hash = "sha256:8e7c86f27c585ef37c35e56a96363ab8de4e549a95512445b85c96d3e2f7c1bf", size = 2015489, upload-time = "2025-11-04T13:41:24.076Z" }, + { url = "https://files.pythonhosted.org/packages/4e/bb/f7a190991ec9e3e0ba22e4993d8755bbc4a32925c0b5b42775c03e8148f9/pydantic_core-2.41.5-cp314-cp314-win_arm64.whl", hash = "sha256:e672ba74fbc2dc8eea59fb6d4aed6845e6905fc2a8afe93175d94a83ba2a01a0", size = 1977288, upload-time = "2025-11-04T13:41:26.33Z" }, + { url = "https://files.pythonhosted.org/packages/92/ed/77542d0c51538e32e15afe7899d79efce4b81eee631d99850edc2f5e9349/pydantic_core-2.41.5-cp314-cp314t-macosx_10_12_x86_64.whl", hash = "sha256:8566def80554c3faa0e65ac30ab0932b9e3a5cd7f8323764303d468e5c37595a", size = 2120255, upload-time = "2025-11-04T13:41:28.569Z" }, + { url = "https://files.pythonhosted.org/packages/bb/3d/6913dde84d5be21e284439676168b28d8bbba5600d838b9dca99de0fad71/pydantic_core-2.41.5-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:b80aa5095cd3109962a298ce14110ae16b8c1aece8b72f9dafe81cf597ad80b3", size = 1863760, upload-time = "2025-11-04T13:41:31.055Z" }, + { url = "https://files.pythonhosted.org/packages/5a/f0/e5e6b99d4191da102f2b0eb9687aaa7f5bea5d9964071a84effc3e40f997/pydantic_core-2.41.5-cp314-cp314t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3006c3dd9ba34b0c094c544c6006cc79e87d8612999f1a5d43b769b89181f23c", size = 1878092, upload-time = "2025-11-04T13:41:33.21Z" }, + { url = "https://files.pythonhosted.org/packages/71/48/36fb760642d568925953bcc8116455513d6e34c4beaa37544118c36aba6d/pydantic_core-2.41.5-cp314-cp314t-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:72f6c8b11857a856bcfa48c86f5368439f74453563f951e473514579d44aa612", size = 2053385, upload-time = "2025-11-04T13:41:35.508Z" }, + { url = "https://files.pythonhosted.org/packages/20/25/92dc684dd8eb75a234bc1c764b4210cf2646479d54b47bf46061657292a8/pydantic_core-2.41.5-cp314-cp314t-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:5cb1b2f9742240e4bb26b652a5aeb840aa4b417c7748b6f8387927bc6e45e40d", size = 2218832, upload-time = "2025-11-04T13:41:37.732Z" }, + { url = "https://files.pythonhosted.org/packages/e2/09/f53e0b05023d3e30357d82eb35835d0f6340ca344720a4599cd663dca599/pydantic_core-2.41.5-cp314-cp314t-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:bd3d54f38609ff308209bd43acea66061494157703364ae40c951f83ba99a1a9", size = 2327585, upload-time = "2025-11-04T13:41:40Z" }, + { url = "https://files.pythonhosted.org/packages/aa/4e/2ae1aa85d6af35a39b236b1b1641de73f5a6ac4d5a7509f77b814885760c/pydantic_core-2.41.5-cp314-cp314t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2ff4321e56e879ee8d2a879501c8e469414d948f4aba74a2d4593184eb326660", size = 2041078, upload-time = "2025-11-04T13:41:42.323Z" }, + { url = "https://files.pythonhosted.org/packages/cd/13/2e215f17f0ef326fc72afe94776edb77525142c693767fc347ed6288728d/pydantic_core-2.41.5-cp314-cp314t-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:d0d2568a8c11bf8225044aa94409e21da0cb09dcdafe9ecd10250b2baad531a9", size = 2173914, upload-time = "2025-11-04T13:41:45.221Z" }, + { url = "https://files.pythonhosted.org/packages/02/7a/f999a6dcbcd0e5660bc348a3991c8915ce6599f4f2c6ac22f01d7a10816c/pydantic_core-2.41.5-cp314-cp314t-musllinux_1_1_aarch64.whl", hash = "sha256:a39455728aabd58ceabb03c90e12f71fd30fa69615760a075b9fec596456ccc3", size = 2129560, upload-time = "2025-11-04T13:41:47.474Z" }, + { url = "https://files.pythonhosted.org/packages/3a/b1/6c990ac65e3b4c079a4fb9f5b05f5b013afa0f4ed6780a3dd236d2cbdc64/pydantic_core-2.41.5-cp314-cp314t-musllinux_1_1_armv7l.whl", hash = "sha256:239edca560d05757817c13dc17c50766136d21f7cd0fac50295499ae24f90fdf", size = 2329244, upload-time = "2025-11-04T13:41:49.992Z" }, + { url = "https://files.pythonhosted.org/packages/d9/02/3c562f3a51afd4d88fff8dffb1771b30cfdfd79befd9883ee094f5b6c0d8/pydantic_core-2.41.5-cp314-cp314t-musllinux_1_1_x86_64.whl", hash = "sha256:2a5e06546e19f24c6a96a129142a75cee553cc018ffee48a460059b1185f4470", size = 2331955, upload-time = "2025-11-04T13:41:54.079Z" }, + { url = "https://files.pythonhosted.org/packages/5c/96/5fb7d8c3c17bc8c62fdb031c47d77a1af698f1d7a406b0f79aaa1338f9ad/pydantic_core-2.41.5-cp314-cp314t-win32.whl", hash = "sha256:b4ececa40ac28afa90871c2cc2b9ffd2ff0bf749380fbdf57d165fd23da353aa", size = 1988906, upload-time = "2025-11-04T13:41:56.606Z" }, + { url = "https://files.pythonhosted.org/packages/22/ed/182129d83032702912c2e2d8bbe33c036f342cc735737064668585dac28f/pydantic_core-2.41.5-cp314-cp314t-win_amd64.whl", hash = "sha256:80aa89cad80b32a912a65332f64a4450ed00966111b6615ca6816153d3585a8c", size = 1981607, upload-time = "2025-11-04T13:41:58.889Z" }, + { url = "https://files.pythonhosted.org/packages/9f/ed/068e41660b832bb0b1aa5b58011dea2a3fe0ba7861ff38c4d4904c1c1a99/pydantic_core-2.41.5-cp314-cp314t-win_arm64.whl", hash = "sha256:35b44f37a3199f771c3eaa53051bc8a70cd7b54f333531c59e29fd4db5d15008", size = 1974769, upload-time = "2025-11-04T13:42:01.186Z" }, +] + +[[package]] +name = "pydantic-settings" +version = "2.12.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "pydantic" }, + { name = "python-dotenv" }, + { name = "typing-inspection" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/43/4b/ac7e0aae12027748076d72a8764ff1c9d82ca75a7a52622e67ed3f765c54/pydantic_settings-2.12.0.tar.gz", hash = "sha256:005538ef951e3c2a68e1c08b292b5f2e71490def8589d4221b95dab00dafcfd0", size = 194184, upload-time = "2025-11-10T14:25:47.013Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/c1/60/5d4751ba3f4a40a6891f24eec885f51afd78d208498268c734e256fb13c4/pydantic_settings-2.12.0-py3-none-any.whl", hash = "sha256:fddb9fd99a5b18da837b29710391e945b1e30c135477f484084ee513adb93809", size = 51880, upload-time = "2025-11-10T14:25:45.546Z" }, +] + +[[package]] +name = "pygments" +version = "2.19.2" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/b0/77/a5b8c569bf593b0140bde72ea885a803b82086995367bf2037de0159d924/pygments-2.19.2.tar.gz", hash = "sha256:636cb2477cec7f8952536970bc533bc43743542f70392ae026374600add5b887", size = 4968631, upload-time = "2025-06-21T13:39:12.283Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/c7/21/705964c7812476f378728bdf590ca4b771ec72385c533964653c68e86bdc/pygments-2.19.2-py3-none-any.whl", hash = "sha256:86540386c03d588bb81d44bc3928634ff26449851e99741617ecb9037ee5ec0b", size = 1225217, upload-time = "2025-06-21T13:39:07.939Z" }, +] + +[[package]] +name = "pyjwt" +version = "2.10.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/e7/46/bd74733ff231675599650d3e47f361794b22ef3e3770998dda30d3b63726/pyjwt-2.10.1.tar.gz", hash = "sha256:3cc5772eb20009233caf06e9d8a0577824723b44e6648ee0a2aedb6cf9381953", size = 87785, upload-time = "2024-11-28T03:43:29.933Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/61/ad/689f02752eeec26aed679477e80e632ef1b682313be70793d798c1d5fc8f/PyJWT-2.10.1-py3-none-any.whl", hash = "sha256:dcdd193e30abefd5debf142f9adfcdd2b58004e644f25406ffaebd50bd98dacb", size = 22997, upload-time = "2024-11-28T03:43:27.893Z" }, +] + +[[package]] +name = "pytest" +version = "9.0.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "colorama", marker = "sys_platform == 'win32'" }, + { name = "iniconfig" }, + { name = "packaging" }, + { name = "pluggy" }, + { name = "pygments" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/07/56/f013048ac4bc4c1d9be45afd4ab209ea62822fb1598f40687e6bf45dcea4/pytest-9.0.1.tar.gz", hash = "sha256:3e9c069ea73583e255c3b21cf46b8d3c56f6e3a1a8f6da94ccb0fcf57b9d73c8", size = 1564125, upload-time = "2025-11-12T13:05:09.333Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/0b/8b/6300fb80f858cda1c51ffa17075df5d846757081d11ab4aa35cef9e6258b/pytest-9.0.1-py3-none-any.whl", hash = "sha256:67be0030d194df2dfa7b556f2e56fb3c3315bd5c8822c6951162b92b32ce7dad", size = 373668, upload-time = "2025-11-12T13:05:07.379Z" }, +] + +[[package]] +name = "pytest-asyncio" +version = "1.3.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "pytest" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/90/2c/8af215c0f776415f3590cac4f9086ccefd6fd463befeae41cd4d3f193e5a/pytest_asyncio-1.3.0.tar.gz", hash = "sha256:d7f52f36d231b80ee124cd216ffb19369aa168fc10095013c6b014a34d3ee9e5", size = 50087, upload-time = "2025-11-10T16:07:47.256Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/e5/35/f8b19922b6a25bc0880171a2f1a003eaeb93657475193ab516fd87cac9da/pytest_asyncio-1.3.0-py3-none-any.whl", hash = "sha256:611e26147c7f77640e6d0a92a38ed17c3e9848063698d5c93d5aa7aa11cebff5", size = 15075, upload-time = "2025-11-10T16:07:45.537Z" }, +] + +[[package]] +name = "pytest-cov" +version = "7.0.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "coverage" }, + { name = "pluggy" }, + { name = "pytest" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/5e/f7/c933acc76f5208b3b00089573cf6a2bc26dc80a8aece8f52bb7d6b1855ca/pytest_cov-7.0.0.tar.gz", hash = "sha256:33c97eda2e049a0c5298e91f519302a1334c26ac65c1a483d6206fd458361af1", size = 54328, upload-time = "2025-09-09T10:57:02.113Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/ee/49/1377b49de7d0c1ce41292161ea0f721913fa8722c19fb9c1e3aa0367eecb/pytest_cov-7.0.0-py3-none-any.whl", hash = "sha256:3b8e9558b16cc1479da72058bdecf8073661c7f57f7d3c5f22a1c23507f2d861", size = 22424, upload-time = "2025-09-09T10:57:00.695Z" }, +] + +[[package]] +name = "python-dateutil" +version = "2.9.0.post0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "six" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/66/c0/0c8b6ad9f17a802ee498c46e004a0eb49bc148f2fd230864601a86dcf6db/python-dateutil-2.9.0.post0.tar.gz", hash = "sha256:37dd54208da7e1cd875388217d5e00ebd4179249f90fb72437e91a35459a0ad3", size = 342432, upload-time = "2024-03-01T18:36:20.211Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/ec/57/56b9bcc3c9c6a792fcbaf139543cee77261f3651ca9da0c93f5c1221264b/python_dateutil-2.9.0.post0-py2.py3-none-any.whl", hash = "sha256:a8b2bc7bffae282281c8140a97d3aa9c14da0b136dfe83f850eea9a5f7470427", size = 229892, upload-time = "2024-03-01T18:36:18.57Z" }, +] + +[[package]] +name = "python-dotenv" +version = "1.2.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/f0/26/19cadc79a718c5edbec86fd4919a6b6d3f681039a2f6d66d14be94e75fb9/python_dotenv-1.2.1.tar.gz", hash = "sha256:42667e897e16ab0d66954af0e60a9caa94f0fd4ecf3aaf6d2d260eec1aa36ad6", size = 44221, upload-time = "2025-10-26T15:12:10.434Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/14/1b/a298b06749107c305e1fe0f814c6c74aea7b2f1e10989cb30f544a1b3253/python_dotenv-1.2.1-py3-none-any.whl", hash = "sha256:b81ee9561e9ca4004139c6cbba3a238c32b03e4894671e181b671e8cb8425d61", size = 21230, upload-time = "2025-10-26T15:12:09.109Z" }, +] + +[[package]] +name = "python-jose" +version = "3.5.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "ecdsa" }, + { name = "pyasn1" }, + { name = "rsa" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/c6/77/3a1c9039db7124eb039772b935f2244fbb73fc8ee65b9acf2375da1c07bf/python_jose-3.5.0.tar.gz", hash = "sha256:fb4eaa44dbeb1c26dcc69e4bd7ec54a1cb8dd64d3b4d81ef08d90ff453f2b01b", size = 92726, upload-time = "2025-05-28T17:31:54.288Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/d9/c3/0bd11992072e6a1c513b16500a5d07f91a24017c5909b02c72c62d7ad024/python_jose-3.5.0-py2.py3-none-any.whl", hash = "sha256:abd1202f23d34dfad2c3d28cb8617b90acf34132c7afd60abd0b0b7d3cb55771", size = 34624, upload-time = "2025-05-28T17:31:52.802Z" }, +] + +[package.optional-dependencies] +cryptography = [ + { name = "cryptography" }, +] + +[[package]] +name = "python-multipart" +version = "0.0.20" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/f3/87/f44d7c9f274c7ee665a29b885ec97089ec5dc034c7f3fafa03da9e39a09e/python_multipart-0.0.20.tar.gz", hash = "sha256:8dd0cab45b8e23064ae09147625994d090fa46f5b0d1e13af944c331a7fa9d13", size = 37158, upload-time = "2024-12-16T19:45:46.972Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/45/58/38b5afbc1a800eeea951b9285d3912613f2603bdf897a4ab0f4bd7f405fc/python_multipart-0.0.20-py3-none-any.whl", hash = "sha256:8a62d3a8335e06589fe01f2a3e178cdcc632f3fbe0d492ad9ee0ec35aab1f104", size = 24546, upload-time = "2024-12-16T19:45:44.423Z" }, +] + +[[package]] +name = "pyyaml" +version = "6.0.3" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/05/8e/961c0007c59b8dd7729d542c61a4d537767a59645b82a0b521206e1e25c2/pyyaml-6.0.3.tar.gz", hash = "sha256:d76623373421df22fb4cf8817020cbb7ef15c725b9d5e45f17e189bfc384190f", size = 130960, upload-time = "2025-09-25T21:33:16.546Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/d1/11/0fd08f8192109f7169db964b5707a2f1e8b745d4e239b784a5a1dd80d1db/pyyaml-6.0.3-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:8da9669d359f02c0b91ccc01cac4a67f16afec0dac22c2ad09f46bee0697eba8", size = 181669, upload-time = "2025-09-25T21:32:23.673Z" }, + { url = "https://files.pythonhosted.org/packages/b1/16/95309993f1d3748cd644e02e38b75d50cbc0d9561d21f390a76242ce073f/pyyaml-6.0.3-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:2283a07e2c21a2aa78d9c4442724ec1eb15f5e42a723b99cb3d822d48f5f7ad1", size = 173252, upload-time = "2025-09-25T21:32:25.149Z" }, + { url = "https://files.pythonhosted.org/packages/50/31/b20f376d3f810b9b2371e72ef5adb33879b25edb7a6d072cb7ca0c486398/pyyaml-6.0.3-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:ee2922902c45ae8ccada2c5b501ab86c36525b883eff4255313a253a3160861c", size = 767081, upload-time = "2025-09-25T21:32:26.575Z" }, + { url = "https://files.pythonhosted.org/packages/49/1e/a55ca81e949270d5d4432fbbd19dfea5321eda7c41a849d443dc92fd1ff7/pyyaml-6.0.3-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:a33284e20b78bd4a18c8c2282d549d10bc8408a2a7ff57653c0cf0b9be0afce5", size = 841159, upload-time = "2025-09-25T21:32:27.727Z" }, + { url = "https://files.pythonhosted.org/packages/74/27/e5b8f34d02d9995b80abcef563ea1f8b56d20134d8f4e5e81733b1feceb2/pyyaml-6.0.3-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:0f29edc409a6392443abf94b9cf89ce99889a1dd5376d94316ae5145dfedd5d6", size = 801626, upload-time = "2025-09-25T21:32:28.878Z" }, + { url = "https://files.pythonhosted.org/packages/f9/11/ba845c23988798f40e52ba45f34849aa8a1f2d4af4b798588010792ebad6/pyyaml-6.0.3-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:f7057c9a337546edc7973c0d3ba84ddcdf0daa14533c2065749c9075001090e6", size = 753613, upload-time = "2025-09-25T21:32:30.178Z" }, + { url = "https://files.pythonhosted.org/packages/3d/e0/7966e1a7bfc0a45bf0a7fb6b98ea03fc9b8d84fa7f2229e9659680b69ee3/pyyaml-6.0.3-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:eda16858a3cab07b80edaf74336ece1f986ba330fdb8ee0d6c0d68fe82bc96be", size = 794115, upload-time = "2025-09-25T21:32:31.353Z" }, + { url = "https://files.pythonhosted.org/packages/de/94/980b50a6531b3019e45ddeada0626d45fa85cbe22300844a7983285bed3b/pyyaml-6.0.3-cp313-cp313-win32.whl", hash = "sha256:d0eae10f8159e8fdad514efdc92d74fd8d682c933a6dd088030f3834bc8e6b26", size = 137427, upload-time = "2025-09-25T21:32:32.58Z" }, + { url = "https://files.pythonhosted.org/packages/97/c9/39d5b874e8b28845e4ec2202b5da735d0199dbe5b8fb85f91398814a9a46/pyyaml-6.0.3-cp313-cp313-win_amd64.whl", hash = "sha256:79005a0d97d5ddabfeeea4cf676af11e647e41d81c9a7722a193022accdb6b7c", size = 154090, upload-time = "2025-09-25T21:32:33.659Z" }, + { url = "https://files.pythonhosted.org/packages/73/e8/2bdf3ca2090f68bb3d75b44da7bbc71843b19c9f2b9cb9b0f4ab7a5a4329/pyyaml-6.0.3-cp313-cp313-win_arm64.whl", hash = "sha256:5498cd1645aa724a7c71c8f378eb29ebe23da2fc0d7a08071d89469bf1d2defb", size = 140246, upload-time = "2025-09-25T21:32:34.663Z" }, + { url = "https://files.pythonhosted.org/packages/9d/8c/f4bd7f6465179953d3ac9bc44ac1a8a3e6122cf8ada906b4f96c60172d43/pyyaml-6.0.3-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:8d1fab6bb153a416f9aeb4b8763bc0f22a5586065f86f7664fc23339fc1c1fac", size = 181814, upload-time = "2025-09-25T21:32:35.712Z" }, + { url = "https://files.pythonhosted.org/packages/bd/9c/4d95bb87eb2063d20db7b60faa3840c1b18025517ae857371c4dd55a6b3a/pyyaml-6.0.3-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:34d5fcd24b8445fadc33f9cf348c1047101756fd760b4dacb5c3e99755703310", size = 173809, upload-time = "2025-09-25T21:32:36.789Z" }, + { url = "https://files.pythonhosted.org/packages/92/b5/47e807c2623074914e29dabd16cbbdd4bf5e9b2db9f8090fa64411fc5382/pyyaml-6.0.3-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:501a031947e3a9025ed4405a168e6ef5ae3126c59f90ce0cd6f2bfc477be31b7", size = 766454, upload-time = "2025-09-25T21:32:37.966Z" }, + { url = "https://files.pythonhosted.org/packages/02/9e/e5e9b168be58564121efb3de6859c452fccde0ab093d8438905899a3a483/pyyaml-6.0.3-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:b3bc83488de33889877a0f2543ade9f70c67d66d9ebb4ac959502e12de895788", size = 836355, upload-time = "2025-09-25T21:32:39.178Z" }, + { url = "https://files.pythonhosted.org/packages/88/f9/16491d7ed2a919954993e48aa941b200f38040928474c9e85ea9e64222c3/pyyaml-6.0.3-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:c458b6d084f9b935061bc36216e8a69a7e293a2f1e68bf956dcd9e6cbcd143f5", size = 794175, upload-time = "2025-09-25T21:32:40.865Z" }, + { url = "https://files.pythonhosted.org/packages/dd/3f/5989debef34dc6397317802b527dbbafb2b4760878a53d4166579111411e/pyyaml-6.0.3-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:7c6610def4f163542a622a73fb39f534f8c101d690126992300bf3207eab9764", size = 755228, upload-time = "2025-09-25T21:32:42.084Z" }, + { url = "https://files.pythonhosted.org/packages/d7/ce/af88a49043cd2e265be63d083fc75b27b6ed062f5f9fd6cdc223ad62f03e/pyyaml-6.0.3-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:5190d403f121660ce8d1d2c1bb2ef1bd05b5f68533fc5c2ea899bd15f4399b35", size = 789194, upload-time = "2025-09-25T21:32:43.362Z" }, + { url = "https://files.pythonhosted.org/packages/23/20/bb6982b26a40bb43951265ba29d4c246ef0ff59c9fdcdf0ed04e0687de4d/pyyaml-6.0.3-cp314-cp314-win_amd64.whl", hash = "sha256:4a2e8cebe2ff6ab7d1050ecd59c25d4c8bd7e6f400f5f82b96557ac0abafd0ac", size = 156429, upload-time = "2025-09-25T21:32:57.844Z" }, + { url = "https://files.pythonhosted.org/packages/f4/f4/a4541072bb9422c8a883ab55255f918fa378ecf083f5b85e87fc2b4eda1b/pyyaml-6.0.3-cp314-cp314-win_arm64.whl", hash = "sha256:93dda82c9c22deb0a405ea4dc5f2d0cda384168e466364dec6255b293923b2f3", size = 143912, upload-time = "2025-09-25T21:32:59.247Z" }, + { url = "https://files.pythonhosted.org/packages/7c/f9/07dd09ae774e4616edf6cda684ee78f97777bdd15847253637a6f052a62f/pyyaml-6.0.3-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:02893d100e99e03eda1c8fd5c441d8c60103fd175728e23e431db1b589cf5ab3", size = 189108, upload-time = "2025-09-25T21:32:44.377Z" }, + { url = "https://files.pythonhosted.org/packages/4e/78/8d08c9fb7ce09ad8c38ad533c1191cf27f7ae1effe5bb9400a46d9437fcf/pyyaml-6.0.3-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:c1ff362665ae507275af2853520967820d9124984e0f7466736aea23d8611fba", size = 183641, upload-time = "2025-09-25T21:32:45.407Z" }, + { url = "https://files.pythonhosted.org/packages/7b/5b/3babb19104a46945cf816d047db2788bcaf8c94527a805610b0289a01c6b/pyyaml-6.0.3-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:6adc77889b628398debc7b65c073bcb99c4a0237b248cacaf3fe8a557563ef6c", size = 831901, upload-time = "2025-09-25T21:32:48.83Z" }, + { url = "https://files.pythonhosted.org/packages/8b/cc/dff0684d8dc44da4d22a13f35f073d558c268780ce3c6ba1b87055bb0b87/pyyaml-6.0.3-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:a80cb027f6b349846a3bf6d73b5e95e782175e52f22108cfa17876aaeff93702", size = 861132, upload-time = "2025-09-25T21:32:50.149Z" }, + { url = "https://files.pythonhosted.org/packages/b1/5e/f77dc6b9036943e285ba76b49e118d9ea929885becb0a29ba8a7c75e29fe/pyyaml-6.0.3-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:00c4bdeba853cc34e7dd471f16b4114f4162dc03e6b7afcc2128711f0eca823c", size = 839261, upload-time = "2025-09-25T21:32:51.808Z" }, + { url = "https://files.pythonhosted.org/packages/ce/88/a9db1376aa2a228197c58b37302f284b5617f56a5d959fd1763fb1675ce6/pyyaml-6.0.3-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:66e1674c3ef6f541c35191caae2d429b967b99e02040f5ba928632d9a7f0f065", size = 805272, upload-time = "2025-09-25T21:32:52.941Z" }, + { url = "https://files.pythonhosted.org/packages/da/92/1446574745d74df0c92e6aa4a7b0b3130706a4142b2d1a5869f2eaa423c6/pyyaml-6.0.3-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:16249ee61e95f858e83976573de0f5b2893b3677ba71c9dd36b9cf8be9ac6d65", size = 829923, upload-time = "2025-09-25T21:32:54.537Z" }, + { url = "https://files.pythonhosted.org/packages/f0/7a/1c7270340330e575b92f397352af856a8c06f230aa3e76f86b39d01b416a/pyyaml-6.0.3-cp314-cp314t-win_amd64.whl", hash = "sha256:4ad1906908f2f5ae4e5a8ddfce73c320c2a1429ec52eafd27138b7f1cbe341c9", size = 174062, upload-time = "2025-09-25T21:32:55.767Z" }, + { url = "https://files.pythonhosted.org/packages/f1/12/de94a39c2ef588c7e6455cfbe7343d3b2dc9d6b6b2f40c4c6565744c873d/pyyaml-6.0.3-cp314-cp314t-win_arm64.whl", hash = "sha256:ebc55a14a21cb14062aa4162f906cd962b28e2e9ea38f9b4391244cd8de4ae0b", size = 149341, upload-time = "2025-09-25T21:32:56.828Z" }, +] + +[[package]] +name = "redis" +version = "5.3.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "pyjwt" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/6a/cf/128b1b6d7086200c9f387bd4be9b2572a30b90745ef078bd8b235042dc9f/redis-5.3.1.tar.gz", hash = "sha256:ca49577a531ea64039b5a36db3d6cd1a0c7a60c34124d46924a45b956e8cf14c", size = 4626200, upload-time = "2025-07-25T08:06:27.778Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/7f/26/5c5fa0e83c3621db835cfc1f1d789b37e7fa99ed54423b5f519beb931aa7/redis-5.3.1-py3-none-any.whl", hash = "sha256:dc1909bd24669cc31b5f67a039700b16ec30571096c5f1f0d9d2324bff31af97", size = 272833, upload-time = "2025-07-25T08:06:26.317Z" }, +] + +[package.optional-dependencies] +hiredis = [ + { name = "hiredis" }, +] + +[[package]] +name = "requests" +version = "2.32.5" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "certifi" }, + { name = "charset-normalizer" }, + { name = "idna" }, + { name = "urllib3" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/c9/74/b3ff8e6c8446842c3f5c837e9c3dfcfe2018ea6ecef224c710c85ef728f4/requests-2.32.5.tar.gz", hash = "sha256:dbba0bac56e100853db0ea71b82b4dfd5fe2bf6d3754a8893c3af500cec7d7cf", size = 134517, upload-time = "2025-08-18T20:46:02.573Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/1e/db/4254e3eabe8020b458f1a747140d32277ec7a271daf1d235b70dc0b4e6e3/requests-2.32.5-py3-none-any.whl", hash = "sha256:2462f94637a34fd532264295e186976db0f5d453d1cdd31473c85a6a161affb6", size = 64738, upload-time = "2025-08-18T20:46:00.542Z" }, +] + +[[package]] +name = "respx" +version = "0.22.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "httpx" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/f4/7c/96bd0bc759cf009675ad1ee1f96535edcb11e9666b985717eb8c87192a95/respx-0.22.0.tar.gz", hash = "sha256:3c8924caa2a50bd71aefc07aa812f2466ff489f1848c96e954a5362d17095d91", size = 28439, upload-time = "2024-12-19T22:33:59.374Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/8e/67/afbb0978d5399bc9ea200f1d4489a23c9a1dad4eee6376242b8182389c79/respx-0.22.0-py2.py3-none-any.whl", hash = "sha256:631128d4c9aba15e56903fb5f66fb1eff412ce28dd387ca3a81339e52dbd3ad0", size = 25127, upload-time = "2024-12-19T22:33:57.837Z" }, +] + +[[package]] +name = "rich" +version = "14.2.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "markdown-it-py" }, + { name = "pygments" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/fb/d2/8920e102050a0de7bfabeb4c4614a49248cf8d5d7a8d01885fbb24dc767a/rich-14.2.0.tar.gz", hash = "sha256:73ff50c7c0c1c77c8243079283f4edb376f0f6442433aecb8ce7e6d0b92d1fe4", size = 219990, upload-time = "2025-10-09T14:16:53.064Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/25/7a/b0178788f8dc6cafce37a212c99565fa1fe7872c70c6c9c1e1a372d9d88f/rich-14.2.0-py3-none-any.whl", hash = "sha256:76bc51fe2e57d2b1be1f96c524b890b816e334ab4c1e45888799bfaab0021edd", size = 243393, upload-time = "2025-10-09T14:16:51.245Z" }, +] + +[[package]] +name = "rsa" +version = "4.9.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "pyasn1" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/da/8a/22b7beea3ee0d44b1916c0c1cb0ee3af23b700b6da9f04991899d0c555d4/rsa-4.9.1.tar.gz", hash = "sha256:e7bdbfdb5497da4c07dfd35530e1a902659db6ff241e39d9953cad06ebd0ae75", size = 29034, upload-time = "2025-04-16T09:51:18.218Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/64/8d/0133e4eb4beed9e425d9a98ed6e081a55d195481b7632472be1af08d2f6b/rsa-4.9.1-py3-none-any.whl", hash = "sha256:68635866661c6836b8d39430f97a996acbd61bfa49406748ea243539fe239762", size = 34696, upload-time = "2025-04-16T09:51:17.142Z" }, +] + +[[package]] +name = "ruff" +version = "0.14.8" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/ed/d9/f7a0c4b3a2bf2556cd5d99b05372c29980249ef71e8e32669ba77428c82c/ruff-0.14.8.tar.gz", hash = "sha256:774ed0dd87d6ce925e3b8496feb3a00ac564bea52b9feb551ecd17e0a23d1eed", size = 5765385, upload-time = "2025-12-04T15:06:17.669Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/48/b8/9537b52010134b1d2b72870cc3f92d5fb759394094741b09ceccae183fbe/ruff-0.14.8-py3-none-linux_armv6l.whl", hash = "sha256:ec071e9c82eca417f6111fd39f7043acb53cd3fde9b1f95bbed745962e345afb", size = 13441540, upload-time = "2025-12-04T15:06:14.896Z" }, + { url = "https://files.pythonhosted.org/packages/24/00/99031684efb025829713682012b6dd37279b1f695ed1b01725f85fd94b38/ruff-0.14.8-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:8cdb162a7159f4ca36ce980a18c43d8f036966e7f73f866ac8f493b75e0c27e9", size = 13669384, upload-time = "2025-12-04T15:06:51.809Z" }, + { url = "https://files.pythonhosted.org/packages/72/64/3eb5949169fc19c50c04f28ece2c189d3b6edd57e5b533649dae6ca484fe/ruff-0.14.8-py3-none-macosx_11_0_arm64.whl", hash = "sha256:2e2fcbefe91f9fad0916850edf0854530c15bd1926b6b779de47e9ab619ea38f", size = 12806917, upload-time = "2025-12-04T15:06:08.925Z" }, + { url = "https://files.pythonhosted.org/packages/c4/08/5250babb0b1b11910f470370ec0cbc67470231f7cdc033cee57d4976f941/ruff-0.14.8-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a9d70721066a296f45786ec31916dc287b44040f553da21564de0ab4d45a869b", size = 13256112, upload-time = "2025-12-04T15:06:23.498Z" }, + { url = "https://files.pythonhosted.org/packages/78/4c/6c588e97a8e8c2d4b522c31a579e1df2b4d003eddfbe23d1f262b1a431ff/ruff-0.14.8-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:2c87e09b3cd9d126fc67a9ecd3b5b1d3ded2b9c7fce3f16e315346b9d05cfb52", size = 13227559, upload-time = "2025-12-04T15:06:33.432Z" }, + { url = "https://files.pythonhosted.org/packages/23/ce/5f78cea13eda8eceac71b5f6fa6e9223df9b87bb2c1891c166d1f0dce9f1/ruff-0.14.8-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1d62cb310c4fbcb9ee4ac023fe17f984ae1e12b8a4a02e3d21489f9a2a5f730c", size = 13896379, upload-time = "2025-12-04T15:06:02.687Z" }, + { url = "https://files.pythonhosted.org/packages/cf/79/13de4517c4dadce9218a20035b21212a4c180e009507731f0d3b3f5df85a/ruff-0.14.8-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:1af35c2d62633d4da0521178e8a2641c636d2a7153da0bac1b30cfd4ccd91344", size = 15372786, upload-time = "2025-12-04T15:06:29.828Z" }, + { url = "https://files.pythonhosted.org/packages/00/06/33df72b3bb42be8a1c3815fd4fae83fa2945fc725a25d87ba3e42d1cc108/ruff-0.14.8-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:25add4575ffecc53d60eed3f24b1e934493631b48ebbc6ebaf9d8517924aca4b", size = 14990029, upload-time = "2025-12-04T15:06:36.812Z" }, + { url = "https://files.pythonhosted.org/packages/64/61/0f34927bd90925880394de0e081ce1afab66d7b3525336f5771dcf0cb46c/ruff-0.14.8-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:4c943d847b7f02f7db4201a0600ea7d244d8a404fbb639b439e987edcf2baf9a", size = 14407037, upload-time = "2025-12-04T15:06:39.979Z" }, + { url = "https://files.pythonhosted.org/packages/96/bc/058fe0aefc0fbf0d19614cb6d1a3e2c048f7dc77ca64957f33b12cfdc5ef/ruff-0.14.8-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:cb6e8bf7b4f627548daa1b69283dac5a296bfe9ce856703b03130732e20ddfe2", size = 14102390, upload-time = "2025-12-04T15:06:46.372Z" }, + { url = "https://files.pythonhosted.org/packages/af/a4/e4f77b02b804546f4c17e8b37a524c27012dd6ff05855d2243b49a7d3cb9/ruff-0.14.8-py3-none-manylinux_2_31_riscv64.whl", hash = "sha256:7aaf2974f378e6b01d1e257c6948207aec6a9b5ba53fab23d0182efb887a0e4a", size = 14230793, upload-time = "2025-12-04T15:06:20.497Z" }, + { url = "https://files.pythonhosted.org/packages/3f/52/bb8c02373f79552e8d087cedaffad76b8892033d2876c2498a2582f09dcf/ruff-0.14.8-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:e5758ca513c43ad8a4ef13f0f081f80f08008f410790f3611a21a92421ab045b", size = 13160039, upload-time = "2025-12-04T15:06:49.06Z" }, + { url = "https://files.pythonhosted.org/packages/1f/ad/b69d6962e477842e25c0b11622548df746290cc6d76f9e0f4ed7456c2c31/ruff-0.14.8-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:f74f7ba163b6e85a8d81a590363bf71618847e5078d90827749bfda1d88c9cdf", size = 13205158, upload-time = "2025-12-04T15:06:54.574Z" }, + { url = "https://files.pythonhosted.org/packages/06/63/54f23da1315c0b3dfc1bc03fbc34e10378918a20c0b0f086418734e57e74/ruff-0.14.8-py3-none-musllinux_1_2_i686.whl", hash = "sha256:eed28f6fafcc9591994c42254f5a5c5ca40e69a30721d2ab18bb0bb3baac3ab6", size = 13469550, upload-time = "2025-12-04T15:05:59.209Z" }, + { url = "https://files.pythonhosted.org/packages/70/7d/a4d7b1961e4903bc37fffb7ddcfaa7beb250f67d97cfd1ee1d5cddb1ec90/ruff-0.14.8-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:21d48fa744c9d1cb8d71eb0a740c4dd02751a5de9db9a730a8ef75ca34cf138e", size = 14211332, upload-time = "2025-12-04T15:06:06.027Z" }, + { url = "https://files.pythonhosted.org/packages/5d/93/2a5063341fa17054e5c86582136e9895db773e3c2ffb770dde50a09f35f0/ruff-0.14.8-py3-none-win32.whl", hash = "sha256:15f04cb45c051159baebb0f0037f404f1dc2f15a927418f29730f411a79bc4e7", size = 13151890, upload-time = "2025-12-04T15:06:11.668Z" }, + { url = "https://files.pythonhosted.org/packages/02/1c/65c61a0859c0add13a3e1cbb6024b42de587456a43006ca2d4fd3d1618fe/ruff-0.14.8-py3-none-win_amd64.whl", hash = "sha256:9eeb0b24242b5bbff3011409a739929f497f3fb5fe3b5698aba5e77e8c833097", size = 14537826, upload-time = "2025-12-04T15:06:26.409Z" }, + { url = "https://files.pythonhosted.org/packages/6d/63/8b41cea3afd7f58eb64ac9251668ee0073789a3bc9ac6f816c8c6fef986d/ruff-0.14.8-py3-none-win_arm64.whl", hash = "sha256:965a582c93c63fe715fd3e3f8aa37c4b776777203d8e1d8aa3cc0c14424a4b99", size = 13634522, upload-time = "2025-12-04T15:06:43.212Z" }, +] + +[[package]] +name = "s3transfer" +version = "0.14.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "botocore" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/62/74/8d69dcb7a9efe8baa2046891735e5dfe433ad558ae23d9e3c14c633d1d58/s3transfer-0.14.0.tar.gz", hash = "sha256:eff12264e7c8b4985074ccce27a3b38a485bb7f7422cc8046fee9be4983e4125", size = 151547, upload-time = "2025-09-09T19:23:31.089Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/48/f0/ae7ca09223a81a1d890b2557186ea015f6e0502e9b8cb8e1813f1d8cfa4e/s3transfer-0.14.0-py3-none-any.whl", hash = "sha256:ea3b790c7077558ed1f02a3072fb3cb992bbbd253392f4b6e9e8976941c7d456", size = 85712, upload-time = "2025-09-09T19:23:30.041Z" }, +] + +[[package]] +name = "shellingham" +version = "1.5.4" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/58/15/8b3609fd3830ef7b27b655beb4b4e9c62313a4e8da8c676e142cc210d58e/shellingham-1.5.4.tar.gz", hash = "sha256:8dbca0739d487e5bd35ab3ca4b36e11c4078f3a234bfce294b0a0291363404de", size = 10310, upload-time = "2023-10-24T04:13:40.426Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/e0/f9/0595336914c5619e5f28a1fb793285925a8cd4b432c9da0a987836c7f822/shellingham-1.5.4-py2.py3-none-any.whl", hash = "sha256:7ecfff8f2fd72616f7481040475a65b2bf8af90a56c89140852d1120324e8686", size = 9755, upload-time = "2023-10-24T04:13:38.866Z" }, +] + +[[package]] +name = "six" +version = "1.17.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/94/e7/b2c673351809dca68a0e064b6af791aa332cf192da575fd474ed7d6f16a2/six-1.17.0.tar.gz", hash = "sha256:ff70335d468e7eb6ec65b95b99d3a2836546063f63acc5171de367e834932a81", size = 34031, upload-time = "2024-12-04T17:35:28.174Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/b7/ce/149a00dd41f10bc29e5921b496af8b574d8413afcd5e30dfa0ed46c2cc5e/six-1.17.0-py2.py3-none-any.whl", hash = "sha256:4721f391ed90541fddacab5acf947aa0d3dc7d27b2e1e8eda2be8970586c3274", size = 11050, upload-time = "2024-12-04T17:35:26.475Z" }, +] + +[[package]] +name = "sniffio" +version = "1.3.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/a2/87/a6771e1546d97e7e041b6ae58d80074f81b7d5121207425c964ddf5cfdbd/sniffio-1.3.1.tar.gz", hash = "sha256:f4324edc670a0f49750a81b895f35c3adb843cca46f0530f79fc1babb23789dc", size = 20372, upload-time = "2024-02-25T23:20:04.057Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/e9/44/75a9c9421471a6c4805dbf2356f7c181a29c1879239abab1ea2cc8f38b40/sniffio-1.3.1-py3-none-any.whl", hash = "sha256:2f6da418d1f1e0fddd844478f41680e794e6051915791a034ff65e5f100525a2", size = 10235, upload-time = "2024-02-25T23:20:01.196Z" }, +] + +[[package]] +name = "sortedcontainers" +version = "2.4.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/e8/c4/ba2f8066cceb6f23394729afe52f3bf7adec04bf9ed2c820b39e19299111/sortedcontainers-2.4.0.tar.gz", hash = "sha256:25caa5a06cc30b6b83d11423433f65d1f9d76c4c6a0c90e3379eaa43b9bfdb88", size = 30594, upload-time = "2021-05-16T22:03:42.897Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/32/46/9cb0e58b2deb7f82b84065f37f3bffeb12413f947f9388e4cac22c4621ce/sortedcontainers-2.4.0-py2.py3-none-any.whl", hash = "sha256:a163dcaede0f1c021485e957a39245190e74249897e2ae4b2aa38595db237ee0", size = 29575, upload-time = "2021-05-16T22:03:41.177Z" }, +] + +[[package]] +name = "sqlalchemy" +version = "2.0.44" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "greenlet", marker = "platform_machine == 'AMD64' or platform_machine == 'WIN32' or platform_machine == 'aarch64' or platform_machine == 'amd64' or platform_machine == 'ppc64le' or platform_machine == 'win32' or platform_machine == 'x86_64'" }, + { name = "typing-extensions" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/f0/f2/840d7b9496825333f532d2e3976b8eadbf52034178aac53630d09fe6e1ef/sqlalchemy-2.0.44.tar.gz", hash = "sha256:0ae7454e1ab1d780aee69fd2aae7d6b8670a581d8847f2d1e0f7ddfbf47e5a22", size = 9819830, upload-time = "2025-10-10T14:39:12.935Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/45/d3/c67077a2249fdb455246e6853166360054c331db4613cda3e31ab1cadbef/sqlalchemy-2.0.44-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:ff486e183d151e51b1d694c7aa1695747599bb00b9f5f604092b54b74c64a8e1", size = 2135479, upload-time = "2025-10-10T16:03:37.671Z" }, + { url = "https://files.pythonhosted.org/packages/2b/91/eabd0688330d6fd114f5f12c4f89b0d02929f525e6bf7ff80aa17ca802af/sqlalchemy-2.0.44-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:0b1af8392eb27b372ddb783b317dea0f650241cea5bd29199b22235299ca2e45", size = 2123212, upload-time = "2025-10-10T16:03:41.755Z" }, + { url = "https://files.pythonhosted.org/packages/b0/bb/43e246cfe0e81c018076a16036d9b548c4cc649de241fa27d8d9ca6f85ab/sqlalchemy-2.0.44-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2b61188657e3a2b9ac4e8f04d6cf8e51046e28175f79464c67f2fd35bceb0976", size = 3255353, upload-time = "2025-10-10T15:35:31.221Z" }, + { url = "https://files.pythonhosted.org/packages/b9/96/c6105ed9a880abe346b64d3b6ddef269ddfcab04f7f3d90a0bf3c5a88e82/sqlalchemy-2.0.44-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b87e7b91a5d5973dda5f00cd61ef72ad75a1db73a386b62877d4875a8840959c", size = 3260222, upload-time = "2025-10-10T15:43:50.124Z" }, + { url = "https://files.pythonhosted.org/packages/44/16/1857e35a47155b5ad927272fee81ae49d398959cb749edca6eaa399b582f/sqlalchemy-2.0.44-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:15f3326f7f0b2bfe406ee562e17f43f36e16167af99c4c0df61db668de20002d", size = 3189614, upload-time = "2025-10-10T15:35:32.578Z" }, + { url = "https://files.pythonhosted.org/packages/88/ee/4afb39a8ee4fc786e2d716c20ab87b5b1fb33d4ac4129a1aaa574ae8a585/sqlalchemy-2.0.44-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:1e77faf6ff919aa8cd63f1c4e561cac1d9a454a191bb864d5dd5e545935e5a40", size = 3226248, upload-time = "2025-10-10T15:43:51.862Z" }, + { url = "https://files.pythonhosted.org/packages/32/d5/0e66097fc64fa266f29a7963296b40a80d6a997b7ac13806183700676f86/sqlalchemy-2.0.44-cp313-cp313-win32.whl", hash = "sha256:ee51625c2d51f8baadf2829fae817ad0b66b140573939dd69284d2ba3553ae73", size = 2101275, upload-time = "2025-10-10T15:03:26.096Z" }, + { url = "https://files.pythonhosted.org/packages/03/51/665617fe4f8c6450f42a6d8d69243f9420f5677395572c2fe9d21b493b7b/sqlalchemy-2.0.44-cp313-cp313-win_amd64.whl", hash = "sha256:c1c80faaee1a6c3428cecf40d16a2365bcf56c424c92c2b6f0f9ad204b899e9e", size = 2127901, upload-time = "2025-10-10T15:03:27.548Z" }, + { url = "https://files.pythonhosted.org/packages/9c/5e/6a29fa884d9fb7ddadf6b69490a9d45fded3b38541713010dad16b77d015/sqlalchemy-2.0.44-py3-none-any.whl", hash = "sha256:19de7ca1246fbef9f9d1bff8f1ab25641569df226364a0e40457dc5457c54b05", size = 1928718, upload-time = "2025-10-10T15:29:45.32Z" }, +] + +[package.optional-dependencies] +asyncio = [ + { name = "greenlet" }, +] + +[[package]] +name = "starlette" +version = "0.50.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "anyio" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/ba/b8/73a0e6a6e079a9d9cfa64113d771e421640b6f679a52eeb9b32f72d871a1/starlette-0.50.0.tar.gz", hash = "sha256:a2a17b22203254bcbc2e1f926d2d55f3f9497f769416b3190768befe598fa3ca", size = 2646985, upload-time = "2025-11-01T15:25:27.516Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/d9/52/1064f510b141bd54025f9b55105e26d1fa970b9be67ad766380a3c9b74b0/starlette-0.50.0-py3-none-any.whl", hash = "sha256:9e5391843ec9b6e472eed1365a78c8098cfceb7a74bfd4d6b1c0c0095efb3bca", size = 74033, upload-time = "2025-11-01T15:25:25.461Z" }, +] + +[[package]] +name = "structlog" +version = "25.5.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/ef/52/9ba0f43b686e7f3ddfeaa78ac3af750292662284b3661e91ad5494f21dbc/structlog-25.5.0.tar.gz", hash = "sha256:098522a3bebed9153d4570c6d0288abf80a031dfdb2048d59a49e9dc2190fc98", size = 1460830, upload-time = "2025-10-27T08:28:23.028Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/a8/45/a132b9074aa18e799b891b91ad72133c98d8042c70f6240e4c5f9dabee2f/structlog-25.5.0-py3-none-any.whl", hash = "sha256:a8453e9b9e636ec59bd9e79bbd4a72f025981b3ba0f5837aebf48f02f37a7f9f", size = 72510, upload-time = "2025-10-27T08:28:21.535Z" }, +] + +[[package]] +name = "tenacity" +version = "9.1.2" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/0a/d4/2b0cd0fe285e14b36db076e78c93766ff1d529d70408bd1d2a5a84f1d929/tenacity-9.1.2.tar.gz", hash = "sha256:1169d376c297e7de388d18b4481760d478b0e99a777cad3a9c86e556f4b697cb", size = 48036, upload-time = "2025-04-02T08:25:09.966Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/e5/30/643397144bfbfec6f6ef821f36f33e57d35946c44a2352d3c9f0ae847619/tenacity-9.1.2-py3-none-any.whl", hash = "sha256:f77bf36710d8b73a50b2dd155c97b870017ad21afe6ab300326b0371b3b05138", size = 28248, upload-time = "2025-04-02T08:25:07.678Z" }, +] + +[[package]] +name = "tqdm" +version = "4.67.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "colorama", marker = "sys_platform == 'win32'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/a8/4b/29b4ef32e036bb34e4ab51796dd745cdba7ed47ad142a9f4a1eb8e0c744d/tqdm-4.67.1.tar.gz", hash = "sha256:f8aef9c52c08c13a65f30ea34f4e5aac3fd1a34959879d7e59e63027286627f2", size = 169737, upload-time = "2024-11-24T20:12:22.481Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/d0/30/dc54f88dd4a2b5dc8a0279bdd7270e735851848b762aeb1c1184ed1f6b14/tqdm-4.67.1-py3-none-any.whl", hash = "sha256:26445eca388f82e72884e0d580d5464cd801a3ea01e63e5601bdff9ba6a48de2", size = 78540, upload-time = "2024-11-24T20:12:19.698Z" }, +] + +[[package]] +name = "ty" +version = "0.0.1a33" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/60/34/82f76e63277f0a6585ea48a8d373cfee417a73755daa078250af65421c77/ty-0.0.1a33.tar.gz", hash = "sha256:1db139aa7cbc9879e93146c99bf5f1f5273ca608683f71b3a9a75f9f812b729f", size = 4704365, upload-time = "2025-12-09T22:35:19.424Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/99/4c/4aec80e452268432f60f17da3840ffd6fef46394300808d0af32766dc989/ty-0.0.1a33-py3-none-linux_armv6l.whl", hash = "sha256:2126e6b62a50dc807d45f56629668861bac95944c77b4b6b6dc13f629d5a5a7e", size = 9674171, upload-time = "2025-12-09T22:34:59.757Z" }, + { url = "https://files.pythonhosted.org/packages/fe/71/ad51a14e00aa0d7e57533f2a68f0865b240bd197c36b87ddab1dd12a1cdd/ty-0.0.1a33-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:f171a278a242b06c2f99327dacfa9c7f2d0328140f2976a46ca46e18cde2d6e3", size = 9466420, upload-time = "2025-12-09T22:34:54.884Z" }, + { url = "https://files.pythonhosted.org/packages/79/fa/72bf596a977e5d5343893bb1eb4092fdd0f22ed8c0f11427cc2201225bdb/ty-0.0.1a33-py3-none-macosx_11_0_arm64.whl", hash = "sha256:7b4249f030d24deeae7b25949d33832b4a25b5c893d679b32df1042584b9091f", size = 9009208, upload-time = "2025-12-09T22:35:27.871Z" }, + { url = "https://files.pythonhosted.org/packages/c4/0d/0e20c21e4473a6ea7109c252f6c6bbc41f895b18307e507d6c12a636e6b6/ty-0.0.1a33-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:176f56fc7a6ea176b36d397c42b35efebb441f1fa42524a010579d7019ca8b67", size = 9280560, upload-time = "2025-12-09T22:35:24.258Z" }, + { url = "https://files.pythonhosted.org/packages/6e/dd/627a0a3e2a270b7200c5f92cb01382a3f9ac4f072abf5e7eb3be8f2f4267/ty-0.0.1a33-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:8ace2379e9c915c4c6d4dfd3737b290ebe2b008c20031233f4a6e9df0758f427", size = 9457161, upload-time = "2025-12-09T22:35:02.394Z" }, + { url = "https://files.pythonhosted.org/packages/ad/5a/974a48b39c885a17471c3f0847165567a77f05beef3b2573984b9b722378/ty-0.0.1a33-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4341a1daa7857b4de3a68658bad7aaa85577a82448182af2c6b412da02b19c14", size = 9873399, upload-time = "2025-12-09T22:34:49.919Z" }, + { url = "https://files.pythonhosted.org/packages/04/0e/8c09a95b91e3ba0d75a6cea69b06b0a070f085de7dd2aabf86d999175f29/ty-0.0.1a33-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:42c45b50b242af5868198131569d9f4ea37f83212a72494b2553e60f385874cc", size = 10487274, upload-time = "2025-12-09T22:34:52.579Z" }, + { url = "https://files.pythonhosted.org/packages/56/37/8d6e898ecf85f67a9bfaaff9c5194d9eaf4d826363a7dab27460eb2d630c/ty-0.0.1a33-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:229b8d927d7815ba4af0b45f1a97766813b62ee97599199900b8ccc1be911284", size = 10244389, upload-time = "2025-12-09T22:35:09.667Z" }, + { url = "https://files.pythonhosted.org/packages/e2/c4/f98a35b12b552d28feb4157334484aa5f472c30944418e23b4a49fad2e40/ty-0.0.1a33-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5018ac5b64865d416b098246da38d2809fdc69e9d86b4e1cf94266e102e7c77f", size = 10224857, upload-time = "2025-12-09T22:35:04.661Z" }, + { url = "https://files.pythonhosted.org/packages/3b/5d/fffb85c5fd7bdffcef212f514b439f229ecaee14e9bf7c199a625819c502/ty-0.0.1a33-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2cd6d6304302ad28e0412d80118a5f63d01af37d3cb39abf33856d348c0819e1", size = 9792377, upload-time = "2025-12-09T22:34:44.634Z" }, + { url = "https://files.pythonhosted.org/packages/6e/ea/0664b0e4a2c286bd880be47121c781befad8077e15cd8a50b9b1f51b8676/ty-0.0.1a33-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:38b75adc050d26a88bbf85d55a4f7633216e455b76e9ee21d6f38640aa040d73", size = 9262018, upload-time = "2025-12-09T22:34:47.274Z" }, + { url = "https://files.pythonhosted.org/packages/89/dd/3d99564d7e649326c98c72b863f0aad771abfc75140413e7b70559ae4850/ty-0.0.1a33-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:efabd881d5b00058c3945b08abbe853b19c93cd0c7148bbcfd27c5d9e6c738f3", size = 9494056, upload-time = "2025-12-09T22:35:21.967Z" }, + { url = "https://files.pythonhosted.org/packages/49/9b/3471118edc5f945e2589c66c27e71b5d9a9efe21c82ced03ea698dbe9a19/ty-0.0.1a33-py3-none-musllinux_1_2_i686.whl", hash = "sha256:ca3b8f84fe661bfb60d1e7665e54dd9c6c84769bff117b00e76ef537473cc59c", size = 9623498, upload-time = "2025-12-09T22:35:07.072Z" }, + { url = "https://files.pythonhosted.org/packages/b8/6d/12dcb22b015a4d3e677f394ab7dc80307f2b59f898ea785ea6bfcef8cffa/ty-0.0.1a33-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:f46ae07e353a54512b64b590eae4d82eb22c3a5f5947cea04f950dc1993f64f1", size = 9904193, upload-time = "2025-12-09T22:34:57.106Z" }, + { url = "https://files.pythonhosted.org/packages/c2/e8/628063386fda2f9182089bfe0c8a27ede0c1a120bef74294008468cd2d7d/ty-0.0.1a33-py3-none-win32.whl", hash = "sha256:9020b8be11a184bbe26d07b1a8f0b2e3b75302b08b98b4b1fb6d5d2d03e64aca", size = 9095241, upload-time = "2025-12-09T22:35:14.475Z" }, + { url = "https://files.pythonhosted.org/packages/e0/fe/8ad29c47c9499132849cd5401f67c6bdd2912be8dcb298e774b4f39e1cce/ty-0.0.1a33-py3-none-win_amd64.whl", hash = "sha256:553b5281d424c69389508a60dfd8af8e3014529ca6856dfed1f231020bc58d09", size = 9948007, upload-time = "2025-12-09T22:35:17.043Z" }, + { url = "https://files.pythonhosted.org/packages/2f/fc/1825f1f8c77d4d8fe75543882d9ad5934e568aa807e1a4cb7e999f701750/ty-0.0.1a33-py3-none-win_arm64.whl", hash = "sha256:d9937e9ddc7b383c6b1ab3065982fb2b8d0a2884ae5bd7b542e4208a807e326e", size = 9471473, upload-time = "2025-12-09T22:35:12.105Z" }, +] + +[[package]] +name = "typer" +version = "0.20.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "click" }, + { name = "rich" }, + { name = "shellingham" }, + { name = "typing-extensions" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/8f/28/7c85c8032b91dbe79725b6f17d2fffc595dff06a35c7a30a37bef73a1ab4/typer-0.20.0.tar.gz", hash = "sha256:1aaf6494031793e4876fb0bacfa6a912b551cf43c1e63c800df8b1a866720c37", size = 106492, upload-time = "2025-10-20T17:03:49.445Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/78/64/7713ffe4b5983314e9d436a90d5bd4f63b6054e2aca783a3cfc44cb95bbf/typer-0.20.0-py3-none-any.whl", hash = "sha256:5b463df6793ec1dca6213a3cf4c0f03bc6e322ac5e16e13ddd622a889489784a", size = 47028, upload-time = "2025-10-20T17:03:47.617Z" }, +] + +[[package]] +name = "typing-extensions" +version = "4.15.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/72/94/1a15dd82efb362ac84269196e94cf00f187f7ed21c242792a923cdb1c61f/typing_extensions-4.15.0.tar.gz", hash = "sha256:0cea48d173cc12fa28ecabc3b837ea3cf6f38c6d1136f85cbaaf598984861466", size = 109391, upload-time = "2025-08-25T13:49:26.313Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/18/67/36e9267722cc04a6b9f15c7f3441c2363321a3ea07da7ae0c0707beb2a9c/typing_extensions-4.15.0-py3-none-any.whl", hash = "sha256:f0fa19c6845758ab08074a0cfa8b7aecb71c999ca73d62883bc25cc018c4e548", size = 44614, upload-time = "2025-08-25T13:49:24.86Z" }, +] + +[[package]] +name = "typing-inspection" +version = "0.4.2" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "typing-extensions" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/55/e3/70399cb7dd41c10ac53367ae42139cf4b1ca5f36bb3dc6c9d33acdb43655/typing_inspection-0.4.2.tar.gz", hash = "sha256:ba561c48a67c5958007083d386c3295464928b01faa735ab8547c5692e87f464", size = 75949, upload-time = "2025-10-01T02:14:41.687Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/dc/9b/47798a6c91d8bdb567fe2698fe81e0c6b7cb7ef4d13da4114b41d239f65d/typing_inspection-0.4.2-py3-none-any.whl", hash = "sha256:4ed1cacbdc298c220f1bd249ed5287caa16f34d44ef4e9c3d0cbad5b521545e7", size = 14611, upload-time = "2025-10-01T02:14:40.154Z" }, +] + +[[package]] +name = "urllib3" +version = "2.5.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/15/22/9ee70a2574a4f4599c47dd506532914ce044817c7752a79b6a51286319bc/urllib3-2.5.0.tar.gz", hash = "sha256:3fc47733c7e419d4bc3f6b3dc2b4f890bb743906a30d56ba4a5bfa4bbff92760", size = 393185, upload-time = "2025-06-18T14:07:41.644Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/a7/c2/fe1e52489ae3122415c51f387e221dd0773709bad6c6cdaa599e8a2c5185/urllib3-2.5.0-py3-none-any.whl", hash = "sha256:e6b01673c0fa6a13e374b50871808eb3bf7046c4b125b216f6bf1cc604cff0dc", size = 129795, upload-time = "2025-06-18T14:07:40.39Z" }, +] + +[[package]] +name = "uuid6" +version = "2025.0.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/ca/b7/4c0f736ca824b3a25b15e8213d1bcfc15f8ac2ae48d1b445b310892dc4da/uuid6-2025.0.1.tar.gz", hash = "sha256:cd0af94fa428675a44e32c5319ec5a3485225ba2179eefcf4c3f205ae30a81bd", size = 13932, upload-time = "2025-07-04T18:30:35.186Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/3d/b2/93faaab7962e2aa8d6e174afb6f76be2ca0ce89fde14d3af835acebcaa59/uuid6-2025.0.1-py3-none-any.whl", hash = "sha256:80530ce4d02a93cdf82e7122ca0da3ebbbc269790ec1cb902481fa3e9cc9ff99", size = 6979, upload-time = "2025-07-04T18:30:34.001Z" }, +] + +[[package]] +name = "uvicorn" +version = "0.38.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "click" }, + { name = "h11" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/cb/ce/f06b84e2697fef4688ca63bdb2fdf113ca0a3be33f94488f2cadb690b0cf/uvicorn-0.38.0.tar.gz", hash = "sha256:fd97093bdd120a2609fc0d3afe931d4d4ad688b6e75f0f929fde1bc36fe0e91d", size = 80605, upload-time = "2025-10-18T13:46:44.63Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/ee/d9/d88e73ca598f4f6ff671fb5fde8a32925c2e08a637303a1d12883c7305fa/uvicorn-0.38.0-py3-none-any.whl", hash = "sha256:48c0afd214ceb59340075b4a052ea1ee91c16fbc2a9b1469cca0e54566977b02", size = 68109, upload-time = "2025-10-18T13:46:42.958Z" }, +] + +[package.optional-dependencies] +standard = [ + { name = "colorama", marker = "sys_platform == 'win32'" }, + { name = "httptools" }, + { name = "python-dotenv" }, + { name = "pyyaml" }, + { name = "uvloop", marker = "platform_python_implementation != 'PyPy' and sys_platform != 'cygwin' and sys_platform != 'win32'" }, + { name = "watchfiles" }, + { name = "websockets" }, +] + +[[package]] +name = "uvloop" +version = "0.22.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/06/f0/18d39dbd1971d6d62c4629cc7fa67f74821b0dc1f5a77af43719de7936a7/uvloop-0.22.1.tar.gz", hash = "sha256:6c84bae345b9147082b17371e3dd5d42775bddce91f885499017f4607fdaf39f", size = 2443250, upload-time = "2025-10-16T22:17:19.342Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/89/8c/182a2a593195bfd39842ea68ebc084e20c850806117213f5a299dfc513d9/uvloop-0.22.1-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:561577354eb94200d75aca23fbde86ee11be36b00e52a4eaf8f50fb0c86b7705", size = 1358611, upload-time = "2025-10-16T22:16:36.833Z" }, + { url = "https://files.pythonhosted.org/packages/d2/14/e301ee96a6dc95224b6f1162cd3312f6d1217be3907b79173b06785f2fe7/uvloop-0.22.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:1cdf5192ab3e674ca26da2eada35b288d2fa49fdd0f357a19f0e7c4e7d5077c8", size = 751811, upload-time = "2025-10-16T22:16:38.275Z" }, + { url = "https://files.pythonhosted.org/packages/b7/02/654426ce265ac19e2980bfd9ea6590ca96a56f10c76e63801a2df01c0486/uvloop-0.22.1-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:6e2ea3d6190a2968f4a14a23019d3b16870dd2190cd69c8180f7c632d21de68d", size = 4288562, upload-time = "2025-10-16T22:16:39.375Z" }, + { url = "https://files.pythonhosted.org/packages/15/c0/0be24758891ef825f2065cd5db8741aaddabe3e248ee6acc5e8a80f04005/uvloop-0.22.1-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:0530a5fbad9c9e4ee3f2b33b148c6a64d47bbad8000ea63704fa8260f4cf728e", size = 4366890, upload-time = "2025-10-16T22:16:40.547Z" }, + { url = "https://files.pythonhosted.org/packages/d2/53/8369e5219a5855869bcee5f4d317f6da0e2c669aecf0ef7d371e3d084449/uvloop-0.22.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:bc5ef13bbc10b5335792360623cc378d52d7e62c2de64660616478c32cd0598e", size = 4119472, upload-time = "2025-10-16T22:16:41.694Z" }, + { url = "https://files.pythonhosted.org/packages/f8/ba/d69adbe699b768f6b29a5eec7b47dd610bd17a69de51b251126a801369ea/uvloop-0.22.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:1f38ec5e3f18c8a10ded09742f7fb8de0108796eb673f30ce7762ce1b8550cad", size = 4239051, upload-time = "2025-10-16T22:16:43.224Z" }, + { url = "https://files.pythonhosted.org/packages/90/cd/b62bdeaa429758aee8de8b00ac0dd26593a9de93d302bff3d21439e9791d/uvloop-0.22.1-cp314-cp314-macosx_10_13_universal2.whl", hash = "sha256:3879b88423ec7e97cd4eba2a443aa26ed4e59b45e6b76aabf13fe2f27023a142", size = 1362067, upload-time = "2025-10-16T22:16:44.503Z" }, + { url = "https://files.pythonhosted.org/packages/0d/f8/a132124dfda0777e489ca86732e85e69afcd1ff7686647000050ba670689/uvloop-0.22.1-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:4baa86acedf1d62115c1dc6ad1e17134476688f08c6efd8a2ab076e815665c74", size = 752423, upload-time = "2025-10-16T22:16:45.968Z" }, + { url = "https://files.pythonhosted.org/packages/a3/94/94af78c156f88da4b3a733773ad5ba0b164393e357cc4bd0ab2e2677a7d6/uvloop-0.22.1-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:297c27d8003520596236bdb2335e6b3f649480bd09e00d1e3a99144b691d2a35", size = 4272437, upload-time = "2025-10-16T22:16:47.451Z" }, + { url = "https://files.pythonhosted.org/packages/b5/35/60249e9fd07b32c665192cec7af29e06c7cd96fa1d08b84f012a56a0b38e/uvloop-0.22.1-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:c1955d5a1dd43198244d47664a5858082a3239766a839b2102a269aaff7a4e25", size = 4292101, upload-time = "2025-10-16T22:16:49.318Z" }, + { url = "https://files.pythonhosted.org/packages/02/62/67d382dfcb25d0a98ce73c11ed1a6fba5037a1a1d533dcbb7cab033a2636/uvloop-0.22.1-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:b31dc2fccbd42adc73bc4e7cdbae4fc5086cf378979e53ca5d0301838c5682c6", size = 4114158, upload-time = "2025-10-16T22:16:50.517Z" }, + { url = "https://files.pythonhosted.org/packages/f0/7a/f1171b4a882a5d13c8b7576f348acfe6074d72eaf52cccef752f748d4a9f/uvloop-0.22.1-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:93f617675b2d03af4e72a5333ef89450dfaa5321303ede6e67ba9c9d26878079", size = 4177360, upload-time = "2025-10-16T22:16:52.646Z" }, + { url = "https://files.pythonhosted.org/packages/79/7b/b01414f31546caf0919da80ad57cbfe24c56b151d12af68cee1b04922ca8/uvloop-0.22.1-cp314-cp314t-macosx_10_13_universal2.whl", hash = "sha256:37554f70528f60cad66945b885eb01f1bb514f132d92b6eeed1c90fd54ed6289", size = 1454790, upload-time = "2025-10-16T22:16:54.355Z" }, + { url = "https://files.pythonhosted.org/packages/d4/31/0bb232318dd838cad3fa8fb0c68c8b40e1145b32025581975e18b11fab40/uvloop-0.22.1-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:b76324e2dc033a0b2f435f33eb88ff9913c156ef78e153fb210e03c13da746b3", size = 796783, upload-time = "2025-10-16T22:16:55.906Z" }, + { url = "https://files.pythonhosted.org/packages/42/38/c9b09f3271a7a723a5de69f8e237ab8e7803183131bc57c890db0b6bb872/uvloop-0.22.1-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:badb4d8e58ee08dad957002027830d5c3b06aea446a6a3744483c2b3b745345c", size = 4647548, upload-time = "2025-10-16T22:16:57.008Z" }, + { url = "https://files.pythonhosted.org/packages/c1/37/945b4ca0ac27e3dc4952642d4c900edd030b3da6c9634875af6e13ae80e5/uvloop-0.22.1-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:b91328c72635f6f9e0282e4a57da7470c7350ab1c9f48546c0f2866205349d21", size = 4467065, upload-time = "2025-10-16T22:16:58.206Z" }, + { url = "https://files.pythonhosted.org/packages/97/cc/48d232f33d60e2e2e0b42f4e73455b146b76ebe216487e862700457fbf3c/uvloop-0.22.1-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:daf620c2995d193449393d6c62131b3fbd40a63bf7b307a1527856ace637fe88", size = 4328384, upload-time = "2025-10-16T22:16:59.36Z" }, + { url = "https://files.pythonhosted.org/packages/e4/16/c1fd27e9549f3c4baf1dc9c20c456cd2f822dbf8de9f463824b0c0357e06/uvloop-0.22.1-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:6cde23eeda1a25c75b2e07d39970f3374105d5eafbaab2a4482be82f272d5a5e", size = 4296730, upload-time = "2025-10-16T22:17:00.744Z" }, +] + +[[package]] +name = "virtualenv" +version = "20.35.4" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "distlib" }, + { name = "filelock" }, + { name = "platformdirs" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/20/28/e6f1a6f655d620846bd9df527390ecc26b3805a0c5989048c210e22c5ca9/virtualenv-20.35.4.tar.gz", hash = "sha256:643d3914d73d3eeb0c552cbb12d7e82adf0e504dbf86a3182f8771a153a1971c", size = 6028799, upload-time = "2025-10-29T06:57:40.511Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/79/0c/c05523fa3181fdf0c9c52a6ba91a23fbf3246cc095f26f6516f9c60e6771/virtualenv-20.35.4-py3-none-any.whl", hash = "sha256:c21c9cede36c9753eeade68ba7d523529f228a403463376cf821eaae2b650f1b", size = 6005095, upload-time = "2025-10-29T06:57:37.598Z" }, +] + +[[package]] +name = "watchfiles" +version = "1.1.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "anyio" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/c2/c9/8869df9b2a2d6c59d79220a4db37679e74f807c559ffe5265e08b227a210/watchfiles-1.1.1.tar.gz", hash = "sha256:a173cb5c16c4f40ab19cecf48a534c409f7ea983ab8fed0741304a1c0a31b3f2", size = 94440, upload-time = "2025-10-14T15:06:21.08Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/bb/f4/f750b29225fe77139f7ae5de89d4949f5a99f934c65a1f1c0b248f26f747/watchfiles-1.1.1-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:130e4876309e8686a5e37dba7d5e9bc77e6ed908266996ca26572437a5271e18", size = 404321, upload-time = "2025-10-14T15:05:02.063Z" }, + { url = "https://files.pythonhosted.org/packages/2b/f9/f07a295cde762644aa4c4bb0f88921d2d141af45e735b965fb2e87858328/watchfiles-1.1.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:5f3bde70f157f84ece3765b42b4a52c6ac1a50334903c6eaf765362f6ccca88a", size = 391783, upload-time = "2025-10-14T15:05:03.052Z" }, + { url = "https://files.pythonhosted.org/packages/bc/11/fc2502457e0bea39a5c958d86d2cb69e407a4d00b85735ca724bfa6e0d1a/watchfiles-1.1.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:14e0b1fe858430fc0251737ef3824c54027bedb8c37c38114488b8e131cf8219", size = 449279, upload-time = "2025-10-14T15:05:04.004Z" }, + { url = "https://files.pythonhosted.org/packages/e3/1f/d66bc15ea0b728df3ed96a539c777acfcad0eb78555ad9efcaa1274688f0/watchfiles-1.1.1-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:f27db948078f3823a6bb3b465180db8ebecf26dd5dae6f6180bd87383b6b4428", size = 459405, upload-time = "2025-10-14T15:05:04.942Z" }, + { url = "https://files.pythonhosted.org/packages/be/90/9f4a65c0aec3ccf032703e6db02d89a157462fbb2cf20dd415128251cac0/watchfiles-1.1.1-cp313-cp313-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:059098c3a429f62fc98e8ec62b982230ef2c8df68c79e826e37b895bc359a9c0", size = 488976, upload-time = "2025-10-14T15:05:05.905Z" }, + { url = "https://files.pythonhosted.org/packages/37/57/ee347af605d867f712be7029bb94c8c071732a4b44792e3176fa3c612d39/watchfiles-1.1.1-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:bfb5862016acc9b869bb57284e6cb35fdf8e22fe59f7548858e2f971d045f150", size = 595506, upload-time = "2025-10-14T15:05:06.906Z" }, + { url = "https://files.pythonhosted.org/packages/a8/78/cc5ab0b86c122047f75e8fc471c67a04dee395daf847d3e59381996c8707/watchfiles-1.1.1-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:319b27255aacd9923b8a276bb14d21a5f7ff82564c744235fc5eae58d95422ae", size = 474936, upload-time = "2025-10-14T15:05:07.906Z" }, + { url = "https://files.pythonhosted.org/packages/62/da/def65b170a3815af7bd40a3e7010bf6ab53089ef1b75d05dd5385b87cf08/watchfiles-1.1.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c755367e51db90e75b19454b680903631d41f9e3607fbd941d296a020c2d752d", size = 456147, upload-time = "2025-10-14T15:05:09.138Z" }, + { url = "https://files.pythonhosted.org/packages/57/99/da6573ba71166e82d288d4df0839128004c67d2778d3b566c138695f5c0b/watchfiles-1.1.1-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:c22c776292a23bfc7237a98f791b9ad3144b02116ff10d820829ce62dff46d0b", size = 630007, upload-time = "2025-10-14T15:05:10.117Z" }, + { url = "https://files.pythonhosted.org/packages/a8/51/7439c4dd39511368849eb1e53279cd3454b4a4dbace80bab88feeb83c6b5/watchfiles-1.1.1-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:3a476189be23c3686bc2f4321dd501cb329c0a0469e77b7b534ee10129ae6374", size = 622280, upload-time = "2025-10-14T15:05:11.146Z" }, + { url = "https://files.pythonhosted.org/packages/95/9c/8ed97d4bba5db6fdcdb2b298d3898f2dd5c20f6b73aee04eabe56c59677e/watchfiles-1.1.1-cp313-cp313-win32.whl", hash = "sha256:bf0a91bfb5574a2f7fc223cf95eeea79abfefa404bf1ea5e339c0c1560ae99a0", size = 272056, upload-time = "2025-10-14T15:05:12.156Z" }, + { url = "https://files.pythonhosted.org/packages/1f/f3/c14e28429f744a260d8ceae18bf58c1d5fa56b50d006a7a9f80e1882cb0d/watchfiles-1.1.1-cp313-cp313-win_amd64.whl", hash = "sha256:52e06553899e11e8074503c8e716d574adeeb7e68913115c4b3653c53f9bae42", size = 288162, upload-time = "2025-10-14T15:05:13.208Z" }, + { url = "https://files.pythonhosted.org/packages/dc/61/fe0e56c40d5cd29523e398d31153218718c5786b5e636d9ae8ae79453d27/watchfiles-1.1.1-cp313-cp313-win_arm64.whl", hash = "sha256:ac3cc5759570cd02662b15fbcd9d917f7ecd47efe0d6b40474eafd246f91ea18", size = 277909, upload-time = "2025-10-14T15:05:14.49Z" }, + { url = "https://files.pythonhosted.org/packages/79/42/e0a7d749626f1e28c7108a99fb9bf524b501bbbeb9b261ceecde644d5a07/watchfiles-1.1.1-cp313-cp313t-macosx_10_12_x86_64.whl", hash = "sha256:563b116874a9a7ce6f96f87cd0b94f7faf92d08d0021e837796f0a14318ef8da", size = 403389, upload-time = "2025-10-14T15:05:15.777Z" }, + { url = "https://files.pythonhosted.org/packages/15/49/08732f90ce0fbbc13913f9f215c689cfc9ced345fb1bcd8829a50007cc8d/watchfiles-1.1.1-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:3ad9fe1dae4ab4212d8c91e80b832425e24f421703b5a42ef2e4a1e215aff051", size = 389964, upload-time = "2025-10-14T15:05:16.85Z" }, + { url = "https://files.pythonhosted.org/packages/27/0d/7c315d4bd5f2538910491a0393c56bf70d333d51bc5b34bee8e68e8cea19/watchfiles-1.1.1-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ce70f96a46b894b36eba678f153f052967a0d06d5b5a19b336ab0dbbd029f73e", size = 448114, upload-time = "2025-10-14T15:05:17.876Z" }, + { url = "https://files.pythonhosted.org/packages/c3/24/9e096de47a4d11bc4df41e9d1e61776393eac4cb6eb11b3e23315b78b2cc/watchfiles-1.1.1-cp313-cp313t-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:cb467c999c2eff23a6417e58d75e5828716f42ed8289fe6b77a7e5a91036ca70", size = 460264, upload-time = "2025-10-14T15:05:18.962Z" }, + { url = "https://files.pythonhosted.org/packages/cc/0f/e8dea6375f1d3ba5fcb0b3583e2b493e77379834c74fd5a22d66d85d6540/watchfiles-1.1.1-cp313-cp313t-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:836398932192dae4146c8f6f737d74baeac8b70ce14831a239bdb1ca882fc261", size = 487877, upload-time = "2025-10-14T15:05:20.094Z" }, + { url = "https://files.pythonhosted.org/packages/ac/5b/df24cfc6424a12deb41503b64d42fbea6b8cb357ec62ca84a5a3476f654a/watchfiles-1.1.1-cp313-cp313t-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:743185e7372b7bc7c389e1badcc606931a827112fbbd37f14c537320fca08620", size = 595176, upload-time = "2025-10-14T15:05:21.134Z" }, + { url = "https://files.pythonhosted.org/packages/8f/b5/853b6757f7347de4e9b37e8cc3289283fb983cba1ab4d2d7144694871d9c/watchfiles-1.1.1-cp313-cp313t-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:afaeff7696e0ad9f02cbb8f56365ff4686ab205fcf9c4c5b6fdfaaa16549dd04", size = 473577, upload-time = "2025-10-14T15:05:22.306Z" }, + { url = "https://files.pythonhosted.org/packages/e1/f7/0a4467be0a56e80447c8529c9fce5b38eab4f513cb3d9bf82e7392a5696b/watchfiles-1.1.1-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3f7eb7da0eb23aa2ba036d4f616d46906013a68caf61b7fdbe42fc8b25132e77", size = 455425, upload-time = "2025-10-14T15:05:23.348Z" }, + { url = "https://files.pythonhosted.org/packages/8e/e0/82583485ea00137ddf69bc84a2db88bd92ab4a6e3c405e5fb878ead8d0e7/watchfiles-1.1.1-cp313-cp313t-musllinux_1_1_aarch64.whl", hash = "sha256:831a62658609f0e5c64178211c942ace999517f5770fe9436be4c2faeba0c0ef", size = 628826, upload-time = "2025-10-14T15:05:24.398Z" }, + { url = "https://files.pythonhosted.org/packages/28/9a/a785356fccf9fae84c0cc90570f11702ae9571036fb25932f1242c82191c/watchfiles-1.1.1-cp313-cp313t-musllinux_1_1_x86_64.whl", hash = "sha256:f9a2ae5c91cecc9edd47e041a930490c31c3afb1f5e6d71de3dc671bfaca02bf", size = 622208, upload-time = "2025-10-14T15:05:25.45Z" }, + { url = "https://files.pythonhosted.org/packages/c3/f4/0872229324ef69b2c3edec35e84bd57a1289e7d3fe74588048ed8947a323/watchfiles-1.1.1-cp314-cp314-macosx_10_12_x86_64.whl", hash = "sha256:d1715143123baeeaeadec0528bb7441103979a1d5f6fd0e1f915383fea7ea6d5", size = 404315, upload-time = "2025-10-14T15:05:26.501Z" }, + { url = "https://files.pythonhosted.org/packages/7b/22/16d5331eaed1cb107b873f6ae1b69e9ced582fcf0c59a50cd84f403b1c32/watchfiles-1.1.1-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:39574d6370c4579d7f5d0ad940ce5b20db0e4117444e39b6d8f99db5676c52fd", size = 390869, upload-time = "2025-10-14T15:05:27.649Z" }, + { url = "https://files.pythonhosted.org/packages/b2/7e/5643bfff5acb6539b18483128fdc0ef2cccc94a5b8fbda130c823e8ed636/watchfiles-1.1.1-cp314-cp314-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7365b92c2e69ee952902e8f70f3ba6360d0d596d9299d55d7d386df84b6941fb", size = 449919, upload-time = "2025-10-14T15:05:28.701Z" }, + { url = "https://files.pythonhosted.org/packages/51/2e/c410993ba5025a9f9357c376f48976ef0e1b1aefb73b97a5ae01a5972755/watchfiles-1.1.1-cp314-cp314-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:bfff9740c69c0e4ed32416f013f3c45e2ae42ccedd1167ef2d805c000b6c71a5", size = 460845, upload-time = "2025-10-14T15:05:30.064Z" }, + { url = "https://files.pythonhosted.org/packages/8e/a4/2df3b404469122e8680f0fcd06079317e48db58a2da2950fb45020947734/watchfiles-1.1.1-cp314-cp314-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:b27cf2eb1dda37b2089e3907d8ea92922b673c0c427886d4edc6b94d8dfe5db3", size = 489027, upload-time = "2025-10-14T15:05:31.064Z" }, + { url = "https://files.pythonhosted.org/packages/ea/84/4587ba5b1f267167ee715b7f66e6382cca6938e0a4b870adad93e44747e6/watchfiles-1.1.1-cp314-cp314-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:526e86aced14a65a5b0ec50827c745597c782ff46b571dbfe46192ab9e0b3c33", size = 595615, upload-time = "2025-10-14T15:05:32.074Z" }, + { url = "https://files.pythonhosted.org/packages/6a/0f/c6988c91d06e93cd0bb3d4a808bcf32375ca1904609835c3031799e3ecae/watchfiles-1.1.1-cp314-cp314-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:04e78dd0b6352db95507fd8cb46f39d185cf8c74e4cf1e4fbad1d3df96faf510", size = 474836, upload-time = "2025-10-14T15:05:33.209Z" }, + { url = "https://files.pythonhosted.org/packages/b4/36/ded8aebea91919485b7bbabbd14f5f359326cb5ec218cd67074d1e426d74/watchfiles-1.1.1-cp314-cp314-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5c85794a4cfa094714fb9c08d4a218375b2b95b8ed1666e8677c349906246c05", size = 455099, upload-time = "2025-10-14T15:05:34.189Z" }, + { url = "https://files.pythonhosted.org/packages/98/e0/8c9bdba88af756a2fce230dd365fab2baf927ba42cd47521ee7498fd5211/watchfiles-1.1.1-cp314-cp314-musllinux_1_1_aarch64.whl", hash = "sha256:74d5012b7630714b66be7b7b7a78855ef7ad58e8650c73afc4c076a1f480a8d6", size = 630626, upload-time = "2025-10-14T15:05:35.216Z" }, + { url = "https://files.pythonhosted.org/packages/2a/84/a95db05354bf2d19e438520d92a8ca475e578c647f78f53197f5a2f17aaf/watchfiles-1.1.1-cp314-cp314-musllinux_1_1_x86_64.whl", hash = "sha256:8fbe85cb3201c7d380d3d0b90e63d520f15d6afe217165d7f98c9c649654db81", size = 622519, upload-time = "2025-10-14T15:05:36.259Z" }, + { url = "https://files.pythonhosted.org/packages/1d/ce/d8acdc8de545de995c339be67711e474c77d643555a9bb74a9334252bd55/watchfiles-1.1.1-cp314-cp314-win32.whl", hash = "sha256:3fa0b59c92278b5a7800d3ee7733da9d096d4aabcfabb9a928918bd276ef9b9b", size = 272078, upload-time = "2025-10-14T15:05:37.63Z" }, + { url = "https://files.pythonhosted.org/packages/c4/c9/a74487f72d0451524be827e8edec251da0cc1fcf111646a511ae752e1a3d/watchfiles-1.1.1-cp314-cp314-win_amd64.whl", hash = "sha256:c2047d0b6cea13b3316bdbafbfa0c4228ae593d995030fda39089d36e64fc03a", size = 287664, upload-time = "2025-10-14T15:05:38.95Z" }, + { url = "https://files.pythonhosted.org/packages/df/b8/8ac000702cdd496cdce998c6f4ee0ca1f15977bba51bdf07d872ebdfc34c/watchfiles-1.1.1-cp314-cp314-win_arm64.whl", hash = "sha256:842178b126593addc05acf6fce960d28bc5fae7afbaa2c6c1b3a7b9460e5be02", size = 277154, upload-time = "2025-10-14T15:05:39.954Z" }, + { url = "https://files.pythonhosted.org/packages/47/a8/e3af2184707c29f0f14b1963c0aace6529f9d1b8582d5b99f31bbf42f59e/watchfiles-1.1.1-cp314-cp314t-macosx_10_12_x86_64.whl", hash = "sha256:88863fbbc1a7312972f1c511f202eb30866370ebb8493aef2812b9ff28156a21", size = 403820, upload-time = "2025-10-14T15:05:40.932Z" }, + { url = "https://files.pythonhosted.org/packages/c0/ec/e47e307c2f4bd75f9f9e8afbe3876679b18e1bcec449beca132a1c5ffb2d/watchfiles-1.1.1-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:55c7475190662e202c08c6c0f4d9e345a29367438cf8e8037f3155e10a88d5a5", size = 390510, upload-time = "2025-10-14T15:05:41.945Z" }, + { url = "https://files.pythonhosted.org/packages/d5/a0/ad235642118090f66e7b2f18fd5c42082418404a79205cdfca50b6309c13/watchfiles-1.1.1-cp314-cp314t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3f53fa183d53a1d7a8852277c92b967ae99c2d4dcee2bfacff8868e6e30b15f7", size = 448408, upload-time = "2025-10-14T15:05:43.385Z" }, + { url = "https://files.pythonhosted.org/packages/df/85/97fa10fd5ff3332ae17e7e40e20784e419e28521549780869f1413742e9d/watchfiles-1.1.1-cp314-cp314t-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:6aae418a8b323732fa89721d86f39ec8f092fc2af67f4217a2b07fd3e93c6101", size = 458968, upload-time = "2025-10-14T15:05:44.404Z" }, + { url = "https://files.pythonhosted.org/packages/47/c2/9059c2e8966ea5ce678166617a7f75ecba6164375f3b288e50a40dc6d489/watchfiles-1.1.1-cp314-cp314t-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f096076119da54a6080e8920cbdaac3dbee667eb91dcc5e5b78840b87415bd44", size = 488096, upload-time = "2025-10-14T15:05:45.398Z" }, + { url = "https://files.pythonhosted.org/packages/94/44/d90a9ec8ac309bc26db808a13e7bfc0e4e78b6fc051078a554e132e80160/watchfiles-1.1.1-cp314-cp314t-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:00485f441d183717038ed2e887a7c868154f216877653121068107b227a2f64c", size = 596040, upload-time = "2025-10-14T15:05:46.502Z" }, + { url = "https://files.pythonhosted.org/packages/95/68/4e3479b20ca305cfc561db3ed207a8a1c745ee32bf24f2026a129d0ddb6e/watchfiles-1.1.1-cp314-cp314t-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:a55f3e9e493158d7bfdb60a1165035f1cf7d320914e7b7ea83fe22c6023b58fc", size = 473847, upload-time = "2025-10-14T15:05:47.484Z" }, + { url = "https://files.pythonhosted.org/packages/4f/55/2af26693fd15165c4ff7857e38330e1b61ab8c37d15dc79118cdba115b7a/watchfiles-1.1.1-cp314-cp314t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8c91ed27800188c2ae96d16e3149f199d62f86c7af5f5f4d2c61a3ed8cd3666c", size = 455072, upload-time = "2025-10-14T15:05:48.928Z" }, + { url = "https://files.pythonhosted.org/packages/66/1d/d0d200b10c9311ec25d2273f8aad8c3ef7cc7ea11808022501811208a750/watchfiles-1.1.1-cp314-cp314t-musllinux_1_1_aarch64.whl", hash = "sha256:311ff15a0bae3714ffb603e6ba6dbfba4065ab60865d15a6ec544133bdb21099", size = 629104, upload-time = "2025-10-14T15:05:49.908Z" }, + { url = "https://files.pythonhosted.org/packages/e3/bd/fa9bb053192491b3867ba07d2343d9f2252e00811567d30ae8d0f78136fe/watchfiles-1.1.1-cp314-cp314t-musllinux_1_1_x86_64.whl", hash = "sha256:a916a2932da8f8ab582f242c065f5c81bed3462849ca79ee357dd9551b0e9b01", size = 622112, upload-time = "2025-10-14T15:05:50.941Z" }, +] + +[[package]] +name = "websockets" +version = "15.0.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/21/e6/26d09fab466b7ca9c7737474c52be4f76a40301b08362eb2dbc19dcc16c1/websockets-15.0.1.tar.gz", hash = "sha256:82544de02076bafba038ce055ee6412d68da13ab47f0c60cab827346de828dee", size = 177016, upload-time = "2025-03-05T20:03:41.606Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/cb/9f/51f0cf64471a9d2b4d0fc6c534f323b664e7095640c34562f5182e5a7195/websockets-15.0.1-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:ee443ef070bb3b6ed74514f5efaa37a252af57c90eb33b956d35c8e9c10a1931", size = 175440, upload-time = "2025-03-05T20:02:36.695Z" }, + { url = "https://files.pythonhosted.org/packages/8a/05/aa116ec9943c718905997412c5989f7ed671bc0188ee2ba89520e8765d7b/websockets-15.0.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:5a939de6b7b4e18ca683218320fc67ea886038265fd1ed30173f5ce3f8e85675", size = 173098, upload-time = "2025-03-05T20:02:37.985Z" }, + { url = "https://files.pythonhosted.org/packages/ff/0b/33cef55ff24f2d92924923c99926dcce78e7bd922d649467f0eda8368923/websockets-15.0.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:746ee8dba912cd6fc889a8147168991d50ed70447bf18bcda7039f7d2e3d9151", size = 173329, upload-time = "2025-03-05T20:02:39.298Z" }, + { url = "https://files.pythonhosted.org/packages/31/1d/063b25dcc01faa8fada1469bdf769de3768b7044eac9d41f734fd7b6ad6d/websockets-15.0.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:595b6c3969023ecf9041b2936ac3827e4623bfa3ccf007575f04c5a6aa318c22", size = 183111, upload-time = "2025-03-05T20:02:40.595Z" }, + { url = "https://files.pythonhosted.org/packages/93/53/9a87ee494a51bf63e4ec9241c1ccc4f7c2f45fff85d5bde2ff74fcb68b9e/websockets-15.0.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:3c714d2fc58b5ca3e285461a4cc0c9a66bd0e24c5da9911e30158286c9b5be7f", size = 182054, upload-time = "2025-03-05T20:02:41.926Z" }, + { url = "https://files.pythonhosted.org/packages/ff/b2/83a6ddf56cdcbad4e3d841fcc55d6ba7d19aeb89c50f24dd7e859ec0805f/websockets-15.0.1-cp313-cp313-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0f3c1e2ab208db911594ae5b4f79addeb3501604a165019dd221c0bdcabe4db8", size = 182496, upload-time = "2025-03-05T20:02:43.304Z" }, + { url = "https://files.pythonhosted.org/packages/98/41/e7038944ed0abf34c45aa4635ba28136f06052e08fc2168520bb8b25149f/websockets-15.0.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:229cf1d3ca6c1804400b0a9790dc66528e08a6a1feec0d5040e8b9eb14422375", size = 182829, upload-time = "2025-03-05T20:02:48.812Z" }, + { url = "https://files.pythonhosted.org/packages/e0/17/de15b6158680c7623c6ef0db361da965ab25d813ae54fcfeae2e5b9ef910/websockets-15.0.1-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:756c56e867a90fb00177d530dca4b097dd753cde348448a1012ed6c5131f8b7d", size = 182217, upload-time = "2025-03-05T20:02:50.14Z" }, + { url = "https://files.pythonhosted.org/packages/33/2b/1f168cb6041853eef0362fb9554c3824367c5560cbdaad89ac40f8c2edfc/websockets-15.0.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:558d023b3df0bffe50a04e710bc87742de35060580a293c2a984299ed83bc4e4", size = 182195, upload-time = "2025-03-05T20:02:51.561Z" }, + { url = "https://files.pythonhosted.org/packages/86/eb/20b6cdf273913d0ad05a6a14aed4b9a85591c18a987a3d47f20fa13dcc47/websockets-15.0.1-cp313-cp313-win32.whl", hash = "sha256:ba9e56e8ceeeedb2e080147ba85ffcd5cd0711b89576b83784d8605a7df455fa", size = 176393, upload-time = "2025-03-05T20:02:53.814Z" }, + { url = "https://files.pythonhosted.org/packages/1b/6c/c65773d6cab416a64d191d6ee8a8b1c68a09970ea6909d16965d26bfed1e/websockets-15.0.1-cp313-cp313-win_amd64.whl", hash = "sha256:e09473f095a819042ecb2ab9465aee615bd9c2028e4ef7d933600a8401c79561", size = 176837, upload-time = "2025-03-05T20:02:55.237Z" }, + { url = "https://files.pythonhosted.org/packages/fa/a8/5b41e0da817d64113292ab1f8247140aac61cbf6cfd085d6a0fa77f4984f/websockets-15.0.1-py3-none-any.whl", hash = "sha256:f7a866fbc1e97b5c617ee4116daaa09b722101d4a3c170c787450ba409f9736f", size = 169743, upload-time = "2025-03-05T20:03:39.41Z" }, +] + +[[package]] +name = "wrapt" +version = "1.17.3" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/95/8f/aeb76c5b46e273670962298c23e7ddde79916cb74db802131d49a85e4b7d/wrapt-1.17.3.tar.gz", hash = "sha256:f66eb08feaa410fe4eebd17f2a2c8e2e46d3476e9f8c783daa8e09e0faa666d0", size = 55547, upload-time = "2025-08-12T05:53:21.714Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/fc/f6/759ece88472157acb55fc195e5b116e06730f1b651b5b314c66291729193/wrapt-1.17.3-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:a47681378a0439215912ef542c45a783484d4dd82bac412b71e59cf9c0e1cea0", size = 54003, upload-time = "2025-08-12T05:51:48.627Z" }, + { url = "https://files.pythonhosted.org/packages/4f/a9/49940b9dc6d47027dc850c116d79b4155f15c08547d04db0f07121499347/wrapt-1.17.3-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:54a30837587c6ee3cd1a4d1c2ec5d24e77984d44e2f34547e2323ddb4e22eb77", size = 39025, upload-time = "2025-08-12T05:51:37.156Z" }, + { url = "https://files.pythonhosted.org/packages/45/35/6a08de0f2c96dcdd7fe464d7420ddb9a7655a6561150e5fc4da9356aeaab/wrapt-1.17.3-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:16ecf15d6af39246fe33e507105d67e4b81d8f8d2c6598ff7e3ca1b8a37213f7", size = 39108, upload-time = "2025-08-12T05:51:58.425Z" }, + { url = "https://files.pythonhosted.org/packages/0c/37/6faf15cfa41bf1f3dba80cd3f5ccc6622dfccb660ab26ed79f0178c7497f/wrapt-1.17.3-cp313-cp313-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:6fd1ad24dc235e4ab88cda009e19bf347aabb975e44fd5c2fb22a3f6e4141277", size = 88072, upload-time = "2025-08-12T05:52:37.53Z" }, + { url = "https://files.pythonhosted.org/packages/78/f2/efe19ada4a38e4e15b6dff39c3e3f3f73f5decf901f66e6f72fe79623a06/wrapt-1.17.3-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:0ed61b7c2d49cee3c027372df5809a59d60cf1b6c2f81ee980a091f3afed6a2d", size = 88214, upload-time = "2025-08-12T05:52:15.886Z" }, + { url = "https://files.pythonhosted.org/packages/40/90/ca86701e9de1622b16e09689fc24b76f69b06bb0150990f6f4e8b0eeb576/wrapt-1.17.3-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:423ed5420ad5f5529db9ce89eac09c8a2f97da18eb1c870237e84c5a5c2d60aa", size = 87105, upload-time = "2025-08-12T05:52:17.914Z" }, + { url = "https://files.pythonhosted.org/packages/fd/e0/d10bd257c9a3e15cbf5523025252cc14d77468e8ed644aafb2d6f54cb95d/wrapt-1.17.3-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:e01375f275f010fcbf7f643b4279896d04e571889b8a5b3f848423d91bf07050", size = 87766, upload-time = "2025-08-12T05:52:39.243Z" }, + { url = "https://files.pythonhosted.org/packages/e8/cf/7d848740203c7b4b27eb55dbfede11aca974a51c3d894f6cc4b865f42f58/wrapt-1.17.3-cp313-cp313-win32.whl", hash = "sha256:53e5e39ff71b3fc484df8a522c933ea2b7cdd0d5d15ae82e5b23fde87d44cbd8", size = 36711, upload-time = "2025-08-12T05:53:10.074Z" }, + { url = "https://files.pythonhosted.org/packages/57/54/35a84d0a4d23ea675994104e667ceff49227ce473ba6a59ba2c84f250b74/wrapt-1.17.3-cp313-cp313-win_amd64.whl", hash = "sha256:1f0b2f40cf341ee8cc1a97d51ff50dddb9fcc73241b9143ec74b30fc4f44f6cb", size = 38885, upload-time = "2025-08-12T05:53:08.695Z" }, + { url = "https://files.pythonhosted.org/packages/01/77/66e54407c59d7b02a3c4e0af3783168fff8e5d61def52cda8728439d86bc/wrapt-1.17.3-cp313-cp313-win_arm64.whl", hash = "sha256:7425ac3c54430f5fc5e7b6f41d41e704db073309acfc09305816bc6a0b26bb16", size = 36896, upload-time = "2025-08-12T05:52:55.34Z" }, + { url = "https://files.pythonhosted.org/packages/02/a2/cd864b2a14f20d14f4c496fab97802001560f9f41554eef6df201cd7f76c/wrapt-1.17.3-cp314-cp314-macosx_10_13_universal2.whl", hash = "sha256:cf30f6e3c077c8e6a9a7809c94551203c8843e74ba0c960f4a98cd80d4665d39", size = 54132, upload-time = "2025-08-12T05:51:49.864Z" }, + { url = "https://files.pythonhosted.org/packages/d5/46/d011725b0c89e853dc44cceb738a307cde5d240d023d6d40a82d1b4e1182/wrapt-1.17.3-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:e228514a06843cae89621384cfe3a80418f3c04aadf8a3b14e46a7be704e4235", size = 39091, upload-time = "2025-08-12T05:51:38.935Z" }, + { url = "https://files.pythonhosted.org/packages/2e/9e/3ad852d77c35aae7ddebdbc3b6d35ec8013af7d7dddad0ad911f3d891dae/wrapt-1.17.3-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:5ea5eb3c0c071862997d6f3e02af1d055f381b1d25b286b9d6644b79db77657c", size = 39172, upload-time = "2025-08-12T05:51:59.365Z" }, + { url = "https://files.pythonhosted.org/packages/c3/f7/c983d2762bcce2326c317c26a6a1e7016f7eb039c27cdf5c4e30f4160f31/wrapt-1.17.3-cp314-cp314-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:281262213373b6d5e4bb4353bc36d1ba4084e6d6b5d242863721ef2bf2c2930b", size = 87163, upload-time = "2025-08-12T05:52:40.965Z" }, + { url = "https://files.pythonhosted.org/packages/e4/0f/f673f75d489c7f22d17fe0193e84b41540d962f75fce579cf6873167c29b/wrapt-1.17.3-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:dc4a8d2b25efb6681ecacad42fca8859f88092d8732b170de6a5dddd80a1c8fa", size = 87963, upload-time = "2025-08-12T05:52:20.326Z" }, + { url = "https://files.pythonhosted.org/packages/df/61/515ad6caca68995da2fac7a6af97faab8f78ebe3bf4f761e1b77efbc47b5/wrapt-1.17.3-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:373342dd05b1d07d752cecbec0c41817231f29f3a89aa8b8843f7b95992ed0c7", size = 86945, upload-time = "2025-08-12T05:52:21.581Z" }, + { url = "https://files.pythonhosted.org/packages/d3/bd/4e70162ce398462a467bc09e768bee112f1412e563620adc353de9055d33/wrapt-1.17.3-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:d40770d7c0fd5cbed9d84b2c3f2e156431a12c9a37dc6284060fb4bec0b7ffd4", size = 86857, upload-time = "2025-08-12T05:52:43.043Z" }, + { url = "https://files.pythonhosted.org/packages/2b/b8/da8560695e9284810b8d3df8a19396a6e40e7518059584a1a394a2b35e0a/wrapt-1.17.3-cp314-cp314-win32.whl", hash = "sha256:fbd3c8319de8e1dc79d346929cd71d523622da527cca14e0c1d257e31c2b8b10", size = 37178, upload-time = "2025-08-12T05:53:12.605Z" }, + { url = "https://files.pythonhosted.org/packages/db/c8/b71eeb192c440d67a5a0449aaee2310a1a1e8eca41676046f99ed2487e9f/wrapt-1.17.3-cp314-cp314-win_amd64.whl", hash = "sha256:e1a4120ae5705f673727d3253de3ed0e016f7cd78dc463db1b31e2463e1f3cf6", size = 39310, upload-time = "2025-08-12T05:53:11.106Z" }, + { url = "https://files.pythonhosted.org/packages/45/20/2cda20fd4865fa40f86f6c46ed37a2a8356a7a2fde0773269311f2af56c7/wrapt-1.17.3-cp314-cp314-win_arm64.whl", hash = "sha256:507553480670cab08a800b9463bdb881b2edeed77dc677b0a5915e6106e91a58", size = 37266, upload-time = "2025-08-12T05:52:56.531Z" }, + { url = "https://files.pythonhosted.org/packages/77/ed/dd5cf21aec36c80443c6f900449260b80e2a65cf963668eaef3b9accce36/wrapt-1.17.3-cp314-cp314t-macosx_10_13_universal2.whl", hash = "sha256:ed7c635ae45cfbc1a7371f708727bf74690daedc49b4dba310590ca0bd28aa8a", size = 56544, upload-time = "2025-08-12T05:51:51.109Z" }, + { url = "https://files.pythonhosted.org/packages/8d/96/450c651cc753877ad100c7949ab4d2e2ecc4d97157e00fa8f45df682456a/wrapt-1.17.3-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:249f88ed15503f6492a71f01442abddd73856a0032ae860de6d75ca62eed8067", size = 40283, upload-time = "2025-08-12T05:51:39.912Z" }, + { url = "https://files.pythonhosted.org/packages/d1/86/2fcad95994d9b572db57632acb6f900695a648c3e063f2cd344b3f5c5a37/wrapt-1.17.3-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:5a03a38adec8066d5a37bea22f2ba6bbf39fcdefbe2d91419ab864c3fb515454", size = 40366, upload-time = "2025-08-12T05:52:00.693Z" }, + { url = "https://files.pythonhosted.org/packages/64/0e/f4472f2fdde2d4617975144311f8800ef73677a159be7fe61fa50997d6c0/wrapt-1.17.3-cp314-cp314t-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:5d4478d72eb61c36e5b446e375bbc49ed002430d17cdec3cecb36993398e1a9e", size = 108571, upload-time = "2025-08-12T05:52:44.521Z" }, + { url = "https://files.pythonhosted.org/packages/cc/01/9b85a99996b0a97c8a17484684f206cbb6ba73c1ce6890ac668bcf3838fb/wrapt-1.17.3-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:223db574bb38637e8230eb14b185565023ab624474df94d2af18f1cdb625216f", size = 113094, upload-time = "2025-08-12T05:52:22.618Z" }, + { url = "https://files.pythonhosted.org/packages/25/02/78926c1efddcc7b3aa0bc3d6b33a822f7d898059f7cd9ace8c8318e559ef/wrapt-1.17.3-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:e405adefb53a435f01efa7ccdec012c016b5a1d3f35459990afc39b6be4d5056", size = 110659, upload-time = "2025-08-12T05:52:24.057Z" }, + { url = "https://files.pythonhosted.org/packages/dc/ee/c414501ad518ac3e6fe184753632fe5e5ecacdcf0effc23f31c1e4f7bfcf/wrapt-1.17.3-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:88547535b787a6c9ce4086917b6e1d291aa8ed914fdd3a838b3539dc95c12804", size = 106946, upload-time = "2025-08-12T05:52:45.976Z" }, + { url = "https://files.pythonhosted.org/packages/be/44/a1bd64b723d13bb151d6cc91b986146a1952385e0392a78567e12149c7b4/wrapt-1.17.3-cp314-cp314t-win32.whl", hash = "sha256:41b1d2bc74c2cac6f9074df52b2efbef2b30bdfe5f40cb78f8ca22963bc62977", size = 38717, upload-time = "2025-08-12T05:53:15.214Z" }, + { url = "https://files.pythonhosted.org/packages/79/d9/7cfd5a312760ac4dd8bf0184a6ee9e43c33e47f3dadc303032ce012b8fa3/wrapt-1.17.3-cp314-cp314t-win_amd64.whl", hash = "sha256:73d496de46cd2cdbdbcce4ae4bcdb4afb6a11234a1df9c085249d55166b95116", size = 41334, upload-time = "2025-08-12T05:53:14.178Z" }, + { url = "https://files.pythonhosted.org/packages/46/78/10ad9781128ed2f99dbc474f43283b13fea8ba58723e98844367531c18e9/wrapt-1.17.3-cp314-cp314t-win_arm64.whl", hash = "sha256:f38e60678850c42461d4202739f9bf1e3a737c7ad283638251e79cc49effb6b6", size = 38471, upload-time = "2025-08-12T05:52:57.784Z" }, + { url = "https://files.pythonhosted.org/packages/1f/f6/a933bd70f98e9cf3e08167fc5cd7aaaca49147e48411c0bd5ae701bb2194/wrapt-1.17.3-py3-none-any.whl", hash = "sha256:7171ae35d2c33d326ac19dd8facb1e82e5fd04ef8c6c0e394d7af55a55051c22", size = 23591, upload-time = "2025-08-12T05:53:20.674Z" }, +] + +[[package]] +name = "yarl" +version = "1.22.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "idna" }, + { name = "multidict" }, + { name = "propcache" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/57/63/0c6ebca57330cd313f6102b16dd57ffaf3ec4c83403dcb45dbd15c6f3ea1/yarl-1.22.0.tar.gz", hash = "sha256:bebf8557577d4401ba8bd9ff33906f1376c877aa78d1fe216ad01b4d6745af71", size = 187169, upload-time = "2025-10-06T14:12:55.963Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/ea/f3/d67de7260456ee105dc1d162d43a019ecad6b91e2f51809d6cddaa56690e/yarl-1.22.0-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:8dee9c25c74997f6a750cd317b8ca63545169c098faee42c84aa5e506c819b53", size = 139980, upload-time = "2025-10-06T14:10:14.601Z" }, + { url = "https://files.pythonhosted.org/packages/01/88/04d98af0b47e0ef42597b9b28863b9060bb515524da0a65d5f4db160b2d5/yarl-1.22.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:01e73b85a5434f89fc4fe27dcda2aff08ddf35e4d47bbbea3bdcd25321af538a", size = 93424, upload-time = "2025-10-06T14:10:16.115Z" }, + { url = "https://files.pythonhosted.org/packages/18/91/3274b215fd8442a03975ce6bee5fe6aa57a8326b29b9d3d56234a1dca244/yarl-1.22.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:22965c2af250d20c873cdbee8ff958fb809940aeb2e74ba5f20aaf6b7ac8c70c", size = 93821, upload-time = "2025-10-06T14:10:17.993Z" }, + { url = "https://files.pythonhosted.org/packages/61/3a/caf4e25036db0f2da4ca22a353dfeb3c9d3c95d2761ebe9b14df8fc16eb0/yarl-1.22.0-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:b4f15793aa49793ec8d1c708ab7f9eded1aa72edc5174cae703651555ed1b601", size = 373243, upload-time = "2025-10-06T14:10:19.44Z" }, + { url = "https://files.pythonhosted.org/packages/6e/9e/51a77ac7516e8e7803b06e01f74e78649c24ee1021eca3d6a739cb6ea49c/yarl-1.22.0-cp313-cp313-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:e5542339dcf2747135c5c85f68680353d5cb9ffd741c0f2e8d832d054d41f35a", size = 342361, upload-time = "2025-10-06T14:10:21.124Z" }, + { url = "https://files.pythonhosted.org/packages/d4/f8/33b92454789dde8407f156c00303e9a891f1f51a0330b0fad7c909f87692/yarl-1.22.0-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:5c401e05ad47a75869c3ab3e35137f8468b846770587e70d71e11de797d113df", size = 387036, upload-time = "2025-10-06T14:10:22.902Z" }, + { url = "https://files.pythonhosted.org/packages/d9/9a/c5db84ea024f76838220280f732970aa4ee154015d7f5c1bfb60a267af6f/yarl-1.22.0-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:243dda95d901c733f5b59214d28b0120893d91777cb8aa043e6ef059d3cddfe2", size = 397671, upload-time = "2025-10-06T14:10:24.523Z" }, + { url = "https://files.pythonhosted.org/packages/11/c9/cd8538dc2e7727095e0c1d867bad1e40c98f37763e6d995c1939f5fdc7b1/yarl-1.22.0-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:bec03d0d388060058f5d291a813f21c011041938a441c593374da6077fe21b1b", size = 377059, upload-time = "2025-10-06T14:10:26.406Z" }, + { url = "https://files.pythonhosted.org/packages/a1/b9/ab437b261702ced75122ed78a876a6dec0a1b0f5e17a4ac7a9a2482d8abe/yarl-1.22.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:b0748275abb8c1e1e09301ee3cf90c8a99678a4e92e4373705f2a2570d581273", size = 365356, upload-time = "2025-10-06T14:10:28.461Z" }, + { url = "https://files.pythonhosted.org/packages/b2/9d/8e1ae6d1d008a9567877b08f0ce4077a29974c04c062dabdb923ed98e6fe/yarl-1.22.0-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:47fdb18187e2a4e18fda2c25c05d8251a9e4a521edaed757fef033e7d8498d9a", size = 361331, upload-time = "2025-10-06T14:10:30.541Z" }, + { url = "https://files.pythonhosted.org/packages/ca/5a/09b7be3905962f145b73beb468cdd53db8aa171cf18c80400a54c5b82846/yarl-1.22.0-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:c7044802eec4524fde550afc28edda0dd5784c4c45f0be151a2d3ba017daca7d", size = 382590, upload-time = "2025-10-06T14:10:33.352Z" }, + { url = "https://files.pythonhosted.org/packages/aa/7f/59ec509abf90eda5048b0bc3e2d7b5099dffdb3e6b127019895ab9d5ef44/yarl-1.22.0-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:139718f35149ff544caba20fce6e8a2f71f1e39b92c700d8438a0b1d2a631a02", size = 385316, upload-time = "2025-10-06T14:10:35.034Z" }, + { url = "https://files.pythonhosted.org/packages/e5/84/891158426bc8036bfdfd862fabd0e0fa25df4176ec793e447f4b85cf1be4/yarl-1.22.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:e1b51bebd221006d3d2f95fbe124b22b247136647ae5dcc8c7acafba66e5ee67", size = 374431, upload-time = "2025-10-06T14:10:37.76Z" }, + { url = "https://files.pythonhosted.org/packages/bb/49/03da1580665baa8bef5e8ed34c6df2c2aca0a2f28bf397ed238cc1bbc6f2/yarl-1.22.0-cp313-cp313-win32.whl", hash = "sha256:d3e32536234a95f513bd374e93d717cf6b2231a791758de6c509e3653f234c95", size = 81555, upload-time = "2025-10-06T14:10:39.649Z" }, + { url = "https://files.pythonhosted.org/packages/9a/ee/450914ae11b419eadd067c6183ae08381cfdfcb9798b90b2b713bbebddda/yarl-1.22.0-cp313-cp313-win_amd64.whl", hash = "sha256:47743b82b76d89a1d20b83e60d5c20314cbd5ba2befc9cda8f28300c4a08ed4d", size = 86965, upload-time = "2025-10-06T14:10:41.313Z" }, + { url = "https://files.pythonhosted.org/packages/98/4d/264a01eae03b6cf629ad69bae94e3b0e5344741e929073678e84bf7a3e3b/yarl-1.22.0-cp313-cp313-win_arm64.whl", hash = "sha256:5d0fcda9608875f7d052eff120c7a5da474a6796fe4d83e152e0e4d42f6d1a9b", size = 81205, upload-time = "2025-10-06T14:10:43.167Z" }, + { url = "https://files.pythonhosted.org/packages/88/fc/6908f062a2f77b5f9f6d69cecb1747260831ff206adcbc5b510aff88df91/yarl-1.22.0-cp313-cp313t-macosx_10_13_universal2.whl", hash = "sha256:719ae08b6972befcba4310e49edb1161a88cdd331e3a694b84466bd938a6ab10", size = 146209, upload-time = "2025-10-06T14:10:44.643Z" }, + { url = "https://files.pythonhosted.org/packages/65/47/76594ae8eab26210b4867be6f49129861ad33da1f1ebdf7051e98492bf62/yarl-1.22.0-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:47d8a5c446df1c4db9d21b49619ffdba90e77c89ec6e283f453856c74b50b9e3", size = 95966, upload-time = "2025-10-06T14:10:46.554Z" }, + { url = "https://files.pythonhosted.org/packages/ab/ce/05e9828a49271ba6b5b038b15b3934e996980dd78abdfeb52a04cfb9467e/yarl-1.22.0-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:cfebc0ac8333520d2d0423cbbe43ae43c8838862ddb898f5ca68565e395516e9", size = 97312, upload-time = "2025-10-06T14:10:48.007Z" }, + { url = "https://files.pythonhosted.org/packages/d1/c5/7dffad5e4f2265b29c9d7ec869c369e4223166e4f9206fc2243ee9eea727/yarl-1.22.0-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:4398557cbf484207df000309235979c79c4356518fd5c99158c7d38203c4da4f", size = 361967, upload-time = "2025-10-06T14:10:49.997Z" }, + { url = "https://files.pythonhosted.org/packages/50/b2/375b933c93a54bff7fc041e1a6ad2c0f6f733ffb0c6e642ce56ee3b39970/yarl-1.22.0-cp313-cp313t-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:2ca6fd72a8cd803be290d42f2dec5cdcd5299eeb93c2d929bf060ad9efaf5de0", size = 323949, upload-time = "2025-10-06T14:10:52.004Z" }, + { url = "https://files.pythonhosted.org/packages/66/50/bfc2a29a1d78644c5a7220ce2f304f38248dc94124a326794e677634b6cf/yarl-1.22.0-cp313-cp313t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:ca1f59c4e1ab6e72f0a23c13fca5430f889634166be85dbf1013683e49e3278e", size = 361818, upload-time = "2025-10-06T14:10:54.078Z" }, + { url = "https://files.pythonhosted.org/packages/46/96/f3941a46af7d5d0f0498f86d71275696800ddcdd20426298e572b19b91ff/yarl-1.22.0-cp313-cp313t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:6c5010a52015e7c70f86eb967db0f37f3c8bd503a695a49f8d45700144667708", size = 372626, upload-time = "2025-10-06T14:10:55.767Z" }, + { url = "https://files.pythonhosted.org/packages/c1/42/8b27c83bb875cd89448e42cd627e0fb971fa1675c9ec546393d18826cb50/yarl-1.22.0-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:9d7672ecf7557476642c88497c2f8d8542f8e36596e928e9bcba0e42e1e7d71f", size = 341129, upload-time = "2025-10-06T14:10:57.985Z" }, + { url = "https://files.pythonhosted.org/packages/49/36/99ca3122201b382a3cf7cc937b95235b0ac944f7e9f2d5331d50821ed352/yarl-1.22.0-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:3b7c88eeef021579d600e50363e0b6ee4f7f6f728cd3486b9d0f3ee7b946398d", size = 346776, upload-time = "2025-10-06T14:10:59.633Z" }, + { url = "https://files.pythonhosted.org/packages/85/b4/47328bf996acd01a4c16ef9dcd2f59c969f495073616586f78cd5f2efb99/yarl-1.22.0-cp313-cp313t-musllinux_1_2_armv7l.whl", hash = "sha256:f4afb5c34f2c6fecdcc182dfcfc6af6cccf1aa923eed4d6a12e9d96904e1a0d8", size = 334879, upload-time = "2025-10-06T14:11:01.454Z" }, + { url = "https://files.pythonhosted.org/packages/c2/ad/b77d7b3f14a4283bffb8e92c6026496f6de49751c2f97d4352242bba3990/yarl-1.22.0-cp313-cp313t-musllinux_1_2_ppc64le.whl", hash = "sha256:59c189e3e99a59cf8d83cbb31d4db02d66cda5a1a4374e8a012b51255341abf5", size = 350996, upload-time = "2025-10-06T14:11:03.452Z" }, + { url = "https://files.pythonhosted.org/packages/81/c8/06e1d69295792ba54d556f06686cbd6a7ce39c22307100e3fb4a2c0b0a1d/yarl-1.22.0-cp313-cp313t-musllinux_1_2_s390x.whl", hash = "sha256:5a3bf7f62a289fa90f1990422dc8dff5a458469ea71d1624585ec3a4c8d6960f", size = 356047, upload-time = "2025-10-06T14:11:05.115Z" }, + { url = "https://files.pythonhosted.org/packages/4b/b8/4c0e9e9f597074b208d18cef227d83aac36184bfbc6eab204ea55783dbc5/yarl-1.22.0-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:de6b9a04c606978fdfe72666fa216ffcf2d1a9f6a381058d4378f8d7b1e5de62", size = 342947, upload-time = "2025-10-06T14:11:08.137Z" }, + { url = "https://files.pythonhosted.org/packages/e0/e5/11f140a58bf4c6ad7aca69a892bff0ee638c31bea4206748fc0df4ebcb3a/yarl-1.22.0-cp313-cp313t-win32.whl", hash = "sha256:1834bb90991cc2999f10f97f5f01317f99b143284766d197e43cd5b45eb18d03", size = 86943, upload-time = "2025-10-06T14:11:10.284Z" }, + { url = "https://files.pythonhosted.org/packages/31/74/8b74bae38ed7fe6793d0c15a0c8207bbb819cf287788459e5ed230996cdd/yarl-1.22.0-cp313-cp313t-win_amd64.whl", hash = "sha256:ff86011bd159a9d2dfc89c34cfd8aff12875980e3bd6a39ff097887520e60249", size = 93715, upload-time = "2025-10-06T14:11:11.739Z" }, + { url = "https://files.pythonhosted.org/packages/69/66/991858aa4b5892d57aef7ee1ba6b4d01ec3b7eb3060795d34090a3ca3278/yarl-1.22.0-cp313-cp313t-win_arm64.whl", hash = "sha256:7861058d0582b847bc4e3a4a4c46828a410bca738673f35a29ba3ca5db0b473b", size = 83857, upload-time = "2025-10-06T14:11:13.586Z" }, + { url = "https://files.pythonhosted.org/packages/46/b3/e20ef504049f1a1c54a814b4b9bed96d1ac0e0610c3b4da178f87209db05/yarl-1.22.0-cp314-cp314-macosx_10_13_universal2.whl", hash = "sha256:34b36c2c57124530884d89d50ed2c1478697ad7473efd59cfd479945c95650e4", size = 140520, upload-time = "2025-10-06T14:11:15.465Z" }, + { url = "https://files.pythonhosted.org/packages/e4/04/3532d990fdbab02e5ede063676b5c4260e7f3abea2151099c2aa745acc4c/yarl-1.22.0-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:0dd9a702591ca2e543631c2a017e4a547e38a5c0f29eece37d9097e04a7ac683", size = 93504, upload-time = "2025-10-06T14:11:17.106Z" }, + { url = "https://files.pythonhosted.org/packages/11/63/ff458113c5c2dac9a9719ac68ee7c947cb621432bcf28c9972b1c0e83938/yarl-1.22.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:594fcab1032e2d2cc3321bb2e51271e7cd2b516c7d9aee780ece81b07ff8244b", size = 94282, upload-time = "2025-10-06T14:11:19.064Z" }, + { url = "https://files.pythonhosted.org/packages/a7/bc/315a56aca762d44a6aaaf7ad253f04d996cb6b27bad34410f82d76ea8038/yarl-1.22.0-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:f3d7a87a78d46a2e3d5b72587ac14b4c16952dd0887dbb051451eceac774411e", size = 372080, upload-time = "2025-10-06T14:11:20.996Z" }, + { url = "https://files.pythonhosted.org/packages/3f/3f/08e9b826ec2e099ea6e7c69a61272f4f6da62cb5b1b63590bb80ca2e4a40/yarl-1.22.0-cp314-cp314-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:852863707010316c973162e703bddabec35e8757e67fcb8ad58829de1ebc8590", size = 338696, upload-time = "2025-10-06T14:11:22.847Z" }, + { url = "https://files.pythonhosted.org/packages/e3/9f/90360108e3b32bd76789088e99538febfea24a102380ae73827f62073543/yarl-1.22.0-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:131a085a53bfe839a477c0845acf21efc77457ba2bcf5899618136d64f3303a2", size = 387121, upload-time = "2025-10-06T14:11:24.889Z" }, + { url = "https://files.pythonhosted.org/packages/98/92/ab8d4657bd5b46a38094cfaea498f18bb70ce6b63508fd7e909bd1f93066/yarl-1.22.0-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:078a8aefd263f4d4f923a9677b942b445a2be970ca24548a8102689a3a8ab8da", size = 394080, upload-time = "2025-10-06T14:11:27.307Z" }, + { url = "https://files.pythonhosted.org/packages/f5/e7/d8c5a7752fef68205296201f8ec2bf718f5c805a7a7e9880576c67600658/yarl-1.22.0-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:bca03b91c323036913993ff5c738d0842fc9c60c4648e5c8d98331526df89784", size = 372661, upload-time = "2025-10-06T14:11:29.387Z" }, + { url = "https://files.pythonhosted.org/packages/b6/2e/f4d26183c8db0bb82d491b072f3127fb8c381a6206a3a56332714b79b751/yarl-1.22.0-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:68986a61557d37bb90d3051a45b91fa3d5c516d177dfc6dd6f2f436a07ff2b6b", size = 364645, upload-time = "2025-10-06T14:11:31.423Z" }, + { url = "https://files.pythonhosted.org/packages/80/7c/428e5812e6b87cd00ee8e898328a62c95825bf37c7fa87f0b6bb2ad31304/yarl-1.22.0-cp314-cp314-musllinux_1_2_armv7l.whl", hash = "sha256:4792b262d585ff0dff6bcb787f8492e40698443ec982a3568c2096433660c694", size = 355361, upload-time = "2025-10-06T14:11:33.055Z" }, + { url = "https://files.pythonhosted.org/packages/ec/2a/249405fd26776f8b13c067378ef4d7dd49c9098d1b6457cdd152a99e96a9/yarl-1.22.0-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:ebd4549b108d732dba1d4ace67614b9545b21ece30937a63a65dd34efa19732d", size = 381451, upload-time = "2025-10-06T14:11:35.136Z" }, + { url = "https://files.pythonhosted.org/packages/67/a8/fb6b1adbe98cf1e2dd9fad71003d3a63a1bc22459c6e15f5714eb9323b93/yarl-1.22.0-cp314-cp314-musllinux_1_2_s390x.whl", hash = "sha256:f87ac53513d22240c7d59203f25cc3beac1e574c6cd681bbfd321987b69f95fd", size = 383814, upload-time = "2025-10-06T14:11:37.094Z" }, + { url = "https://files.pythonhosted.org/packages/d9/f9/3aa2c0e480fb73e872ae2814c43bc1e734740bb0d54e8cb2a95925f98131/yarl-1.22.0-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:22b029f2881599e2f1b06f8f1db2ee63bd309e2293ba2d566e008ba12778b8da", size = 370799, upload-time = "2025-10-06T14:11:38.83Z" }, + { url = "https://files.pythonhosted.org/packages/50/3c/af9dba3b8b5eeb302f36f16f92791f3ea62e3f47763406abf6d5a4a3333b/yarl-1.22.0-cp314-cp314-win32.whl", hash = "sha256:6a635ea45ba4ea8238463b4f7d0e721bad669f80878b7bfd1f89266e2ae63da2", size = 82990, upload-time = "2025-10-06T14:11:40.624Z" }, + { url = "https://files.pythonhosted.org/packages/ac/30/ac3a0c5bdc1d6efd1b41fa24d4897a4329b3b1e98de9449679dd327af4f0/yarl-1.22.0-cp314-cp314-win_amd64.whl", hash = "sha256:0d6e6885777af0f110b0e5d7e5dda8b704efed3894da26220b7f3d887b839a79", size = 88292, upload-time = "2025-10-06T14:11:42.578Z" }, + { url = "https://files.pythonhosted.org/packages/df/0a/227ab4ff5b998a1b7410abc7b46c9b7a26b0ca9e86c34ba4b8d8bc7c63d5/yarl-1.22.0-cp314-cp314-win_arm64.whl", hash = "sha256:8218f4e98d3c10d683584cb40f0424f4b9fd6e95610232dd75e13743b070ee33", size = 82888, upload-time = "2025-10-06T14:11:44.863Z" }, + { url = "https://files.pythonhosted.org/packages/06/5e/a15eb13db90abd87dfbefb9760c0f3f257ac42a5cac7e75dbc23bed97a9f/yarl-1.22.0-cp314-cp314t-macosx_10_13_universal2.whl", hash = "sha256:45c2842ff0e0d1b35a6bf1cd6c690939dacb617a70827f715232b2e0494d55d1", size = 146223, upload-time = "2025-10-06T14:11:46.796Z" }, + { url = "https://files.pythonhosted.org/packages/18/82/9665c61910d4d84f41a5bf6837597c89e665fa88aa4941080704645932a9/yarl-1.22.0-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:d947071e6ebcf2e2bee8fce76e10faca8f7a14808ca36a910263acaacef08eca", size = 95981, upload-time = "2025-10-06T14:11:48.845Z" }, + { url = "https://files.pythonhosted.org/packages/5d/9a/2f65743589809af4d0a6d3aa749343c4b5f4c380cc24a8e94a3c6625a808/yarl-1.22.0-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:334b8721303e61b00019474cc103bdac3d7b1f65e91f0bfedeec2d56dfe74b53", size = 97303, upload-time = "2025-10-06T14:11:50.897Z" }, + { url = "https://files.pythonhosted.org/packages/b0/ab/5b13d3e157505c43c3b43b5a776cbf7b24a02bc4cccc40314771197e3508/yarl-1.22.0-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:1e7ce67c34138a058fd092f67d07a72b8e31ff0c9236e751957465a24b28910c", size = 361820, upload-time = "2025-10-06T14:11:52.549Z" }, + { url = "https://files.pythonhosted.org/packages/fb/76/242a5ef4677615cf95330cfc1b4610e78184400699bdda0acb897ef5e49a/yarl-1.22.0-cp314-cp314t-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:d77e1b2c6d04711478cb1c4ab90db07f1609ccf06a287d5607fcd90dc9863acf", size = 323203, upload-time = "2025-10-06T14:11:54.225Z" }, + { url = "https://files.pythonhosted.org/packages/8c/96/475509110d3f0153b43d06164cf4195c64d16999e0c7e2d8a099adcd6907/yarl-1.22.0-cp314-cp314t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:c4647674b6150d2cae088fc07de2738a84b8bcedebef29802cf0b0a82ab6face", size = 363173, upload-time = "2025-10-06T14:11:56.069Z" }, + { url = "https://files.pythonhosted.org/packages/c9/66/59db471aecfbd559a1fd48aedd954435558cd98c7d0da8b03cc6c140a32c/yarl-1.22.0-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:efb07073be061c8f79d03d04139a80ba33cbd390ca8f0297aae9cce6411e4c6b", size = 373562, upload-time = "2025-10-06T14:11:58.783Z" }, + { url = "https://files.pythonhosted.org/packages/03/1f/c5d94abc91557384719da10ff166b916107c1b45e4d0423a88457071dd88/yarl-1.22.0-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:e51ac5435758ba97ad69617e13233da53908beccc6cfcd6c34bbed8dcbede486", size = 339828, upload-time = "2025-10-06T14:12:00.686Z" }, + { url = "https://files.pythonhosted.org/packages/5f/97/aa6a143d3afba17b6465733681c70cf175af89f76ec8d9286e08437a7454/yarl-1.22.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:33e32a0dd0c8205efa8e83d04fc9f19313772b78522d1bdc7d9aed706bfd6138", size = 347551, upload-time = "2025-10-06T14:12:02.628Z" }, + { url = "https://files.pythonhosted.org/packages/43/3c/45a2b6d80195959239a7b2a8810506d4eea5487dce61c2a3393e7fc3c52e/yarl-1.22.0-cp314-cp314t-musllinux_1_2_armv7l.whl", hash = "sha256:bf4a21e58b9cde0e401e683ebd00f6ed30a06d14e93f7c8fd059f8b6e8f87b6a", size = 334512, upload-time = "2025-10-06T14:12:04.871Z" }, + { url = "https://files.pythonhosted.org/packages/86/a0/c2ab48d74599c7c84cb104ebd799c5813de252bea0f360ffc29d270c2caa/yarl-1.22.0-cp314-cp314t-musllinux_1_2_ppc64le.whl", hash = "sha256:e4b582bab49ac33c8deb97e058cd67c2c50dac0dd134874106d9c774fd272529", size = 352400, upload-time = "2025-10-06T14:12:06.624Z" }, + { url = "https://files.pythonhosted.org/packages/32/75/f8919b2eafc929567d3d8411f72bdb1a2109c01caaab4ebfa5f8ffadc15b/yarl-1.22.0-cp314-cp314t-musllinux_1_2_s390x.whl", hash = "sha256:0b5bcc1a9c4839e7e30b7b30dd47fe5e7e44fb7054ec29b5bb8d526aa1041093", size = 357140, upload-time = "2025-10-06T14:12:08.362Z" }, + { url = "https://files.pythonhosted.org/packages/cf/72/6a85bba382f22cf78add705d8c3731748397d986e197e53ecc7835e76de7/yarl-1.22.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:c0232bce2170103ec23c454e54a57008a9a72b5d1c3105dc2496750da8cfa47c", size = 341473, upload-time = "2025-10-06T14:12:10.994Z" }, + { url = "https://files.pythonhosted.org/packages/35/18/55e6011f7c044dc80b98893060773cefcfdbf60dfefb8cb2f58b9bacbd83/yarl-1.22.0-cp314-cp314t-win32.whl", hash = "sha256:8009b3173bcd637be650922ac455946197d858b3630b6d8787aa9e5c4564533e", size = 89056, upload-time = "2025-10-06T14:12:13.317Z" }, + { url = "https://files.pythonhosted.org/packages/f9/86/0f0dccb6e59a9e7f122c5afd43568b1d31b8ab7dda5f1b01fb5c7025c9a9/yarl-1.22.0-cp314-cp314t-win_amd64.whl", hash = "sha256:9fb17ea16e972c63d25d4a97f016d235c78dd2344820eb35bc034bc32012ee27", size = 96292, upload-time = "2025-10-06T14:12:15.398Z" }, + { url = "https://files.pythonhosted.org/packages/48/b7/503c98092fb3b344a179579f55814b613c1fbb1c23b3ec14a7b008a66a6e/yarl-1.22.0-cp314-cp314t-win_arm64.whl", hash = "sha256:9f6d73c1436b934e3f01df1e1b21ff765cd1d28c77dfb9ace207f746d4610ee1", size = 85171, upload-time = "2025-10-06T14:12:16.935Z" }, + { url = "https://files.pythonhosted.org/packages/73/ae/b48f95715333080afb75a4504487cbe142cae1268afc482d06692d605ae6/yarl-1.22.0-py3-none-any.whl", hash = "sha256:1380560bdba02b6b6c90de54133c81c9f2a453dee9912fe58c1dcced1edb7cff", size = 46814, upload-time = "2025-10-06T14:12:53.872Z" }, +] diff --git a/env.example b/env.example new file mode 100644 index 0000000..4f8716f --- /dev/null +++ b/env.example @@ -0,0 +1,120 @@ +# ============================================================================== +# BookBytes Environment Configuration +# ============================================================================== +# Copy this file to .env and fill in the values for your environment. +# Never commit .env to version control! +# ============================================================================== + +# ============================================================================== +# APPLICATION +# ============================================================================== +# Environment: development, staging, production +APP_ENV=development + +# Enable debug mode (additional logging, stack traces in responses) +DEBUG=true + +# Application name (shown in logs and API docs) +APP_NAME=BookBytes + +# Application version +APP_VERSION=0.1.0 + +# ============================================================================== +# LOGGING +# ============================================================================== +# Log level: DEBUG, INFO, WARNING, ERROR, CRITICAL +LOG_LEVEL=INFO + +# Log format: json (production) or console (development) +LOG_FORMAT=console + +# ============================================================================== +# SERVER +# ============================================================================== +# Server bind address +HOST=0.0.0.0 + +# Server port +PORT=8000 + +# ============================================================================== +# DATABASE +# ============================================================================== +# PostgreSQL connection URL with async driver +# Format: postgresql+asyncpg://user:password@host:port/database +DATABASE_URL=postgresql+asyncpg://bookbytes:bookbytes@localhost:5432/bookbytes + +# Connection pool settings +DATABASE_POOL_MIN=2 +DATABASE_POOL_MAX=10 + +# ============================================================================== +# REDIS +# ============================================================================== +# Redis connection URL for background jobs and caching +# Format: redis://host:port/db +REDIS_URL=redis://localhost:6379/0 + +# ============================================================================== +# STORAGE +# ============================================================================== +# Storage backend: local (filesystem) or s3 (AWS S3) +STORAGE_BACKEND=local + +# Local storage path (used when STORAGE_BACKEND=local) +LOCAL_STORAGE_PATH=./data/audio + +# S3 configuration (used when STORAGE_BACKEND=s3) +S3_BUCKET=bookbytes-audio +S3_REGION=us-east-1 + +# AWS credentials (optional if using IAM roles) +# AWS_ACCESS_KEY_ID=your-access-key-id +# AWS_SECRET_ACCESS_KEY=your-secret-access-key + +# S3 pre-signed URL expiry in seconds (0 = public/no expiry) +S3_URL_EXPIRY_SECONDS=0 + +# ============================================================================== +# EXTERNAL APIS +# ============================================================================== +# OpenAI API key for chapter extraction and summary generation +# Get your key at: https://platform.openai.com/api-keys +OPENAI_API_KEY=sk-your-openai-api-key + +# OpenAI model to use (gpt-3.5-turbo, gpt-4, etc.) +OPENAI_MODEL=gpt-4o-mini + +# OpenAI request timeout in seconds +OPENAI_TIMEOUT=30 + +# ============================================================================== +# AUTHENTICATION +# ============================================================================== +# Auth mode: jwt (production) or api_key (local development bypass) +AUTH_MODE=api_key + +# JWT secret key for token signing +# IMPORTANT: Change this in production! Use a secure random string. +# Generate with: python -c "import secrets; print(secrets.token_urlsafe(32))" +JWT_SECRET_KEY=dev-secret-key-change-in-production + +# JWT algorithm (HS256, HS384, HS512) +JWT_ALGORITHM=HS256 + +# JWT access token expiry in minutes +JWT_EXPIRE_MINUTES=30 + +# API key for local development (used when AUTH_MODE=api_key) +# Include this in requests as X-API-Key header +API_KEY=dev-api-key-12345 + +# ============================================================================== +# WORKER +# ============================================================================== +# Maximum concurrent jobs per worker +WORKER_MAX_JOBS=5 + +# Job timeout in seconds (default 10 minutes for book processing) +WORKER_JOB_TIMEOUT=600 diff --git a/tasks/backend-productionisation/audio-books-library/design-doc.md b/tasks/backend-productionisation/audio-books-library/design-doc.md new file mode 100644 index 0000000..6c65250 --- /dev/null +++ b/tasks/backend-productionisation/audio-books-library/design-doc.md @@ -0,0 +1,899 @@ +# Phase 3: Enhanced Book Search Flow - Design Document + +> **Status:** Approved for Implementation +> **Last Updated:** 4 December 2025 + +## Background + +Revising Phase 3 to implement a more intuitive book discovery experience before background job processing. The discovered books will become part of our permanent library, not just a transient cache. + +--- + +## OpenLibrary API Research Summary + +### Endpoints + +| Endpoint | Purpose | +| -------------------------- | ------------------------------------------------------- | +| `/search.json` | Search books by title, author, publisher, general query | +| `/works/{work_id}.json` | Get work details (all editions, subjects, etc.) | +| `/books/{edition_id}.json` | Get specific edition details | +| `/search/authors.json` | Search for authors | +| `covers.openlibrary.org` | Get cover images by ISBN or OLID | + +### Query Parameters + +``` +title - Search by title +author - Search by author name +publisher - Search by publisher +q - General search query +lang - Filter by language (e.g., "en", "fr") +limit - Results per page (default: 20, max: 1000) +offset - Pagination offset +sort - new, old, random, key, or by rating +fields - Specify return fields (include "editions" for edition data) +``` + +### Response Structure + +```json +{ + "numFound": 629, + "start": 0, + "docs": [ + { + "key": "/works/OL27448W", + "title": "The Lord of the Rings", + "author_name": ["J. R. R. Tolkien"], + "author_key": ["OL26320A"], + "first_publish_year": 1954, + "edition_count": 120, + "cover_i": 258027, + "isbn": ["9780618640157", "0618640150", ...], + "language": ["eng", "spa", "fre", ...], + "publisher": ["Houghton Mifflin", ...], + "editions": { "numFound": 120, "docs": [...] } + } + ] +} +``` + +### Rate Limits + +- Covers API: 100 requests/5 min per IP (non-OLID) +- General: Include `User-Agent` header or get blocked +- Bulk: Don't bulk download, use monthly dumps + +--- + +## Architecture Decisions + +### 1. Storage Strategy: PostgreSQL (Persistent Library) + +The "cache" is not just a cacheβ€”it's **building our library** by owning the data discovered from external APIs. Using PostgreSQL for persistence: + +``` +β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” +β”‚ BookBytes Library β”‚ +β”œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€ +β”‚ β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ +β”‚ β”‚ Works │───▢│ Editions │───▢│ AudioBooks β”‚ β”‚ +β”‚ β”‚ (our data) β”‚ β”‚ (ISBNs) β”‚ β”‚ (our content) β”‚ β”‚ +β”‚ β””β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”˜ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚ +β”‚ β”‚ β”‚ +β”‚ β–Ό β”‚ +β”‚ β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ +β”‚ β”‚ BookProvidersβ”‚ ← Maps our IDs to provider IDs (OL, Google)β”‚ +β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚ +β”‚ β”‚ β”‚ +β”‚ β–Ό β”‚ +β”‚ β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ +β”‚ β”‚ API Cache β”‚ ← Raw API responses (TTL-based) β”‚ +β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚ +β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ +``` + +**Rationale:** + +- **Library data is provider-agnostic** - no `openlibrary_key` on core models +- **BookProviders table** maps our UUIDs to provider-specific identifiers +- **Easy to add new providers** (Google Books, etc.) without migrations +- **API Cache** is TTL-based for raw responses +- **Single PostgreSQL database** for consistency + +### 2. Edition Handling Strategy (with Cache Integration) + +```mermaid +flowchart TD + A[User Search] --> B{Search Type?} + + %% ISBN Search Path + B -->|By ISBN| C[Check Library DB] + C -->|Found| E[Return Existing Book] + C -->|Not Found| F1[Check ISBN Cache] + F1 -->|Hit| F2[Return Cached Edition Info] + F1 -->|Miss| F[Query OpenLibrary by ISBN] + F --> F3[Cache Result Async] + F3 --> G{ISBN Edition Check} + G -->|Latest Edition| H[Process & Store in Library] + G -->|Older Edition| I[Check if Latest Exists in Library] + I -->|Yes| J[Return Latest + Note about edition] + I -->|No| H + + %% Title/Author Search Path + B -->|By Title/Author| D1[Check Search Cache] + D1 -->|Hit| D2{Near Expiry?} + D2 -->|Yes| D3[Return Stale + Background Refresh] + D2 -->|No| K[Display Results] + D1 -->|Miss| D4[Query OpenLibrary API] + D4 --> D5[Cache Result Async] + D5 --> K + D3 --> K + + %% Selection & Processing + K --> L[User Selects Book] + L --> M[Store Work + Editions in Library] + M --> N[Process Latest Edition] + + %% Future Refresh + N --> O{Later: New Edition Search} + O --> P[Allow Refresh AudioBook] +``` + +**Cache Touchpoints:** + +| Step | Cache Layer | Action | +| ------------------ | ---------------------- | ------------------------------- | +| Check Library DB | PostgreSQL (permanent) | Query `editions` table by ISBN | +| Check ISBN Cache | Redis β†’ PostgreSQL | Lookup by ISBN cache key | +| Check Search Cache | Redis β†’ PostgreSQL | Lookup by search params hash | +| Cache Result Async | Redis + PostgreSQL | Fire-and-forget storage | +| Store in Library | PostgreSQL (permanent) | Insert into `works`, `editions` | + +**Key Rules:** + +1. Default to **latest edition** for new processing +2. If user searches specific ISBN for older edition: + - If latest not in library β†’ process the requested edition + - If latest exists β†’ suggest latest, allow override +3. When new edition of existing book is searched β†’ allow **refresh** to regenerate audio + +### 3. Pagination: Page Size 100 + +- Fetch 100 results from OpenLibrary per request +- Internal pagination to client as needed +- Store all results in library on first fetch + +--- + +## Data Models + +### Core Library Models (Provider-Agnostic) + +```python +# Work - represents a book across all editions (NO provider-specific keys) +class Work(UUIDPrimaryKeyMixin, TimestampMixin, Base): + __tablename__ = "works" + + # Our owned data - no openlibrary_key here + title: Mapped[str] + authors: Mapped[list[str]] # JSON array + first_publish_year: Mapped[int | None] + subjects: Mapped[list[str] | None] # JSON array + cover_url: Mapped[str | None] + + # Relationships + editions: Mapped[list["Edition"]] = relationship(back_populates="work") + book_providers: Mapped[list["BookProvider"]] = relationship(back_populates="work") + +# Edition - specific ISBN/format of a work (NO provider-specific keys) +class Edition(UUIDPrimaryKeyMixin, TimestampMixin, Base): + __tablename__ = "editions" + + work_id: Mapped[UUID] = mapped_column(ForeignKey("works.id")) + isbn: Mapped[str] = mapped_column(unique=True, index=True) # Normalized + isbn_type: Mapped[str] # "isbn10" or "isbn13" + title: Mapped[str] # Edition-specific title + publisher: Mapped[str | None] + publish_year: Mapped[int | None] + language: Mapped[str] = "eng" + pages: Mapped[int | None] + + # Relationships + work: Mapped["Work"] = relationship(back_populates="editions") + audio_book: Mapped["AudioBook | None"] = relationship(back_populates="edition") + book_providers: Mapped[list["BookProvider"]] = relationship(back_populates="edition") + +# AudioBook - our generated content +class AudioBook(UUIDPrimaryKeyMixin, TimestampMixin, SoftDeleteMixin, Base): + __tablename__ = "audio_books" + + edition_id: Mapped[UUID] = mapped_column(ForeignKey("editions.id")) + status: Mapped[str] # pending, processing, completed, failed + version: Mapped[int] = 1 # For refresh/regeneration + + # Relationships + edition: Mapped["Edition"] = relationship(back_populates="audio_book") + chapters: Mapped[list["Chapter"]] = relationship(back_populates="audio_book") + +# Chapter - audio content for a chapter +class Chapter(UUIDPrimaryKeyMixin, TimestampMixin, Base): + __tablename__ = "chapters" + + audio_book_id: Mapped[UUID] = mapped_column(ForeignKey("audio_books.id")) + chapter_number: Mapped[int] + title: Mapped[str] + summary: Mapped[str] + audio_file_path: Mapped[str | None] + audio_url: Mapped[str | None] + word_count: Mapped[int | None] + duration_seconds: Mapped[int | None] + + # Relationships + audio_book: Mapped["AudioBook"] = relationship(back_populates="chapters") +``` + +### BookProvider Model (Provider-Specific Mapping) + +```python +class BookProviderType(str, Enum): + """Supported external data providers.""" + OPENLIBRARY = "openlibrary" + GOOGLE_BOOKS = "google_books" + # Future: AMAZON, GOODREADS, etc. + +class BookProvider(UUIDPrimaryKeyMixin, TimestampMixin, Base): + """ + Maps our internal UUIDs to external provider IDs. + Decouples our data model from any specific provider. + """ + __tablename__ = "book_providers" + + # Polymorphic: can link to Work or Edition + entity_type: Mapped[str] # "work" or "edition" + entity_id: Mapped[UUID] # Our internal UUID + + # Provider info + provider: Mapped[str] # e.g., "openlibrary", "google_books" + external_key: Mapped[str] # e.g., "/works/OL27448W", "google:abc123" + + # Optional: provider-specific metadata + provider_metadata: Mapped[dict | None] # JSON - any extra provider data + + # Relationships (nullable - only one will be set) + work_id: Mapped[UUID | None] = mapped_column(ForeignKey("works.id")) + edition_id: Mapped[UUID | None] = mapped_column(ForeignKey("editions.id")) + work: Mapped["Work | None"] = relationship(back_populates="book_providers") + edition: Mapped["Edition | None"] = relationship(back_populates="book_providers") + + # Unique constraint: one provider key per entity + __table_args__ = ( + UniqueConstraint("provider", "external_key", name="uq_provider_external_key"), + Index("ix_entity_lookup", "entity_type", "entity_id"), + ) +``` + +**Benefits of BookProvider Table:** + +| Benefit | Description | +| --------------------- | ---------------------------------------------------- | +| **Provider-agnostic** | Core models don't know about OpenLibrary | +| **Multi-provider** | Same Work can have IDs from multiple providers | +| **No migrations** | Adding a new provider = new rows, not schema changes | +| **Easy lookups** | Find Work by OpenLibrary key OR Google Books ID | + +**Example Usage:** + +```python +# Find our Work by any provider's external ID +async def find_work_by_provider(provider: str, external_key: str) -> Work | None: + bp = await db.query(BookProvider).filter_by( + provider=provider, + external_key=external_key, + entity_type="work" + ).first() + return bp.work if bp else None + +# Store provider mapping when importing +async def link_to_provider(work: Work, provider: str, key: str) -> None: + await db.add(BookProvider( + entity_type="work", + entity_id=work.id, + work_id=work.id, + provider=provider, + external_key=key + )) +``` + +### API Cache Model + +> **REMOVED:** The `APICache` PostgreSQL table has been removed. See below for rationale. + +### Cache TTL Policy + +| Data Type | Redis TTL | Rationale | +| --------------- | --------- | ------------------------------ | +| Search results | 24 hours | Balance freshness vs API calls | +| Work details | 7 days | Stable metadata | +| Edition details | 7 days | Stable metadata | + +### 4. Cache Key Design (Pagination & Ordering) + +**Problem:** How to ensure consistent pagination across requests and cache matching across users? + +**Example Scenario:** + +1. User A searches for "Book-A" β†’ We fetch 100 results from OpenLibrary, cache them +2. User A views page 1 (results 1-10) from our API +3. User A goes to page 2 (results 11-20) from our API +4. User B searches for "Book-A" β†’ Should hit the same cache + +**Solution: Deterministic Cache Key** + +```python +def generate_cache_key(params: SearchParams) -> str: + """ + Generate a deterministic cache key from search parameters. + Same search = same key = cache hit. + """ + # Normalize: lowercase, strip whitespace, sort keys + normalized = { + "title": params.title.lower().strip(), + "author": (params.author or "").lower().strip(), + "publisher": (params.publisher or "").lower().strip(), + "language": params.language.lower().strip(), + } + # Remove empty values + normalized = {k: v for k, v in normalized.items() if v} + + # Create deterministic string + key_parts = sorted(f"{k}={v}" for k, v in normalized.items()) + key_string = "&".join(key_parts) + + # Hash for storage efficiency + return f"search:{hashlib.sha256(key_string.encode()).hexdigest()[:16]}" +``` + +**Example Cache Keys:** + +``` +Search: title="Lord of the Rings", author="Tolkien" +Key: search:a3f2b1c4d5e6f7a8 + +Search: title="lord of the rings", author="tolkien" (different case) +Key: search:a3f2b1c4d5e6f7a8 (SAME - normalized) + +Search: title="Lord of the Rings" (no author) +Key: search:b9c8d7e6f5a4b3c2 (DIFFERENT - different params) +``` + +### 5. Cache Strategy: Redis-Only with AOF Persistence + +> **Key Clarification:** There are two distinct data categories: +> +> | Category | Storage | Purpose | Persistence | +> | ---------------- | ---------- | ------------------------ | ------------------------------------- | +> | **Search Cache** | Redis only | Avoid repeated API calls | Temporary (survives restarts via AOF) | +> | **Library Data** | PostgreSQL | Our processed books | Permanent (Works, Editions) | +> +> Search results are **transient** - if lost, users simply re-search. The important data (processed books) lives permanently in PostgreSQL Work/Edition tables. + +**Why Not Two-Tier (Redis + PostgreSQL)?** + +Originally we planned PostgreSQL as L2 cache backup. After analysis: + +| Concern | Reality | +| ---------------------- | ------------------------------------------------------------------------------------------- | +| **Survives restarts?** | Redis AOF with `appendfsync everysec` survives restarts (max 1 sec data loss) | +| **Cache important?** | Search cache is convenience, not critical. Permanent data is already in Work/Edition tables | +| **Complexity cost** | Two-tier adds: APICache table, sync logic, two TTL systems | +| **Benefit** | Marginal - we'd only save re-fetching from OpenLibrary | + +**Decision:** Use **Redis-only** caching with AOF persistence. Simpler architecture, sufficient durability. + +**Architecture:** + +``` +β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” +β”‚ API Instances β”‚ +β”‚ β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ +β”‚ β”‚ API #1 β”‚ β”‚ API #2 β”‚ β”‚ API #3 β”‚ (Horizontal Scaling) β”‚ +β”‚ β””β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”˜ β””β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”˜ β””β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”˜ β”‚ +β”‚ β”‚ β”‚ β”‚ β”‚ +β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚ +β”‚ β–Ό β”‚ +β”‚ β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ +β”‚ β”‚ Redis (Cache Layer) β”‚ ← Shared, TTL-based β”‚ +β”‚ β”‚ - Search results (24h TTL) β”‚ β”‚ +β”‚ β”‚ - AOF persistence (survives restart)β”‚ β”‚ +β”‚ β”‚ - LRU eviction on memory limit β”‚ β”‚ +β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚ +β”‚ β”‚ β”‚ +β”‚ β”‚ (On book selection/processing) β”‚ +β”‚ β–Ό β”‚ +β”‚ β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ +β”‚ β”‚ PostgreSQL (Permanent Library) β”‚ ← Our owned data β”‚ +β”‚ β”‚ - Works table (book metadata) β”‚ β”‚ +β”‚ β”‚ - Editions table (ISBN, publisher) β”‚ β”‚ +β”‚ β”‚ - AudioBooks table (our content) β”‚ β”‚ +β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚ +β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ +``` + +**Cache Flow:** + +```mermaid +flowchart TD + A[Search Request] --> B[Generate Cache Key] + B --> C{Redis Hit?} + C -->|Yes| D{Near Expiry?} + D -->|Yes| E[Return Stale + Background Refresh] + D -->|No| F[Return Fresh] + C -->|No| G[Query OpenLibrary API] + G --> H[Return Response Immediately] + H -.->|Async| I[Store in Redis] + + J[User Selects Book] --> K[Store in PostgreSQL] + K --> L[Work + Edition Tables] + L --> M[Process AudioBook] +``` + +### Caching Policy (Finalized) + +**Pattern:** Cache-Aside + TTL + Stale-While-Revalidate + +#### TTL Values (with Β±10% Jitter) + +| Cache Type | Redis TTL | Jitter | +| --------------- | --------- | ------ | +| Search results | 24 hours | Β±10% | +| Work details | 7 days | Β±10% | +| Edition details | 7 days | Β±10% | + +**TTL Jitter:** Prevents cache stampede (all keys expiring simultaneously). + +```python +def calculate_ttl_with_jitter(base_ttl: int) -> int: + """Add Β±10% random jitter to TTL to prevent stampede.""" + jitter = random.uniform(-0.1, 0.1) + return int(base_ttl * (1 + jitter)) +``` + +#### Stale-While-Revalidate + +When cache entry is near expiry (< 20% TTL remaining): + +1. **Return stale data immediately** (fast response) +2. **Trigger background refresh** (update cache async) +3. **Next request gets fresh data** + +```python +REVALIDATE_THRESHOLD = 0.2 # 20% of TTL remaining + +async def get_with_stale_revalidate(cache_key: str) -> tuple[dict | None, bool]: + """ + Returns (data, is_stale). + If stale, caller should trigger background refresh. + """ + result, ttl = await redis.get_with_ttl(cache_key) + if not result: + return None, False + + needs_revalidation = ttl < (original_ttl * REVALIDATE_THRESHOLD) + return json.loads(result), needs_revalidation +``` + +#### Invalidation Policy + +| Trigger | Action | Scope | +| --------------------- | ------------------- | ----------------------------- | +| **TTL expiry** | Auto-delete | Single entry | +| **User refresh book** | Manual delete | Work + related search pattern | +| **Book processed** | Invalidate searches | `search:*` pattern | +| **Memory pressure** | LRU eviction | Least recently accessed | + +#### Redis Configuration + +```redis +# Persistence: AOF for durability +appendonly yes +appendfsync everysec # Sync every second (max 1 sec data loss) +auto-aof-rewrite-percentage 100 +auto-aof-rewrite-min-size 64mb + +# Memory management +maxmemory 256mb +maxmemory-policy allkeys-lru # Evict least-recently-used when full +``` + +When Redis hits memory limit, evict least-recently-used keys first. + +### CacheService Implementation (Redis-Only) + +```python +class CacheService: + """Redis-only caching with TTL and stale-while-revalidate.""" + + REVALIDATE_THRESHOLD = 0.2 # Trigger refresh at 20% TTL remaining + + async def get(self, cache_key: str) -> tuple[dict | None, bool]: + """ + Get from cache. Returns (data, needs_revalidation). + If needs_revalidation=True, caller should refresh in background. + """ + result = await self.redis.get(cache_key) + if not result: + return None, False + + ttl = await self.redis.ttl(cache_key) + original_ttl = self._get_original_ttl(cache_key) # From key prefix + needs_revalidation = (ttl / original_ttl) < self.REVALIDATE_THRESHOLD + + return json.loads(result), needs_revalidation + + async def set( + self, + cache_key: str, + data: dict, + base_ttl: int + ) -> None: + """Store in Redis with jittered TTL.""" + ttl = self._jitter_ttl(base_ttl) + await self.redis.setex(cache_key, ttl, json.dumps(data)) + + async def invalidate(self, cache_key: str) -> None: + """Delete a specific cache key.""" + await self.redis.delete(cache_key) + + async def invalidate_pattern(self, pattern: str) -> None: + """Delete all keys matching pattern (e.g., 'search:*').""" + async for key in self.redis.scan_iter(match=pattern): + await self.redis.delete(key) + + def _jitter_ttl(self, base_ttl: int) -> int: + """Add Β±10% jitter to prevent stampede.""" + jitter = random.uniform(-0.1, 0.1) + return int(base_ttl * (1 + jitter)) + + def _get_original_ttl(self, cache_key: str) -> int: + """Get original TTL based on key prefix.""" + if cache_key.startswith("search:"): + return 86400 # 24 hours + elif cache_key.startswith("work:") or cache_key.startswith("edition:"): + return 604800 # 7 days + return 86400 # Default 24 hours +``` + +**Key Points:** + +1. **Redis only:** Simple, fast, shared across all instances +2. **AOF persistence:** Survives restarts (max 1 second data loss with `appendfsync everysec`) +3. **LRU eviction:** Automatically evicts old keys when memory limit reached +4. **Stale-While-Revalidate:** Return stale data fast, refresh in background +5. **TTL Jitter:** Β±10% prevents cache stampede +6. **Invalidation:** Pattern-based deletion for search caches +7. **No PostgreSQL cache table:** Library data (Work/Edition) is permanent, search cache is transient + +--- + +### 6. Future Scope: Multi-API Support + +**Problem:** What if we want to integrate APIs other than OpenLibrary, or use multiple APIs to avoid rate limits? + +**Solution: Provider Abstraction** + +```python +from abc import ABC, abstractmethod + +class BookSearchProvider(ABC): + """Abstract base for book search APIs.""" + + @property + @abstractmethod + def provider_name(self) -> str: + """Unique identifier for caching.""" + pass + + @abstractmethod + async def search(self, params: SearchParams) -> list[BookResult]: + pass + + @abstractmethod + async def get_work_details(self, work_id: str) -> WorkDetails: + pass + + +class OpenLibraryProvider(BookSearchProvider): + provider_name = "openlibrary" + + async def search(self, params: SearchParams) -> list[BookResult]: + # OpenLibrary implementation + pass + + +class GoogleBooksProvider(BookSearchProvider): + provider_name = "googlebooks" + + async def search(self, params: SearchParams) -> list[BookResult]: + # Google Books API implementation + pass + + +class BookSearchService: + """Orchestrates multiple providers with rate limit handling.""" + + def __init__(self, providers: list[BookSearchProvider]): + self.providers = providers + self.rate_limiters = {p.provider_name: RateLimiter() for p in providers} + + async def search(self, params: SearchParams) -> list[BookResult]: + # Try providers in order, skip if rate limited + for provider in self.providers: + if not self.rate_limiters[provider.provider_name].is_limited(): + try: + return await provider.search(params) + except RateLimitError: + self.rate_limiters[provider.provider_name].mark_limited() + + raise AllProvidersRateLimitedError() +``` + +**Cache Key with Provider:** + +```python +# Cache key includes provider for multi-API support +cache_key = f"{provider_name}:search:{hash}" +# e.g., "openlibrary:search:a3f2b1c4d5e6f7a8" +# e.g., "googlebooks:search:a3f2b1c4d5e6f7a8" +``` + +**Phase 3 Scope:** OpenLibrary only. Multi-API is future enhancement. + +--- + +## API Endpoints + +### Search Endpoints + +```python +@router.post("/books/search") +async def search_books( + request: BookSearchRequest +) -> BookSearchResponse: + """ + Search OpenLibrary for books. + Results are cached and stored in library. + + Request: { title, author?, publisher?, language? } + Response: { results: [...], total, page, page_size } + """ + +@router.get("/books/works/{work_id}") +async def get_work(work_id: UUID) -> WorkResponse: + """Get work details including all editions.""" + +@router.get("/books/isbn/{isbn}") +async def get_by_isbn(isbn: str) -> EditionResponse: + """ + Lookup by ISBN. + 1. Check library + 2. If not found, query OpenLibrary + 3. Store and return + """ +``` + +### Processing Endpoints + +```python +@router.post("/books/process") +async def process_book( + request: ProcessBookRequest # edition_id OR isbn +) -> ProcessBookResponse: + """ + Start audiobook generation for an edition. + Returns job_id for tracking. + """ + +@router.post("/books/{audio_book_id}/refresh") +async def refresh_audiobook( + audio_book_id: UUID +) -> ProcessBookResponse: + """ + Regenerate audiobook (new version). + Used when new edition available or content update. + """ +``` + +--- + +## Service Layer + +### OpenLibraryService + +```python +class OpenLibraryService: + """Client for OpenLibrary API with caching.""" + + BASE_URL = "https://openlibrary.org" + PAGE_SIZE = 100 # Fetch 100 results per API call + + async def search_books( + self, + title: str, + author: str | None = None, + publisher: str | None = None, + language: str = "eng", + page: int = 1 + ) -> SearchResult: + """ + 1. Check API cache + 2. If miss, query OpenLibrary + 3. Store in cache + library + 4. Return results + """ + + async def get_work_details( + self, + work_key: str # e.g., "OL27448W" + ) -> Work: + """Fetch and store work details.""" + + async def get_all_isbns_for_work( + self, + work_key: str + ) -> list[str]: + """Get all ISBNs across all editions of a work.""" +``` + +### LibraryService (Provider-Agnostic) + +```python +class LibraryService: + """Manages our book library with provider-agnostic lookups.""" + + async def find_work_by_provider( + self, + provider: str, + external_key: str + ) -> Work | None: + """Find a work by any external provider ID.""" + bp = await self.db.query(BookProvider).filter_by( + provider=provider, + external_key=external_key, + entity_type="work" + ).first() + return bp.work if bp else None + + async def get_or_create_work( + self, + title: str, + authors: list[str], + provider: str | None = None, + external_key: str | None = None, + **data + ) -> Work: + """ + Get existing or create new work. + If provider/external_key given, check BookProvider first. + """ + if provider and external_key: + existing = await self.find_work_by_provider(provider, external_key) + if existing: + return existing + + # Create new work + work = Work(title=title, authors=authors, **data) + self.db.add(work) + await self.db.flush() + + # Link to provider if given + if provider and external_key: + await self.link_to_provider(work, "work", provider, external_key) + + return work + + async def link_to_provider( + self, + entity: Work | Edition, + entity_type: str, + provider: str, + external_key: str + ) -> BookProvider: + """Create mapping from our entity to external provider ID.""" + bp = BookProvider( + entity_type=entity_type, + entity_id=entity.id, + provider=provider, + external_key=external_key, + work_id=entity.id if entity_type == "work" else None, + edition_id=entity.id if entity_type == "edition" else None + ) + self.db.add(bp) + return bp + + async def find_by_isbn(self, isbn: str) -> Edition | None: + """Find edition by ISBN in library.""" + return await self.db.query(Edition).filter_by(isbn=isbn).first() + + async def find_latest_edition( + self, + work_id: UUID, + language: str = "eng" + ) -> Edition | None: + """Find latest edition of a work by publish year.""" + return await self.db.query(Edition).filter_by( + work_id=work_id, + language=language + ).order_by(Edition.publish_year.desc()).first() +``` + +--- + +## Implementation Phases + +### Phase 3.A: Data Models & Migrations + +- [ ] Create `models/work.py` (provider-agnostic) +- [ ] Create `models/edition.py` (provider-agnostic) +- [ ] Create `models/book_provider.py` (provider mapping) +- [ ] Create `models/audio_book.py` +- [ ] Create `models/chapter.py` +- [ ] Create `models/api_cache.py` +- [ ] Generate and run migration + +### Phase 3.B: Cache Service + +- [ ] Create `services/cache.py` (two-tier) +- [ ] Implement Redis L1 + PostgreSQL L2 +- [ ] Add TTL jitter +- [ ] Add stale-while-revalidate +- [ ] Add async storage on API response +- [ ] Add invalidation methods + +### Phase 3.C: OpenLibrary Service + +- [ ] Create `services/openlibrary.py` +- [ ] Implement search with caching +- [ ] Implement work/edition fetching +- [ ] Add User-Agent header for rate limit compliance +- [ ] Map responses to provider-agnostic models + +### Phase 3.D: Library Service + +- [ ] Create `services/library.py` +- [ ] Implement `find_work_by_provider()` +- [ ] Implement `get_or_create_work()` with provider linking +- [ ] Implement `link_to_provider()` +- [ ] Implement ISBN lookup +- [ ] Implement latest edition finder + +### Phase 3.E: API Endpoints + +- [ ] Create `api/v1/search.py` +- [ ] Create search endpoints +- [ ] Update `api/v1/books.py` with process/refresh + +### Phase 3.E: Background Jobs + +- [ ] Create Job model +- [ ] Create ARQ worker settings +- [ ] Implement `process_book_task` + +### Phase 3.F: Testing + +- [ ] Unit tests for services +- [ ] Integration tests for search flow +- [ ] Mock OpenLibrary responses + +--- + +## Success Metrics + +| Metric | Target | +| ------------------------- | ---------------------------- | +| Library Hit Rate | >70% after 1 month | +| Search Latency (cached) | <200ms | +| Search Latency (API call) | <2s | +| ISBN Collection | 100% of work ISBNs stored | +| Edition Accuracy | Latest edition selected 95%+ | diff --git a/tasks/backend-productionisation/audio-books-library/tasks-audio-books-library.md b/tasks/backend-productionisation/audio-books-library/tasks-audio-books-library.md new file mode 100644 index 0000000..68f6b68 --- /dev/null +++ b/tasks/backend-productionisation/audio-books-library/tasks-audio-books-library.md @@ -0,0 +1,353 @@ +# Tasks: Audio Books Library + +**Design Reference:** [design-doc.md](./design-doc.md) +**Parent:** [tasks-backend-productionisation.md](../tasks-backend-productionisation.md) +**Branch:** `feat/productionise-and-saasify` + +--- + +## Overview + +This module implements the enhanced book search flow with: + +- OpenLibrary API integration +- Two-tier caching (Redis L1 + PostgreSQL L2) +- Provider-agnostic data models +- Background job processing for audiobook generation + +--- + +## Relevant Files + +### Models + +- `src/bookbytes/models/work.py` - Work model (provider-agnostic) +- `src/bookbytes/models/edition.py` - Edition model with ISBN (ISO 639-2/B language codes) +- `src/bookbytes/models/book_provider.py` - Provider mapping (polymorphic) +- `src/bookbytes/models/audio_book.py` - AudioBook model +- `src/bookbytes/models/chapter.py` - Chapter model + +> **Note:** No `api_cache.py` - using Redis-only caching with AOF persistence. + +### Repositories + +- `src/bookbytes/repositories/work.py` - WorkRepository +- `src/bookbytes/repositories/edition.py` - EditionRepository +- `src/bookbytes/repositories/book_provider.py` - BookProviderRepository +- `src/bookbytes/repositories/audio_book.py` - AudioBookRepository +- `src/bookbytes/repositories/chapter.py` - ChapterRepository (update existing) + +### Services + +- `src/bookbytes/services/cache.py` - Two-tier cache service +- `src/bookbytes/services/openlibrary.py` - OpenLibrary API client +- `src/bookbytes/services/library.py` - Library management service + +### API + +- `src/bookbytes/api/v1/search.py` - Book search endpoints + +### Workers + +- `src/bookbytes/workers/tasks.py` - Background job tasks + +--- + +## Tasks + +### Phase 3.0: UUIDv7 Foundation + +> **Prerequisite:** Before creating new models, set up UUIDv7 infrastructure. + +- [x] 3.0.1 Add `uuid6` library to `pyproject.toml` + + - RFC 9562 compliant backport for Python <3.14 + - Will switch to stdlib `uuid.uuid7()` when on Python 3.14 + +- [x] ~~3.0.2 Install `pg_idkit` PostgreSQL extension~~ **REMOVED** - Using app-generated UUIDs + + - pg_idkit not available in standard PostgreSQL images + - UUIDv7 generated in Python via `uuid6.uuid7()`, stored in native UUID column + - No database extension required + +- [x] 3.0.3 Update `models/base.py` `UUIDPrimaryKeyMixin` + - Change from `uuid.uuid4` to `uuid6.uuid7` + - Keep PostgreSQL default as app-generated (no DB function) + +```python +# Before +from uuid import uuid4 +default=uuid4 + +# After +from uuid6 import uuid7 +default=uuid7 +``` + +- [ ] 3.0.4 Verify UUIDv7 sorting works in PostgreSQL + - UUIDv7 is time-sortable when stored as native UUID type + +--- + +### Phase 3.A: Data Models & Migrations + +- [x] 3.A.1 Create `models/work.py` with `Work` model + + - UUID primary key (v7), title, authors (JSON), first_publish_year, subjects (JSON), cover_url + - Relationship to `editions` and `book_providers` + - No provider-specific fields + +- [x] 3.A.2 Create `models/edition.py` with `Edition` model + + - work_id FK, isbn (unique, indexed), isbn_type, title, publisher, publish_year, language, pages + - Relationship to `work`, `audio_book`, `book_providers` + +- [x] 3.A.3 Create `models/book_provider.py` with `BookProvider` model + + - entity_type ("work" | "edition"), entity_id, provider, external_key + - Nullable FKs: work_id, edition_id (polymorphic pattern) + - Unique constraint on (provider, external_key) + - Index on (entity_type, entity_id) + +- [x] 3.A.4 Create `models/audio_book.py` with `AudioBook` model + + - edition_id FK, status enum, version + - Uses SoftDeleteMixin + +- [x] 3.A.5 Create `models/chapter.py` for AudioBook + + - audio_book_id FK, chapter_number, title, summary, audio paths, word_count, duration + +- [x] ~~3.A.6 Create `models/api_cache.py`~~ **REMOVED** - Using Redis-only caching + +- [x] 3.A.7 Update `models/__init__.py` with new model exports + +- [x] 3.A.8 Generate migration: `alembic revision --autogenerate -m "add_audio_books_library_models"` + +- [x] 3.A.9 Run migration: `alembic upgrade head` + +--- + +### Phase 3.A-R: Repositories + +> **Note:** No APICacheRepository - CacheService handles all cache operations directly. + +- [x] 3.A-R.1 Create `repositories/work.py` with `WorkRepository` + + - `get_by_title_author()` - find by title and author combination + - `get_with_editions()` - eager load editions + +- [x] 3.A-R.2 Create `repositories/edition.py` with `EditionRepository` + + - `get_by_isbn()` - find by normalized ISBN + - `get_by_work_id()` - all editions for a work + - `get_latest_by_work()` - latest by publish_year + +- [x] 3.A-R.3 Create `repositories/book_provider.py` with `BookProviderRepository` + + - `get_by_provider_key()` - find by (provider, external_key) + - `get_for_entity()` - all providers for a work/edition + - `create_mapping()` - link entity to provider + +- [x] 3.A-R.4 Create `repositories/audio_book.py` with `AudioBookRepository` + + - Extends `SoftDeleteRepository` for soft delete support + - `get_by_edition()` - find audiobook for edition + - `get_by_status()` - filter by processing status + +- [x] 3.A-R.5 Create `repositories/chapter.py` with `ChapterRepository` + + - `get_by_audio_book()` - all chapters for audiobook + - `get_by_number()` - specific chapter + +- [x] 3.A-R.6 Update `repositories/__init__.py` with new exports + +--- + +### Phase 3.B: Cache Service (Redis-Only) + +> **Simplified:** Using Redis-only caching with AOF persistence. No PostgreSQL cache table. + +- [x] 3.B.1 Create `services/cache.py` with `CacheService` class + + - Redis-only with AOF persistence (survives restarts) + - Inject Redis client + +- [x] 3.B.2 Implement `get()` method + + - Check Redis, return (data, needs_revalidation) tuple + - Track remaining TTL for stale-while-revalidate + +- [x] 3.B.3 Implement `set()` method with TTL jitter + + - Store in Redis with Β±10% jitter to prevent stampede + +- [x] 3.B.4 Implement `invalidate()` and `invalidate_pattern()` + + - Delete by key or by pattern (e.g., `search:*`) + +- [x] 3.B.5 Implement stale-while-revalidate logic + + - Return stale data at <20% TTL remaining + - Trigger background refresh + +- [x] 3.B.6 Add cache key generation helper + - Normalize params (lowercase, trim, sort) + - SHA256 hash for storage efficiency + +--- + +### Phase 3.C: OpenLibrary Service + +- [x] 3.C.1 Create `services/openlibrary.py` with `OpenLibraryService` class + + - BASE_URL, PAGE_SIZE=100 + - Use httpx async client + +- [x] 3.C.2 Add User-Agent header + + - Include app name and contact for API compliance + +- [x] 3.C.3 Implement `search_books()` method + + - Accept title, author, publisher, language + - Check cache first (via CacheService) + - Query API on miss, cache result async + +- [x] 3.C.4 Implement `get_work_details()` method + + - Fetch work by OpenLibrary key + - Cache with 7-day TTL + +- [x] 3.C.5 Implement `get_all_isbns_for_work()` method + + - Collect ISBNs from all editions + +- [x] 3.C.6 Map API responses to provider-agnostic schemas + - Create DTOs for search results, work details + +--- + +### Phase 3.D: Library Service + +- [x] 3.D.1 Create `services/library.py` with `LibraryService` class + + - Inject: WorkRepository, EditionRepository, BookProviderRepository + +- [x] 3.D.2 Implement `find_work_by_provider()` + + - Query BookProvider by (provider, external_key) + - Return associated Work + +- [x] 3.D.3 Implement `get_or_create_work()` + + - Check if work exists via provider lookup + - Create new Work if not found + - Link to provider + +- [x] 3.D.4 Implement `link_to_provider()` + + - Create BookProvider mapping for Work or Edition + +- [x] 3.D.5 Implement `find_by_isbn()` + + - Query Edition by normalized ISBN + +- [x] 3.D.6 Implement `find_latest_edition()` + + - Order by publish_year descending + - Filter by language + +- [x] 3.D.7 Implement `store_editions()` + + - Bulk insert editions for a work + - Create BookProvider mappings + +--- + +### Phase 3.E: API Endpoints + +- [x] 3.E.1 Create `api/v1/search.py` with router + +- [x] 3.E.2 Create `POST /books/search` endpoint + + - Accept BookSearchRequest (title, author?, publisher?, language?) + - Return paginated results with page, page_size params + +- [x] 3.E.3 Create `GET /books/works/{work_id}` endpoint + + - Return work details with all editions + +- [x] 3.E.4 Create `GET /books/isbn/{isbn}` endpoint + + - Check library first, query API if not found + - Store in library on fetch + +- [x] 3.E.5 ~~Create `POST /books/process` endpoint~~ β†’ **Moved to** [audio-books-pipeline Phase 1](../audio-books-pipeline/tasks-audio-books-pipeline.md) + +- [x] 3.E.6 ~~Create `POST /books/{audio_book_id}/refresh` endpoint~~ β†’ **Moved to** [audio-books-pipeline Phase 1](../audio-books-pipeline/tasks-audio-books-pipeline.md) + +- [x] 3.E.7 Create schemas in `schemas/search.py` + + - BookSearchRequest, BookSearchResponse, WorkResponse, EditionResponse + +- [x] 3.E.8 Include search router in v1 router + +--- + +### Phase 3.F: Background Jobs + +> **Moved to:** [audio-books-pipeline/tasks-audio-books-pipeline.md](../audio-books-pipeline/tasks-audio-books-pipeline.md) +> +> The background jobs, processing pipeline, and worker tasks are now defined in the +> Audio Books Pipeline module (Phase 3.1), which handles: +> - Job model and infrastructure +> - LLM/TTS services with Protocol-based abstraction +> - ProcessingService orchestration +> - ARQ worker tasks + +--- + +### Phase 3.G: Testing + +- [x] 3.G.1 Create `tests/unit/test_cache_service.py` + + - Test cache get/set flow + - Test TTL jitter + +- [x] 3.G.2 Create `tests/unit/test_openlibrary_service.py` + + - Mock HTTP responses + - Test search, work details, ISBN collection + +- [x] 3.G.3 Create `tests/unit/test_library_service.py` + + - Test provider lookups + - Test work/edition storage + +- [ ] 3.G.4 Create `tests/unit/test_repositories.py` + + - Test Work, Edition, BookProvider, AudioBook repositories + +- [x] 3.G.5 Create `tests/integration/test_search_api.py` + + - Test full search flow with mocked OpenLibrary + - Test caching behavior + - Test pagination + +- [ ] 3.G.6 Add OpenLibrary mock responses to `tests/mocks/` + +--- + +## Notes + +- **UUIDv7:** Using `uuid6` library (RFC 9562 compliant) until Python 3.14, app-generated (no DB extension needed) +- **Language Codes:** Edition uses ISO 639-2/B (bibliographic) codes per MARC/ONIX standards +- All models use `UUIDPrimaryKeyMixin` (now v7) and `TimestampMixin` +- AudioBook uses `SoftDeleteMixin` for soft deletion +- AudioBookRepository extends `SoftDeleteRepository` +- BookProvider is a sparse/polymorphic table (see design doc for query patterns) +- **Redis-only caching:** No PostgreSQL cache table, uses AOF persistence (`appendfsync everysec`) +- **Cache vs Library:** Search cache is transient (Redis), library data is permanent (Work/Edition in PostgreSQL) +- Redis memory policy: `allkeys-lru` with 256mb limit +- OpenLibrary requires User-Agent header to avoid rate limiting diff --git a/tasks/backend-productionisation/audio-books-pipeline/design-doc.md b/tasks/backend-productionisation/audio-books-pipeline/design-doc.md new file mode 100644 index 0000000..ffa0198 --- /dev/null +++ b/tasks/backend-productionisation/audio-books-pipeline/design-doc.md @@ -0,0 +1,345 @@ +# Audio Books Pipeline - Design Document + +> **Status:** Draft - Planning Phase +> **Last Updated:** 11 December 2025 + +## Background + +This document outlines the design for the audiobook processing pipeline that transforms book editions into chapter-wise audio summaries using LLM and TTS services. + +--- + +## TTS & LLM Provider Research + +### TTS Provider Comparison + +| Provider | Quality | Cost/1M chars | Latency | Best For | +| ----------------- | ----------------------- | ----------------------------- | -------------- | ------------------------------ | +| **OpenAI TTS** | High, natural | $15 (HD) / $30 (HD) | Low, streaming | General purpose | +| **ElevenLabs** | Premium, most realistic | $45+ | Low | Audiobooks, storytelling | +| **Google Cloud** | Good, stable | $4 (standard) / $16 (Neural2) | Very fast | Scale, enterprise | +| **Azure** | Natural, broadcast-like | ~$16 | Medium | Long-form, Microsoft ecosystem | +| **gTTS (Google)** | Basic, robotic | Free | Medium | MVP, cost-sensitive | + +#### Recommendation: **OpenAI TTS** + +**Why:** +1. **Quality-to-Cost ratio** - Best balance for audiobook summaries +2. **Same ecosystem** - Already using OpenAI for LLM (shared API key, billing) +3. **Streaming support** - Can stream audio as it's generated +4. **Simple API** - `openai.audio.speech.create()` + +**Upgrade path:** ElevenLabs for premium voice quality if needed later. + +### LLM Abstraction Options + +| Library | Providers | Structured Output | Maturity | +| ------------------ | ---------------------------- | -------------------- | ------------ | +| **LiteLLM** | 100+ | Via function calling | High | +| **PydanticAI** | OpenAI, Claude, Gemini, etc. | Native Pydantic | Medium | +| **Instructor** | OpenAI, Claude + LiteLLM | Native Pydantic | High | +| **Custom wrapper** | Any | Manual | Full control | + +#### Recommendation: **Instructor + Custom Interface** + +**Why:** +1. **Structured output** - Native Pydantic integration for chapter extraction +2. **Provider agnostic** - Works directly with OpenAI, can use LiteLLM as backend +3. **Type-safe** - Matches our codebase patterns +4. **Proven** - Used in production by many teams + +--- + +## Proposed Architecture + +``` +β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” +β”‚ ProcessingService β”‚ +β”œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€ +β”‚ β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ +β”‚ β”‚ LLMService β”‚ β”‚ TTSService β”‚ β”‚ +β”‚ β”‚ (Instructor) β”‚ β”‚ (Protocol-based)β”‚ β”‚ +β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β””β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚ +β”‚ β”‚ β”‚ β”‚ +β”‚ β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β–Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β–Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ +β”‚ β”‚ OpenAI Provider β”‚ β”‚ OpenAI Provider β”‚ β”‚ +β”‚ β”‚ (default) β”‚ β”‚ (default) β”‚ β”‚ +β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚ +β”‚ β”‚ β”‚ β”‚ +β”‚ β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β–Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β–Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ +β”‚ β”‚ LiteLLM Backend β”‚ β”‚ ElevenLabs β”‚ β”‚ +β”‚ β”‚ (optional) β”‚ β”‚ (upgrade path) β”‚ β”‚ +β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚ +β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ +``` + +--- + +## Processing Flow + +```mermaid +flowchart TD + A[POST /books/process] --> B[Create AudioBook: PENDING] + B --> C[Create Job record] + C --> D[Return job_id immediately] + + D -.-> E[Background Task picks up] + E --> F[Update AudioBook: PROCESSING] + F --> G[Extract chapters via LLM] + G --> H[Generate summaries] + H --> I[Generate audio TTS] + I --> J[Upload to storage] + J --> K[Update AudioBook: COMPLETED] + + G --> |Error| L[Update AudioBook: FAILED] + H --> |Error| L + I --> |Error| L +``` + +--- + +## Service Interfaces + +### Design Decision: Protocol-Based Abstraction for Both LLM and TTS + +**Why?** Same reasoning applies to both services: +- Decouple business logic from provider-specific libraries +- Enable A/B testing between providers +- Allow switching without changing domain code +- Consistent architecture pattern + +### LLMProvider (Protocol-based) + +```python +from typing import Protocol +from pydantic import BaseModel + +# Domain models - NO external library imports +class ChapterInfo(BaseModel): + number: int + title: str + summary: str + +class ChapterExtraction(BaseModel): + chapters: list[ChapterInfo] + +class BookContext(BaseModel): + """Input context for chapter extraction.""" + title: str + author: str + num_chapters: int = 10 + language: str = "en" + +# Provider interface - implementations are free to use any library +class LLMProvider(Protocol): + """Interface for LLM providers.""" + + async def extract_chapters( + self, + context: BookContext, + ) -> ChapterExtraction: + """Extract chapter summaries from a book.""" + ... + +# Implementations can use Instructor, LiteLLM, raw OpenAI, etc. +class InstructorLLMProvider: + """OpenAI via Instructor for structured output.""" + +class LiteLLMProvider: + """Multi-provider via LiteLLM.""" + +class AnthropicProvider: + """Claude direct integration (future).""" + +# Service delegates to configured provider +class LLMService: + """Thin wrapper - delegates to provider.""" + + def __init__(self, provider: LLMProvider): + self._provider = provider + + async def extract_chapters( + self, + book_title: str, + book_author: str, + num_chapters: int = 10, + ) -> ChapterExtraction: + context = BookContext( + title=book_title, + author=book_author, + num_chapters=num_chapters, + ) + return await self._provider.extract_chapters(context) +``` + +### TTSProvider (Protocol-based) + +```python +from typing import Protocol, AsyncIterator + +class TTSProvider(Protocol): + """Interface for TTS providers.""" + + async def synthesize( + self, + text: str, + voice: str = "alloy", + ) -> bytes: + """Generate audio from text.""" + + async def synthesize_stream( + self, + text: str, + voice: str = "alloy", + ) -> AsyncIterator[bytes]: + """Stream audio as it's generated.""" + +class OpenAITTSProvider: + """OpenAI TTS implementation.""" + +class ElevenLabsTTSProvider: + """ElevenLabs TTS implementation (future).""" + +class TTSService: + """Wrapper that delegates to configured provider.""" + + def __init__(self, provider: TTSProvider): + self.provider = provider +``` + +--- + +## Technology Decisions + +| Component | Choice | Rationale | +| ---------------- | ----------------------- | --------------------------------------- | +| **Job Queue** | ARQ (Redis) | Already in deps, native async | +| **TTS Provider** | OpenAI TTS | Quality/cost balance, same ecosystem | +| **LLM Library** | Instructor | Structured output, Pydantic integration | +| **Storage** | Local (dev) / S3 (prod) | Per PRD | +| **Migrations** | Alembic | Already configured | + +### Provider Abstraction + +```python +# Protocol-based interface for TTS +class TTSProvider(Protocol): + async def synthesize(self, text: str, voice: str) -> bytes: ... + async def synthesize_stream(self, text: str, voice: str) -> AsyncIterator[bytes]: ... + +# Easy to add ElevenLabs, Azure, etc. later +class OpenAITTSProvider(TTSProvider): ... +class ElevenLabsTTSProvider(TTSProvider): ... # Future +``` + +--- + +## API Endpoints + +| Endpoint | Method | Description | +| -------------------------------- | ------ | --------------------------- | +| `/books/process` | POST | Start audiobook generation | +| `/books/{audio_book_id}/refresh` | POST | Regenerate with new version | +| `/jobs/{job_id}` | GET | Check job status | + +### Request/Response Schemas + +```python +class ProcessRequest(BaseModel): + edition_id: UUID | None = None + isbn: str | None = None + + @model_validator(mode="after") + def require_one(self) -> Self: + if not self.edition_id and not self.isbn: + raise ValueError("Provide edition_id or isbn") + return self + +class ProcessResponse(BaseModel): + job_id: UUID + audio_book_id: UUID + status: str + message: str + +class JobStatusResponse(BaseModel): + id: UUID + status: str # pending, processing, completed, failed + audio_book_id: UUID | None + progress: int | None # 0-100 percentage + error_message: str | None + created_at: datetime + updated_at: datetime +``` + +--- + +## Data Models + +### Job Model + +```python +class JobStatus(str, Enum): + PENDING = "pending" + PROCESSING = "processing" + COMPLETED = "completed" + FAILED = "failed" + +class JobType(str, Enum): + AUDIOBOOK_GENERATION = "audiobook_generation" + AUDIOBOOK_REFRESH = "audiobook_refresh" + +class Job(UUIDPrimaryKeyMixin, TimestampMixin, Base): + __tablename__ = "jobs" + + job_type: Mapped[str] + status: Mapped[str] = default(JobStatus.PENDING) + audio_book_id: Mapped[UUID | None] # FK to audiobooks + progress: Mapped[int] = default(0) + error_message: Mapped[str | None] + started_at: Mapped[datetime | None] + completed_at: Mapped[datetime | None] +``` + +--- + +## Dependencies + +```toml +[project.dependencies] +instructor = ">=1.0.0" # Structured LLM output +# openai already in deps +# arq already in deps +``` + +--- + +## Implementation Order + +### Phase 1: Endpoints (2-3 hours) +1. Create `schemas/processing.py` +2. Create `api/v1/processing.py` router +3. Add to v1 router +4. Unit tests + +### Phase 2: Job Infrastructure (2-3 hours) +1. Create `models/job.py` +2. Generate Alembic migration +3. Create `repositories/job.py` + +### Phase 3: LLM Service (3-4 hours) +1. Add instructor to deps +2. Create `services/llm.py` with chapter extraction +3. Unit tests with mocked responses + +### Phase 4: TTS Service (2-3 hours) +1. Create `services/tts.py` with Protocol +2. Implement OpenAITTSProvider +3. Unit tests + +### Phase 5: Processing Pipeline (3-4 hours) +1. Create `services/processing.py` +2. Create `workers/tasks.py` with ARQ task +3. Create `workers/settings.py` +4. Integration tests + +**Total Estimated Time:** 12-14 hours diff --git a/tasks/backend-productionisation/audio-books-pipeline/prd-audio-books-pipeline.md b/tasks/backend-productionisation/audio-books-pipeline/prd-audio-books-pipeline.md new file mode 100644 index 0000000..d823052 --- /dev/null +++ b/tasks/backend-productionisation/audio-books-pipeline/prd-audio-books-pipeline.md @@ -0,0 +1,184 @@ +# PRD: Audio Books Pipeline + +**Document Version:** 1.0 +**Created:** 11 December 2025 +**Status:** Draft +**Timeline:** Urgent (hours, target completion in 1-2 days) + +--- + +## 1. Introduction/Overview + +The Audio Books Pipeline transforms book editions into chapter-wise audio summaries using LLM and TTS services. This is the core value-delivery component of BookBytes - converting a 250-page book into 15-20 audio summaries of ~5 minutes each. + +### Problem Statement + +Users want to consume books faster without losing key insights. Reading full books is time-consuming, and existing audiobooks are just narrations of the full text. + +### Solution + +An automated pipeline that: +1. Uses LLM to extract chapter summaries from book metadata +2. Converts summaries to natural-sounding audio via TTS +3. Stores and serves audio files to users + +--- + +## 2. Goals + +| # | Goal | Success Criteria | +| --- | ---------------------------- | ------------------------------------------------------------------- | +| G1 | **Production Architecture** | Provider-agnostic design allowing LLM/TTS switching without code changes | +| G2 | **Revenue Enablement** | Working pipeline that processes books end-to-end | +| G3 | **Quality Output** | Structured summaries with clear, natural audio | +| G4 | **Reliability** | Retry logic, partial success handling, job progress tracking | +| G5 | **Observability** | Full job status visibility, error diagnostics | + +--- + +## 3. User Stories + +### API Consumer Stories + +| ID | Story | Acceptance Criteria | +| --- | ------------------------------------------------------------------------------------------ | -------------------------------------------------------- | +| US1 | As a user, I want to submit a book for processing and get a job ID | `POST /books/process` returns `{job_id, status}` immediately | +| US2 | As a user, I want to poll job status to know when my audiobook is ready | `GET /jobs/{id}` shows progress 0-100% | +| US3 | As a user, I want to listen to generated chapter summaries | Chapter audio files are accessible via URL | +| US4 | As a user, I want to refresh an audiobook when I want updated summaries | `POST /books/{id}/refresh` creates new version | + +### Developer Stories + +| ID | Story | Acceptance Criteria | +| --- | ------------------------------------------------------------------------------------------ | -------------------------------------------------------- | +| DS1 | As a developer, I want to switch LLM providers without changing business logic | Provider configured via DI, Protocol interface | +| DS2 | As a developer, I want to switch TTS providers without changing business logic | Provider configured via DI, Protocol interface | +| DS3 | As a developer, I want failed jobs to retry automatically | ARQ retry logic with exponential backoff | +| DS4 | As a developer, I want to test the pipeline without calling real APIs | Mocked providers work in tests | + +--- + +## 4. Functional Requirements + +### Phase 1: API Endpoints + +| # | Requirement | +| ------ | -------------------------------------------------------------------------- | +| FR1.1 | Create `POST /api/v1/books/process` endpoint accepting `edition_id` or `isbn` | +| FR1.2 | Return `job_id` and `audio_book_id` immediately (async processing) | +| FR1.3 | Create `GET /api/v1/jobs/{job_id}` endpoint for status polling | +| FR1.4 | Job status includes: `pending`, `processing`, `completed`, `failed` | +| FR1.5 | Job response includes `progress` (0-100) and `error_message` if failed | +| FR1.6 | Create `POST /api/v1/books/{audio_book_id}/refresh` for regeneration | +| FR1.7 | Refresh increments audiobook `version` and creates new job | + +### Phase 2: Job Infrastructure + +| # | Requirement | +| ------ | -------------------------------------------------------------------------- | +| FR2.1 | Create `Job` model with `job_type`, `status`, `progress`, `audio_book_id` | +| FR2.2 | Create Alembic migration for `jobs` table | +| FR2.3 | Create `JobRepository` with async CRUD | +| FR2.4 | Configure ARQ worker settings in `workers/settings.py` | + +### Phase 3: LLM Service + +| # | Requirement | +| ------ | -------------------------------------------------------------------------- | +| FR3.1 | Define `LLMProvider` Protocol with `extract_chapters()` method | +| FR3.2 | Create `BookContext` and `ChapterExtraction` Pydantic models | +| FR3.3 | Implement `InstructorLLMProvider` using Instructor library | +| FR3.4 | Create `LLMService` thin wrapper that delegates to provider | +| FR3.5 | Extract chapter number, title, and summary for each chapter | +| FR3.6 | Provider selection via dependency injection | + +### Phase 4: TTS Service + +| # | Requirement | +| ------ | -------------------------------------------------------------------------- | +| FR4.1 | Define `TTSProvider` Protocol with `synthesize()` and `synthesize_stream()` | +| FR4.2 | Implement `OpenAITTSProvider` using OpenAI TTS API | +| FR4.3 | Create `TTSService` wrapper that delegates to provider | +| FR4.4 | Support voice selection (default: "alloy") | +| FR4.5 | Return audio as bytes or async stream | + +### Phase 5: Processing Pipeline + +| # | Requirement | +| ------ | -------------------------------------------------------------------------- | +| FR5.1 | Create `ProcessingService` orchestrating LLM β†’ TTS β†’ Storage | +| FR5.2 | Create ARQ task `process_audiobook_task` in `workers/tasks.py` | +| FR5.3 | Update job progress at each stage (20%, 50%, 80%, 100%) | +| FR5.4 | Handle partial failures (some chapters fail, others succeed) | +| FR5.5 | Implement retry logic with exponential backoff (3 attempts) | +| FR5.6 | Store audio files via StorageService (local dev / S3 prod) | +| FR5.7 | Update `AudioBook` status on completion/failure | +| FR5.8 | Store chapter audio paths/URLs in `Chapter` records | + +--- + +## 5. Non-Goals (Out of Scope) + +- **Multiple TTS providers implemented** - Architecture ready, only OpenAI initially +- **Multiple LLM providers implemented** - Architecture ready, only OpenAI/Instructor initially +- **User notifications** - No push notifications on job completion (polling only) +- **Audio file management UI** - API only, no admin interface +- **Voice customization per user** - Single voice initially +- **Real-time streaming to client** - Files generated then served, no live streaming + +--- + +## 6. Technical Considerations + +### Architecture + +``` +β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” +β”‚ ProcessingService β”‚ +β”œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€ +β”‚ β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ +β”‚ β”‚ LLMService β”‚ β”‚ TTSService β”‚ β”‚ +β”‚ β”‚ (Protocol-based)β”‚ β”‚ (Protocol-based)β”‚ β”‚ +β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β””β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚ +β”‚ β”‚ β”‚ β”‚ +β”‚ β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β–Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β–Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ +β”‚ β”‚InstructorProviderβ”‚ β”‚ OpenAITTSProviderβ”‚ β”‚ +β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚ +β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ +``` + +### Dependencies + +```toml +instructor = ">=1.0.0" # New - for structured LLM output +# openai, arq already present +``` + +### Key Design Decisions + +| Decision | Choice | Rationale | +|----------|--------|-----------| +| Job queue | ARQ | Already in deps, native async | +| LLM library | Instructor (behind Protocol) | Structured output, but decoupled | +| TTS provider | OpenAI | Quality/cost balance | +| Provider abstraction | Protocol pattern | Enables switching without code changes | + +--- + +## 7. Success Metrics + +| Metric | Target | +|--------|--------| +| Processing success rate | >95% of jobs complete successfully | +| Average processing time | <5 minutes per book (15-20 chapters) | +| Audio quality | No user complaints about clarity | +| Provider switch time | <30 minutes to add new provider | + +--- + +## 8. Open Questions + +1. **Chapter count estimation** - How do we know how many chapters a book has? (Use metadata or ask LLM?) +2. **Summary length** - Target word count per chapter summary? (Affects audio duration) +3. **Voice selection** - Stick with one voice or offer options? +4. **Rate limiting** - How to handle OpenAI rate limits during batch processing? diff --git a/tasks/backend-productionisation/audio-books-pipeline/tasks-audio-books-pipeline.md b/tasks/backend-productionisation/audio-books-pipeline/tasks-audio-books-pipeline.md new file mode 100644 index 0000000..7399b04 --- /dev/null +++ b/tasks/backend-productionisation/audio-books-pipeline/tasks-audio-books-pipeline.md @@ -0,0 +1,235 @@ +# Tasks: Audio Books Pipeline + +**PRD Reference:** [prd-audio-books-pipeline.md](./prd-audio-books-pipeline.md) +**Design Doc:** [design-doc.md](./design-doc.md) +**Parent:** [tasks-backend-productionisation.md](../tasks-backend-productionisation.md) + +--- + +### Phase 0: Storage Service (Prerequisite) + +> **Shared with:** [Phase 5 Storage Infrastructure](../tasks-backend-productionisation.md#phase-5-storage--external-services) + +- `src/bookbytes/storage/base.py` - Abstract StorageBackend interface +- `src/bookbytes/storage/local.py` - Local filesystem implementation +- `src/bookbytes/storage/s3.py` - S3 implementation with presigned URLs +- `src/bookbytes/storage/__init__.py` - Factory function + +### Phase 1: Processing API Endpoints + +- `src/bookbytes/schemas/processing.py` - Request/response schemas +- `src/bookbytes/api/v1/processing.py` - Processing router +- `tests/unit/test_processing_endpoints.py` - Unit tests + +### Phase 2: Job Infrastructure + +- `src/bookbytes/models/job.py` - Job model with status enum +- `src/bookbytes/repositories/job.py` - Job repository +- `alembic/versions/xxx_add_jobs_table.py` - Migration + +### Phase 3: LLM Service + +- `src/bookbytes/services/llm.py` - LLMProvider protocol + InstructorProvider +- `tests/unit/test_llm_service.py` - Unit tests with mocked responses + +### Phase 4: TTS Service + +- `src/bookbytes/services/tts.py` - TTSProvider protocol + OpenAIProvider +- `tests/unit/test_tts_service.py` - Unit tests + +### Phase 5: Processing Pipeline + +- `src/bookbytes/services/processing.py` - ProcessingService orchestration +- `src/bookbytes/workers/settings.py` - ARQ worker settings +- `src/bookbytes/workers/tasks.py` - ARQ task definitions +- `tests/integration/test_processing_pipeline.py` - Integration tests + +--- + +## Dependencies + +- **Storage Service** (Phase 5 of backend-productionisation) - Required for storing audio files +- **AudioBook/Chapter models** (Phase 3.A of audio-books-library) - Already exist + +--- + +## Tasks + +### Phase 0: Storage Service (If not already implemented) + +> Check if storage exists. If not, implement from [Phase 5 Storage tasks](../tasks-backend-productionisation.md). + +- [ ] 0.1 Create `storage/base.py` with `StorageBackend` abstract class + - [ ] `async save(key: str, data: bytes) -> str` + - [ ] `async get_url(key: str) -> str` + - [ ] `async delete(key: str) -> bool` + - [ ] `async exists(key: str) -> bool` + +- [ ] 0.2 Create `storage/local.py` with `LocalStorage` + - [ ] Save to `LOCAL_STORAGE_PATH` config + - [ ] Return file:// URLs for local dev + - [ ] Use aiofiles for async I/O + +- [ ] 0.3 Create `storage/s3.py` with `S3Storage` + - [ ] Use aioboto3 for async S3 operations + - [ ] Generate pre-signed URLs + +- [ ] 0.4 Create `storage/__init__.py` factory + - [ ] `get_storage_backend(settings) -> StorageBackend` + +- [ ] 0.5 Add `get_storage()` dependency in `dependencies.py` + +--- + +### Phase 1: Processing API Endpoints + +- [x] 1.1 Create `schemas/processing.py` + - [x] `ProcessRequest(edition_id: UUID | None, isbn: str | None)` with validator + - [x] `ProcessResponse(job_id, audio_book_id, status, message)` + - [x] `JobStatusResponse(id, status, progress, error_message, timestamps)` + - [x] `RefreshRequest(force: bool = False)` + +- [x] 1.2 Create `api/v1/processing.py` router + - [x] `POST /books/process` - Accept edition_id OR isbn, create job, return job_id + - [x] `POST /books/{audio_book_id}/refresh` - Regenerate audiobook + - [x] `GET /jobs/{job_id}` - Return job status with progress + +- [x] 1.3 Add processing router to v1 router + - [x] Include in `api/v1/router.py` + +- [x] 1.4 Unit tests for endpoints + - [x] Test process endpoint validation (16 tests) + - [x] Test job status endpoint + - [x] Test refresh endpoint + +--- + +### Phase 2: Job Infrastructure + +- [x] 2.1 Create `models/job.py` + - [x] `JobStatus` enum (pending, processing, completed, failed) + - [x] `JobType` enum (audiobook_generation, audiobook_refresh) + - [x] `Job` model - **GENERIC** (no FK to AudioBook) + - [x] `models/audio_book_job.py` - Relation table for job↔audiobook link + +- [x] 2.2 Generate Alembic migration + - [x] Run `alembic revision --autogenerate -m "add_jobs_and_audio_book_jobs_tables"` + - [x] Verified migration creates both tables with indexes + - [x] Run migration `alembic upgrade head` + +- [x] 2.3 Create `repositories/job.py` + - [x] `JobRepository` with `claim_next()` (optimistic locking) + - [x] `update_progress()`, `mark_completed()`, `mark_failed()` + - [x] `schedule_retry()`, `get_by_status()`, `get_pending_count()` + - [x] `AudioBookJobRepository` for managing job↔audiobook links + +- [ ] 2.4 Configure ARQ worker + - [ ] Create `workers/settings.py` with `WorkerSettings` + - [ ] Configure Redis connection from settings + +--- + +### Phase 3: LLM Service + +- [ ] 3.1 Add instructor dependency + - [ ] Add `instructor>=1.0.0` to pyproject.toml + - [ ] Run `uv sync` + +- [ ] 3.2 Create domain models (no library imports) + - [ ] `ChapterInfo(number, title, summary)` Pydantic model + - [ ] `ChapterExtraction(chapters: list[ChapterInfo])` Pydantic model + - [ ] `BookContext(title, author, num_chapters, language)` input model + +- [ ] 3.3 Create `LLMProvider` Protocol + - [ ] Define `extract_chapters(context: BookContext) -> ChapterExtraction` + - [ ] Document interface contract + +- [ ] 3.4 Implement `InstructorLLMProvider` + - [ ] Initialize with OpenAI client + - [ ] Use Instructor for structured output + - [ ] Handle rate limits and retries + +- [ ] 3.5 Create `LLMService` wrapper + - [ ] Accept `LLMProvider` via DI + - [ ] Provide convenience methods + +- [ ] 3.6 Unit tests with mocked LLM responses + - [ ] Test chapter extraction + - [ ] Test error handling + +--- + +### Phase 4: TTS Service + +- [ ] 4.1 Create `TTSProvider` Protocol + - [ ] `synthesize(text, voice) -> bytes` + - [ ] `synthesize_stream(text, voice) -> AsyncIterator[bytes]` + +- [ ] 4.2 Implement `OpenAITTSProvider` + - [ ] Use `openai.audio.speech.create()` + - [ ] Support voice selection (alloy, echo, fable, onyx, nova, shimmer) + - [ ] Handle streaming response + +- [ ] 4.3 Create `TTSService` wrapper + - [ ] Accept `TTSProvider` via DI + - [ ] Provide convenience methods + +- [ ] 4.4 Unit tests + - [ ] Test audio generation (mocked) + - [ ] Test streaming (mocked) + +--- + +### Phase 5: Processing Pipeline + +- [ ] 5.1 Create `services/processing.py` + - [ ] `ProcessingService` class + - [ ] `start_processing(edition_id) -> (Job, AudioBook)` + - [ ] `process_audiobook(job_id, audio_book_id)` - main pipeline + - [ ] `refresh_audiobook(audio_book_id) -> (Job, AudioBook)` + +- [ ] 5.2 Implement processing pipeline + - [ ] Create AudioBook record (PENDING) + - [ ] Create Job record + - [ ] Extract chapters via LLMService + - [ ] Generate audio for each chapter via TTSService + - [ ] Store audio files via StorageService + - [ ] Update Chapter records with audio paths + - [ ] Update AudioBook status (COMPLETED) + +- [ ] 5.3 Create ARQ tasks + - [ ] `process_audiobook_task(ctx, job_id, audio_book_id)` + - [ ] Progress updates at stages (20%, 50%, 80%, 100%) + - [ ] Error handling with job failure status + +- [ ] 5.4 Implement retry logic + - [ ] Exponential backoff (3 attempts) + - [ ] Partial failure handling (some chapters fail) + +- [ ] 5.5 Integration tests + - [ ] Full pipeline with mocked LLM/TTS + - [ ] Job status progression + - [ ] Error scenarios + +--- + +### Phase 6: Dependency Injection Setup + +- [ ] 6.1 Add provider factory functions + - [ ] `get_llm_service()` - returns configured LLMService + - [ ] `get_tts_service()` - returns configured TTSService + - [ ] `get_processing_service()` - returns configured ProcessingService + +- [ ] 6.2 Provider selection via config + - [ ] `LLM_PROVIDER` env var (default: "instructor") + - [ ] `TTS_PROVIDER` env var (default: "openai") + +--- + +## Completion Checklist + +- [ ] All unit tests passing +- [ ] All integration tests passing +- [ ] ARQ worker starts successfully +- [ ] Full E2E: POST /process β†’ job complete β†’ audio accessible +- [ ] Documentation updated diff --git a/tasks/backend-productionisation/prd-backend-productionisation.md b/tasks/backend-productionisation/prd-backend-productionisation.md new file mode 100644 index 0000000..cf04f9d --- /dev/null +++ b/tasks/backend-productionisation/prd-backend-productionisation.md @@ -0,0 +1,285 @@ +# PRD: Backend Productionisation + +**Document Version:** 1.1 +**Created:** 4 December 2025 +**Last Updated:** 4 December 2025 +**Status:** Approved for Implementation +**Timeline:** 1-2 days (aggressive) + +--- + +## 1. Introduction/Overview + +BookBytes is an application that converts physical books (via ISBN) into chapter-wise audio summaries. The current implementation is a working prototype with a monolithic architecture (`app.py` ~700 lines), synchronous processing, SQLite database, and Flask-based API. + +This PRD outlines the transformation of the backend from a junior-developer prototype into a production-ready, senior-engineer-quality SaaS backend. The focus is on **technical excellence** with JWT authentication included in this phase (OAuth deferred). + +### Problem Statement + +The current codebase has several production blockers: + +- **Monolithic architecture** makes testing and maintenance difficult +- **Synchronous processing** blocks API responses during long OpenAI/gTTS operations +- **SQLite** doesn't scale and lacks proper connection management +- **No async runtime** limits concurrency and throughput +- **Mixed error handling** leads to inconsistent API responses +- **No job tracking** means clients can't monitor long-running operations +- **Tight coupling** to external services makes testing difficult + +### Solution + +Restructure the backend using FastAPI with async patterns, PostgreSQL with SQLAlchemy 2.0 async, ARQ for background jobs, and a clean modular architecture following repository/service patterns. + +--- + +## 2. Goals + +| # | Goal | Success Criteria | +| --- | ------------------------- | --------------------------------------------------------------------------------------------- | +| G1 | **Modular Architecture** | Codebase split into api/, models/, services/, repositories/ with clear separation of concerns | +| G2 | **Async Runtime** | All database and HTTP operations use async/await | +| G3 | **Background Processing** | Book processing happens in worker mode with job status tracking | +| G4 | **Production Database** | PostgreSQL with async driver, migrations, and connection pooling | +| G5 | **Consistent API** | Versioned endpoints, Pydantic validation, standardized error responses | +| G6 | **Storage Abstraction** | Pluggable storage (local dev / S3 prod) without code changes | +| G7 | **Observability** | Structured JSON logging, health checks, graceful shutdown | +| G8 | **Testability** | Integration tests with mocked external services | +| G9 | **Containerization** | Docker Compose stack with API, Worker, Postgres, Redis | +| G10 | **JWT Authentication** | Protected endpoints with JWT tokens; API key mode for local development | + +--- + +## 3. User Stories + +### API Consumer Stories + +| ID | Story | Acceptance Criteria | +| --- | ------------------------------------------------------------------------------------------------------------------------ | ------------------------------------------------------------------- | +| US1 | As an API consumer, I want to submit a book for processing and receive a job ID, so I can track progress without waiting | POST `/api/v1/books/process` returns `{job_id, status}` immediately | +| US2 | As an API consumer, I want to poll job status, so I know when processing is complete | GET `/api/v1/jobs/{job_id}` returns current status and progress | +| US3 | As an API consumer, I want consistent error responses, so I can handle failures programmatically | All errors return `{error: {code, message}}` format | +| US4 | As an API consumer, I want to retrieve audio files via URL, so I can stream them directly | Audio endpoints return pre-signed URLs (S3) or direct paths (local) | + +### Developer Stories + +| ID | Story | Acceptance Criteria | +| --- | --------------------------------------------------------------------------- | ------------------------------------------------------- | +| DS1 | As a developer, I want to run the full stack locally with one command | `docker-compose up` starts API, Worker, Postgres, Redis | +| DS2 | As a developer, I want to run tests with mocked external services | `pytest` runs without OpenAI/gTTS API calls | +| DS3 | As a developer, I want to add new features without touching unrelated code | Each module has single responsibility | +| DS4 | As a developer, I want database migrations, so schema changes are versioned | Alembic migrations track all schema changes | + +### Operations Stories + +| ID | Story | Acceptance Criteria | +| --- | -------------------------------------------------------------------------------------- | -------------------------------------------------- | +| OS1 | As an operator, I want health check endpoints, so I can configure load balancer probes | `/health/live` and `/health/ready` endpoints exist | +| OS2 | As an operator, I want structured logs, so I can aggregate and search them | All logs output as JSON with correlation IDs | +| OS3 | As an operator, I want graceful shutdown, so in-flight requests complete | SIGTERM allows 30s for cleanup | + +--- + +## 4. Functional Requirements + +### Phase 1: Project Structure & Configuration + +| # | Requirement | +| ----- | ------------------------------------------------------------------------------------------------------------------------------------------------------------- | +| FR1.1 | Create new folder structure under `src/bookbytes/` with modules: `api/`, `models/`, `schemas/`, `repositories/`, `services/`, `workers/`, `core/`, `storage/` | +| FR1.2 | Implement Pydantic Settings class in `config.py` with validation for all environment variables | +| FR1.3 | Configure FastAPI application factory in `main.py` with lifespan events | +| FR1.4 | Set up dependency injection using FastAPI's `Depends()` pattern | +| FR1.5 | Create `pyproject.toml` for modern Python packaging with all dependencies | +| FR1.6 | Create `.env.example` with all required environment variables documented | + +### Phase 2: Database Layer + +| # | Requirement | +| ----- | ---------------------------------------------------------------------------- | +| FR2.1 | Create SQLAlchemy 2.0 async models for `Book`, `BookIsbn`, `Chapter`, `Job`, `User` in `models/` | +| FR2.2 | **Book model**: UUID primary key (not ISBN). Group editions by language, keep latest edition per language. Fields: `id` (UUID PK), `title`, `author`, `language` (ISO 639-1), `edition`, `publisher`, `pages`, `publish_date`, `cover_url`, `created_at`, `updated_at` | +| FR2.3 | **BookIsbn model**: Normalized ISBN storage (1:N with Book). Fields: `id` (UUID PK), `book_id` (FK), `isbn` (VARCHAR 13, unique), `isbn_type` (enum: isbn10, isbn13), `created_at`. Index on `isbn` for fast lookups | +| FR2.4 | **Chapter model**: Reference Book by UUID. Fields: `id` (UUID PK), `book_id` (FK), `chapter_number`, `title`, `summary`, `audio_file_path`, `audio_url`, `word_count`, `created_at`, `updated_at`. Unique constraint on `(book_id, chapter_number)` | +| FR2.5 | Implement async database engine and session factory in `core/database.py` | +| FR2.6 | Configure Alembic for async migrations with `alembic/` directory | +| FR2.7 | Create initial migration with new schema (fresh start, no SQLite migration) | +| FR2.8 | Implement repository classes with async CRUD operations | +| FR2.9 | Add connection pooling configuration (min 2, max 10 connections) | +| FR2.10 | Support SQLite (aiosqlite) for local development via config flag | + +### Phase 3: Background Job Queue + +| # | Requirement | +| ----- | -------------------------------------------------------------------------------------- | +| FR3.1 | Set up ARQ with Redis connection in `workers/settings.py` | +| FR3.2 | Create `process_book_task` that orchestrates the full pipeline | +| FR3.3 | Implement `Job` model with status enum: `pending`, `processing`, `completed`, `failed` | +| FR3.4 | Add job progress tracking (percentage, current step) | +| FR3.5 | Create job status API endpoint: GET `/api/v1/jobs/{job_id}` | +| FR3.6 | Implement job result storage (success data or error details) | +| FR3.7 | Add worker entry point: `arq src.bookbytes.workers.settings.WorkerSettings` | + +### Phase 4: API Layer & Error Handling + +| # | Requirement | +| ----- | ------------------------------------------------------------- | +| FR4.1 | Create versioned router structure: `/api/v1/` | +| FR4.2 | Migrate all Flask endpoints to FastAPI with async handlers | +| FR4.3 | Implement Pydantic request/response schemas for all endpoints | +| FR4.4 | Create custom exception hierarchy in `core/exceptions.py` | +| FR4.5 | Add global exception handlers for consistent error responses | +| FR4.6 | Configure OpenAPI documentation with examples | +| FR4.7 | Add request ID middleware for correlation | + +### Phase 5: Storage & External Services + +| # | Requirement | +| ----- | ---------------------------------------------------------------------------- | --- | +| FR5.1 | Create abstract `StorageBackend` interface in `storage/base.py` | +| FR5.2 | Implement `LocalStorage` for development (saves to `data/audio/`) | +| FR5.3 | Implement `S3Storage` for production with pre-signed URLs | +| FR5.4 | Add storage backend selection via config: `STORAGE_BACKEND=local | s3` | +| FR5.5 | Wrap OpenAI calls with retry logic (3 retries, exponential backoff) | +| FR5.6 | Wrap gTTS calls with retry logic (3 retries) | +| FR5.7 | Add configurable timeouts for all external HTTP calls (30s default) | +| FR5.8 | Create service classes: `OpenAIService`, `TTSService`, `BookMetadataService` | + +### Phase 6: JWT Authentication + +| # | Requirement | +| ----- | ------------------------------------------------------------------- | +| FR6.1 | Implement JWT token structure with `sub`, `exp`, `iat`, `scope` claims | +| FR6.2 | Create `core/security.py` with JWT encode/decode utilities, password hashing | +| FR6.3 | Create `User` model with fields: `id` (UUID), `email`, `hashed_password`, `is_active`, `created_at` | +| FR6.4 | Implement auth middleware with `get_current_user` dependency | +| FR6.5 | Create auth endpoints: POST `/api/v1/auth/register`, POST `/api/v1/auth/login`, GET `/api/v1/auth/me` | +| FR6.6 | Add `AUTH_MODE` config: `jwt` (production) or `api_key` (local dev bypass) | +| FR6.7 | For `api_key` mode: Accept `X-API-Key` header with configurable static key for local development | +| FR6.8 | Protect all `/api/v1/books/*` and `/api/v1/jobs/*` endpoints with auth | +| FR6.9 | Document OAuth integration points for future implementation (design only) | + +### Phase 7: Observability & Deployment + +| # | Requirement | +| ----- | ----------------------------------------------------------------------- | +| FR7.1 | Configure structlog for JSON logging in `core/logging.py` | +| FR7.2 | Add proper logging to the codebase | +| FR7.3 | Add correlation ID to all log entries | +| FR7.4 | Create `/health/live` endpoint (always returns 200 if app is running) | +| FR7.5 | Create `/health/ready` endpoint (checks DB and Redis connectivity) | +| FR7.6 | Implement graceful shutdown with 30s timeout | +| FR7.7 | Create multi-stage `Dockerfile` for optimized image | +| FR7.8 | Create `docker-compose.yml` with services: api, worker, postgres, redis | +| FR7.9 | Add volume mounts for local development data persistence | + +--- + +## 5. Non-Goals (Out of Scope) + +| # | Excluded Item | Reason | +| --- | ----------------------------------- | ---------------------------------------------- | +| NG1 | OAuth/Social login | Deferred to future phase; JWT auth only for now | +| NG2 | Billing and subscription management | Deferred to future phase | +| NG3 | CDN configuration | S3 storage only; CDN is infrastructure concern | +| NG4 | Kubernetes deployment | Docker Compose is target | +| NG5 | Load testing | This is at internal release stage | +| NG6 | CLI tool updates | Deferred to later | +| NG7 | Frontend changes | Backend-only scope | +| NG8 | Rate limiting implementation | Design only; implement later | +| NG9 | User management UI | API-only; auth endpoints only | +| NG10 | Email verification flow | Deferred; users active by default | +| NG11 | Password reset flow | Deferred to future phase | + +--- + +## 6. Technical Considerations + +### Technology Stack + +| Component | Technology | Rationale | +| ------------- | -------------------- | ------------------------------------------------ | +| Web Framework | FastAPI 0.109+ | Async-native, auto OpenAPI, Pydantic integration | +| Async Runtime | uvicorn + uvloop | High-performance ASGI server | +| Database | PostgreSQL 16 | Production-grade, async support | +| ORM | SQLAlchemy 2.0 async | Modern async patterns, type hints | +| Migrations | Alembic | SQLAlchemy-native, versioned migrations | +| Job Queue | ARQ | Async-native, Redis-based, simple | +| Validation | Pydantic v2 | FastAPI integration, performance | +| HTTP Client | httpx | Async HTTP client | +| Logging | structlog | Structured JSON logging | +| Storage | boto3/aioboto3 | S3-compatible storage | + +--- + +## 7. Success Metrics + +| Metric | Target | Measurement | +| ----------------------- | -------------------------- | ------------------------- | +| API Response Time (p95) | < 200ms for sync endpoints | Logs / APM | +| Job Processing Time | < 5min per book | Job completion timestamps | +| Test Coverage | > 70% on critical paths | pytest-cov | +| Error Rate | < 1% | Log aggregation | +| Build Time | < 2min | CI/CD pipeline | +| Container Image Size | < 500MB | Docker | + +--- + +## 8. Open Questions + +| # | Question | Status | Decision | +| --- | ----------------------------------------------------------- | ------- | ----------------------------------------- | +| Q1 | Should we keep backward compatibility with Flask endpoints? | Decided | No - clean break with v1 API | +| Q2 | How to handle in-flight jobs during deployment? | Decided | Graceful shutdown with proper status tracking; worker waits for current job before terminating | +| Q3 | Should audio files have expiring URLs? | Decided | No - audio files managed via Books/Chapters lifecycle; no URL expiration | +| Q4 | Max concurrent jobs per worker? | Decided | 5 concurrent jobs (configurable via `WORKER_MAX_JOBS` env var) | +| Q5 | How to handle Book identity across editions? | Decided | UUID primary key; ISBNs stored as array; group by language; keep latest edition per language | + +--- + +## 9. Implementation Phases + +### Phase 1: Project Structure & Configuration + +**Duration:** 2-3 hours +**Files:** `pyproject.toml`, `src/bookbytes/config.py`, `src/bookbytes/main.py`, folder structure +**Deliverable:** Runnable FastAPI app with config system + +### Phase 2: Database Layer + +**Duration:** 2-3 hours +**Files:** `models/`, `repositories/`, `core/database.py`, `alembic/` +**Deliverable:** Async database with migrations + +### Phase 3: Background Job Queue + +**Duration:** 2-3 hours +**Files:** `workers/`, `models/job.py`, `api/v1/jobs.py` +**Deliverable:** ARQ worker processing books asynchronously + +### Phase 4: API Layer & Error Handling + +**Duration:** 2-3 hours +**Files:** `api/v1/`, `schemas/`, `core/exceptions.py` +**Deliverable:** Full API with validation and error handling + +### Phase 5: Storage & External Services + +**Duration:** 1-2 hours +**Files:** `storage/`, `services/` +**Deliverable:** Pluggable storage and resilient external calls + +### Phase 6: JWT Authentication + +**Duration:** 1-2 hours +**Files:** `core/security.py`, `models/user.py`, `api/v1/auth.py`, `schemas/auth.py` +**Deliverable:** Working JWT auth with user registration/login; API key bypass for local dev + +### Phase 7: Observability & Deployment + +**Duration:** 1-2 hours +**Files:** `core/logging.py`, `docker/`, health endpoints +**Deliverable:** Production-ready Docker stack + +--- + +_This PRD will be used to generate detailed task lists for each phase._ diff --git a/tasks/backend-productionisation/tasks-backend-productionisation.md b/tasks/backend-productionisation/tasks-backend-productionisation.md new file mode 100644 index 0000000..f5b70d9 --- /dev/null +++ b/tasks/backend-productionisation/tasks-backend-productionisation.md @@ -0,0 +1,262 @@ +# Tasks: Backend Productionisation + +**PRD Reference:** [prd-backend-productionisation.md](./prd-backend-productionisation.md) +**Technical Reference:** [technical-considerations-backend-productionisation.md](./technical-considerations-backend-productionisation.md) +**Branch:** `feat/productionise-and-saasify` + +--- + +## Relevant Files + +### Phase 1: Project Structure & Configuration + +- `pyproject.toml` - Modern Python packaging with all dependencies +- `src/bookbytes/__init__.py` - Package initialization +- `src/bookbytes/main.py` - FastAPI application factory with lifespan events +- `src/bookbytes/config.py` - Pydantic Settings class for environment validation +- `src/bookbytes/dependencies.py` - FastAPI dependency injection container +- `.env.example` - Documented environment variables template + +### Incremental Phase (2): Database Layer + +- `src/bookbytes/core/__init__.py` - Core module initialization +- `src/bookbytes/core/database.py` - Async engine, session factory, connection pooling +- `src/bookbytes/models/__init__.py` - Models package with exports +- `src/bookbytes/models/base.py` - SQLAlchemy base model with common mixins +- `src/bookbytes/models/user.py` - User model for authentication +- `src/bookbytes/models/book.py` - Book and BookIsbn models +- `src/bookbytes/models/chapter.py` - Chapter model +- `src/bookbytes/models/job.py` - Job model with status enum +- `src/bookbytes/repositories/__init__.py` - Repositories package +- `src/bookbytes/repositories/base.py` - Generic async repository base class +- `src/bookbytes/repositories/user.py` - User repository +- `src/bookbytes/repositories/book.py` - Book and BookIsbn repositories +- `src/bookbytes/repositories/chapter.py` - Chapter repository +- `src/bookbytes/repositories/job.py` - Job repository +- `alembic.ini` - Alembic configuration +- `alembic/env.py` - Async migration environment +- `alembic/script.py.mako` - Migration template +- `alembic/versions/001_initial_schema.py` - Initial database migration + +### Phase 3: Background Job Queue + +- `src/bookbytes/workers/__init__.py` - Workers package +- `src/bookbytes/workers/settings.py` - ARQ worker configuration +- `src/bookbytes/workers/tasks.py` - Job task definitions (process_book_task) +- `src/bookbytes/api/v1/jobs.py` - Job status API endpoints + +### Phase 4: API Layer & Error Handling + +- `src/bookbytes/core/exceptions.py` - Custom exception hierarchy +- `src/bookbytes/api/__init__.py` - API package +- `src/bookbytes/api/v1/__init__.py` - V1 API package +- `src/bookbytes/api/v1/router.py` - Main v1 router combining all endpoints +- `src/bookbytes/api/v1/books.py` - Book endpoints (process, list, get, chapters) +- `src/bookbytes/api/v1/chapters.py` - Chapter endpoints (get, audio) +- `src/bookbytes/api/v1/health.py` - Health check endpoints +- `src/bookbytes/schemas/__init__.py` - Schemas package +- `src/bookbytes/schemas/common.py` - Shared schemas (ErrorResponse, PaginatedResponse) +- `src/bookbytes/schemas/book.py` - Book request/response schemas +- `src/bookbytes/schemas/chapter.py` - Chapter schemas +- `src/bookbytes/schemas/job.py` - Job schemas + +### Phase 5: Storage & External Services + +- `src/bookbytes/storage/__init__.py` - Storage package +- `src/bookbytes/storage/base.py` - Abstract StorageBackend interface +- `src/bookbytes/storage/local.py` - LocalStorage implementation +- `src/bookbytes/storage/s3.py` - S3Storage implementation with pre-signed URLs +- `src/bookbytes/services/__init__.py` - Services package +- `src/bookbytes/services/book_service.py` - Book processing orchestration +- `src/bookbytes/services/metadata_service.py` - Open Library API client +- `src/bookbytes/services/openai_service.py` - OpenAI API wrapper with retries +- `src/bookbytes/services/tts_service.py` - gTTS wrapper with retries + +### Phase 6: JWT Authentication + +- `src/bookbytes/core/security.py` - JWT encode/decode, password hashing +- `src/bookbytes/api/v1/auth.py` - Auth endpoints (register, login, me) +- `src/bookbytes/schemas/auth.py` - Auth request/response schemas + +### Phase 7: Observability & Deployment + +- `src/bookbytes/core/logging.py` - Structlog configuration with correlation IDs +- `docker/Dockerfile` - Multi-stage production Dockerfile +- `docker/docker-compose.yml` - Full stack: api, worker, postgres, redis + +### Tests + +- `tests/__init__.py` - Tests package +- `tests/conftest.py` - Pytest fixtures (async client, mock services, test DB) +- `tests/mocks/__init__.py` - Mock responses package +- `tests/mocks/openai_responses.py` - OpenAI API mock responses +- `tests/mocks/openlibrary_responses.py` - Open Library mock responses +- `tests/integration/__init__.py` - Integration tests package +- `tests/integration/test_auth_api.py` - Auth endpoint tests +- `tests/integration/test_books_api.py` - Books endpoint tests +- `tests/integration/test_jobs_api.py` - Jobs endpoint tests + +### Notes + +- This task list follows the 7-phase structure defined in the PRD +- All source code lives under `src/bookbytes/` for proper Python packaging +- Tests use pytest-asyncio for async test support +- Use `pytest tests/` to run all tests, or `pytest tests/integration/test_books_api.py` for specific tests +- Use `alembic upgrade head` to run migrations +- Use `docker-compose -f docker/docker-compose.yml up` to start the full stack +- Environment variables are validated at startup via Pydantic Settings + +--- + +## Instructions for Completing Tasks + +**IMPORTANT:** As you complete each task, you must check it off in this markdown file by changing `- [ ]` to `- [x]`. This helps track progress and ensures you don't skip any steps. + +Example: + +- `- [ ] 1.1 Read file` β†’ `- [x] 1.1 Read file` (after completing) + +Update the file after completing each sub-task, not just after completing an entire parent task. + +--- + +## Tasks + +- [x] 1.0 **Phase 1: Project Structure & Configuration** + + - [x] 1.1 Create the new folder structure: `src/bookbytes/` with subdirectories `api/`, `api/v1/`, `models/`, `schemas/`, `repositories/`, `services/`, `workers/`, `core/`, `storage/` + - [x] 1.2 Create `__init__.py` files in each package directory to make them proper Python packages + - [x] 1.3 Create `pyproject.toml` with project metadata, all dependencies from technical-considerations doc, and optional dev dependencies (pytest, ruff, etc.) + - [x] 1.4 Create `src/bookbytes/config.py` with Pydantic `Settings` class that validates all environment variables: `APP_ENV`, `DEBUG`, `LOG_LEVEL`, `DATABASE_URL`, `REDIS_URL`, `STORAGE_BACKEND`, `AUTH_MODE`, `JWT_SECRET_KEY`, `OPENAI_API_KEY`, etc. + - [x] 1.5 Create `src/bookbytes/main.py` with FastAPI application factory using `@asynccontextmanager` lifespan for startup/shutdown events (initialize DB, Redis connections) + - [x] 1.6 Create `src/bookbytes/dependencies.py` with dependency injection functions: `get_settings()`, `get_db_session()`, `get_redis()` using FastAPI's `Depends()` pattern + - [x] 1.7 Create `.env.example` with all environment variables documented with comments explaining each one + - [x] 1.8 Verify the app starts: `uvicorn src.bookbytes.main:app --reload` should show FastAPI running (even with placeholder routes) + +- [x] 2.0 **Phase 2: Database Foundation** + + - [x] 2.1 Create `src/bookbytes/core/database.py` with async SQLAlchemy engine using `create_async_engine()`, `async_sessionmaker`, and connection pool settings (pool_size=2, max_overflow=8) + - [x] 2.2 Create `src/bookbytes/models/base.py` with `Base = declarative_base()` and a `TimestampMixin` class that adds `created_at` and `updated_at` columns with auto-update triggers + - [x] 2.3 Create `src/bookbytes/models/__init__.py` that exports Base (will be extended as models are added in subsequent phases) + - [x] 2.4 Initialize Alembic: Run `alembic init alembic` and configure `alembic.ini` with async driver support + - [x] 2.5 Update `alembic/env.py` to use async migrations with `run_async_migrations()` function, import Base from models, use `target_metadata = Base.metadata` + - [x] 2.6 Create `src/bookbytes/repositories/base.py` with generic `BaseRepository[T]` class providing async `get_by_id()`, `get_all()`, `create()`, `update()`, `delete()` methods + - [x] 2.7 Add database session dependency to `dependencies.py`: async generator `get_db_session()` that yields session and handles commit/rollback + - [x] 2.8 Update `main.py` lifespan to initialize and close database connection + - [x] 2.9 Update `/health/ready` to check database connectivity with `SELECT 1` + - [x] 2.10 Create `tests/integration/test_database.py` to verify database connection and session lifecycle + - [ ] 2.11 Test database setup: Start postgres via docker-compose, verify connection works + +- [ ] 3.0 **Phase 3: Audio Books Library** β†’ [tasks-audio-books-library.md](./audio-books-library/tasks-audio-books-library.md) + + > **Design Doc:** [design-doc.md](./audio-books-library/design-doc.md) + + This phase has been moved to a dedicated module. Key components: + + - 3.A: Data Models (Work, Edition, BookProvider, AudioBook, Chapter, APICache) + - 3.B: Cache Service (Redis) + - 3.C: OpenLibrary Service (API client with caching) + - 3.D: Library Service (Provider-agnostic book management) + - 3.E: API Endpoints (Search, works, isbn) + - 3.F: Background Jobs (Audiobook generation) + - 3.G: Testing + +- [ ] 3.1 **Phase 3.1: Audio Books Pipeline** β†’ [tasks-audio-books-pipeline.md](./audio-books-pipeline/tasks-audio-books-pipeline.md) + + > **PRD:** [prd-audio-books-pipeline.md](./audio-books-pipeline/prd-audio-books-pipeline.md) + > **Design Doc:** [design-doc.md](./audio-books-pipeline/design-doc.md) + + LLM + TTS processing pipeline for chapter summaries: + + - Processing API endpoints (process, refresh, job status) + - Job infrastructure (ARQ, Job model) + - LLM Service (Protocol-based, Instructor implementation) + - TTS Service (Protocol-based, OpenAI implementation) + - ProcessingService orchestration + +- [ ] 4.0 **Phase 4: API Layer & Error Handling** + + - [x] 4.1 Create `src/bookbytes/core/exceptions.py` with exception hierarchy: `BookBytesError(Exception)` base class with `code` and `message` attributes, then `BookNotFoundError`, `ChapterNotFoundError`, `JobNotFoundError`, `ISBNNotFoundError`, `AuthenticationError`, `AuthorizationError` _(DONE - moved to auxiliary foundation)_ + - [x] 4.2 Create `src/bookbytes/schemas/common.py` with shared schemas: `ErrorDetail(code: str, message: str, request_id: str | None)`, `ErrorResponse(error: ErrorDetail)`, `PaginatedResponse[T](items: list[T], total: int, page: int, size: int)` _(DONE - moved to auxiliary foundation)_ + - [x] 4.3 Add global exception handlers in `main.py`: register handlers for `BookBytesError` (return 400 with ErrorResponse), `HTTPException` (pass through), `Exception` (log and return 500) _(DONE - integrated with 4.1)_ + - [x] 4.4 Create request ID middleware in `main.py`: Use `starlette.middleware` to add `X-Request-ID` header (generate UUID if not present), store in request state for logging _(DONE - completed in Phase 1 and logging setup)_ + - [ ] 4.5 Create `src/bookbytes/models/book.py` with `Book` model: `id` (UUID, PK), `title`, `author`, `language` (default 'en'), `edition`, `publisher`, `pages`, `publish_date`, `cover_url`, timestamps. Add relationship to `BookIsbn` and `Chapter` + - [ ] 4.6 Create `src/bookbytes/models/book.py` with `BookIsbn` model in same file: `id` (UUID, PK), `book_id` (FK to books), `isbn` (unique), `isbn_type` (Enum: isbn10, isbn13), `created_at`. Add index on `isbn` + - [ ] 4.7 Create `src/bookbytes/models/chapter.py` with `Chapter` model: `id` (UUID, PK), `book_id` (FK), `chapter_number`, `title`, `summary`, `audio_file_path`, `audio_url`, `word_count`, timestamps. Add unique constraint on `(book_id, chapter_number)` + - [ ] 4.8 Create `src/bookbytes/repositories/book.py` with `BookRepository` (add `get_by_language()`, `get_latest_by_title_language()`) and `BookIsbnRepository` (add `get_by_isbn()`, `get_isbns_for_book()`) + - [ ] 4.9 Create `src/bookbytes/repositories/chapter.py` with `ChapterRepository` adding `get_by_book_id()`, `get_by_book_and_number()` + - [ ] 4.10 Generate migration for Book, BookIsbn, Chapter models: `alembic revision --autogenerate -m "add_book_chapter_models"` + - [ ] 4.11 Create `src/bookbytes/schemas/book.py` with schemas: `BookCreate(isbn: str)`, `BookIsbnResponse(isbn, isbn_type)`, `BookResponse(id, title, author, language, ..., isbns: list[BookIsbnResponse])`, `BookListResponse(books: list[BookResponse])`, `ProcessBookRequest(isbn: str)`, `ProcessBookResponse(job_id, status)` + - [ ] 4.12 Create `src/bookbytes/api/v1/books.py` with endpoints: `POST /books/process` (enqueue processing, return job_id), `GET /books` (list all books), `GET /books/{book_id}` (get by UUID), `GET /books/isbn/{isbn}` (get by ISBN), `GET /books/{book_id}/chapters` (list chapters) + - [ ] 4.13 Create `src/bookbytes/api/v1/chapters.py` with endpoints: `GET /chapters/{chapter_id}` (get chapter details), `GET /chapters/{chapter_id}/audio` (return audio URL or redirect) + - [ ] 4.14 Create `src/bookbytes/api/v1/health.py` with endpoints: `GET /health/live` (always returns 200 `{"status": "ok"}`), `GET /health/ready` (checks DB and Redis, returns checks object) + - [ ] 4.15 Create `src/bookbytes/api/v1/router.py` that combines all routers using `APIRouter()` and `include_router()` with appropriate prefixes and tags + - [ ] 4.16 Include v1 router in `main.py` under `/api/v1` prefix + - [ ] 4.17 Configure OpenAPI in `main.py`: Set title, description, version, add example responses to endpoints using `responses` parameter + - [ ] 4.18 Verify API documentation: Access `/docs` and `/redoc` endpoints, ensure all endpoints are documented with request/response examples + - [ ] 4.19 Create `tests/integration/test_books_api.py` with tests: process book (returns job_id), list books, get book by id, get book by isbn (404 for unknown), get chapters for book + +- [ ] 5.0 **Phase 5: Storage & External Services** + + > **Note:** Storage tasks (5.1-5.5) are shared infrastructure used by the Audio Books Pipeline. + > Service tasks (5.6-5.14) have been **superseded** by Protocol-based services in Phase 3.1. + + #### Storage Infrastructure (Active) + + - [ ] 5.1 Create `src/bookbytes/storage/base.py` with abstract `StorageBackend` class defining interface: `async save(key: str, data: bytes) -> str`, `async get_url(key: str) -> str`, `async delete(key: str) -> bool`, `async exists(key: str) -> bool` + - [ ] 5.2 Create `src/bookbytes/storage/local.py` with `LocalStorage(StorageBackend)` implementation: saves files to `LOCAL_STORAGE_PATH`, returns file:// URLs for local dev, uses aiofiles for async I/O + - [ ] 5.3 Create `src/bookbytes/storage/s3.py` with `S3Storage(StorageBackend)` implementation: uses aioboto3 for async S3 operations, generates pre-signed URLs with configurable expiry + - [ ] 5.4 Create `src/bookbytes/storage/__init__.py` with factory function `get_storage_backend(settings) -> StorageBackend` that returns LocalStorage or S3Storage based on `STORAGE_BACKEND` config + - [ ] 5.5 Add storage backend dependency in `dependencies.py`: `get_storage()` that uses the factory function + + #### External Services (Superseded by Phase 3.1) + + The following tasks have been replaced by Protocol-based services in [audio-books-pipeline](./audio-books-pipeline/tasks-audio-books-pipeline.md): + + - [x] 5.6 ~~`BookMetadataService`~~ β†’ Replaced by `OpenLibraryService` in Phase 3.C + - [x] 5.7 ~~Retry logic for metadata~~ β†’ Included in OpenLibraryService + - [x] 5.8 ~~`OpenAIService`~~ β†’ Replaced by `LLMProvider` protocol + `InstructorLLMProvider` in Phase 3.1 + - [x] 5.9 ~~Retry logic for OpenAI~~ β†’ Included in LLMProvider implementations + - [x] 5.10 ~~`TTSService` (gTTS)~~ β†’ Replaced by `TTSProvider` protocol + `OpenAITTSProvider` in Phase 3.1 + - [x] 5.11 ~~Retry logic for TTS~~ β†’ Included in TTSProvider implementations + - [x] 5.12 ~~`BookService` orchestration~~ β†’ Replaced by `ProcessingService` in Phase 3.1 + - [x] 5.13 ~~Configurable timeouts~~ β†’ Configured in service implementations + - [x] 5.14 ~~Wire up BookService~~ β†’ Wired in ProcessingService + - [x] 5.15 ~~Service dependencies~~ β†’ Configured via DI in Phase 3.1 + +- [ ] 6.0 **Phase 6: JWT Authentication** + + - [ ] 6.1 Create `src/bookbytes/models/user.py` with `User` model: `id` (UUID, PK), `email` (unique), `hashed_password`, `is_active` (default True), timestamps + - [ ] 6.2 Create `src/bookbytes/repositories/user.py` with `UserRepository` extending base, adding `get_by_email()` method + - [ ] 6.3 Generate migration for User model: `alembic revision --autogenerate -m "add_user_model"` + - [ ] 6.4 Create `src/bookbytes/core/security.py` with password utilities: `hash_password(password: str) -> str` using passlib bcrypt, `verify_password(plain: str, hashed: str) -> bool` + - [ ] 6.2 Add JWT utilities in `security.py`: `create_access_token(data: dict, expires_delta: timedelta | None) -> str` using python-jose, `decode_access_token(token: str) -> dict` with validation + - [ ] 6.3 Define JWT payload structure in `security.py`: `TokenPayload` dataclass with `sub` (user_id), `exp`, `iat`, `scope` (default: "access") + - [ ] 6.4 Create `src/bookbytes/schemas/auth.py` with schemas: `UserCreate(email: EmailStr, password: str)`, `UserLogin(email: EmailStr, password: str)`, `UserResponse(id, email, is_active, created_at)`, `TokenResponse(access_token: str, token_type: str = "bearer")` + - [ ] 6.5 Create `src/bookbytes/api/v1/auth.py` with endpoints: `POST /auth/register` (create user, return UserResponse), `POST /auth/login` (validate credentials, return TokenResponse), `GET /auth/me` (return current user) + - [ ] 6.6 Create auth dependency in `dependencies.py`: `get_current_user(token: str = Depends(oauth2_scheme))` that decodes JWT, fetches user from DB, raises 401 if invalid + - [ ] 6.7 Create optional auth dependency: `get_current_user_optional()` that returns None if no token, for endpoints that work with or without auth + - [ ] 6.8 Implement API key bypass for local dev: In `get_current_user()`, if `AUTH_MODE=api_key`, check `X-API-Key` header against `API_KEY` config, return a mock/system user + - [ ] 6.9 Add `oauth2_scheme = OAuth2PasswordBearer(tokenUrl="/api/v1/auth/login")` in dependencies for Swagger UI integration + - [ ] 6.10 Protect book and job endpoints: Add `current_user: User = Depends(get_current_user)` to all endpoints in `books.py` and `jobs.py` + - [ ] 6.11 Filter jobs by user: Update `GET /jobs` to only return jobs where `job.user_id == current_user.id` + - [ ] 6.12 Include auth router in v1 router with `/auth` prefix + - [ ] 6.13 Test auth flow: Verify register β†’ login β†’ access protected endpoint with token works; verify 401 without token + - [ ] 6.14 Create `tests/integration/test_auth_api.py` with tests: register new user, login with valid credentials, login with invalid credentials (401), access /me with token, access /me without token (401) + +- [ ] 7.0 **Phase 7: Observability & Deployment** + - [x] 7.1 Create `src/bookbytes/core/logging.py` with structlog configuration: Configure `structlog.configure()` with processors for JSON output (prod) or console (dev), add timestamp, log level, logger name _(DONE - moved to auxiliary foundation)_ + - [x] 7.2 Add correlation ID processor in logging: Extract `request_id` from context, add to all log entries _(DONE - moved to auxiliary foundation)_ + - [x] 7.3 Create logging middleware in `main.py`: Log request start (method, path, request*id), log request end (status_code, duration_ms), bind request_id to structlog context *(DONE - moved to auxiliary foundation)\_ + - [x] ~~7.4 Replace all `print()` and basic logging calls throughout codebase with structlog~~ _(OBSOLETE - logging established from start, no print statements to replace)_ + - [ ] 7.5 Add structured logging to worker tasks: Log job start, each step transition, completion/failure with job_id context + - [ ] 7.6 Enhance health endpoints: `/health/ready` should check DB connection (`SELECT 1`), Redis ping, return `{"status": "ok", "checks": {"database": "ok", "redis": "ok"}}` or appropriate error status + - [ ] 7.7 Implement graceful shutdown in `main.py` lifespan: On shutdown, wait for in-flight requests (30s timeout), close DB connections, close Redis connections + - [ ] 7.8 Add graceful shutdown to worker: Configure ARQ's `on_shutdown` hook to wait for current job completion before exiting + - [x] 7.9 Create `docker/Dockerfile` with multi-stage build: Stage 1 (builder) installs dependencies, Stage 2 (runtime) copies only needed files, uses slim Python image, runs as non-root user _(DONE - moved to auxiliary foundation)_ + - [x] 7.10 Create `docker/docker-compose.yml` with services: `api` (uvicorn), `worker` (arq), `postgres` (postgres:16-alpine), `redis` (redis:7-alpine) with health checks, volumes, and proper depends*on conditions *(DONE - moved to auxiliary foundation)\_ + - [x] 7.11 Add volume mounts in docker-compose: `postgres-data` for database persistence, `redis-data` for Redis persistence, `audio-data` for local audio file storage _(DONE - moved to auxiliary foundation)_ + - [x] 7.12 Configure environment variables in docker-compose: Use `${VAR:-default}` syntax for secrets, set development defaults for local use _(DONE - moved to auxiliary foundation)_ + - [ ] 7.13 Test full stack: Run `docker-compose up --build`, verify all services start, health checks pass, can register user and process book + - [x] 7.14 Create `tests/conftest.py` with pytest fixtures: `async_client` (TestClient with async support), `test_db_session` (isolated test database), `mock_openai_service`, `mock_tts_service`, `authenticated_client` (client with valid JWT) _(DONE - moved to auxiliary foundation)_ + - [ ] 7.15 Verify all tests pass: Run `pytest tests/ -v` and ensure all tests pass with mocked external services diff --git a/tasks/backend-productionisation/technical-considerations-backend-productionisation.md b/tasks/backend-productionisation/technical-considerations-backend-productionisation.md new file mode 100644 index 0000000..aa78d86 --- /dev/null +++ b/tasks/backend-productionisation/technical-considerations-backend-productionisation.md @@ -0,0 +1,479 @@ +### Dependencies (pyproject.toml) + +```toml +[project] +dependencies = [ + # Core + "fastapi>=0.109.0", + "uvicorn[standard]>=0.27.0", + "python-multipart>=0.0.6", + + # Async + "httpx>=0.26.0", + "anyio>=4.2.0", + + # Database + "sqlalchemy[asyncio]>=2.0.25", + "asyncpg>=0.29.0", + "aiosqlite>=0.19.0", + "alembic>=1.13.0", + + # Background Jobs + "arq>=0.25.0", + "redis>=5.0.0", + + # Configuration + "pydantic>=2.5.0", + "pydantic-settings>=2.1.0", + "python-dotenv>=1.0.0", + + # Auth (JWT) + "python-jose[cryptography]>=3.3.0", + "passlib[bcrypt]>=1.7.4", + + # Storage + "boto3>=1.34.0", + "aioboto3>=12.0.0", + + # External APIs + "openai>=1.0.0", + "gtts>=2.5.0", + + # Resilience + "tenacity>=8.2.0", + + # Observability + "structlog>=24.1.0", +] + +[project.optional-dependencies] +dev = [ + "pytest>=7.4.0", + "pytest-asyncio>=0.23.0", + "pytest-cov>=4.1.0", + "respx>=0.20.0", + "fakeredis>=2.20.0", + "ruff>=0.1.0", +] +``` + +### Folder Structure + +``` +bookbytes/ +β”œβ”€β”€ alembic/ +β”‚ β”œβ”€β”€ versions/ +β”‚ β”œβ”€β”€ env.py +β”‚ └── script.py.mako +β”œβ”€β”€ src/ +β”‚ └── bookbytes/ +β”‚ β”œβ”€β”€ __init__.py +β”‚ β”œβ”€β”€ main.py # FastAPI app factory +β”‚ β”œβ”€β”€ config.py # Pydantic settings +β”‚ β”œβ”€β”€ dependencies.py # DI container +β”‚ β”‚ +β”‚ β”œβ”€β”€ api/ +β”‚ β”‚ β”œβ”€β”€ __init__.py +β”‚ β”‚ └── v1/ +β”‚ β”‚ β”œβ”€β”€ __init__.py +β”‚ β”‚ β”œβ”€β”€ router.py # Main v1 router +β”‚ β”‚ β”œβ”€β”€ auth.py # Auth endpoints (register, login, me) +β”‚ β”‚ β”œβ”€β”€ books.py # Book endpoints +β”‚ β”‚ β”œβ”€β”€ chapters.py # Chapter endpoints +β”‚ β”‚ β”œβ”€β”€ jobs.py # Job status endpoints +β”‚ β”‚ └── health.py # Health checks +β”‚ β”‚ +β”‚ β”œβ”€β”€ models/ +β”‚ β”‚ β”œβ”€β”€ __init__.py +β”‚ β”‚ β”œβ”€β”€ base.py # Base model, mixins +β”‚ β”‚ β”œβ”€β”€ user.py # User model +β”‚ β”‚ β”œβ”€β”€ book.py # Book + BookIsbn models +β”‚ β”‚ β”œβ”€β”€ chapter.py +β”‚ β”‚ └── job.py +β”‚ β”‚ +β”‚ β”œβ”€β”€ schemas/ +β”‚ β”‚ β”œβ”€β”€ __init__.py +β”‚ β”‚ β”œβ”€β”€ common.py # Shared schemas +β”‚ β”‚ β”œβ”€β”€ auth.py # Auth request/response schemas +β”‚ β”‚ β”œβ”€β”€ book.py +β”‚ β”‚ β”œβ”€β”€ chapter.py +β”‚ β”‚ └── job.py +β”‚ β”‚ +β”‚ β”œβ”€β”€ repositories/ +β”‚ β”‚ β”œβ”€β”€ __init__.py +β”‚ β”‚ β”œβ”€β”€ base.py # Generic repository +β”‚ β”‚ β”œβ”€β”€ user.py # User repository +β”‚ β”‚ β”œβ”€β”€ book.py # Book + BookIsbn repositories +β”‚ β”‚ β”œβ”€β”€ chapter.py +β”‚ β”‚ └── job.py +β”‚ β”‚ +β”‚ β”œβ”€β”€ services/ +β”‚ β”‚ β”œβ”€β”€ __init__.py +β”‚ β”‚ β”œβ”€β”€ book_service.py # Book processing orchestration +β”‚ β”‚ β”œβ”€β”€ openai_service.py # OpenAI API wrapper +β”‚ β”‚ β”œβ”€β”€ tts_service.py # gTTS wrapper +β”‚ β”‚ └── metadata_service.py # Open Library API +β”‚ β”‚ +β”‚ β”œβ”€β”€ workers/ +β”‚ β”‚ β”œβ”€β”€ __init__.py +β”‚ β”‚ β”œβ”€β”€ settings.py # ARQ worker config +β”‚ β”‚ └── tasks.py # Job definitions +β”‚ β”‚ +β”‚ β”œβ”€β”€ core/ +β”‚ β”‚ β”œβ”€β”€ __init__.py +β”‚ β”‚ β”œβ”€β”€ database.py # Async engine & sessions +β”‚ β”‚ β”œβ”€β”€ exceptions.py # Exception hierarchy +β”‚ β”‚ β”œβ”€β”€ logging.py # Structured logging +β”‚ β”‚ └── security.py # JWT utilities +β”‚ β”‚ +β”‚ └── storage/ +β”‚ β”œβ”€β”€ __init__.py +β”‚ β”œβ”€β”€ base.py # Abstract interface +β”‚ β”œβ”€β”€ local.py # Local filesystem +β”‚ └── s3.py # S3 implementation +β”‚ +β”œβ”€β”€ tests/ +β”‚ β”œβ”€β”€ conftest.py +β”‚ β”œβ”€β”€ integration/ +β”‚ β”‚ β”œβ”€β”€ test_books_api.py +β”‚ β”‚ └── test_jobs_api.py +β”‚ └── mocks/ +β”‚ β”œβ”€β”€ openai_responses.py +β”‚ └── openlibrary_responses.py +β”‚ +β”œβ”€β”€ docker/ +β”‚ β”œβ”€β”€ Dockerfile +β”‚ └── docker-compose.yml +β”‚ +β”œβ”€β”€ alembic.ini +β”œβ”€β”€ pyproject.toml +β”œβ”€β”€ .env.example +└── README.md +``` + +### Database Schema + +```sql +-- ============================================ +-- USERS (for JWT auth) +-- ============================================ +CREATE TABLE users ( + id UUID PRIMARY KEY DEFAULT gen_random_uuid(), + email VARCHAR(255) NOT NULL UNIQUE, + hashed_password VARCHAR(255) NOT NULL, + is_active BOOLEAN DEFAULT TRUE, + created_at TIMESTAMP WITH TIME ZONE DEFAULT NOW(), + updated_at TIMESTAMP WITH TIME ZONE DEFAULT NOW() +); + +CREATE INDEX idx_users_email ON users(email); + +-- ============================================ +-- BOOKS (UUID primary key, grouped by language) +-- ============================================ +CREATE TABLE books ( + id UUID PRIMARY KEY DEFAULT gen_random_uuid(), + title VARCHAR(500) NOT NULL, + author VARCHAR(500) NOT NULL, + language VARCHAR(10) NOT NULL DEFAULT 'en', -- ISO 639-1 code + edition VARCHAR(100), -- e.g., "1st", "Revised" + publisher VARCHAR(255), + pages INTEGER, + publish_date VARCHAR(50), + cover_url VARCHAR(1000), + created_at TIMESTAMP WITH TIME ZONE DEFAULT NOW(), + updated_at TIMESTAMP WITH TIME ZONE DEFAULT NOW() +); + +CREATE INDEX idx_books_language ON books(language); +CREATE INDEX idx_books_title ON books(title); + +-- ============================================ +-- BOOK_ISBNS (normalized 1:N relationship) +-- ============================================ +CREATE TYPE isbn_type AS ENUM ('isbn10', 'isbn13'); + +CREATE TABLE book_isbns ( + id UUID PRIMARY KEY DEFAULT gen_random_uuid(), + book_id UUID NOT NULL REFERENCES books(id) ON DELETE CASCADE, + isbn VARCHAR(13) NOT NULL UNIQUE, + isbn_type isbn_type NOT NULL, + created_at TIMESTAMP WITH TIME ZONE DEFAULT NOW() +); + +CREATE INDEX idx_book_isbns_isbn ON book_isbns(isbn); +CREATE INDEX idx_book_isbns_book_id ON book_isbns(book_id); + +-- ============================================ +-- CHAPTERS (references Book by UUID) +-- ============================================ +CREATE TABLE chapters ( + id UUID PRIMARY KEY DEFAULT gen_random_uuid(), + book_id UUID NOT NULL REFERENCES books(id) ON DELETE CASCADE, + chapter_number INTEGER NOT NULL, + title VARCHAR(500) NOT NULL, + summary TEXT NOT NULL, + audio_file_path VARCHAR(1000), + audio_url VARCHAR(1000), + word_count INTEGER, + created_at TIMESTAMP WITH TIME ZONE DEFAULT NOW(), + updated_at TIMESTAMP WITH TIME ZONE DEFAULT NOW(), + UNIQUE(book_id, chapter_number) +); + +CREATE INDEX idx_chapters_book_id ON chapters(book_id); + +-- ============================================ +-- JOBS (background processing) +-- ============================================ +CREATE TYPE job_status AS ENUM ('pending', 'processing', 'completed', 'failed'); +CREATE TYPE job_type AS ENUM ('process_book'); + +CREATE TABLE jobs ( + id UUID PRIMARY KEY DEFAULT gen_random_uuid(), + user_id UUID REFERENCES users(id) ON DELETE SET NULL, -- nullable for system jobs + type job_type NOT NULL, + status job_status NOT NULL DEFAULT 'pending', + book_id UUID REFERENCES books(id) ON DELETE SET NULL, -- reference to book being processed + isbn VARCHAR(13), -- input ISBN used for processing + error TEXT, + progress INTEGER DEFAULT 0 CHECK (progress >= 0 AND progress <= 100), + current_step VARCHAR(100), + created_at TIMESTAMP WITH TIME ZONE DEFAULT NOW(), + updated_at TIMESTAMP WITH TIME ZONE DEFAULT NOW(), + started_at TIMESTAMP WITH TIME ZONE, + completed_at TIMESTAMP WITH TIME ZONE +); + +CREATE INDEX idx_jobs_status ON jobs(status); +CREATE INDEX idx_jobs_user_id ON jobs(user_id); +CREATE INDEX idx_jobs_book_id ON jobs(book_id); +``` + +### API Endpoints + +#### Auth Endpoints (Public) + +| Method | Endpoint | Description | Request | Response | +| ------ | ----------------------- | ----------------- | ------------------- | ---------------------------- | +| POST | `/api/v1/auth/register` | Register new user | `{email, password}` | `{user: {...}}` | +| POST | `/api/v1/auth/login` | Login user | `{email, password}` | `{access_token, token_type}` | +| GET | `/api/v1/auth/me` | Get current user | - | `{user: {...}}` | + +#### Protected Endpoints (Require JWT or API Key) + +| Method | Endpoint | Description | Request | Response | +| ------ | ---------------------------------- | --------------------- | ---------------- | ---------------------------------------------- | +| POST | `/api/v1/books/process` | Start book processing | `{isbn: string}` | `{job_id, status}` | +| GET | `/api/v1/books` | List all books | - | `{books: [...]}` | +| GET | `/api/v1/books/{book_id}` | Get book by UUID | - | `{book: {...}, isbns: [...], chapters: [...]}` | +| GET | `/api/v1/books/isbn/{isbn}` | Get book by ISBN | - | `{book: {...}}` | +| GET | `/api/v1/books/{book_id}/chapters` | Get book chapters | - | `{chapters: [...]}` | +| GET | `/api/v1/chapters/{id}` | Get chapter details | - | `{chapter: {...}}` | +| GET | `/api/v1/chapters/{id}/audio` | Get audio URL | - | `{url: string}` or redirect | +| GET | `/api/v1/jobs` | List user's jobs | - | `{jobs: [...]}` | +| GET | `/api/v1/jobs/{job_id}` | Get job status | - | `{job: {...}}` | + +#### Health Endpoints (Public) + +| Method | Endpoint | Description | Request | Response | +| ------ | --------------- | --------------- | ------- | ------------------------------- | +| GET | `/health/live` | Liveness probe | - | `{status: "ok"}` | +| GET | `/health/ready` | Readiness probe | - | `{status: "ok", checks: {...}}` | + +### Error Response Format + +```json +{ + "error": { + "code": "BOOK_NOT_FOUND", + "message": "Book with ISBN 1234567890 not found", + "request_id": "abc-123-def" + } +} +``` + +### Environment Variables + +```bash +# Application +APP_ENV=development|staging|production +DEBUG=true|false +LOG_LEVEL=DEBUG|INFO|WARNING|ERROR +LOG_FORMAT=json|console + +# Server +HOST=0.0.0.0 +PORT=8000 + +# Database +DATABASE_URL=postgresql+asyncpg://user:pass@localhost:5432/bookbytes +DATABASE_POOL_MIN=2 +DATABASE_POOL_MAX=10 + +# Redis +REDIS_URL=redis://localhost:6379/0 + +# Storage +STORAGE_BACKEND=local|s3 +LOCAL_STORAGE_PATH=./data/audio +S3_BUCKET=bookbytes-audio +S3_REGION=us-east-1 +AWS_ACCESS_KEY_ID=... +AWS_SECRET_ACCESS_KEY=... + +# External APIs +OPENAI_API_KEY=... +OPENAI_MODEL=gpt-3.5-turbo +OPENAI_TIMEOUT=30 + +# Auth +AUTH_MODE=jwt|api_key # jwt for production, api_key for local dev +JWT_SECRET_KEY=... # Required for jwt mode +JWT_ALGORITHM=HS256 +JWT_EXPIRE_MINUTES=30 +API_KEY=dev-api-key-12345 # Only used when AUTH_MODE=api_key + +# Worker +WORKER_MAX_JOBS=5 # Max concurrent jobs per worker +``` + +--- + +## Appendix A: Migration from Current Codebase + +### Mapping: Old β†’ New + +| Old Location | New Location | +| ------------------------------------------ | ------------------------------------------------ | +| `app.py:Book` dataclass | `src/bookbytes/models/book.py` (Book + BookIsbn) | +| `app.py:Chapter` dataclass | `src/bookbytes/models/chapter.py` | +| `app.py:BookBytesApp._init_database()` | `alembic/versions/001_initial.py` | +| `app.py:BookBytesApp.fetch_book_details()` | `src/bookbytes/services/metadata_service.py` | +| `app.py:BookBytesApp.get_chapter_list()` | `src/bookbytes/services/openai_service.py` | +| `app.py:BookBytesApp.text_to_speech()` | `src/bookbytes/services/tts_service.py` | +| `app.py:BookBytesApp.save_book()` | `src/bookbytes/repositories/book.py` | +| `app.py:process_book_api()` | `src/bookbytes/api/v1/books.py` | +| `logger.py` | `src/bookbytes/core/logging.py` | +| N/A (new) | `src/bookbytes/models/user.py` | +| N/A (new) | `src/bookbytes/api/v1/auth.py` | +| N/A (new) | `src/bookbytes/core/security.py` | + +### Preserved Functionality + +All existing functionality will be preserved: + +- Book metadata fetching from Open Library +- Chapter extraction via OpenAI +- Summary generation via OpenAI +- Audio generation via gTTS +- Book/Chapter CRUD operations + +### New Functionality + +- JWT authentication with user registration/login +- API key mode for local development bypass +- Background job processing with status tracking +- Multiple ISBNs per book (normalized) +- Book grouping by language + +### Breaking Changes + +- API endpoints move to `/api/v1/` prefix +- Book identified by UUID, not ISBN (ISBNs are now secondary lookup) +- Processing returns job ID instead of blocking +- Audio served via URL instead of direct file serving +- All protected endpoints require authentication (JWT or API key) +- Response format standardized + +--- + +## Appendix B: Docker Compose Reference + +```yaml +version: "3.8" + +services: + api: + build: + context: . + dockerfile: docker/Dockerfile + command: uvicorn src.bookbytes.main:app --host 0.0.0.0 --port 8000 + ports: + - "8000:8000" + environment: + - APP_ENV=development + - DATABASE_URL=postgresql+asyncpg://bookbytes:bookbytes@postgres:5432/bookbytes + - REDIS_URL=redis://redis:6379/0 + - STORAGE_BACKEND=local + - LOCAL_STORAGE_PATH=/data/audio + - AUTH_MODE=api_key + - API_KEY=dev-api-key-12345 + - JWT_SECRET_KEY=${JWT_SECRET_KEY:-dev-secret-key-change-in-prod} + volumes: + - audio-data:/data/audio + depends_on: + postgres: + condition: service_healthy + redis: + condition: service_started + healthcheck: + test: ["CMD", "curl", "-f", "http://localhost:8000/health/live"] + interval: 10s + timeout: 5s + retries: 3 + + worker: + build: + context: . + dockerfile: docker/Dockerfile + command: arq src.bookbytes.workers.settings.WorkerSettings + environment: + - APP_ENV=development + - DATABASE_URL=postgresql+asyncpg://bookbytes:bookbytes@postgres:5432/bookbytes + - REDIS_URL=redis://redis:6379/0 + - STORAGE_BACKEND=local + - LOCAL_STORAGE_PATH=/data/audio + - OPENAI_API_KEY=${OPENAI_API_KEY} + - WORKER_MAX_JOBS=5 + volumes: + - audio-data:/data/audio + depends_on: + postgres: + condition: service_healthy + redis: + condition: service_started + + postgres: + image: postgres:16-alpine + environment: + - POSTGRES_USER=bookbytes + - POSTGRES_PASSWORD=bookbytes + - POSTGRES_DB=bookbytes + ports: + - "5432:5432" + volumes: + - postgres-data:/var/lib/postgresql/data + healthcheck: + test: ["CMD-SHELL", "pg_isready -U bookbytes"] + interval: 5s + timeout: 5s + retries: 5 + + redis: + image: redis:7-alpine + ports: + - "6379:6379" + volumes: + - redis-data:/data + healthcheck: + test: ["CMD", "redis-cli", "ping"] + interval: 5s + timeout: 5s + retries: 5 + +volumes: + postgres-data: + redis-data: + audio-data: +``` diff --git a/tasks/knowledge/multi-provider-integration-patterns.md b/tasks/knowledge/multi-provider-integration-patterns.md new file mode 100644 index 0000000..5fe0044 --- /dev/null +++ b/tasks/knowledge/multi-provider-integration-patterns.md @@ -0,0 +1,115 @@ +# Multi-Provider Data Integration Patterns + +> **Status:** Research document for future multi-provider phase +> **Created:** 2024-12-07 +> **Context:** BookBytes Audio Books Library - caching strategy for external book APIs + +--- + +## Summary + +When integrating multiple external data sources (OpenLibrary, Google Books, Goodreads), use: + +1. **Canonical Data Model (CDM)** - unified internal representation +2. **Anti-Corruption Layer (ACL)** - adapter per provider + +--- + +## Industry Patterns + +### Canonical Data Model (CDM) + +A standardized, application-agnostic data representation that acts as a "universal translator." + +**Key Benefits:** + +- `2n` mappings instead of `nΒ²` (each provider to canonical, canonical to consumer) +- Data consistency across providers +- Easy to add new providers (just add adapter) +- Cache layer stays provider-agnostic + +``` +OpenLibrary API ──▢ β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” ◀── Google Books + β”‚ Canonical Model β”‚ + β”‚ (BookSearchResultβ”‚ + β”‚ Edition, Work) β”‚ + β””β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ + β–Ό + β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” + β”‚ Redis Cache β”‚ + β”‚ (Canonical JSON) β”‚ + β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ +``` + +### Anti-Corruption Layer (ACL) + +From Domain-Driven Design (DDD) - translation layer protecting domain from external system quirks. + +```python +class OpenLibraryAdapter: + """Transforms OpenLibrary responses to canonical models.""" + + def to_search_result(self, raw: dict) -> BookSearchResult: + return BookSearchResult( + title=raw["title"], + authors=self._extract_authors(raw), + source_provider="openlibrary", + ... + ) +``` + +--- + +## Cache Key Strategy + +### Current (Phase 1 - Single Provider) + +Provider-agnostic keys: +| Key | Description | +|-----|-------------| +| `search:{hash}` | Search results | +| `isbn:{isbn}` | Book by ISBN | +| `work:{identifier}` | Work details | + +### Future (Multi-Provider Phase) + +Same keys, but cached data includes metadata: + +```python +{ + "data": { ... }, # Canonical model + "metadata": { + "source_provider": "openlibrary", + "fetched_at": "2024-12-07T...", + } +} +``` + +--- + +## Trade-offs Analysis + +| Trade-off | Decision | Rationale | +| ------------------- | ----------------------------------------- | ----------------------------------------- | +| First provider wins | βœ… Accepted | Book metadata is similar across providers | +| Transform overhead | βœ… Accepted | Happens once at fetch time | +| Provider debugging | Include `source_provider` in cached value | +| Schema evolution | Version canonical models carefully | + +--- + +## Deferred to Multi-Provider Phase + +- [ ] Provider-specific adapters (ACL) +- [ ] Provider rotation/selection strategy +- [ ] Provider-specific rate limiting +- [ ] Provider health checks and failover +- [ ] Provider-specific cache invalidation patterns + +--- + +## References + +- Eric Evans, Domain-Driven Design (Anti-Corruption Layer) +- Enterprise Integration Patterns (Canonical Data Model) +- Microsoft Azure Architecture: [Anti-Corruption Layer](https://learn.microsoft.com/en-us/azure/architecture/patterns/anti-corruption-layer)