diff --git a/README.md b/README.md index b6759da86..f5b0a91df 100644 --- a/README.md +++ b/README.md @@ -160,6 +160,7 @@ Keep in mind that many of these were recorded when Fabric was Python-based, so r - [Docker](#docker) - [Environment Variables](#environment-variables) - [Setup](#setup) + - [Supported AI Providers](#supported-ai-providers) - [Per-Pattern Model Mapping](#per-pattern-model-mapping) - [Add aliases for all patterns](#add-aliases-for-all-patterns) - [Save your files in markdown using aliases](#save-your-files-in-markdown-using-aliases) @@ -172,12 +173,15 @@ Keep in mind that many of these were recorded when Fabric was Python-based, so r - [Fish Completion](#fish-completion) - [Usage](#usage) - [Debug Levels](#debug-levels) + - [Dry Run Mode](#dry-run-mode) - [Extensions](#extensions) - [REST API Server](#rest-api-server) + - [Ollama Compatibility Mode](#ollama-compatibility-mode) - [Our approach to prompting](#our-approach-to-prompting) - [Examples](#examples) - [Just use the Patterns](#just-use-the-patterns) - [Prompt Strategies](#prompt-strategies) + - [Available Strategies](#available-strategies) - [Custom Patterns](#custom-patterns) - [Setting Up Custom Patterns](#setting-up-custom-patterns) - [Using Custom Patterns](#using-custom-patterns) @@ -186,6 +190,7 @@ Keep in mind that many of these were recorded when Fabric was Python-based, so r - [`to_pdf`](#to_pdf) - [`to_pdf` Installation](#to_pdf-installation) - [`code2context`](#code2context) + - [`generate_changelog`](#generate_changelog) - [pbpaste](#pbpaste) - [Web Interface (Fabric Web App)](#web-interface-fabric-web-app) - [Meta](#meta) @@ -349,6 +354,43 @@ fabric --setup If everything works you are good to go. +### Supported AI Providers + +Fabric supports a wide range of AI providers: + +**Native Integrations:** + +- OpenAI +- Anthropic (Claude) +- Google Gemini +- Ollama (local models) +- Azure OpenAI +- Amazon Bedrock +- Vertex AI +- LM Studio +- Perplexity + +**OpenAI-Compatible Providers:** + +- Abacus +- AIML +- Cerebras +- DeepSeek +- GitHub Models +- GrokAI +- Groq +- Langdock +- LiteLLM +- MiniMax +- Mistral +- OpenRouter +- SiliconCloud +- Together +- Venice AI +- Z AI + +Run `fabric --setup` to configure your preferred provider(s), or use `fabric --listvendors` to see all available vendors. + ### Per-Pattern Model Mapping You can configure specific models for individual patterns using environment variables @@ -720,6 +762,16 @@ Use the `--debug` flag to control runtime logging: - `2`: detailed debugging - `3`: trace level +### Dry Run Mode + +Use `--dry-run` to preview what would be sent to the AI model without making an API call: + +```bash +echo "test input" | fabric --dry-run -p summarize +``` + +This is useful for debugging patterns, checking prompt construction, and verifying input formatting before using API credits. + ### Extensions Fabric supports extensions that can be called within patterns. See the [Extension Guide](internal/plugins/template/Examples/README.md) for complete documentation. @@ -745,6 +797,22 @@ The server provides endpoints for: For complete endpoint documentation, authentication setup, and usage examples, see [REST API Documentation](docs/rest-api.md). +### Ollama Compatibility Mode + +Fabric can serve as a drop-in replacement for Ollama by exposing Ollama-compatible API endpoints. Start the server with: + +```bash +fabric --serve --serveOllama +``` + +This enables the following Ollama-compatible endpoints: + +- `GET /api/tags` - List available patterns as models +- `POST /api/chat` - Chat completions +- `GET /api/version` - Server version + +Applications configured to use the Ollama API can point to your Fabric server instead, allowing you to use any of Fabric's supported AI providers through the Ollama interface. Patterns appear as models (e.g., `summarize:latest`). + ## Our approach to prompting Fabric _Patterns_ are different than most prompts you'll see. @@ -825,6 +893,34 @@ LLM in the chat session. Use `fabric -S` and select the option to install the strategies in your `~/.config/fabric` directory. +#### Available Strategies + +Fabric includes several prompt strategies: + +- `cot` - Chain-of-Thought: Step-by-step reasoning +- `cod` - Chain-of-Draft: Iterative drafting with minimal notes (5 words max per step) +- `tot` - Tree-of-Thought: Generate multiple reasoning paths and select the best one +- `aot` - Atom-of-Thought: Break problems into smallest independent atomic sub-problems +- `ltm` - Least-to-Most: Solve problems from easiest to hardest sub-problems +- `self-consistent` - Self-Consistency: Multiple reasoning paths with consensus +- `self-refine` - Self-Refinement: Answer, critique, and refine +- `reflexion` - Reflexion: Answer, critique briefly, and provide refined answer +- `standard` - Standard: Direct answer without explanation + +Use the `--strategy` flag to apply a strategy: + +```bash +echo "Analyze this code" | fabric --strategy cot -p analyze_code +``` + +List all available strategies with: + +```bash +fabric --liststrategies +``` + +Strategies are stored as JSON files in `~/.config/fabric/strategies/`. See the default strategies for the format specification. + ## Custom Patterns You may want to use Fabric to create your own custom Patterns—but not share them with others. No problem! @@ -918,6 +1014,24 @@ Install it first using: go install github.com/danielmiessler/fabric/cmd/code2context@latest ``` +### `generate_changelog` + +`generate_changelog` generates changelogs from git commit history and GitHub pull requests. It walks through your repository's git history, extracts PR information, and produces well-formatted markdown changelogs. + +```bash +generate_changelog --help +``` + +Features include SQLite caching for fast incremental updates, GitHub GraphQL API integration for efficient PR fetching, and optional AI-enhanced summaries using Fabric. + +Install it using: + +```bash +go install github.com/danielmiessler/fabric/cmd/generate_changelog@latest +``` + +See the [generate_changelog README](./cmd/generate_changelog/README.md) for detailed usage and options. + ## pbpaste The [examples](#examples) use the macOS program `pbpaste` to paste content from the clipboard to pipe into `fabric` as the input. `pbpaste` is not available on Windows or Linux, but there are alternatives. diff --git a/cmd/generate_changelog/incoming/1925.txt b/cmd/generate_changelog/incoming/1925.txt new file mode 100644 index 000000000..ed114f891 --- /dev/null +++ b/cmd/generate_changelog/incoming/1925.txt @@ -0,0 +1,7 @@ +### PR [#1925](https://github.com/danielmiessler/Fabric/pull/1925) by [ksylvan](https://github.com/ksylvan): docs: update README to document new AI providers and features + +- Docs: update README to document new AI providers and features +- List supported native and OpenAI-compatible AI provider integrations +- Document dry run mode for previewing prompt construction +- Explain Ollama compatibility mode for exposing API endpoints +- Detail available prompt strategies like chain-of-thought and reflexion