Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
114 changes: 114 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -160,6 +160,7 @@ Keep in mind that many of these were recorded when Fabric was Python-based, so r
- [Docker](#docker)
- [Environment Variables](#environment-variables)
- [Setup](#setup)
- [Supported AI Providers](#supported-ai-providers)
- [Per-Pattern Model Mapping](#per-pattern-model-mapping)
- [Add aliases for all patterns](#add-aliases-for-all-patterns)
- [Save your files in markdown using aliases](#save-your-files-in-markdown-using-aliases)
Expand All @@ -172,12 +173,15 @@ Keep in mind that many of these were recorded when Fabric was Python-based, so r
- [Fish Completion](#fish-completion)
- [Usage](#usage)
- [Debug Levels](#debug-levels)
- [Dry Run Mode](#dry-run-mode)
- [Extensions](#extensions)
- [REST API Server](#rest-api-server)
- [Ollama Compatibility Mode](#ollama-compatibility-mode)
- [Our approach to prompting](#our-approach-to-prompting)
- [Examples](#examples)
- [Just use the Patterns](#just-use-the-patterns)
- [Prompt Strategies](#prompt-strategies)
- [Available Strategies](#available-strategies)
- [Custom Patterns](#custom-patterns)
- [Setting Up Custom Patterns](#setting-up-custom-patterns)
- [Using Custom Patterns](#using-custom-patterns)
Expand All @@ -186,6 +190,7 @@ Keep in mind that many of these were recorded when Fabric was Python-based, so r
- [`to_pdf`](#to_pdf)
- [`to_pdf` Installation](#to_pdf-installation)
- [`code2context`](#code2context)
- [`generate_changelog`](#generate_changelog)
- [pbpaste](#pbpaste)
- [Web Interface (Fabric Web App)](#web-interface-fabric-web-app)
- [Meta](#meta)
Expand Down Expand Up @@ -349,6 +354,43 @@ fabric --setup

If everything works you are good to go.

### Supported AI Providers

Fabric supports a wide range of AI providers:

**Native Integrations:**

- OpenAI
- Anthropic (Claude)
- Google Gemini
- Ollama (local models)
- Azure OpenAI
- Amazon Bedrock
- Vertex AI
- LM Studio
- Perplexity

**OpenAI-Compatible Providers:**

- Abacus
- AIML
- Cerebras
- DeepSeek
- GitHub Models
- GrokAI
- Groq
- Langdock
- LiteLLM
- MiniMax
- Mistral
- OpenRouter
- SiliconCloud
- Together
- Venice AI
- Z AI

Run `fabric --setup` to configure your preferred provider(s), or use `fabric --listvendors` to see all available vendors.

### Per-Pattern Model Mapping

You can configure specific models for individual patterns using environment variables
Expand Down Expand Up @@ -720,6 +762,16 @@ Use the `--debug` flag to control runtime logging:
- `2`: detailed debugging
- `3`: trace level

### Dry Run Mode

Use `--dry-run` to preview what would be sent to the AI model without making an API call:

```bash
echo "test input" | fabric --dry-run -p summarize
```

This is useful for debugging patterns, checking prompt construction, and verifying input formatting before using API credits.

### Extensions

Fabric supports extensions that can be called within patterns. See the [Extension Guide](internal/plugins/template/Examples/README.md) for complete documentation.
Expand All @@ -745,6 +797,22 @@ The server provides endpoints for:

For complete endpoint documentation, authentication setup, and usage examples, see [REST API Documentation](docs/rest-api.md).

### Ollama Compatibility Mode

Fabric can serve as a drop-in replacement for Ollama by exposing Ollama-compatible API endpoints. Start the server with:

```bash
fabric --serve --serveOllama
```

This enables the following Ollama-compatible endpoints:

- `GET /api/tags` - List available patterns as models
- `POST /api/chat` - Chat completions
- `GET /api/version` - Server version

Applications configured to use the Ollama API can point to your Fabric server instead, allowing you to use any of Fabric's supported AI providers through the Ollama interface. Patterns appear as models (e.g., `summarize:latest`).

## Our approach to prompting

Fabric _Patterns_ are different than most prompts you'll see.
Expand Down Expand Up @@ -825,6 +893,34 @@ LLM in the chat session.

Use `fabric -S` and select the option to install the strategies in your `~/.config/fabric` directory.

#### Available Strategies

Fabric includes several prompt strategies:

- `cot` - Chain-of-Thought: Step-by-step reasoning
- `cod` - Chain-of-Draft: Iterative drafting with minimal notes (5 words max per step)
- `tot` - Tree-of-Thought: Generate multiple reasoning paths and select the best one
- `aot` - Atom-of-Thought: Break problems into smallest independent atomic sub-problems
- `ltm` - Least-to-Most: Solve problems from easiest to hardest sub-problems
- `self-consistent` - Self-Consistency: Multiple reasoning paths with consensus
- `self-refine` - Self-Refinement: Answer, critique, and refine
- `reflexion` - Reflexion: Answer, critique briefly, and provide refined answer
- `standard` - Standard: Direct answer without explanation

Use the `--strategy` flag to apply a strategy:

```bash
echo "Analyze this code" | fabric --strategy cot -p analyze_code
```

List all available strategies with:

```bash
fabric --liststrategies
```

Strategies are stored as JSON files in `~/.config/fabric/strategies/`. See the default strategies for the format specification.

## Custom Patterns

You may want to use Fabric to create your own custom Patterns—but not share them with others. No problem!
Expand Down Expand Up @@ -918,6 +1014,24 @@ Install it first using:
go install github.com/danielmiessler/fabric/cmd/code2context@latest
```

### `generate_changelog`

`generate_changelog` generates changelogs from git commit history and GitHub pull requests. It walks through your repository's git history, extracts PR information, and produces well-formatted markdown changelogs.

```bash
generate_changelog --help
```

Features include SQLite caching for fast incremental updates, GitHub GraphQL API integration for efficient PR fetching, and optional AI-enhanced summaries using Fabric.

Install it using:

```bash
go install github.com/danielmiessler/fabric/cmd/generate_changelog@latest
```

See the [generate_changelog README](./cmd/generate_changelog/README.md) for detailed usage and options.

## pbpaste

The [examples](#examples) use the macOS program `pbpaste` to paste content from the clipboard to pipe into `fabric` as the input. `pbpaste` is not available on Windows or Linux, but there are alternatives.
Expand Down
7 changes: 7 additions & 0 deletions cmd/generate_changelog/incoming/1925.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
### PR [#1925](https://github.com/danielmiessler/Fabric/pull/1925) by [ksylvan](https://github.com/ksylvan): docs: update README to document new AI providers and features

- Docs: update README to document new AI providers and features
- List supported native and OpenAI-compatible AI provider integrations
- Document dry run mode for previewing prompt construction
- Explain Ollama compatibility mode for exposing API endpoints
- Detail available prompt strategies like chain-of-thought and reflexion