Opsy is an intelligent command-line assistant designed for Site Reliability Engineers (SREs), DevOps professionals, and platform engineers. It uses AI to help you navigate operational challenges, troubleshoot issues, and automate routine workflows. Opsy integrates with your existing tools and provides contextual assistance to make your daily operations more efficient.
Opsy uses a "tools-as-agents" architecture where each tool functions as a specialized AI agent with expertise in its domain (Kubernetes, Git, AWS, etc.). The main Opsy agent orchestrates these specialized agents, breaking down complex tasks and delegating them to the appropriate tools. This approach provides domain-specific expertise, improved safety through tool-specific validation, better context management for multi-step operations, and modular extensibility for adding new capabilities.
Warning
Opsy is currently in early development. While the core functionality works well, some features are still being refined. We recommend using it in non-production environments for now. We welcome your feedback to help improve Opsy.
The demo above shows Opsy handling this complex task:
Analyze the pods in the current namespace. If there are any pods that are failing, I need you to analyze the reason it is failing. Then, create a single Jira task named
Kubernetes issues
inOPSY
project reporting the issue. The task description must contain your analysis for on the failing pods. In addition, I want to have backups for our deployments: extract the deployment manifests and push them into a new privatebackup
repo indatolabs-io-sandbox
.
Click on the screenshot to watch the full demonstration.
Opsy uses Anthropic's Claude AI models to provide intelligent assistance. You'll need an Anthropic API key:
-
Create an account at Anthropic's website
-
Generate an API key from your account dashboard
-
Set the API key in your Opsy configuration (see Configuration section) or as an environment variable:
export ANTHROPIC_API_KEY=your_api_key_here
Opsy works with standard command-line tools. While none are strictly required to run Opsy, having them installed expands its capabilities:
- Git - Version control
- GitHub CLI - GitHub integration
- kubectl - Kubernetes management
- AWS CLI - AWS management
- Helm - Kubernetes package manager
- Google Cloud CLI (gcloud) - Google Cloud management
- Jira CLI - Jira automation
Opsy adapts to your environment and only uses tools that are installed on your system.
For users with Go 1.24 or later:
go install github.com/datolabs-io/opsy/cmd/opsy@latest
Ensure your Go bin directory is in your PATH.
For macOS and Linux users with Homebrew:
brew tap datolabs-io/opsy https://github.com/datolabs-io/opsy
brew install datolabs-io/opsy/opsy
Each release includes binaries for various platforms:
- Download the appropriate binary for your operating system
- Make it executable (Unix-based systems):
chmod +x opsy
- Move it to a directory in your
PATH
:mv opsy /usr/local/bin/
(or another directory in yourPATH
)
Opsy is simple to use. Just describe what you want to do in plain language, and Opsy will handle the rest.
opsy 'Your task description here'
For example:
# Repository management
opsy 'Create a new private repository in datolabs-io organization named backup'
# Kubernetes troubleshooting
opsy 'Check why pods in the production namespace are crashing'
# Log analysis
opsy 'Find errors in the application logs from the last hour'
Opsy interprets your instructions, builds a plan, and executes the necessary actions to complete your task—no additional input required.
Opsy is configured via a YAML file located at ~/.opsy/config.yaml
:
# UI configuration
ui:
# Theme for the UI (default: "default")
theme: default
# Logging configuration
logging:
# Path to the log file (default: "~/.opsy/log.log")
path: ~/.opsy/log.log
# Logging level: debug, info, warn, error (default: "info")
level: info
# Anthropic API configuration
anthropic:
# Your Anthropic API key (required)
api_key: your_api_key_here
# Model to use (default: "claude-3-7-sonnet-latest")
model: claude-3-7-sonnet-latest
# Temperature for generation (default: 0.5)
temperature: 0.5
# Maximum tokens to generate (default: 1024)
max_tokens: 1024
# Tools configuration
tools:
# Maximum duration in seconds for a tool to execute (default: 120)
timeout: 120
# Exec tool configuration
exec:
# Timeout for exec tool (0 means use global timeout) (default: 0)
timeout: 0
# Shell to use for execution (default: "/bin/bash")
shell: /bin/bash
You can also set configuration using environment variables with the prefix OPSY_
followed by the configuration path in uppercase with underscores:
# Set the logging level
export OPSY_LOGGING_LEVEL=debug
# Set the tools timeout
export OPSY_TOOLS_TIMEOUT=180
The Anthropic API key can also be set via ANTHROPIC_API_KEY
(without the OPSY_
prefix).
We welcome contributions to Opsy! The project is designed to be easily extended.
To contribute:
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature
) - Commit your changes (
git commit -m 'Add some amazing feature'
) - Push to the branch (
git push origin feature/amazing-feature
) - Open a Pull Request
Please update tests as appropriate and follow the existing coding style.
Here's how you can extend Opsy's capabilities:
System prompts in ./assets/prompts define how Opsy understands and responds to user tasks:
The primary prompt (assets/prompts/agent_system.tmpl) guides Opsy's overall behavior, establishing its identity as an AI assistant for SREs and DevOps professionals and defining the format for execution plans.
This prompt (assets/prompts/tool_system.tmpl) defines how Opsy interacts with external tools, ensuring interactions are safe, effective, and follow best practices.
This prompt (assets/prompts/tool_user.tmpl) defines the format for requesting tool execution, maintaining consistency in how tools are invoked.
To contribute a new prompt or modify an existing one, add it to the repository and submit a pull request.
Tool definitions in assets/tools/ allow Opsy to interact with various systems and services:
---
display_name: Tool Name
executable: command-name
description: Description of what the tool does
inputs:
parameter1:
type: string
description: Description of the first parameter
default: "default-value" # Optional default value
examples:
- "example1"
- "example2"
optional: false # Whether this parameter is required
rules:
- 'Rule 1 for using this tool'
- 'Rule 2 for using this tool'
Theme definitions in assets/themes/ control Opsy's visual appearance:
base:
base00: "#1A1B26" # Primary background
base01: "#24283B" # Secondary background
base02: "#292E42" # Borders and dividers
base03: "#565F89" # Muted text
base04: "#A9B1D6" # Primary text
accent:
accent0: "#FF9E64" # Command text
accent1: "#9ECE6A" # Agent messages
accent2: "#7AA2F7" # Tool output
- Charm for their TUI libraries
- Anthropic for their Go SDK for Claude AI models
- Viper for configuration management
- Various Go libraries for schema validation, data structures, and YAML parsing
- The Go community for excellent tooling and support
This project is licensed under the Apache License 2.0 - see the LICENSE file for details.