This repository provides Apache Airflow operators for managing Restic backups. It allows you to integrate Restic backup operations into your Airflow DAGs using Docker-based operators.
- Docker-based Restic operators for Airflow
- Support for S3 and local repositories
- Built-in notification system for task/DAG success and failure
- Configurable backup retention policies
- Repository health checking capabilities
ResticInitOperator
: Initialize a new Restic repositoryResticBackupOperator
: Create backups with configurable tags and pathsResticForgetAndPruneOperator
: Manage backup retention and cleanupResticCheckOperator
: Verify repository integrityResticUnlockOperator
: Remove stale repository locksResticPruneOperator
: Clean up unused dataResticRepositoryExistsOperator
: Check if a repository exists
from airflow import DAG
from restic_airflow.operators.restic import ResticBackupOperator
with DAG('backup_dag', ...) as dag:
backup_task = ResticBackupOperator(
task_id='backup_data',
repository='/path/to/repo',
backup_from_path='/data/to/backup',
cache_directory='/tmp/restic-cache',
tags=['daily'],
password='your-repository-password',
hostname='backup-host'
)
See sample.py for a complete DAG example including initialization, backup, health checks, and retention management.
The operators support configuration through environment variables:
RESTIC_PASSWORD
: Repository passwordAWS_ACCESS_KEY_ID
: For S3 repositoriesAWS_SECRET_ACCESS_KEY
: For S3 repositoriesALERT_EMAIL
: Email address for notifications
The package includes a notification system that can send emails on:
- DAG success/failure
- Individual task success/failure
This project is released into the public domain under the Unlicense.