A FastAPI application with Celery workers using RabbitMQ for task queuing.
-
FastAPI Server with three endpoints:
GET /- Returns "Hello World"GET /long-process- Simulates a long-running process (5 seconds)POST /worker- Sends a number to Celery worker via RabbitMQ
-
Celery Worker that processes numbers from RabbitMQ and prints "hello {number}"
- Python 3.11+ (currently using Python 3.12)
- Docker and Docker Compose
- Virtual environment (already configured)
Start RabbitMQ and Redis using Docker Compose:
docker-compose up -dThis will start:
- RabbitMQ on port 5672 (Management UI on port 15672)
- Redis on port 6379
Dependencies are already installed in the virtual environment, but if needed:
pip install -r requirements.txtIn one terminal, start the Celery worker:
celery -A worker worker --loglevel=infoIn another terminal, start the FastAPI server:
uvicorn app:app --reload --host 0.0.0.0 --port 8000Or run directly:
python app.pyReturns a simple "Hello World" message.
Example:
curl http://localhost:8000/Response:
{
"message": "Hello World"
}Simulates a long-running process that takes 5 seconds to complete.
Example:
curl http://localhost:8000/long-processResponse:
{
"message": "Long process completed after 5 seconds"
}Sends a number to the Celery worker via RabbitMQ. The worker will print "hello {number}".
Example:
curl -X POST http://localhost:8000/worker \
-H "Content-Type: application/json" \
-d '{"number": 42}'Response:
{
"message": "Task sent to worker with number: 42",
"task_id": "12345678-1234-1234-1234-123456789012"
}- RabbitMQ Management UI: http://localhost:15672 (guest/guest)
- FastAPI Docs: http://localhost:8000/docs
- FastAPI ReDoc: http://localhost:8000/redoc
.
├── app.py # FastAPI application
├── worker.py # Celery worker
├── requirements.txt # Python dependencies
├── docker-compose.yml # Docker services (RabbitMQ, Redis)
└── README.md # This file
To stop all services:
# Stop FastAPI and Celery worker (Ctrl+C in their terminals)
# Stop Docker services
docker-compose down- Connection refused errors: Make sure RabbitMQ and Redis are running via Docker Compose
- Port conflicts: Check if ports 5672, 6379, 8000, or 15672 are already in use
- Worker not receiving tasks: Ensure the Celery worker is running and connected to RabbitMQ