Skip to content

Use celery for task queue #979

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 3 commits into from
May 7, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions .env.example
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,7 @@ DATABASE_URL=sqlite:///db.sqlite3

REDIS_PORT=7963
CACHE_URL=redis://127.0.0.1:${REDIS_PORT}/0
TASK_BROKER_URL=redis://127.0.0.1:${REDIS_PORT}/1

# Used to select which other services to run alongside
# manage.py, pytest and runserver
Expand Down
1 change: 1 addition & 0 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,7 @@ env:
SECRET_KEY: notTheRealOne
DATABASE_URL: sqlite:///db.sqlite3
CACHE_URL: redis://127.0.0.1:6379/0
TASK_BROKER_URL: redis://127.0.0.1:6379/1

jobs:
pre-commit:
Expand Down
1 change: 1 addition & 0 deletions Procfile.service
Original file line number Diff line number Diff line change
@@ -1 +1,2 @@
redis: redis-server --port $REDIS_PORT
celery: uv run celery -A sith worker --beat -l INFO
25 changes: 25 additions & 0 deletions docs/explanation/technos.md
Original file line number Diff line number Diff line change
Expand Up @@ -131,6 +131,31 @@ de données fonctionnent avec l'un comme avec l'autre.
Heureusement, et grâce à l'ORM de Django, cette
double compatibilité est presque toujours possible.

### Celery

[Site officiel](https://docs.celeryq.dev/en/stable/)

Dans certaines situations, on veut séparer une tâche
pour la faire tourner dans son coin.
Deux cas qui correspondent à cette situation sont :

- les tâches longues à exécuter
(comme l'envoi de mail ou la génération de documents),
pour lesquelles on veut pouvoir dire à l'utilisateur
que sa requête a été prise en compte, sans pour autant
le faire trop patienter
- les tâches régulières séparées du cycle requête/réponse.

Pour ça, nous utilisons Celery.
Grâce à son intégration avec Django,
il permet de mettre en place une queue de message
avec assez peu complexité ajoutée.

En outre, ses extensions `django-celery-results`
et `django-celery-beat` enrichissent son intégration
avec django et offrent des moyens de manipuler certaines
tâches directement dans l'interface admin de django.

## Frontend

### Jinja2
Expand Down
24 changes: 22 additions & 2 deletions docs/tutorial/install-advanced.md
Original file line number Diff line number Diff line change
Expand Up @@ -120,7 +120,7 @@ les conflits avec les instances de redis déjà en fonctionnement.

```dotenv
REDIS_PORT=6379
CACHE_URL=redis://127.0.0.1:${REDIS_PORT}/0
CACHE_URL=redis://127.0.0.1:6379/0
```

Si on souhaite configurer redis pour communiquer via un socket :
Expand Down Expand Up @@ -151,7 +151,7 @@ ALTER ROLE sith SET client_encoding TO 'utf8';
ALTER ROLE sith SET default_transaction_isolation TO 'read committed';
ALTER ROLE sith SET timezone TO 'UTC';

GRANT ALL PRIVILEGES ON DATABASE sith TO SITH;
GRANT ALL PRIVILEGES ON DATABASE sith TO sith;
\q
```

Expand Down Expand Up @@ -279,6 +279,26 @@ Toutes les requêtes vers des fichiers statiques et les medias publiques
seront servies directement par nginx.
Toutes les autres requêtes seront transmises au serveur django.

## Celery

Celery ne tourne pas dans django.
C'est une application à part, avec ses propres processus,
qui tourne de manière indépendante et qui ne communique
que par messages avec l'instance de django.

Pour faire tourner Celery, faites la commande suivante dans
un terminal à part :

```bash
uv run celery -A sith worker --beat -l INFO
```

!!!note

Nous utilisons Redis comme broker pour Celery,
donc vous devez aussi configurer l'URL du broker,
de la même manière que ce qui est décrit plus haut
pour Redis.

## Mettre à jour la base de données antispam

Expand Down
1 change: 0 additions & 1 deletion docs/tutorial/install.md
Original file line number Diff line number Diff line change
Expand Up @@ -100,7 +100,6 @@ cd /mnt/<la_lettre_du_disque>/vos/fichiers/comme/dhab
Python ne fait pas parti des dépendances puisqu'il est automatiquement
installé par uv.


## Finaliser l'installation

Clonez le projet (depuis votre console WSL, si vous utilisez WSL)
Expand Down
11 changes: 7 additions & 4 deletions manage.py
Original file line number Diff line number Diff line change
Expand Up @@ -18,10 +18,10 @@
import os
import sys

from django.conf import settings
from django.utils.autoreload import DJANGO_AUTORELOAD_ENV

from sith.composer import start_composer, stop_composer
from sith.settings import PROCFILE_SERVICE

if __name__ == "__main__":
logging.basicConfig(encoding="utf-8", level=logging.INFO)
Expand All @@ -30,8 +30,11 @@

from django.core.management import execute_from_command_line

if os.environ.get(DJANGO_AUTORELOAD_ENV) is None and PROCFILE_SERVICE is not None:
start_composer(PROCFILE_SERVICE)
_ = atexit.register(stop_composer, procfile=PROCFILE_SERVICE)
if (
os.environ.get(DJANGO_AUTORELOAD_ENV) is None
and settings.PROCFILE_SERVICE is not None
):
start_composer(settings.PROCFILE_SERVICE)
_ = atexit.register(stop_composer, procfile=settings.PROCFILE_SERVICE)

execute_from_command_line(sys.argv)
3 changes: 3 additions & 0 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -49,6 +49,9 @@ dependencies = [
"requests>=2.32.3",
"honcho>=2.0.0",
"psutil>=7.0.0",
"celery[redis]>=5.5.1",
"django-celery-results>=2.5.1",
"django-celery-beat>=2.7.0",
]

[project.urls]
Expand Down
6 changes: 6 additions & 0 deletions sith/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,3 +12,9 @@
# OR WITHIN THE LOCAL FILE "LICENSE"
#
#

# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery import app as celery_app

__all__ = ("celery_app",)
17 changes: 17 additions & 0 deletions sith/celery.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
# Set the default Django settings module for the 'celery' program.
import os

from celery import Celery

os.environ.setdefault("DJANGO_SETTINGS_MODULE", "sith.settings")

app = Celery("sith")

# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
# should have a `CELERY_` prefix.
app.config_from_object("django.conf:settings", namespace="CELERY")

# Load task modules from all registered Django apps.
app.autodiscover_tasks()
12 changes: 11 additions & 1 deletion sith/settings.py
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,7 @@

from .honeypot import custom_honeypot_error

env = Env()
env = Env(expand_vars=True)
env.read_env()


Expand Down Expand Up @@ -106,6 +106,8 @@ def optional_file_parser(value: str) -> Path | None:
"django_jinja",
"ninja_extra",
"haystack",
"django_celery_results",
"django_celery_beat",
"captcha",
"core",
"club",
Expand Down Expand Up @@ -336,6 +338,14 @@ def optional_file_parser(value: str) -> Path | None:
EMAIL_HOST = env.str("EMAIL_HOST", default="localhost")
EMAIL_PORT = env.int("EMAIL_PORT", default=25)

# Celery
CELERY_TIMEZONE = TIME_ZONE
CELERY_TASK_TRACK_STARTED = True
CELERY_TASK_TIME_LIMIT = 30 * 60
CELERY_BROKER_URL = env.str("TASK_BROKER_URL")
CELERY_RESULT_BACKEND = "django-db"
CELERY_BEAT_SCHEDULER = "django_celery_beat.schedulers:DatabaseScheduler"

# Below this line, only Sith-specific variables are defined

SITH_URL = env.str("SITH_URL", default="127.0.0.1:8000")
Expand Down
Loading