Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Binary file added .DS_Store
Binary file not shown.
2 changes: 2 additions & 0 deletions .env
Original file line number Diff line number Diff line change
Expand Up @@ -43,3 +43,5 @@ SENTRY_DSN=
# Configure these with your own Docker registry images
DOCKER_IMAGE_BACKEND=backend
DOCKER_IMAGE_FRONTEND=frontend

REDIS_URL=redis://redis:6379/0
3 changes: 3 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -4,3 +4,6 @@ node_modules/
/playwright-report/
/blob-report/
/playwright/.cache/

# macOS
.DS_Store
16 changes: 16 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -148,6 +148,22 @@ Copy the content and use that as password / secret key. And run that again to ge

## How To Use It - Alternative With Copier

## PostgreSQL 18 + pgvector (local Docker)

This project includes a Docker image to run PostgreSQL 18 with the `pgvector` v0.8 extension built-in.

- Build and start the full stack (uses the custom image for the `db` service):

```bash
docker compose up --build -d
```

- The container image is built from `docker/postgres-pgvector/Dockerfile` and the initialization SQL
`docker/postgres-pgvector/initdb/01-enable-pgvector.sql` creates the `vector` extension on first
initialization.

If you prefer to use a pre-built image, modify `docker-compose.yml` to point `db.image` to your image.

This repository also supports generating a new project using [Copier](https://copier.readthedocs.io).

It will copy all the files, ask you configuration questions, and update the `.env` files with your answers.
Expand Down
29 changes: 0 additions & 29 deletions SECURITY.md

This file was deleted.

55 changes: 48 additions & 7 deletions backend/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,8 +2,8 @@

## Requirements

* [Docker](https://www.docker.com/).
* [uv](https://docs.astral.sh/uv/) for Python package and environment management.
- [Docker](https://www.docker.com/).
- [uv](https://docs.astral.sh/uv/) for Python package and environment management.

## Docker Compose

Expand Down Expand Up @@ -127,23 +127,23 @@ As during local development your app directory is mounted as a volume inside the

Make sure you create a "revision" of your models and that you "upgrade" your database with that revision every time you change them. As this is what will update the tables in your database. Otherwise, your application will have errors.

* Start an interactive session in the backend container:
- Start an interactive session in the backend container:

```console
$ docker compose exec backend bash
```

* Alembic is already configured to import your SQLModel models from `./backend/app/models.py`.
- Alembic is already configured to import your SQLModel models from `./backend/app/models.py`.

* After changing a model (for example, adding a column), inside the container, create a revision, e.g.:
- After changing a model (for example, adding a column), inside the container, create a revision, e.g.:

```console
$ alembic revision --autogenerate -m "Add column last_name to User model"
```

* Commit to the git repository the files generated in the alembic directory.
- Commit to the git repository the files generated in the alembic directory.

* After creating the revision, run the migration in the database (this is what will actually change the database):
- After creating the revision, run the migration in the database (this is what will actually change the database):

```console
$ alembic upgrade head
Expand All @@ -170,3 +170,44 @@ The email templates are in `./backend/app/email-templates/`. Here, there are two
Before continuing, ensure you have the [MJML extension](https://marketplace.visualstudio.com/items?itemName=attilabuti.vscode-mjml) installed in your VS Code.

Once you have the MJML extension installed, you can create a new email template in the `src` directory. After creating the new email template and with the `.mjml` file open in your editor, open the command palette with `Ctrl+Shift+P` and search for `MJML: Export to HTML`. This will convert the `.mjml` file to a `.html` file and now you can save it in the build directory.

## Background Tasks (Celery) and Upstash Redis

This project supports running background tasks using Celery with Redis as
broker/result backend. You can use a local Redis (via Docker Compose) or a
hosted provider such as Upstash. The project reads these settings from the
environment via the `app.core.settings` values.

- **Configure via `.env` or environment**: set either `REDIS_URL` (recommended)
or `CELERY_BROKER_URL` and `CELERY_RESULT_BACKEND` explicitly. For Upstash
use the `rediss://` URL provided by Upstash (it contains the host and token).

Example `.env` entries for Upstash (replace with your values):

```
REDIS_URL=rediss://default:[email protected]:6379
# or explicit celery vars
CELERY_BROKER_URL=rediss://default:[email protected]:6379
CELERY_RESULT_BACKEND=rediss://default:[email protected]:6379
```

- **Run worker (recommended)**: from the `backend/` directory either use the
Celery CLI or the lightweight Python entrypoint:

```
# using Celery CLI (preferred)
celery -A app.core.celery_app.celery_app worker --loglevel=info

# quick start via python entrypoint (run from the `backend/` directory)
# module form:
python -m app.workers.celery_worker
```

- **Test a task**: in a Python shell (with your virtualenv activated):

```
python -c "from app.workers import add; res = add.delay(2,3); print(res.get(timeout=10))"
```

The example tasks are in `app/tasks.py`. Replace `send_welcome_email` with
your real email sending logic to run it asynchronously.
32 changes: 32 additions & 0 deletions backend/WEBSOCKETS.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,32 @@
**WebSocket Infrastructure**: Quick setup

- **Purpose**: Provide real-time sync across connected clients and across multiple app instances using Redis pub/sub.
- **Components**:

- `app.api.websocket_manager.WebSocketManager`: manages local WebSocket connections and subscribes to Redis channels `ws:{room}`.
- `app.api.routes.ws`: WebSocket endpoint at `GET /api/v1/ws/{room}` (path under API prefix).
- Uses existing Redis client configured via `REDIS_URL` in `app.core.config.Settings`.

- **How it works**:

- Each connected client opens a WebSocket to `/api/v1/ws/{room}`.
- When a client sends a text message, the endpoint publishes the message to Redis channel `ws:{room}`.
- The `WebSocketManager` subscribes to `ws:*` and forwards published messages to all local WebSocket connections in the given room.
- This allows multiple app instances to broadcast to each other's connected clients.

- **Env / Config**:

- Ensure `REDIS_URL` is configured in the project's environment (default: `redis://redis:6379/0`).

- **Frontend example** (browser JS):

```js
const ws = new WebSocket(`wss://your-backend.example.com/api/v1/ws/room-123`);
ws.addEventListener("message", (ev) => console.log("msg", ev.data));
ws.addEventListener("open", () => ws.send(JSON.stringify({ type: "hello" })));
```

- **Notes & next steps**:
- Messages are sent/received as plain text; consider JSON schema enforcement and auth.
- Add authentication (JWT in query param/header) and room access checks as needed.
- Consider rate limiting and maximum connections per client.
28 changes: 24 additions & 4 deletions backend/app/alembic/env.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
import os
import sys
from logging.config import fileConfig

from alembic import context
Expand All @@ -12,14 +13,26 @@
# This line sets up loggers basically.
fileConfig(config.config_file_name)

# Ensure the project root (backend/) is on sys.path so imports like
# `import app` work when Alembic loads this env.py from elsewhere.
project_root = os.path.abspath(os.path.join(os.path.dirname(__file__), "..", ".."))
if project_root not in sys.path:
sys.path.insert(0, project_root)

# add your model's MetaData object here
# for 'autogenerate' support
# from myapp import mymodel
# target_metadata = mymodel.Base.metadata
# target_metadata = None

from app.models import SQLModel # noqa
from app.core.config import settings # noqa
from sqlmodel import SQLModel # noqa
from app.core.config import settings # noqa

# Ensure all model modules are imported so SQLModel.metadata is populated.
# Alembic's autogenerate inspects `target_metadata` to detect differences
# between models and the database; importing the model modules registers
# their tables on the metadata.
import app.models # noqa: F401

target_metadata = SQLModel.metadata

Expand Down Expand Up @@ -47,7 +60,11 @@ def run_migrations_offline():
"""
url = get_url()
context.configure(
url=url, target_metadata=target_metadata, literal_binds=True, compare_type=True
url=url,
target_metadata=target_metadata,
literal_binds=True,
compare_type=True,
compare_server_default=True,
)

with context.begin_transaction():
Expand All @@ -71,7 +88,10 @@ def run_migrations_online():

with connectable.connect() as connection:
context.configure(
connection=connection, target_metadata=target_metadata, compare_type=True
connection=connection,
target_metadata=target_metadata,
compare_type=True,
compare_server_default=True,
)

with context.begin_transaction():
Expand Down
29 changes: 29 additions & 0 deletions backend/app/alembic/versions/049f237840d6_migration_cha.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
"""migration-cha

Revision ID: 049f237840d6
Revises: 792708ed624f
Create Date: 2025-12-12 16:14:20.104058

"""
from alembic import op
import sqlalchemy as sa
import sqlmodel.sql.sqltypes


# revision identifiers, used by Alembic.
revision = '049f237840d6'
down_revision = '792708ed624f'
branch_labels = None
depends_on = None


def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.create_index(op.f('ix_otp_user_id'), 'otp', ['user_id'], unique=False)
# ### end Alembic commands ###


def downgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.drop_index(op.f('ix_otp_user_id'), table_name='otp')
# ### end Alembic commands ###
29 changes: 29 additions & 0 deletions backend/app/alembic/versions/0dcd3acf382f_change_relation_ship.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
"""change-relation-ship

Revision ID: 0dcd3acf382f
Revises: adb437cb796b
Create Date: 2025-12-12 15:23:06.136110

"""
from alembic import op
import sqlalchemy as sa
import sqlmodel.sql.sqltypes


# revision identifiers, used by Alembic.
revision = '0dcd3acf382f'
down_revision = 'adb437cb796b'
branch_labels = None
depends_on = None


def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
pass
# ### end Alembic commands ###


def downgrade():
# ### commands auto generated by Alembic - please adjust! ###
pass
# ### end Alembic commands ###

This file was deleted.

29 changes: 29 additions & 0 deletions backend/app/alembic/versions/792708ed624f_migration_chag.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
"""migration-chag

Revision ID: 792708ed624f
Revises: ebf4b66990a5
Create Date: 2025-12-12 16:04:30.397740

"""
from alembic import op
import sqlalchemy as sa
import sqlmodel.sql.sqltypes


# revision identifiers, used by Alembic.
revision = '792708ed624f'
down_revision = 'ebf4b66990a5'
branch_labels = None
depends_on = None


def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.drop_index(op.f('ix_otp_user_id'), table_name='otp')
# ### end Alembic commands ###


def downgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.create_index(op.f('ix_otp_user_id'), 'otp', ['user_id'], unique=False)
# ### end Alembic commands ###
Loading
Loading