diff --git a/.DS_Store b/.DS_Store new file mode 100644 index 0000000000..06493750e5 Binary files /dev/null and b/.DS_Store differ diff --git a/.env b/.env index 1d44286e25..6cbb185186 100644 --- a/.env +++ b/.env @@ -43,3 +43,5 @@ SENTRY_DSN= # Configure these with your own Docker registry images DOCKER_IMAGE_BACKEND=backend DOCKER_IMAGE_FRONTEND=frontend + +REDIS_URL=redis://redis:6379/0 diff --git a/.gitignore b/.gitignore index a6dd346572..60a1969682 100644 --- a/.gitignore +++ b/.gitignore @@ -4,3 +4,6 @@ node_modules/ /playwright-report/ /blob-report/ /playwright/.cache/ + +# macOS +.DS_Store diff --git a/README.md b/README.md index a9049b4779..f23dc46d77 100644 --- a/README.md +++ b/README.md @@ -148,6 +148,22 @@ Copy the content and use that as password / secret key. And run that again to ge ## How To Use It - Alternative With Copier +## PostgreSQL 18 + pgvector (local Docker) + +This project includes a Docker image to run PostgreSQL 18 with the `pgvector` v0.8 extension built-in. + +- Build and start the full stack (uses the custom image for the `db` service): + +```bash +docker compose up --build -d +``` + +- The container image is built from `docker/postgres-pgvector/Dockerfile` and the initialization SQL + `docker/postgres-pgvector/initdb/01-enable-pgvector.sql` creates the `vector` extension on first + initialization. + +If you prefer to use a pre-built image, modify `docker-compose.yml` to point `db.image` to your image. + This repository also supports generating a new project using [Copier](https://copier.readthedocs.io). It will copy all the files, ask you configuration questions, and update the `.env` files with your answers. diff --git a/SECURITY.md b/SECURITY.md deleted file mode 100644 index 0045fb8182..0000000000 --- a/SECURITY.md +++ /dev/null @@ -1,29 +0,0 @@ -# Security Policy - -Security is very important for this project and its community. 🔒 - -Learn more about it below. 👇 - -## Versions - -The latest version or release is supported. - -You are encouraged to write tests for your application and update your versions frequently after ensuring that your tests are passing. This way you will benefit from the latest features, bug fixes, and **security fixes**. - -## Reporting a Vulnerability - -If you think you found a vulnerability, and even if you are not sure about it, please report it right away by sending an email to: security@tiangolo.com. Please try to be as explicit as possible, describing all the steps and example code to reproduce the security issue. - -I (the author, [@tiangolo](https://twitter.com/tiangolo)) will review it thoroughly and get back to you. - -## Public Discussions - -Please restrain from publicly discussing a potential security vulnerability. 🙊 - -It's better to discuss privately and try to find a solution first, to limit the potential impact as much as possible. - ---- - -Thanks for your help! - -The community and I thank you for that. 🙇 diff --git a/backend/README.md b/backend/README.md index c217000fc2..3363cc9e11 100644 --- a/backend/README.md +++ b/backend/README.md @@ -2,8 +2,8 @@ ## Requirements -* [Docker](https://www.docker.com/). -* [uv](https://docs.astral.sh/uv/) for Python package and environment management. +- [Docker](https://www.docker.com/). +- [uv](https://docs.astral.sh/uv/) for Python package and environment management. ## Docker Compose @@ -127,23 +127,23 @@ As during local development your app directory is mounted as a volume inside the Make sure you create a "revision" of your models and that you "upgrade" your database with that revision every time you change them. As this is what will update the tables in your database. Otherwise, your application will have errors. -* Start an interactive session in the backend container: +- Start an interactive session in the backend container: ```console $ docker compose exec backend bash ``` -* Alembic is already configured to import your SQLModel models from `./backend/app/models.py`. +- Alembic is already configured to import your SQLModel models from `./backend/app/models.py`. -* After changing a model (for example, adding a column), inside the container, create a revision, e.g.: +- After changing a model (for example, adding a column), inside the container, create a revision, e.g.: ```console $ alembic revision --autogenerate -m "Add column last_name to User model" ``` -* Commit to the git repository the files generated in the alembic directory. +- Commit to the git repository the files generated in the alembic directory. -* After creating the revision, run the migration in the database (this is what will actually change the database): +- After creating the revision, run the migration in the database (this is what will actually change the database): ```console $ alembic upgrade head @@ -170,3 +170,44 @@ The email templates are in `./backend/app/email-templates/`. Here, there are two Before continuing, ensure you have the [MJML extension](https://marketplace.visualstudio.com/items?itemName=attilabuti.vscode-mjml) installed in your VS Code. Once you have the MJML extension installed, you can create a new email template in the `src` directory. After creating the new email template and with the `.mjml` file open in your editor, open the command palette with `Ctrl+Shift+P` and search for `MJML: Export to HTML`. This will convert the `.mjml` file to a `.html` file and now you can save it in the build directory. + +## Background Tasks (Celery) and Upstash Redis + +This project supports running background tasks using Celery with Redis as +broker/result backend. You can use a local Redis (via Docker Compose) or a +hosted provider such as Upstash. The project reads these settings from the +environment via the `app.core.settings` values. + +- **Configure via `.env` or environment**: set either `REDIS_URL` (recommended) + or `CELERY_BROKER_URL` and `CELERY_RESULT_BACKEND` explicitly. For Upstash + use the `rediss://` URL provided by Upstash (it contains the host and token). + +Example `.env` entries for Upstash (replace with your values): + +``` +REDIS_URL=rediss://default:REPLACE_WITH_YOUR_TOKEN@global-xxxx.upstash.io:6379 +# or explicit celery vars +CELERY_BROKER_URL=rediss://default:REPLACE_WITH_YOUR_TOKEN@global-xxxx.upstash.io:6379 +CELERY_RESULT_BACKEND=rediss://default:REPLACE_WITH_YOUR_TOKEN@global-xxxx.upstash.io:6379 +``` + +- **Run worker (recommended)**: from the `backend/` directory either use the + Celery CLI or the lightweight Python entrypoint: + +``` +# using Celery CLI (preferred) +celery -A app.core.celery_app.celery_app worker --loglevel=info + +# quick start via python entrypoint (run from the `backend/` directory) +# module form: +python -m app.workers.celery_worker +``` + +- **Test a task**: in a Python shell (with your virtualenv activated): + +``` +python -c "from app.workers import add; res = add.delay(2,3); print(res.get(timeout=10))" +``` + +The example tasks are in `app/tasks.py`. Replace `send_welcome_email` with +your real email sending logic to run it asynchronously. diff --git a/backend/WEBSOCKETS.md b/backend/WEBSOCKETS.md new file mode 100644 index 0000000000..3f1253043a --- /dev/null +++ b/backend/WEBSOCKETS.md @@ -0,0 +1,32 @@ +**WebSocket Infrastructure**: Quick setup + +- **Purpose**: Provide real-time sync across connected clients and across multiple app instances using Redis pub/sub. +- **Components**: + + - `app.api.websocket_manager.WebSocketManager`: manages local WebSocket connections and subscribes to Redis channels `ws:{room}`. + - `app.api.routes.ws`: WebSocket endpoint at `GET /api/v1/ws/{room}` (path under API prefix). + - Uses existing Redis client configured via `REDIS_URL` in `app.core.config.Settings`. + +- **How it works**: + + - Each connected client opens a WebSocket to `/api/v1/ws/{room}`. + - When a client sends a text message, the endpoint publishes the message to Redis channel `ws:{room}`. + - The `WebSocketManager` subscribes to `ws:*` and forwards published messages to all local WebSocket connections in the given room. + - This allows multiple app instances to broadcast to each other's connected clients. + +- **Env / Config**: + + - Ensure `REDIS_URL` is configured in the project's environment (default: `redis://redis:6379/0`). + +- **Frontend example** (browser JS): + +```js +const ws = new WebSocket(`wss://your-backend.example.com/api/v1/ws/room-123`); +ws.addEventListener("message", (ev) => console.log("msg", ev.data)); +ws.addEventListener("open", () => ws.send(JSON.stringify({ type: "hello" }))); +``` + +- **Notes & next steps**: + - Messages are sent/received as plain text; consider JSON schema enforcement and auth. + - Add authentication (JWT in query param/header) and room access checks as needed. + - Consider rate limiting and maximum connections per client. diff --git a/backend/app/alembic/env.py b/backend/app/alembic/env.py index 7f29c04680..0619ee7e21 100755 --- a/backend/app/alembic/env.py +++ b/backend/app/alembic/env.py @@ -1,4 +1,5 @@ import os +import sys from logging.config import fileConfig from alembic import context @@ -12,14 +13,26 @@ # This line sets up loggers basically. fileConfig(config.config_file_name) +# Ensure the project root (backend/) is on sys.path so imports like +# `import app` work when Alembic loads this env.py from elsewhere. +project_root = os.path.abspath(os.path.join(os.path.dirname(__file__), "..", "..")) +if project_root not in sys.path: + sys.path.insert(0, project_root) + # add your model's MetaData object here # for 'autogenerate' support # from myapp import mymodel # target_metadata = mymodel.Base.metadata # target_metadata = None -from app.models import SQLModel # noqa -from app.core.config import settings # noqa +from sqlmodel import SQLModel # noqa +from app.core.config import settings # noqa + +# Ensure all model modules are imported so SQLModel.metadata is populated. +# Alembic's autogenerate inspects `target_metadata` to detect differences +# between models and the database; importing the model modules registers +# their tables on the metadata. +import app.models # noqa: F401 target_metadata = SQLModel.metadata @@ -47,7 +60,11 @@ def run_migrations_offline(): """ url = get_url() context.configure( - url=url, target_metadata=target_metadata, literal_binds=True, compare_type=True + url=url, + target_metadata=target_metadata, + literal_binds=True, + compare_type=True, + compare_server_default=True, ) with context.begin_transaction(): @@ -71,7 +88,10 @@ def run_migrations_online(): with connectable.connect() as connection: context.configure( - connection=connection, target_metadata=target_metadata, compare_type=True + connection=connection, + target_metadata=target_metadata, + compare_type=True, + compare_server_default=True, ) with context.begin_transaction(): diff --git a/backend/app/alembic/versions/049f237840d6_migration_cha.py b/backend/app/alembic/versions/049f237840d6_migration_cha.py new file mode 100644 index 0000000000..5dadc8713d --- /dev/null +++ b/backend/app/alembic/versions/049f237840d6_migration_cha.py @@ -0,0 +1,29 @@ +"""migration-cha + +Revision ID: 049f237840d6 +Revises: 792708ed624f +Create Date: 2025-12-12 16:14:20.104058 + +""" +from alembic import op +import sqlalchemy as sa +import sqlmodel.sql.sqltypes + + +# revision identifiers, used by Alembic. +revision = '049f237840d6' +down_revision = '792708ed624f' +branch_labels = None +depends_on = None + + +def upgrade(): + # ### commands auto generated by Alembic - please adjust! ### + op.create_index(op.f('ix_otp_user_id'), 'otp', ['user_id'], unique=False) + # ### end Alembic commands ### + + +def downgrade(): + # ### commands auto generated by Alembic - please adjust! ### + op.drop_index(op.f('ix_otp_user_id'), table_name='otp') + # ### end Alembic commands ### diff --git a/backend/app/alembic/versions/0dcd3acf382f_change_relation_ship.py b/backend/app/alembic/versions/0dcd3acf382f_change_relation_ship.py new file mode 100644 index 0000000000..72a08a6a8c --- /dev/null +++ b/backend/app/alembic/versions/0dcd3acf382f_change_relation_ship.py @@ -0,0 +1,29 @@ +"""change-relation-ship + +Revision ID: 0dcd3acf382f +Revises: adb437cb796b +Create Date: 2025-12-12 15:23:06.136110 + +""" +from alembic import op +import sqlalchemy as sa +import sqlmodel.sql.sqltypes + + +# revision identifiers, used by Alembic. +revision = '0dcd3acf382f' +down_revision = 'adb437cb796b' +branch_labels = None +depends_on = None + + +def upgrade(): + # ### commands auto generated by Alembic - please adjust! ### + pass + # ### end Alembic commands ### + + +def downgrade(): + # ### commands auto generated by Alembic - please adjust! ### + pass + # ### end Alembic commands ### diff --git a/backend/app/alembic/versions/1a31ce608336_add_cascade_delete_relationships.py b/backend/app/alembic/versions/1a31ce608336_add_cascade_delete_relationships.py deleted file mode 100644 index 10e47a1456..0000000000 --- a/backend/app/alembic/versions/1a31ce608336_add_cascade_delete_relationships.py +++ /dev/null @@ -1,37 +0,0 @@ -"""Add cascade delete relationships - -Revision ID: 1a31ce608336 -Revises: d98dd8ec85a3 -Create Date: 2024-07-31 22:24:34.447891 - -""" -from alembic import op -import sqlalchemy as sa -import sqlmodel.sql.sqltypes - - -# revision identifiers, used by Alembic. -revision = '1a31ce608336' -down_revision = 'd98dd8ec85a3' -branch_labels = None -depends_on = None - - -def upgrade(): - # ### commands auto generated by Alembic - please adjust! ### - op.alter_column('item', 'owner_id', - existing_type=sa.UUID(), - nullable=False) - op.drop_constraint('item_owner_id_fkey', 'item', type_='foreignkey') - op.create_foreign_key(None, 'item', 'user', ['owner_id'], ['id'], ondelete='CASCADE') - # ### end Alembic commands ### - - -def downgrade(): - # ### commands auto generated by Alembic - please adjust! ### - op.drop_constraint(None, 'item', type_='foreignkey') - op.create_foreign_key('item_owner_id_fkey', 'item', 'user', ['owner_id'], ['id']) - op.alter_column('item', 'owner_id', - existing_type=sa.UUID(), - nullable=True) - # ### end Alembic commands ### diff --git a/backend/app/alembic/versions/792708ed624f_migration_chag.py b/backend/app/alembic/versions/792708ed624f_migration_chag.py new file mode 100644 index 0000000000..a1dca605fd --- /dev/null +++ b/backend/app/alembic/versions/792708ed624f_migration_chag.py @@ -0,0 +1,29 @@ +"""migration-chag + +Revision ID: 792708ed624f +Revises: ebf4b66990a5 +Create Date: 2025-12-12 16:04:30.397740 + +""" +from alembic import op +import sqlalchemy as sa +import sqlmodel.sql.sqltypes + + +# revision identifiers, used by Alembic. +revision = '792708ed624f' +down_revision = 'ebf4b66990a5' +branch_labels = None +depends_on = None + + +def upgrade(): + # ### commands auto generated by Alembic - please adjust! ### + op.drop_index(op.f('ix_otp_user_id'), table_name='otp') + # ### end Alembic commands ### + + +def downgrade(): + # ### commands auto generated by Alembic - please adjust! ### + op.create_index(op.f('ix_otp_user_id'), 'otp', ['user_id'], unique=False) + # ### end Alembic commands ### diff --git a/backend/app/alembic/versions/9c0a54914c78_add_max_length_for_string_varchar_.py b/backend/app/alembic/versions/9c0a54914c78_add_max_length_for_string_varchar_.py deleted file mode 100755 index 78a41773b9..0000000000 --- a/backend/app/alembic/versions/9c0a54914c78_add_max_length_for_string_varchar_.py +++ /dev/null @@ -1,69 +0,0 @@ -"""Add max length for string(varchar) fields in User and Items models - -Revision ID: 9c0a54914c78 -Revises: e2412789c190 -Create Date: 2024-06-17 14:42:44.639457 - -""" -from alembic import op -import sqlalchemy as sa -import sqlmodel.sql.sqltypes - - -# revision identifiers, used by Alembic. -revision = '9c0a54914c78' -down_revision = 'e2412789c190' -branch_labels = None -depends_on = None - - -def upgrade(): - # Adjust the length of the email field in the User table - op.alter_column('user', 'email', - existing_type=sa.String(), - type_=sa.String(length=255), - existing_nullable=False) - - # Adjust the length of the full_name field in the User table - op.alter_column('user', 'full_name', - existing_type=sa.String(), - type_=sa.String(length=255), - existing_nullable=True) - - # Adjust the length of the title field in the Item table - op.alter_column('item', 'title', - existing_type=sa.String(), - type_=sa.String(length=255), - existing_nullable=False) - - # Adjust the length of the description field in the Item table - op.alter_column('item', 'description', - existing_type=sa.String(), - type_=sa.String(length=255), - existing_nullable=True) - - -def downgrade(): - # Revert the length of the email field in the User table - op.alter_column('user', 'email', - existing_type=sa.String(length=255), - type_=sa.String(), - existing_nullable=False) - - # Revert the length of the full_name field in the User table - op.alter_column('user', 'full_name', - existing_type=sa.String(length=255), - type_=sa.String(), - existing_nullable=True) - - # Revert the length of the title field in the Item table - op.alter_column('item', 'title', - existing_type=sa.String(length=255), - type_=sa.String(), - existing_nullable=False) - - # Revert the length of the description field in the Item table - op.alter_column('item', 'description', - existing_type=sa.String(length=255), - type_=sa.String(), - existing_nullable=True) diff --git a/backend/app/alembic/versions/adb437cb796b_initial.py b/backend/app/alembic/versions/adb437cb796b_initial.py new file mode 100644 index 0000000000..95d2af8622 --- /dev/null +++ b/backend/app/alembic/versions/adb437cb796b_initial.py @@ -0,0 +1,60 @@ +"""initial + +Revision ID: adb437cb796b +Revises: +Create Date: 2025-12-12 12:12:10.473657 + +""" +from alembic import op +import sqlalchemy as sa +import sqlmodel.sql.sqltypes + + +# revision identifiers, used by Alembic. +revision = 'adb437cb796b' +down_revision = None +branch_labels = None +depends_on = None + + +def upgrade(): + # ### commands auto generated by Alembic - please adjust! ### + op.create_table('user', + sa.Column('id', sa.Uuid(), nullable=False), + sa.Column('first_name', sqlmodel.sql.sqltypes.AutoString(), nullable=True), + sa.Column('last_name', sqlmodel.sql.sqltypes.AutoString(), nullable=True), + sa.Column('email', sqlmodel.sql.sqltypes.AutoString(), nullable=False), + sa.Column('hashed_password', sqlmodel.sql.sqltypes.AutoString(), nullable=False), + sa.Column('phone_number', sqlmodel.sql.sqltypes.AutoString(), nullable=True), + sa.Column('role', sa.Enum('user', 'admin', name='userrole'), nullable=True), + sa.Column('created_at', sa.DateTime(timezone=True), nullable=True), + sa.Column('updated_at', sa.DateTime(timezone=True), nullable=True), + sa.PrimaryKeyConstraint('id') + ) + op.create_index(op.f('ix_user_email'), 'user', ['email'], unique=True) + op.create_index(op.f('ix_user_id'), 'user', ['id'], unique=False) + op.create_table('otp', + sa.Column('id', sa.Uuid(), nullable=False), + sa.Column('user_id', sa.Uuid(), nullable=False), + sa.Column('code', sa.Integer(), nullable=False), + sa.Column('type', sa.Enum('password_reset', 'email_verification', 'signup_confirmation', 'login_confirmation', name='otptype'), nullable=False), + sa.Column('created_at', sa.DateTime(timezone=True), nullable=True), + sa.Column('expires_at', sa.DateTime(), nullable=False), + sa.Column('updated_at', sa.DateTime(timezone=True), nullable=True), + sa.ForeignKeyConstraint(['user_id'], ['user.id'], ), + sa.PrimaryKeyConstraint('id') + ) + op.create_index(op.f('ix_otp_id'), 'otp', ['id'], unique=False) + op.create_index(op.f('ix_otp_user_id'), 'otp', ['user_id'], unique=False) + # ### end Alembic commands ### + + +def downgrade(): + # ### commands auto generated by Alembic - please adjust! ### + op.drop_index(op.f('ix_otp_user_id'), table_name='otp') + op.drop_index(op.f('ix_otp_id'), table_name='otp') + op.drop_table('otp') + op.drop_index(op.f('ix_user_id'), table_name='user') + op.drop_index(op.f('ix_user_email'), table_name='user') + op.drop_table('user') + # ### end Alembic commands ### diff --git a/backend/app/alembic/versions/b96941f53a7b_migration.py b/backend/app/alembic/versions/b96941f53a7b_migration.py new file mode 100644 index 0000000000..4d59329a91 --- /dev/null +++ b/backend/app/alembic/versions/b96941f53a7b_migration.py @@ -0,0 +1,29 @@ +"""migration + +Revision ID: b96941f53a7b +Revises: d48192107d61 +Create Date: 2025-12-12 15:44:00.498929 + +""" +from alembic import op +import sqlalchemy as sa +import sqlmodel.sql.sqltypes + + +# revision identifiers, used by Alembic. +revision = 'b96941f53a7b' +down_revision = 'd48192107d61' +branch_labels = None +depends_on = None + + +def upgrade(): + # ### commands auto generated by Alembic - please adjust! ### + pass + # ### end Alembic commands ### + + +def downgrade(): + # ### commands auto generated by Alembic - please adjust! ### + pass + # ### end Alembic commands ### diff --git a/backend/app/alembic/versions/d48192107d61_migration_failed.py b/backend/app/alembic/versions/d48192107d61_migration_failed.py new file mode 100644 index 0000000000..1aaaabb38d --- /dev/null +++ b/backend/app/alembic/versions/d48192107d61_migration_failed.py @@ -0,0 +1,29 @@ +"""migration-failed + +Revision ID: d48192107d61 +Revises: 0dcd3acf382f +Create Date: 2025-12-12 15:34:13.945725 + +""" +from alembic import op +import sqlalchemy as sa +import sqlmodel.sql.sqltypes + + +# revision identifiers, used by Alembic. +revision = 'd48192107d61' +down_revision = '0dcd3acf382f' +branch_labels = None +depends_on = None + + +def upgrade(): + # ### commands auto generated by Alembic - please adjust! ### + pass + # ### end Alembic commands ### + + +def downgrade(): + # ### commands auto generated by Alembic - please adjust! ### + pass + # ### end Alembic commands ### diff --git a/backend/app/alembic/versions/d98dd8ec85a3_edit_replace_id_integers_in_all_models_.py b/backend/app/alembic/versions/d98dd8ec85a3_edit_replace_id_integers_in_all_models_.py deleted file mode 100755 index 37af1fa215..0000000000 --- a/backend/app/alembic/versions/d98dd8ec85a3_edit_replace_id_integers_in_all_models_.py +++ /dev/null @@ -1,90 +0,0 @@ -"""Edit replace id integers in all models to use UUID instead - -Revision ID: d98dd8ec85a3 -Revises: 9c0a54914c78 -Create Date: 2024-07-19 04:08:04.000976 - -""" -from alembic import op -import sqlalchemy as sa -import sqlmodel.sql.sqltypes -from sqlalchemy.dialects import postgresql - - -# revision identifiers, used by Alembic. -revision = 'd98dd8ec85a3' -down_revision = '9c0a54914c78' -branch_labels = None -depends_on = None - - -def upgrade(): - # Ensure uuid-ossp extension is available - op.execute('CREATE EXTENSION IF NOT EXISTS "uuid-ossp"') - - # Create a new UUID column with a default UUID value - op.add_column('user', sa.Column('new_id', postgresql.UUID(as_uuid=True), default=sa.text('uuid_generate_v4()'))) - op.add_column('item', sa.Column('new_id', postgresql.UUID(as_uuid=True), default=sa.text('uuid_generate_v4()'))) - op.add_column('item', sa.Column('new_owner_id', postgresql.UUID(as_uuid=True), nullable=True)) - - # Populate the new columns with UUIDs - op.execute('UPDATE "user" SET new_id = uuid_generate_v4()') - op.execute('UPDATE item SET new_id = uuid_generate_v4()') - op.execute('UPDATE item SET new_owner_id = (SELECT new_id FROM "user" WHERE "user".id = item.owner_id)') - - # Set the new_id as not nullable - op.alter_column('user', 'new_id', nullable=False) - op.alter_column('item', 'new_id', nullable=False) - - # Drop old columns and rename new columns - op.drop_constraint('item_owner_id_fkey', 'item', type_='foreignkey') - op.drop_column('item', 'owner_id') - op.alter_column('item', 'new_owner_id', new_column_name='owner_id') - - op.drop_column('user', 'id') - op.alter_column('user', 'new_id', new_column_name='id') - - op.drop_column('item', 'id') - op.alter_column('item', 'new_id', new_column_name='id') - - # Create primary key constraint - op.create_primary_key('user_pkey', 'user', ['id']) - op.create_primary_key('item_pkey', 'item', ['id']) - - # Recreate foreign key constraint - op.create_foreign_key('item_owner_id_fkey', 'item', 'user', ['owner_id'], ['id']) - -def downgrade(): - # Reverse the upgrade process - op.add_column('user', sa.Column('old_id', sa.Integer, autoincrement=True)) - op.add_column('item', sa.Column('old_id', sa.Integer, autoincrement=True)) - op.add_column('item', sa.Column('old_owner_id', sa.Integer, nullable=True)) - - # Populate the old columns with default values - # Generate sequences for the integer IDs if not exist - op.execute('CREATE SEQUENCE IF NOT EXISTS user_id_seq AS INTEGER OWNED BY "user".old_id') - op.execute('CREATE SEQUENCE IF NOT EXISTS item_id_seq AS INTEGER OWNED BY item.old_id') - - op.execute('SELECT setval(\'user_id_seq\', COALESCE((SELECT MAX(old_id) + 1 FROM "user"), 1), false)') - op.execute('SELECT setval(\'item_id_seq\', COALESCE((SELECT MAX(old_id) + 1 FROM item), 1), false)') - - op.execute('UPDATE "user" SET old_id = nextval(\'user_id_seq\')') - op.execute('UPDATE item SET old_id = nextval(\'item_id_seq\'), old_owner_id = (SELECT old_id FROM "user" WHERE "user".id = item.owner_id)') - - # Drop new columns and rename old columns back - op.drop_constraint('item_owner_id_fkey', 'item', type_='foreignkey') - op.drop_column('item', 'owner_id') - op.alter_column('item', 'old_owner_id', new_column_name='owner_id') - - op.drop_column('user', 'id') - op.alter_column('user', 'old_id', new_column_name='id') - - op.drop_column('item', 'id') - op.alter_column('item', 'old_id', new_column_name='id') - - # Create primary key constraint - op.create_primary_key('user_pkey', 'user', ['id']) - op.create_primary_key('item_pkey', 'item', ['id']) - - # Recreate foreign key constraint - op.create_foreign_key('item_owner_id_fkey', 'item', 'user', ['owner_id'], ['id']) diff --git a/backend/app/alembic/versions/e2412789c190_initialize_models.py b/backend/app/alembic/versions/e2412789c190_initialize_models.py deleted file mode 100644 index 7529ea91fa..0000000000 --- a/backend/app/alembic/versions/e2412789c190_initialize_models.py +++ /dev/null @@ -1,54 +0,0 @@ -"""Initialize models - -Revision ID: e2412789c190 -Revises: -Create Date: 2023-11-24 22:55:43.195942 - -""" -import sqlalchemy as sa -import sqlmodel.sql.sqltypes -from alembic import op - -# revision identifiers, used by Alembic. -revision = "e2412789c190" -down_revision = None -branch_labels = None -depends_on = None - - -def upgrade(): - # ### commands auto generated by Alembic - please adjust! ### - op.create_table( - "user", - sa.Column("email", sqlmodel.sql.sqltypes.AutoString(), nullable=False), - sa.Column("is_active", sa.Boolean(), nullable=False), - sa.Column("is_superuser", sa.Boolean(), nullable=False), - sa.Column("full_name", sqlmodel.sql.sqltypes.AutoString(), nullable=True), - sa.Column("id", sa.Integer(), nullable=False), - sa.Column( - "hashed_password", sqlmodel.sql.sqltypes.AutoString(), nullable=False - ), - sa.PrimaryKeyConstraint("id"), - ) - op.create_index(op.f("ix_user_email"), "user", ["email"], unique=True) - op.create_table( - "item", - sa.Column("description", sqlmodel.sql.sqltypes.AutoString(), nullable=True), - sa.Column("id", sa.Integer(), nullable=False), - sa.Column("title", sqlmodel.sql.sqltypes.AutoString(), nullable=False), - sa.Column("owner_id", sa.Integer(), nullable=False), - sa.ForeignKeyConstraint( - ["owner_id"], - ["user.id"], - ), - sa.PrimaryKeyConstraint("id"), - ) - # ### end Alembic commands ### - - -def downgrade(): - # ### commands auto generated by Alembic - please adjust! ### - op.drop_table("item") - op.drop_index(op.f("ix_user_email"), table_name="user") - op.drop_table("user") - # ### end Alembic commands ### diff --git a/backend/app/alembic/versions/ebf4b66990a5_migration_chaged.py b/backend/app/alembic/versions/ebf4b66990a5_migration_chaged.py new file mode 100644 index 0000000000..d1459a464a --- /dev/null +++ b/backend/app/alembic/versions/ebf4b66990a5_migration_chaged.py @@ -0,0 +1,65 @@ +"""migration-chaged + +Revision ID: ebf4b66990a5 +Revises: b96941f53a7b +Create Date: 2025-12-12 16:00:09.983884 + +""" +from alembic import op +import sqlalchemy as sa +import sqlmodel.sql.sqltypes +from sqlalchemy.dialects import postgresql + +# revision identifiers, used by Alembic. +revision = 'ebf4b66990a5' +down_revision = 'b96941f53a7b' +branch_labels = None +depends_on = None + + +def upgrade(): + # ### commands auto generated by Alembic - please adjust! ### + op.alter_column('otp', 'created_at', + existing_type=postgresql.TIMESTAMP(timezone=True), + type_=sa.DateTime(), + nullable=False) + op.alter_column('otp', 'updated_at', + existing_type=postgresql.TIMESTAMP(timezone=True), + type_=sa.DateTime(), + nullable=False) + op.alter_column('user', 'role', + existing_type=postgresql.ENUM('user', 'admin', name='userrole'), + nullable=False) + op.alter_column('user', 'created_at', + existing_type=postgresql.TIMESTAMP(timezone=True), + type_=sa.DateTime(), + nullable=False) + op.alter_column('user', 'updated_at', + existing_type=postgresql.TIMESTAMP(timezone=True), + type_=sa.DateTime(), + nullable=False) + # ### end Alembic commands ### + + +def downgrade(): + # ### commands auto generated by Alembic - please adjust! ### + op.alter_column('user', 'updated_at', + existing_type=sa.DateTime(), + type_=postgresql.TIMESTAMP(timezone=True), + nullable=True) + op.alter_column('user', 'created_at', + existing_type=sa.DateTime(), + type_=postgresql.TIMESTAMP(timezone=True), + nullable=True) + op.alter_column('user', 'role', + existing_type=postgresql.ENUM('user', 'admin', name='userrole'), + nullable=True) + op.alter_column('otp', 'updated_at', + existing_type=sa.DateTime(), + type_=postgresql.TIMESTAMP(timezone=True), + nullable=True) + op.alter_column('otp', 'created_at', + existing_type=sa.DateTime(), + type_=postgresql.TIMESTAMP(timezone=True), + nullable=True) + # ### end Alembic commands ### diff --git a/backend/app/api/controllers/auth_controller.py b/backend/app/api/controllers/auth_controller.py new file mode 100644 index 0000000000..964c8b54c4 --- /dev/null +++ b/backend/app/api/controllers/auth_controller.py @@ -0,0 +1,119 @@ +from typing import Any, Dict, Type + +from fastapi.responses import JSONResponse +from starlette import status + +from app.schemas.user import LoginSchema, RegisterSchema +from app.services.auth_service import AuthService +from app.utils_helper.messages import Messages as MSG +from app.schemas.response import ResponseSchema +from app.core.exceptions import AppException + + +class UserController: + + def __init__(self) -> None: + self.service = AuthService() + self.response_class: Type[ResponseSchema] = ResponseSchema + self.error_class = AppException + + def _success(self, data: Any = None, message: str = "OK", status_code: int = status.HTTP_200_OK) -> JSONResponse: + msg = message + data_payload = data + + if isinstance(data, dict): + msg = data.get("message") or message + if "user" in data: + data_payload = data.get("user") + elif "data" in data: + data_payload = data.get("data") + if isinstance(data_payload, dict) and "message" in data_payload: + data_payload = {k: v for k, v in data_payload.items() if k != "message"} + + payload = self.response_class( + success=True, + message=msg, + data=data_payload, + errors=None, + meta=None, + ).model_dump(exclude_none=True) + + return JSONResponse(status_code=status_code, content=payload) + + def _error(self, message: Any = "Error", errors: Any = None, status_code: int | None = None) -> JSONResponse: + code = status_code + if isinstance(message, self.error_class): + exc = message + code = code or getattr(exc, "status_code", status.HTTP_400_BAD_REQUEST) + payload = self.response_class( + success=False, + message=getattr(exc, "message", str(exc)), + errors=getattr(exc, "details", None), + data=None, + ).model_dump(exclude_none=True) + return JSONResponse(status_code=code, content=payload) + + if isinstance(message, Exception): + code = code or status.HTTP_400_BAD_REQUEST + msg = str(message) + else: + code = code or status.HTTP_400_BAD_REQUEST + msg = str(message) + + payload = self.response_class( + success=False, + message=msg, + errors=errors, + data=None, + ).model_dump(exclude_none=True) + + return JSONResponse(status_code=code, content=payload) + + async def login(self, request: LoginSchema) -> Dict[str, Any]: + try: + result = await self.service.login(request.email, request.password) + return self._success(data=result, message=MSG.AUTH["SUCCESS"]["USER_LOGGED_IN"], status_code=status.HTTP_200_OK) + except Exception as exc: + return self._error(exc) + + async def register(self, request: RegisterSchema) -> Dict[str, Any]: + try: + result = await self.service.register(request.email, request.password, request.first_name , request.last_name , request.phone_number) + return self._success(data=result, message=MSG.AUTH["SUCCESS"]["USER_REGISTERED"], status_code=status.HTTP_201_CREATED) + except Exception as exc: + return self._error(exc) + + async def verify(self) -> Dict[str, Any]: + try: + result = await self.service.verify() + return self._success(data=result, message=MSG.AUTH["SUCCESS"]["EMAIL_VERIFIED"], status_code=status.HTTP_200_OK) + except Exception as exc: + return self._error(exc) + + async def forgot_password(self) -> Dict[str, Any]: + try: + result = await self.service.forgot_password(email=None) + return self._success(data=result, message=MSG.AUTH["SUCCESS"]["PASSWORD_RESET_EMAIL_SENT"], status_code=status.HTTP_200_OK) + except Exception as exc: + return self._error(exc) + + async def reset_password(self) -> Dict[str, Any]: + try: + result = await self.service.reset_password(token=None, new_password=None) + return self._success(data=result, message=MSG.AUTH["SUCCESS"]["PASSWORD_HAS_BEEN_RESET"], status_code=status.HTTP_200_OK) + except Exception as exc: + return self._error(exc) + + async def resend_email(self) -> Dict[str, Any]: + try: + result = await self.service.resend_email(email=None) + return self._success(data=result, message=MSG.AUTH["SUCCESS"]["VERIFICATION_EMAIL_RESENT"], status_code=status.HTTP_200_OK) + except Exception as exc: + return self._error(exc) + + async def logout(self) -> Dict[str, Any]: + try: + result = await self.service.logout() + return self._success(data=result, message=MSG.AUTH["SUCCESS"]["LOGGED_OUT"], status_code=status.HTTP_200_OK) + except Exception as exc: + return self._error(exc) diff --git a/backend/app/api/deps.py b/backend/app/api/deps.py index c2b83c841d..b2f412f2ec 100644 --- a/backend/app/api/deps.py +++ b/backend/app/api/deps.py @@ -1,17 +1,11 @@ from collections.abc import Generator from typing import Annotated -import jwt -from fastapi import Depends, HTTPException, status +from fastapi import Depends from fastapi.security import OAuth2PasswordBearer -from jwt.exceptions import InvalidTokenError -from pydantic import ValidationError from sqlmodel import Session - -from app.core import security from app.core.config import settings from app.core.db import engine -from app.models import TokenPayload, User reusable_oauth2 = OAuth2PasswordBearer( tokenUrl=f"{settings.API_V1_STR}/login/access-token" @@ -25,33 +19,3 @@ def get_db() -> Generator[Session, None, None]: SessionDep = Annotated[Session, Depends(get_db)] TokenDep = Annotated[str, Depends(reusable_oauth2)] - - -def get_current_user(session: SessionDep, token: TokenDep) -> User: - try: - payload = jwt.decode( - token, settings.SECRET_KEY, algorithms=[security.ALGORITHM] - ) - token_data = TokenPayload(**payload) - except (InvalidTokenError, ValidationError): - raise HTTPException( - status_code=status.HTTP_403_FORBIDDEN, - detail="Could not validate credentials", - ) - user = session.get(User, token_data.sub) - if not user: - raise HTTPException(status_code=404, detail="User not found") - if not user.is_active: - raise HTTPException(status_code=400, detail="Inactive user") - return user - - -CurrentUser = Annotated[User, Depends(get_current_user)] - - -def get_current_active_superuser(current_user: CurrentUser) -> User: - if not current_user.is_superuser: - raise HTTPException( - status_code=403, detail="The user doesn't have enough privileges" - ) - return current_user diff --git a/backend/app/api/main.py b/backend/app/api/main.py index eac18c8e8f..b6099194d8 100644 --- a/backend/app/api/main.py +++ b/backend/app/api/main.py @@ -1,14 +1,7 @@ from fastapi import APIRouter -from app.api.routes import items, login, private, users, utils -from app.core.config import settings +from app.api.routes import auth, ws api_router = APIRouter() -api_router.include_router(login.router) -api_router.include_router(users.router) -api_router.include_router(utils.router) -api_router.include_router(items.router) - - -if settings.ENVIRONMENT == "local": - api_router.include_router(private.router) +api_router.include_router(auth.router) +api_router.include_router(ws.router) diff --git a/backend/app/api/routes/auth.py b/backend/app/api/routes/auth.py new file mode 100644 index 0000000000..a74aed434b --- /dev/null +++ b/backend/app/api/routes/auth.py @@ -0,0 +1,37 @@ +from fastapi import APIRouter +from app.schemas.user import LoginSchema, RegisterSchema +from app.api.controllers.auth_controller import UserController + +router = APIRouter(prefix="/auth", tags=["auth"]) + +controller = UserController() + + +@router.post("/login") +async def login(request: LoginSchema): + return await controller.login(request) + + +@router.post("/register") +async def register(request: RegisterSchema): + return await controller.register(request) + +@router.post("/verify") +async def verify(): + return await controller.verify() + +@router.post("/forgot-password") +async def forgot_password(): + return await controller.forgot_password() + +@router.post("/reset-password") +async def reset_password(): + return await controller.reset_password() + +@router.post("/resend-email") +async def resend_email(): + return await controller.resend_email() + +@router.post("/logout") +async def logout(): + return await controller.logout() diff --git a/backend/app/api/routes/items.py b/backend/app/api/routes/items.py deleted file mode 100644 index 177dc1e476..0000000000 --- a/backend/app/api/routes/items.py +++ /dev/null @@ -1,109 +0,0 @@ -import uuid -from typing import Any - -from fastapi import APIRouter, HTTPException -from sqlmodel import func, select - -from app.api.deps import CurrentUser, SessionDep -from app.models import Item, ItemCreate, ItemPublic, ItemsPublic, ItemUpdate, Message - -router = APIRouter(prefix="/items", tags=["items"]) - - -@router.get("/", response_model=ItemsPublic) -def read_items( - session: SessionDep, current_user: CurrentUser, skip: int = 0, limit: int = 100 -) -> Any: - """ - Retrieve items. - """ - - if current_user.is_superuser: - count_statement = select(func.count()).select_from(Item) - count = session.exec(count_statement).one() - statement = select(Item).offset(skip).limit(limit) - items = session.exec(statement).all() - else: - count_statement = ( - select(func.count()) - .select_from(Item) - .where(Item.owner_id == current_user.id) - ) - count = session.exec(count_statement).one() - statement = ( - select(Item) - .where(Item.owner_id == current_user.id) - .offset(skip) - .limit(limit) - ) - items = session.exec(statement).all() - - return ItemsPublic(data=items, count=count) - - -@router.get("/{id}", response_model=ItemPublic) -def read_item(session: SessionDep, current_user: CurrentUser, id: uuid.UUID) -> Any: - """ - Get item by ID. - """ - item = session.get(Item, id) - if not item: - raise HTTPException(status_code=404, detail="Item not found") - if not current_user.is_superuser and (item.owner_id != current_user.id): - raise HTTPException(status_code=400, detail="Not enough permissions") - return item - - -@router.post("/", response_model=ItemPublic) -def create_item( - *, session: SessionDep, current_user: CurrentUser, item_in: ItemCreate -) -> Any: - """ - Create new item. - """ - item = Item.model_validate(item_in, update={"owner_id": current_user.id}) - session.add(item) - session.commit() - session.refresh(item) - return item - - -@router.put("/{id}", response_model=ItemPublic) -def update_item( - *, - session: SessionDep, - current_user: CurrentUser, - id: uuid.UUID, - item_in: ItemUpdate, -) -> Any: - """ - Update an item. - """ - item = session.get(Item, id) - if not item: - raise HTTPException(status_code=404, detail="Item not found") - if not current_user.is_superuser and (item.owner_id != current_user.id): - raise HTTPException(status_code=400, detail="Not enough permissions") - update_dict = item_in.model_dump(exclude_unset=True) - item.sqlmodel_update(update_dict) - session.add(item) - session.commit() - session.refresh(item) - return item - - -@router.delete("/{id}") -def delete_item( - session: SessionDep, current_user: CurrentUser, id: uuid.UUID -) -> Message: - """ - Delete an item. - """ - item = session.get(Item, id) - if not item: - raise HTTPException(status_code=404, detail="Item not found") - if not current_user.is_superuser and (item.owner_id != current_user.id): - raise HTTPException(status_code=400, detail="Not enough permissions") - session.delete(item) - session.commit() - return Message(message="Item deleted successfully") diff --git a/backend/app/api/routes/login.py b/backend/app/api/routes/login.py deleted file mode 100644 index 980c66f86f..0000000000 --- a/backend/app/api/routes/login.py +++ /dev/null @@ -1,124 +0,0 @@ -from datetime import timedelta -from typing import Annotated, Any - -from fastapi import APIRouter, Depends, HTTPException -from fastapi.responses import HTMLResponse -from fastapi.security import OAuth2PasswordRequestForm - -from app import crud -from app.api.deps import CurrentUser, SessionDep, get_current_active_superuser -from app.core import security -from app.core.config import settings -from app.core.security import get_password_hash -from app.models import Message, NewPassword, Token, UserPublic -from app.utils import ( - generate_password_reset_token, - generate_reset_password_email, - send_email, - verify_password_reset_token, -) - -router = APIRouter(tags=["login"]) - - -@router.post("/login/access-token") -def login_access_token( - session: SessionDep, form_data: Annotated[OAuth2PasswordRequestForm, Depends()] -) -> Token: - """ - OAuth2 compatible token login, get an access token for future requests - """ - user = crud.authenticate( - session=session, email=form_data.username, password=form_data.password - ) - if not user: - raise HTTPException(status_code=400, detail="Incorrect email or password") - elif not user.is_active: - raise HTTPException(status_code=400, detail="Inactive user") - access_token_expires = timedelta(minutes=settings.ACCESS_TOKEN_EXPIRE_MINUTES) - return Token( - access_token=security.create_access_token( - user.id, expires_delta=access_token_expires - ) - ) - - -@router.post("/login/test-token", response_model=UserPublic) -def test_token(current_user: CurrentUser) -> Any: - """ - Test access token - """ - return current_user - - -@router.post("/password-recovery/{email}") -def recover_password(email: str, session: SessionDep) -> Message: - """ - Password Recovery - """ - user = crud.get_user_by_email(session=session, email=email) - - if not user: - raise HTTPException( - status_code=404, - detail="The user with this email does not exist in the system.", - ) - password_reset_token = generate_password_reset_token(email=email) - email_data = generate_reset_password_email( - email_to=user.email, email=email, token=password_reset_token - ) - send_email( - email_to=user.email, - subject=email_data.subject, - html_content=email_data.html_content, - ) - return Message(message="Password recovery email sent") - - -@router.post("/reset-password/") -def reset_password(session: SessionDep, body: NewPassword) -> Message: - """ - Reset password - """ - email = verify_password_reset_token(token=body.token) - if not email: - raise HTTPException(status_code=400, detail="Invalid token") - user = crud.get_user_by_email(session=session, email=email) - if not user: - raise HTTPException( - status_code=404, - detail="The user with this email does not exist in the system.", - ) - elif not user.is_active: - raise HTTPException(status_code=400, detail="Inactive user") - hashed_password = get_password_hash(password=body.new_password) - user.hashed_password = hashed_password - session.add(user) - session.commit() - return Message(message="Password updated successfully") - - -@router.post( - "/password-recovery-html-content/{email}", - dependencies=[Depends(get_current_active_superuser)], - response_class=HTMLResponse, -) -def recover_password_html_content(email: str, session: SessionDep) -> Any: - """ - HTML Content for Password Recovery - """ - user = crud.get_user_by_email(session=session, email=email) - - if not user: - raise HTTPException( - status_code=404, - detail="The user with this username does not exist in the system.", - ) - password_reset_token = generate_password_reset_token(email=email) - email_data = generate_reset_password_email( - email_to=user.email, email=email, token=password_reset_token - ) - - return HTMLResponse( - content=email_data.html_content, headers={"subject:": email_data.subject} - ) diff --git a/backend/app/api/routes/private.py b/backend/app/api/routes/private.py deleted file mode 100644 index 9f33ef1900..0000000000 --- a/backend/app/api/routes/private.py +++ /dev/null @@ -1,38 +0,0 @@ -from typing import Any - -from fastapi import APIRouter -from pydantic import BaseModel - -from app.api.deps import SessionDep -from app.core.security import get_password_hash -from app.models import ( - User, - UserPublic, -) - -router = APIRouter(tags=["private"], prefix="/private") - - -class PrivateUserCreate(BaseModel): - email: str - password: str - full_name: str - is_verified: bool = False - - -@router.post("/users/", response_model=UserPublic) -def create_user(user_in: PrivateUserCreate, session: SessionDep) -> Any: - """ - Create a new user. - """ - - user = User( - email=user_in.email, - full_name=user_in.full_name, - hashed_password=get_password_hash(user_in.password), - ) - - session.add(user) - session.commit() - - return user diff --git a/backend/app/api/routes/users.py b/backend/app/api/routes/users.py deleted file mode 100644 index 6429818458..0000000000 --- a/backend/app/api/routes/users.py +++ /dev/null @@ -1,226 +0,0 @@ -import uuid -from typing import Any - -from fastapi import APIRouter, Depends, HTTPException -from sqlmodel import col, delete, func, select - -from app import crud -from app.api.deps import ( - CurrentUser, - SessionDep, - get_current_active_superuser, -) -from app.core.config import settings -from app.core.security import get_password_hash, verify_password -from app.models import ( - Item, - Message, - UpdatePassword, - User, - UserCreate, - UserPublic, - UserRegister, - UsersPublic, - UserUpdate, - UserUpdateMe, -) -from app.utils import generate_new_account_email, send_email - -router = APIRouter(prefix="/users", tags=["users"]) - - -@router.get( - "/", - dependencies=[Depends(get_current_active_superuser)], - response_model=UsersPublic, -) -def read_users(session: SessionDep, skip: int = 0, limit: int = 100) -> Any: - """ - Retrieve users. - """ - - count_statement = select(func.count()).select_from(User) - count = session.exec(count_statement).one() - - statement = select(User).offset(skip).limit(limit) - users = session.exec(statement).all() - - return UsersPublic(data=users, count=count) - - -@router.post( - "/", dependencies=[Depends(get_current_active_superuser)], response_model=UserPublic -) -def create_user(*, session: SessionDep, user_in: UserCreate) -> Any: - """ - Create new user. - """ - user = crud.get_user_by_email(session=session, email=user_in.email) - if user: - raise HTTPException( - status_code=400, - detail="The user with this email already exists in the system.", - ) - - user = crud.create_user(session=session, user_create=user_in) - if settings.emails_enabled and user_in.email: - email_data = generate_new_account_email( - email_to=user_in.email, username=user_in.email, password=user_in.password - ) - send_email( - email_to=user_in.email, - subject=email_data.subject, - html_content=email_data.html_content, - ) - return user - - -@router.patch("/me", response_model=UserPublic) -def update_user_me( - *, session: SessionDep, user_in: UserUpdateMe, current_user: CurrentUser -) -> Any: - """ - Update own user. - """ - - if user_in.email: - existing_user = crud.get_user_by_email(session=session, email=user_in.email) - if existing_user and existing_user.id != current_user.id: - raise HTTPException( - status_code=409, detail="User with this email already exists" - ) - user_data = user_in.model_dump(exclude_unset=True) - current_user.sqlmodel_update(user_data) - session.add(current_user) - session.commit() - session.refresh(current_user) - return current_user - - -@router.patch("/me/password", response_model=Message) -def update_password_me( - *, session: SessionDep, body: UpdatePassword, current_user: CurrentUser -) -> Any: - """ - Update own password. - """ - if not verify_password(body.current_password, current_user.hashed_password): - raise HTTPException(status_code=400, detail="Incorrect password") - if body.current_password == body.new_password: - raise HTTPException( - status_code=400, detail="New password cannot be the same as the current one" - ) - hashed_password = get_password_hash(body.new_password) - current_user.hashed_password = hashed_password - session.add(current_user) - session.commit() - return Message(message="Password updated successfully") - - -@router.get("/me", response_model=UserPublic) -def read_user_me(current_user: CurrentUser) -> Any: - """ - Get current user. - """ - return current_user - - -@router.delete("/me", response_model=Message) -def delete_user_me(session: SessionDep, current_user: CurrentUser) -> Any: - """ - Delete own user. - """ - if current_user.is_superuser: - raise HTTPException( - status_code=403, detail="Super users are not allowed to delete themselves" - ) - session.delete(current_user) - session.commit() - return Message(message="User deleted successfully") - - -@router.post("/signup", response_model=UserPublic) -def register_user(session: SessionDep, user_in: UserRegister) -> Any: - """ - Create new user without the need to be logged in. - """ - user = crud.get_user_by_email(session=session, email=user_in.email) - if user: - raise HTTPException( - status_code=400, - detail="The user with this email already exists in the system", - ) - user_create = UserCreate.model_validate(user_in) - user = crud.create_user(session=session, user_create=user_create) - return user - - -@router.get("/{user_id}", response_model=UserPublic) -def read_user_by_id( - user_id: uuid.UUID, session: SessionDep, current_user: CurrentUser -) -> Any: - """ - Get a specific user by id. - """ - user = session.get(User, user_id) - if user == current_user: - return user - if not current_user.is_superuser: - raise HTTPException( - status_code=403, - detail="The user doesn't have enough privileges", - ) - return user - - -@router.patch( - "/{user_id}", - dependencies=[Depends(get_current_active_superuser)], - response_model=UserPublic, -) -def update_user( - *, - session: SessionDep, - user_id: uuid.UUID, - user_in: UserUpdate, -) -> Any: - """ - Update a user. - """ - - db_user = session.get(User, user_id) - if not db_user: - raise HTTPException( - status_code=404, - detail="The user with this id does not exist in the system", - ) - if user_in.email: - existing_user = crud.get_user_by_email(session=session, email=user_in.email) - if existing_user and existing_user.id != user_id: - raise HTTPException( - status_code=409, detail="User with this email already exists" - ) - - db_user = crud.update_user(session=session, db_user=db_user, user_in=user_in) - return db_user - - -@router.delete("/{user_id}", dependencies=[Depends(get_current_active_superuser)]) -def delete_user( - session: SessionDep, current_user: CurrentUser, user_id: uuid.UUID -) -> Message: - """ - Delete a user. - """ - user = session.get(User, user_id) - if not user: - raise HTTPException(status_code=404, detail="User not found") - if user == current_user: - raise HTTPException( - status_code=403, detail="Super users are not allowed to delete themselves" - ) - statement = delete(Item).where(col(Item.owner_id) == user_id) - session.exec(statement) # type: ignore - session.delete(user) - session.commit() - return Message(message="User deleted successfully") diff --git a/backend/app/api/routes/utils.py b/backend/app/api/routes/utils.py deleted file mode 100644 index fc093419b3..0000000000 --- a/backend/app/api/routes/utils.py +++ /dev/null @@ -1,31 +0,0 @@ -from fastapi import APIRouter, Depends -from pydantic.networks import EmailStr - -from app.api.deps import get_current_active_superuser -from app.models import Message -from app.utils import generate_test_email, send_email - -router = APIRouter(prefix="/utils", tags=["utils"]) - - -@router.post( - "/test-email/", - dependencies=[Depends(get_current_active_superuser)], - status_code=201, -) -def test_email(email_to: EmailStr) -> Message: - """ - Test emails. - """ - email_data = generate_test_email(email_to=email_to) - send_email( - email_to=email_to, - subject=email_data.subject, - html_content=email_data.html_content, - ) - return Message(message="Test email sent") - - -@router.get("/health-check/") -async def health_check() -> bool: - return True diff --git a/backend/app/api/routes/ws.py b/backend/app/api/routes/ws.py new file mode 100644 index 0000000000..4f55e6aca2 --- /dev/null +++ b/backend/app/api/routes/ws.py @@ -0,0 +1,128 @@ +from fastapi import APIRouter, WebSocket, WebSocketDisconnect, status +import json +import time +from datetime import datetime +from sqlmodel import Session, select +from app.core.db import engine +import jwt + +from app.core.config import settings +from app.core import security +from app.models.user import User + +router = APIRouter() + + +def _sanitize_text(value: str) -> str: + return value.replace("<", "<").replace(">", ">") + + +async def _verify_websocket_token(token: str) -> User | None: + try: + payload = jwt.decode(token, settings.SECRET_KEY, algorithms=[security.ALGORITHM]) + subject = payload.get("sub") + if not subject: + return None + with Session(engine) as session: + statement = select(User).where(User.id == subject) + user = session.exec(statement).first() + return user + except jwt.ExpiredSignatureError: + return None + except jwt.PyJWTError: + return None + + +async def _verify_room_access(room: str, user: User) -> bool: + # Placeholder: replace with real ACL checks (room membership, roles, etc.) + # For now allow access to all rooms; restrict if necessary. + return True + + +@router.websocket("/ws/{room}") +async def websocket_endpoint(websocket: WebSocket, room: str): + + # 1. Authenticate (token passed as query param `?token=...`) + token = websocket.query_params.get("token") + if not token: + await websocket.close(code=status.WS_1008_POLICY_VIOLATION) + return + + user = await _verify_websocket_token(token) + if not user: + await websocket.close(code=status.WS_1008_POLICY_VIOLATION) + return + + # 2. Authorize room access + if not await _verify_room_access(room, user): + await websocket.close(code=status.WS_1003_UNSUPPORTED_DATA) + return + + manager = websocket.app.state.ws_manager + # Attach user context on connect + await manager.connect(websocket, room, user_id=str(user.id)) + + # Rate limiting params + MAX_MESSAGE_SIZE = 64 * 1024 + MAX_MESSAGES_PER_MINUTE = 60 + window = 60 + + try: + while True: + data = await websocket.receive_text() + + if len(data) > MAX_MESSAGE_SIZE: + # ignore oversized messages + continue + + # Redis-backed per-user+room rate limiting (zset window) + redis = getattr(websocket.app.state, "redis", None) + if redis: + now = time.time() + window_start = now - window + key = f"ws_rate:{room}:{user.id}" + member = f"{now}-{now}" + try: + pipe = redis.pipeline() + pipe.zremrangebyscore(key, 0, window_start) + pipe.zadd(key, {member: now}) + pipe.zcard(key) + pipe.expire(key, window) + results = await pipe.execute() + current_count = int(results[2]) if len(results) >= 3 and results[2] is not None else 0 + except Exception: + current_count = 0 + if current_count > MAX_MESSAGES_PER_MINUTE: + # Optionally notify client before closing + await websocket.send_text(json.dumps({"error": "rate_limit_exceeded"})) + await websocket.close(code=status.WS_1011_INTERNAL_ERROR) + break + + # Validate JSON and message shape + try: + message = json.loads(data) + except json.JSONDecodeError: + # ignore invalid JSON + continue + + # Minimal validation: expect an object with 'text' field (string) + text = message.get("text") + if not isinstance(text, str): + continue + + # Sanitize text to mitigate XSS if frontends render unescaped + message["text"] = _sanitize_text(text) + message["user_id"] = str(user.id) + message["timestamp"] = datetime.utcnow().isoformat() + + await manager.publish(room, json.dumps(message)) + except WebSocketDisconnect: + try: + await manager.disconnect(websocket, room) + except Exception: + pass + finally: + try: + await manager.disconnect(websocket, room) + except Exception: + pass diff --git a/backend/app/api/websocket_manager.py b/backend/app/api/websocket_manager.py new file mode 100644 index 0000000000..e0c5fbae0c --- /dev/null +++ b/backend/app/api/websocket_manager.py @@ -0,0 +1,109 @@ +import asyncio +import logging +from typing import Dict, Set + +from fastapi import WebSocket + +logger = logging.getLogger(__name__) + + +class WebSocketManager: + def __init__(self, redis_client): + self.redis = redis_client + self.connections: Dict[str, Set[WebSocket]] = {} + self._pubsub = None + self._listen_task: asyncio.Task | None = None + + async def start(self) -> None: + if not self.redis: + logger.warning("WebSocketManager start skipped: no redis client provided") + return + + max_retries = 5 + base_delay = 1.0 + for attempt in range(1, max_retries + 1): + try: + result = await self.redis.ping() + if not result: + raise Exception("Redis ping failed") + logger.info("Starting WebSocketManager redis listener: %s", getattr(self.redis, "pubsub", None)) + self._pubsub = self.redis.pubsub() + await self._pubsub.psubscribe("ws:*") + self._listen_task = asyncio.create_task(self._reader_loop()) + return + except Exception as e: + logger.error(f"WebSocketManager redis connection error on attempt {attempt}: {e}") + logger.warning(f"WebSocketManager start attempt {attempt} failed: {e}") + if attempt == max_retries: + logger.warning("WebSocketManager failed to start after retries; continuing without Redis listener") + return + await asyncio.sleep(base_delay * (2 ** (attempt - 1))) + + async def _reader_loop(self) -> None: + try: + async for message in self._pubsub.listen(): + if not message: + continue + mtype = message.get("type") + # handle pmessage (pattern) and message + if mtype not in ("pmessage", "message"): + continue + # redis.asyncio returns bytes for channel/data in some setups + channel = message.get("channel") or message.get("pattern") + data = message.get("data") + if isinstance(channel, (bytes, bytearray)): + channel = channel.decode() + if isinstance(data, (bytes, bytearray)): + data = data.decode() + # channel format: ws: + try: + room = str(channel).split("ws:", 1)[1] + except Exception: + continue + await self._broadcast_to_local(room, data) + except asyncio.CancelledError: + logger.info("WebSocketManager listener task cancelled") + except Exception as e: + logger.exception(f"WebSocketManager listener error: {e}") + + async def publish(self, room: str, message: str) -> None: + try: + await self.redis.publish(f"ws:{room}", message) + except Exception as e: + logger.warning(f"Failed to publish websocket message: {e}") + + async def connect(self, websocket: WebSocket, room: str, user_id: str | None = None) -> None: + await websocket.accept() + self.connections.setdefault(room, set()).add(websocket) + + async def disconnect(self, websocket: WebSocket, room: str) -> None: + conns = self.connections.get(room) + if not conns: + return + conns.discard(websocket) + if not conns: + self.connections.pop(room, None) + + async def send_personal(self, websocket: WebSocket, message: str) -> None: + await websocket.send_text(message) + + async def _broadcast_to_local(self, room: str, message: str) -> None: + conns = list(self.connections.get(room, [])) + for ws in conns: + try: + await ws.send_text(message) + except Exception: + logger.exception("Error sending websocket message to local connection") + + async def stop(self) -> None: + if self._listen_task: + self._listen_task.cancel() + try: + await self._listen_task + except Exception: + logger.exception("Error waiting for websocket listener task to stop") + if self._pubsub: + try: + await self._pubsub.close() + except Exception: + logger.exception("Error closing websocket pubsub") diff --git a/backend/app/backend_pre_start.py b/backend/app/backend_pre_start.py index c2f8e29ae1..8ac968576b 100644 --- a/backend/app/backend_pre_start.py +++ b/backend/app/backend_pre_start.py @@ -4,7 +4,7 @@ from sqlmodel import Session, select from tenacity import after_log, before_log, retry, stop_after_attempt, wait_fixed -from app.core.db import engine +from app.core.db import get_engine logging.basicConfig(level=logging.INFO) logger = logging.getLogger(__name__) @@ -31,7 +31,7 @@ def init(db_engine: Engine) -> None: def main() -> None: logger.info("Initializing service") - init(engine) + init(get_engine()) logger.info("Service finished initializing") diff --git a/backend/app/core/celery_app.py b/backend/app/core/celery_app.py new file mode 100644 index 0000000000..903311b9a1 --- /dev/null +++ b/backend/app/core/celery_app.py @@ -0,0 +1,28 @@ +from __future__ import annotations + +from celery import Celery + +from .config import settings + + +broker_url = settings.CELERY_BROKER_URL or settings.REDIS_URL +result_backend = settings.CELERY_RESULT_BACKEND or settings.REDIS_URL + + +celery_app = Celery( + settings.PROJECT_NAME if getattr(settings, "PROJECT_NAME", None) else "app", + broker=broker_url, + backend=result_backend, +) + + +celery_app.conf.update( + result_expires=3600, + task_serializer="json", + result_serializer="json", + accept_content=["json"], + timezone="UTC", + enable_utc=True, +) + +celery_app.autodiscover_tasks(["app.workers"]) diff --git a/backend/app/core/config.py b/backend/app/core/config.py index 650b9f7910..e9f6204eac 100644 --- a/backend/app/core/config.py +++ b/backend/app/core/config.py @@ -55,6 +55,21 @@ def all_cors_origins(self) -> list[str]: POSTGRES_USER: str POSTGRES_PASSWORD: str = "" POSTGRES_DB: str = "" + # Redis connection URL. Default points to the compose service `redis`. + REDIS_URL: str = "redis://redis:6379/0" + # Celery broker/result backend. By default reuse `REDIS_URL` so you can + # configure an Upstash or other hosted Redis via `REDIS_URL` or explicitly + # via `CELERY_BROKER_URL` / `CELERY_RESULT_BACKEND` env vars. + CELERY_BROKER_URL: str | None = None + CELERY_RESULT_BACKEND: str | None = None + + # Cloudflare R2 (S3 compatible) settings + R2_ENABLED: bool = False + R2_ACCOUNT_ID: str | None = None + R2_ACCESS_KEY_ID: str | None = None + R2_SECRET_ACCESS_KEY: str | None = None + R2_BUCKET: str | None = None + R2_ENDPOINT_URL: AnyUrl | None = None @computed_field # type: ignore[prop-decorator] @property @@ -90,10 +105,47 @@ def _set_default_emails_from(self) -> Self: def emails_enabled(self) -> bool: return bool(self.SMTP_HOST and self.EMAILS_FROM_EMAIL) + @computed_field # type: ignore[prop-decorator] + @property + def r2_endpoint(self) -> str | None: + """Return explicit endpoint URL if set, otherwise construct from account id.""" + if self.R2_ENDPOINT_URL: + return str(self.R2_ENDPOINT_URL) + if self.R2_ACCOUNT_ID: + return f"https://{self.R2_ACCOUNT_ID}.r2.cloudflarestorage.com" + return None + + @computed_field # type: ignore[prop-decorator] + @property + def r2_enabled(self) -> bool: + """Whether R2 integration is configured/enabled.""" + if not self.R2_ENABLED: + return False + return bool(self.R2_BUCKET and self.R2_ACCESS_KEY_ID and self.R2_SECRET_ACCESS_KEY) + + @computed_field # type: ignore[prop-decorator] + @property + def r2_boto3_config(self) -> dict[str, Any]: + """Return a dict of kwargs suitable for boto3/aioboto3 client creation.""" + if not self.r2_enabled: + return {} + cfg: dict[str, Any] = { + "aws_access_key_id": self.R2_ACCESS_KEY_ID, + "aws_secret_access_key": self.R2_SECRET_ACCESS_KEY, + } + endpoint = self.r2_endpoint + if endpoint: + cfg["endpoint_url"] = endpoint + return cfg + EMAIL_TEST_USER: EmailStr = "test@example.com" FIRST_SUPERUSER: EmailStr FIRST_SUPERUSER_PASSWORD: str + # WebEngage transactional email settings + WEBENGAGE_API_URL: HttpUrl | None = None + WEBENGAGE_API_KEY: str | None = None + def _check_default_secret(self, var_name: str, value: str | None) -> None: if value == "changethis": message = ( @@ -115,5 +167,11 @@ def _enforce_non_default_secrets(self) -> Self: return self + @computed_field # type: ignore[prop-decorator] + @property + def webengage_enabled(self) -> bool: + """Whether WebEngage transactional email integration is configured.""" + return bool(self.WEBENGAGE_API_URL and self.WEBENGAGE_API_KEY) + settings = Settings() # type: ignore diff --git a/backend/app/core/db.py b/backend/app/core/db.py index ba991fb36d..93dcc14c23 100644 --- a/backend/app/core/db.py +++ b/backend/app/core/db.py @@ -1,33 +1,18 @@ -from sqlmodel import Session, create_engine, select +from sqlmodel import create_engine +from sqlalchemy import Engine -from app import crud from app.core.config import settings -from app.models import User, UserCreate -engine = create_engine(str(settings.SQLALCHEMY_DATABASE_URI)) +_engine: Engine | None = None -# make sure all SQLModel models are imported (app.models) before initializing DB -# otherwise, SQLModel might fail to initialize relationships properly -# for more details: https://github.com/fastapi/full-stack-fastapi-template/issues/28 +def get_engine() -> Engine: + """Return a cached SQLAlchemy Engine instance.""" + global _engine + if _engine is None: + _engine = create_engine(str(settings.SQLALCHEMY_DATABASE_URI)) + return _engine -def init_db(session: Session) -> None: - # Tables should be created with Alembic migrations - # But if you don't want to use migrations, create - # the tables un-commenting the next lines - # from sqlmodel import SQLModel +engine = get_engine() - # This works because the models are already imported and registered from app.models - # SQLModel.metadata.create_all(engine) - - user = session.exec( - select(User).where(User.email == settings.FIRST_SUPERUSER) - ).first() - if not user: - user_in = UserCreate( - email=settings.FIRST_SUPERUSER, - password=settings.FIRST_SUPERUSER_PASSWORD, - is_superuser=True, - ) - user = crud.create_user(session=session, user_create=user_in) diff --git a/backend/app/core/exceptions.py b/backend/app/core/exceptions.py new file mode 100644 index 0000000000..7157a4b2a8 --- /dev/null +++ b/backend/app/core/exceptions.py @@ -0,0 +1,36 @@ + +from typing import Any, Optional + + +class AppException(Exception): + + def __init__(self, message: str = "Application error", status_code: int = 400, details: Optional[Any] = None): + super().__init__(message) + self.message = message + self.status_code = status_code + self.details = details + + def to_dict(self) -> dict: + return {"success": False, "message": self.message, "errors": self.details} + + def __str__(self) -> str: + return self.message + + +class NotFoundException(AppException): + + def __init__(self, message: str = "Not found", details: Optional[Any] = None): + super().__init__(message=message, status_code=404, details=details) + + +class UnauthorizedException(AppException): + + def __init__(self, message: str = "Unauthorized", details: Optional[Any] = None): + super().__init__(message=message, status_code=401, details=details) + + +class ForbiddenException(AppException): + + def __init__(self, message: str = "Forbidden", details: Optional[Any] = None): + super().__init__(message=message, status_code=403, details=details) + diff --git a/backend/app/core/r2.py b/backend/app/core/r2.py new file mode 100644 index 0000000000..fcf5306b92 --- /dev/null +++ b/backend/app/core/r2.py @@ -0,0 +1,68 @@ +from __future__ import annotations + +from typing import Optional + +import aioboto3 +from botocore.exceptions import ClientError + +from .config import settings + + +async def upload_bytes( + key: str, + data: bytes, + bucket: Optional[str] = None, + content_type: Optional[str] = None, +) -> None: + bucket = bucket or settings.R2_BUCKET + if not settings.r2_enabled: + raise RuntimeError("R2 is not configured") + + async with aioboto3.client("s3", **settings.r2_boto3_config) as client: + params = {"Bucket": bucket, "Key": key, "Body": data} + if content_type: + params["ContentType"] = content_type + await client.put_object(**params) + + +async def download_bytes(key: str, bucket: Optional[str] = None) -> bytes: + bucket = bucket or settings.R2_BUCKET + if not settings.r2_enabled: + raise RuntimeError("R2 is not configured") + + async with aioboto3.client("s3", **settings.r2_boto3_config) as client: + resp = await client.get_object(Bucket=bucket, Key=key) + async with resp["Body"] as stream: + return await stream.read() + + +async def delete_object(key: str, bucket: Optional[str] = None) -> None: + bucket = bucket or settings.R2_BUCKET + if not settings.r2_enabled: + raise RuntimeError("R2 is not configured") + + async with aioboto3.client("s3", **settings.r2_boto3_config) as client: + await client.delete_object(Bucket=bucket, Key=key) + + +async def generate_presigned_url(key: str, expires_in: int = 3600, bucket: Optional[str] = None) -> str: + bucket = bucket or settings.R2_BUCKET + if not settings.r2_enabled: + raise RuntimeError("R2 is not configured") + + session = aioboto3.Session() + async with session.client("s3", **settings.r2_boto3_config) as client: + # generate_presigned_url is provided by botocore client + return client.generate_presigned_url( + "get_object", + Params={"Bucket": bucket, "Key": key}, + ExpiresIn=expires_in, + ) + + +__all__ = [ + "upload_bytes", + "download_bytes", + "delete_object", + "generate_presigned_url", +] diff --git a/backend/app/core/redis.py b/backend/app/core/redis.py new file mode 100644 index 0000000000..d44b6474f0 --- /dev/null +++ b/backend/app/core/redis.py @@ -0,0 +1,93 @@ +import asyncio +import json +import logging +from typing import Optional, AsyncGenerator + +import redis.asyncio as aioredis + +from app.core.config import settings + +logger = logging.getLogger(__name__) + +_redis_pool: Optional[aioredis.ConnectionPool] = None +_pool_lock = asyncio.Lock() + + +async def get_redis_pool() -> aioredis.ConnectionPool: + global _redis_pool + if _redis_pool is None: + async with _pool_lock: + if _redis_pool is None: + _redis_pool = aioredis.ConnectionPool.from_url( + settings.REDIS_URL, + encoding="utf-8", + decode_responses=True, + max_connections=50, + socket_connect_timeout=5, + health_check_interval=30, + ) + logger.info("Redis connection pool created") + return _redis_pool + + +async def get_redis() -> AsyncGenerator[aioredis.Redis, None]: + pool = await get_redis_pool() + redis = aioredis.Redis(connection_pool=pool) + try: + yield redis + finally: + try: + await redis.close() + except Exception: + logger.exception("Error closing temporary Redis client") + + +async def create_redis_client() -> aioredis.Redis: + pool = await get_redis_pool() + client = aioredis.Redis(connection_pool=pool) + logger.info("Redis client created from pool") + return client + + +async def close_redis_pool(): + global _redis_pool + if _redis_pool is not None: + try: + await _redis_pool.disconnect() + logger.info("Redis connection pool disconnected") + except Exception: + logger.exception("Error disconnecting Redis pool") + finally: + _redis_pool = None + + +class CacheService: + def __init__(self, redis_client: aioredis.Redis): + self.redis = redis_client + + async def get(self, key: str) -> Optional[dict]: + try: + value = await self.redis.get(key) + return json.loads(value) if value else None + except Exception as e: + logger.error(f"Redis GET error: {e}") + return None + + async def set(self, key: str, value: dict, expire: int = 3600): + try: + await self.redis.set(key, json.dumps(value), ex=expire) + except Exception as e: + logger.error(f"Redis SET error: {e}") + + async def delete(self, key: str): + try: + await self.redis.delete(key) + except Exception as e: + logger.error(f"Redis DELETE error: {e}") + + async def exists(self, key: str) -> bool: + try: + return await self.redis.exists(key) > 0 + except Exception as e: + logger.error(f"Redis EXISTS error: {e}") + return False diff --git a/backend/app/crud.py b/backend/app/crud.py deleted file mode 100644 index 905bf48724..0000000000 --- a/backend/app/crud.py +++ /dev/null @@ -1,54 +0,0 @@ -import uuid -from typing import Any - -from sqlmodel import Session, select - -from app.core.security import get_password_hash, verify_password -from app.models import Item, ItemCreate, User, UserCreate, UserUpdate - - -def create_user(*, session: Session, user_create: UserCreate) -> User: - db_obj = User.model_validate( - user_create, update={"hashed_password": get_password_hash(user_create.password)} - ) - session.add(db_obj) - session.commit() - session.refresh(db_obj) - return db_obj - - -def update_user(*, session: Session, db_user: User, user_in: UserUpdate) -> Any: - user_data = user_in.model_dump(exclude_unset=True) - extra_data = {} - if "password" in user_data: - password = user_data["password"] - hashed_password = get_password_hash(password) - extra_data["hashed_password"] = hashed_password - db_user.sqlmodel_update(user_data, update=extra_data) - session.add(db_user) - session.commit() - session.refresh(db_user) - return db_user - - -def get_user_by_email(*, session: Session, email: str) -> User | None: - statement = select(User).where(User.email == email) - session_user = session.exec(statement).first() - return session_user - - -def authenticate(*, session: Session, email: str, password: str) -> User | None: - db_user = get_user_by_email(session=session, email=email) - if not db_user: - return None - if not verify_password(password, db_user.hashed_password): - return None - return db_user - - -def create_item(*, session: Session, item_in: ItemCreate, owner_id: uuid.UUID) -> Item: - db_item = Item.model_validate(item_in, update={"owner_id": owner_id}) - session.add(db_item) - session.commit() - session.refresh(db_item) - return db_item diff --git a/backend/app/enums/__init__.py b/backend/app/enums/__init__.py new file mode 100644 index 0000000000..8b13789179 --- /dev/null +++ b/backend/app/enums/__init__.py @@ -0,0 +1 @@ + diff --git a/backend/app/enums/otp_enum.py b/backend/app/enums/otp_enum.py new file mode 100644 index 0000000000..c607f0fabf --- /dev/null +++ b/backend/app/enums/otp_enum.py @@ -0,0 +1,7 @@ +from enum import Enum + +class OTPType(str, Enum): + password_reset = "password_reset" + email_verification = "email_verification" + signup_confirmation = "signup_confirmation" + login_confirmation = "login_confirmation" diff --git a/backend/app/enums/user_enum.py b/backend/app/enums/user_enum.py new file mode 100644 index 0000000000..200260530f --- /dev/null +++ b/backend/app/enums/user_enum.py @@ -0,0 +1,6 @@ +from enum import Enum + + +class UserRole(str, Enum): + user = "user" + admin = "admin" diff --git a/backend/app/initial_data.py b/backend/app/initial_data.py deleted file mode 100644 index d806c3d381..0000000000 --- a/backend/app/initial_data.py +++ /dev/null @@ -1,23 +0,0 @@ -import logging - -from sqlmodel import Session - -from app.core.db import engine, init_db - -logging.basicConfig(level=logging.INFO) -logger = logging.getLogger(__name__) - - -def init() -> None: - with Session(engine) as session: - init_db(session) - - -def main() -> None: - logger.info("Creating initial data") - init() - logger.info("Initial data created") - - -if __name__ == "__main__": - main() diff --git a/backend/app/main.py b/backend/app/main.py index 9a95801e74..005550a362 100644 --- a/backend/app/main.py +++ b/backend/app/main.py @@ -1,11 +1,34 @@ +import sys +sys.dont_write_bytecode = True + +import logging import sentry_sdk from fastapi import FastAPI from fastapi.routing import APIRoute from starlette.middleware.cors import CORSMiddleware +import asyncio from app.api.main import api_router from app.core.config import settings +# middlewares +from app.middlewares.logger import RequestLoggerMiddleware +from app.middlewares.rate_limiter import RateLimiterMiddleware +from app.middlewares.error_handler import ( + app_exception_handler, + validation_exception_handler, + http_exception_handler, + unhandled_exception_handler, +) +from app.core.exceptions import AppException +from fastapi.exceptions import RequestValidationError +from starlette.exceptions import HTTPException as StarletteHTTPException + +# redis client and threading utils +from app.core.redis import create_redis_client, close_redis_pool +from app.utils_helper.threading import ThreadingUtils +from app.api.websocket_manager import WebSocketManager + def custom_generate_unique_id(route: APIRoute) -> str: return f"{route.tags[0]}-{route.name}" @@ -31,3 +54,55 @@ def custom_generate_unique_id(route: APIRoute) -> str: ) app.include_router(api_router, prefix=settings.API_V1_STR) + +# Register additional middlewares +app.add_middleware(RequestLoggerMiddleware) +app.add_middleware(RateLimiterMiddleware, requests_per_minute=100) + +# Register global exception handlers +app.add_exception_handler(AppException, app_exception_handler) +app.add_exception_handler(RequestValidationError, validation_exception_handler) +app.add_exception_handler(StarletteHTTPException, http_exception_handler) +app.add_exception_handler(Exception, unhandled_exception_handler) + + +@app.on_event("startup") +async def startup_event(): + # Configure basic logging + logging.basicConfig(level=logging.INFO) + + # Initialize redis client and attach to app.state + try: + app.state.redis = await create_redis_client() + # Initialize WebSocket manager and start Redis listener + try: + app.state.ws_manager = WebSocketManager(app.state.redis) + # start the manager which spawns a background redis subscription + await app.state.ws_manager.start() + except Exception as e: + logging.getLogger(__name__).warning(f"WS manager init failed: {e}") + except Exception as e: + logging.getLogger(__name__).warning(f"Redis init failed: {e}") + + # Attach threading utilities to app state for global access + app.state.threading = ThreadingUtils + + +@app.on_event("shutdown") +async def shutdown_event(): + try: + # Close long-lived redis client and disconnect pool + if getattr(app.state, "redis", None): + try: + await app.state.redis.close() + except Exception: + logging.getLogger(__name__).warning("Redis client close failed") + await close_redis_pool() + except Exception as e: + logging.getLogger(__name__).warning(f"Redis close failed: {e}") + # stop websocket manager if present + try: + if getattr(app.state, "ws_manager", None): + await app.state.ws_manager.stop() + except Exception as e: + logging.getLogger(__name__).warning(f"WS manager stop failed: {e}") diff --git a/backend/app/middlewares/cors.py b/backend/app/middlewares/cors.py new file mode 100644 index 0000000000..44dea926e0 --- /dev/null +++ b/backend/app/middlewares/cors.py @@ -0,0 +1,13 @@ +from fastapi.middleware.cors import CORSMiddleware +from app.config.settings import settings + + +def setup_cors(app): + app.add_middleware( + CORSMiddleware, + allow_origins=settings.CORS_ORIGINS, + allow_credentials=True, + allow_methods=["*"], + allow_headers=["*"], + expose_headers=["X-Request-ID", "X-Process-Time"] + ) diff --git a/backend/app/middlewares/error_handler.py b/backend/app/middlewares/error_handler.py new file mode 100644 index 0000000000..8420d64b51 --- /dev/null +++ b/backend/app/middlewares/error_handler.py @@ -0,0 +1,71 @@ +import logging +from fastapi import Request, status +from fastapi.responses import JSONResponse +from fastapi.exceptions import RequestValidationError +from starlette.exceptions import HTTPException as StarletteHTTPException +from app.core.exceptions import AppException +from app.schemas.response import ResponseSchema +from app.core.config import settings + +logger = logging.getLogger(__name__) + + +async def app_exception_handler(request: Request, exc: AppException): + logger.error(f"AppException: {exc.message} - Details: {exc.details}") + return JSONResponse( + status_code=exc.status_code, + content=ResponseSchema( + success=False, + message=exc.message, + errors=exc.details, + data=None + ).model_dump(exclude_none=True) + ) + + +async def validation_exception_handler(request: Request, exc: RequestValidationError): + errors = [ + { + "field": ".".join(str(loc) for loc in err["loc"]), + "message": err["msg"], + "type": err["type"] + } + for err in exc.errors() + ] + + logger.warning(f"Validation error: {errors}") + return JSONResponse( + status_code=status.HTTP_422_UNPROCESSABLE_ENTITY, + content=ResponseSchema( + success=False, + message="Validation error", + errors=errors, + data=None + ).model_dump(exclude_none=True) + ) + + +async def http_exception_handler(request: Request, exc: StarletteHTTPException): + logger.error(f"HTTPException: {exc.status_code} - {exc.detail}") + return JSONResponse( + status_code=exc.status_code, + content=ResponseSchema( + success=False, + message=exc.detail, + errors=None, + data=None + ).model_dump(exclude_none=True) + ) + + +async def unhandled_exception_handler(request: Request, exc: Exception): + logger.exception(f"Unhandled exception: {str(exc)}") + return JSONResponse( + status_code=status.HTTP_500_INTERNAL_SERVER_ERROR, + content=ResponseSchema( + success=False, + message="Internal server error", + errors=str(exc) if settings.DEBUG else None, + data=None + ).model_dump(exclude_none=True) + ) diff --git a/backend/app/middlewares/logger.py b/backend/app/middlewares/logger.py new file mode 100644 index 0000000000..0dd2a990d9 --- /dev/null +++ b/backend/app/middlewares/logger.py @@ -0,0 +1,33 @@ +import time +import logging +from fastapi import Request +from starlette.middleware.base import BaseHTTPMiddleware +from typing import Callable + +logger = logging.getLogger(__name__) + + +class RequestLoggerMiddleware(BaseHTTPMiddleware): + async def dispatch(self, request: Request, call_next: Callable): + request_id = request.headers.get("X-Request-ID", "N/A") + start_time = time.time() + + logger.info( + f"Request started: {request.method} {request.url.path} " + f"[Request ID: {request_id}]" + ) + + response = await call_next(request) + + process_time = time.time() - start_time + response.headers["X-Process-Time"] = str(process_time) + response.headers["X-Request-ID"] = request_id + + logger.info( + f"Request completed: {request.method} {request.url.path} " + f"Status: {response.status_code} " + f"Duration: {process_time:.3f}s " + f"[Request ID: {request_id}]" + ) + + return response diff --git a/backend/app/middlewares/rate_limiter.py b/backend/app/middlewares/rate_limiter.py new file mode 100644 index 0000000000..81add8748e --- /dev/null +++ b/backend/app/middlewares/rate_limiter.py @@ -0,0 +1,69 @@ +import time +import uuid +from typing import Callable, Optional +from fastapi import Request, HTTPException, status +from starlette.middleware.base import BaseHTTPMiddleware +from redis.asyncio import Redis + + +class RateLimiterMiddleware(BaseHTTPMiddleware): + + def __init__(self, app, requests_per_minute: int = 100, window_seconds: int = 60): + super().__init__(app) + self.requests_per_minute = requests_per_minute + self.window = window_seconds + + async def _get_redis(self) -> Optional[Redis]: + current = getattr(self, "app", None) + seen = set() + while current is not None and id(current) not in seen: + seen.add(id(current)) + state = getattr(current, "state", None) + if state is not None: + return getattr(state, "redis", None) + current = getattr(current, "app", None) + return None + + async def dispatch(self, request: Request, call_next: Callable): + client_ip = request.client.host if request.client else "unknown" + now = time.time() + window_start = now - self.window + + redis = await self._get_redis() + + if not redis: + response = await call_next(request) + response.headers["X-RateLimit-Limit"] = str(self.requests_per_minute) + response.headers["X-RateLimit-Remaining"] = "-1" + return response + + key = f"rate_limit:{client_ip}" + + member = f"{now}-{uuid.uuid4().hex}" + + try: + pipe = redis.pipeline() + pipe.zremrangebyscore(key, 0, window_start) + pipe.zadd(key, {member: now}) + pipe.zcard(key) + pipe.expire(key, self.window) + results = await pipe.execute() + current_count = int(results[2]) if len(results) >= 3 and results[2] is not None else 0 + except Exception: + response = await call_next(request) + response.headers["X-RateLimit-Limit"] = str(self.requests_per_minute) + response.headers["X-RateLimit-Remaining"] = "-1" + return response + + remaining = max(0, self.requests_per_minute - current_count) + + if current_count > self.requests_per_minute: + raise HTTPException( + status_code=status.HTTP_429_TOO_MANY_REQUESTS, + detail="Rate limit exceeded. Please try again later." + ) + + response = await call_next(request) + response.headers["X-RateLimit-Limit"] = str(self.requests_per_minute) + response.headers["X-RateLimit-Remaining"] = str(remaining) + return response diff --git a/backend/app/middlewares/response.py b/backend/app/middlewares/response.py new file mode 100644 index 0000000000..5065474a74 --- /dev/null +++ b/backend/app/middlewares/response.py @@ -0,0 +1,96 @@ +from fastapi import Request +from starlette.middleware.base import BaseHTTPMiddleware +from starlette.responses import JSONResponse +from typing import Callable, Any +import json + +from app.schemas.response import ResponseSchema + + +class ResponseFormatterMiddleware(BaseHTTPMiddleware): + async def dispatch(self, request: Request, call_next: Callable): + response = await call_next(request) + + content_type = response.headers.get("content-type", "") + if not content_type.startswith("application/json"): + return response + + # Read response body (consume iterator when present) + body_bytes = b"" + body_iterator = getattr(response, "body_iterator", None) + if body_iterator is not None: + try: + async for chunk in body_iterator: + body_bytes += chunk + except Exception: + # fallback to body attribute if iteration fails + body_bytes = getattr(response, "body", b"") or b"" + else: + body_bytes = getattr(response, "body", b"") or b"" + + try: + text = body_bytes.decode("utf-8") if body_bytes else "" + except Exception: + return response + + if not text: + original = None + else: + try: + original = json.loads(text) + except Exception: + # Not JSON-parsable — leave response unchanged + return response + + status_code = getattr(response, "status_code", 200) + + # If already formatted (has `success` key), return as-is + if isinstance(original, dict) and "success" in original: + new_content = original + else: + # For error HTTP statuses, mark success=False + if status_code >= 400: + message = None + errors: Any = None + if isinstance(original, dict): + message = original.get("detail") or original.get("message") or str(original) + errors = original.get("errors") or original.get("detail") + else: + message = str(original) if original is not None else "Error" + + # Use model_dump(exclude_none=True) so `data` is removed when None + new_content = ResponseSchema( + success=False, + message=message or "Error", + errors=errors, + data=None, + ).model_dump(exclude_none=True) + else: + # Build success response. Normalize common shapes: + # - If original has top-level "message", use it as top-level message + # - If original has "user", place that object into `data` + # - If original has `data` and that inner dict contains a "message", remove that inner "message" + message = "Success" + data_payload = original + + if isinstance(original, dict): + message = original.get("message") or message + + if "user" in original: + data_payload = original.get("user") + elif "data" in original: + data_payload = original.get("data") + if isinstance(data_payload, dict) and "message" in data_payload: + # remove nested message inside data + data_payload = {k: v for k, v in data_payload.items() if k != "message"} + + new_content = ResponseSchema( + success=True, + message=message, + data=data_payload, + errors=None, + ).model_dump(exclude_none=True) + + # Preserve headers (except content-length which will be recalculated) + headers = {k: v for k, v in response.headers.items() if k.lower() != "content-length"} + return JSONResponse(status_code=status_code, content=new_content, headers=headers) diff --git a/backend/app/models.py b/backend/app/models.py deleted file mode 100644 index 2d060ba0b4..0000000000 --- a/backend/app/models.py +++ /dev/null @@ -1,113 +0,0 @@ -import uuid - -from pydantic import EmailStr -from sqlmodel import Field, Relationship, SQLModel - - -# Shared properties -class UserBase(SQLModel): - email: EmailStr = Field(unique=True, index=True, max_length=255) - is_active: bool = True - is_superuser: bool = False - full_name: str | None = Field(default=None, max_length=255) - - -# Properties to receive via API on creation -class UserCreate(UserBase): - password: str = Field(min_length=8, max_length=128) - - -class UserRegister(SQLModel): - email: EmailStr = Field(max_length=255) - password: str = Field(min_length=8, max_length=128) - full_name: str | None = Field(default=None, max_length=255) - - -# Properties to receive via API on update, all are optional -class UserUpdate(UserBase): - email: EmailStr | None = Field(default=None, max_length=255) # type: ignore - password: str | None = Field(default=None, min_length=8, max_length=128) - - -class UserUpdateMe(SQLModel): - full_name: str | None = Field(default=None, max_length=255) - email: EmailStr | None = Field(default=None, max_length=255) - - -class UpdatePassword(SQLModel): - current_password: str = Field(min_length=8, max_length=128) - new_password: str = Field(min_length=8, max_length=128) - - -# Database model, database table inferred from class name -class User(UserBase, table=True): - id: uuid.UUID = Field(default_factory=uuid.uuid4, primary_key=True) - hashed_password: str - items: list["Item"] = Relationship(back_populates="owner", cascade_delete=True) - - -# Properties to return via API, id is always required -class UserPublic(UserBase): - id: uuid.UUID - - -class UsersPublic(SQLModel): - data: list[UserPublic] - count: int - - -# Shared properties -class ItemBase(SQLModel): - title: str = Field(min_length=1, max_length=255) - description: str | None = Field(default=None, max_length=255) - - -# Properties to receive on item creation -class ItemCreate(ItemBase): - pass - - -# Properties to receive on item update -class ItemUpdate(ItemBase): - title: str | None = Field(default=None, min_length=1, max_length=255) # type: ignore - - -# Database model, database table inferred from class name -class Item(ItemBase, table=True): - id: uuid.UUID = Field(default_factory=uuid.uuid4, primary_key=True) - owner_id: uuid.UUID = Field( - foreign_key="user.id", nullable=False, ondelete="CASCADE" - ) - owner: User | None = Relationship(back_populates="items") - - -# Properties to return via API, id is always required -class ItemPublic(ItemBase): - id: uuid.UUID - owner_id: uuid.UUID - - -class ItemsPublic(SQLModel): - data: list[ItemPublic] - count: int - - -# Generic message -class Message(SQLModel): - message: str - - -# JSON payload containing access token -class Token(SQLModel): - access_token: str - token_type: str = "bearer" - - -# Contents of JWT token -class TokenPayload(SQLModel): - sub: str | None = None - - -class NewPassword(SQLModel): - token: str - new_password: str = Field(min_length=8, max_length=128) diff --git a/backend/app/models/__init__.py b/backend/app/models/__init__.py new file mode 100644 index 0000000000..7c5cb610af --- /dev/null +++ b/backend/app/models/__init__.py @@ -0,0 +1,14 @@ +"""Package initializer for app models. + +Import each model module here so that when Alembic imports +`app.models` during `env.py` execution, all SQLModel models are +registered on `SQLModel.metadata` and are available for +`--autogenerate` migrations. + +Keep imports explicit to avoid accidental heavy imports at runtime. +""" + +from . import user # noqa: F401 +from . import otp # noqa: F401 + +__all__ = ["user", "otp"] diff --git a/backend/app/models/otp.py b/backend/app/models/otp.py new file mode 100644 index 0000000000..691f0c6f28 --- /dev/null +++ b/backend/app/models/otp.py @@ -0,0 +1,18 @@ +import uuid +from datetime import datetime +from typing import Optional +from sqlmodel import Field, Relationship, SQLModel +from app.enums.otp_enum import OTPType + +class OTP(SQLModel, table=True): + id: uuid.UUID = Field(default_factory=uuid.uuid4, primary_key=True, index=True) + user_id: uuid.UUID = Field(foreign_key="user.id", nullable=False, index=True) + code: int = Field(nullable=False) + type: OTPType = Field(nullable=False) + + created_at: datetime = Field(default_factory=datetime.utcnow) + expires_at: datetime + updated_at: datetime = Field(default_factory=datetime.utcnow) + + # Many-to-one relationship + user: Optional["User"] = Relationship(back_populates="otp") diff --git a/backend/app/models/user.py b/backend/app/models/user.py new file mode 100644 index 0000000000..f94e34f8ee --- /dev/null +++ b/backend/app/models/user.py @@ -0,0 +1,40 @@ +import uuid +from typing import List, Optional +from datetime import datetime +from pydantic import EmailStr +from sqlmodel import Field, Relationship, SQLModel +from app.enums.user_enum import UserRole + +class User(SQLModel, table=True): + id: uuid.UUID = Field(default_factory=uuid.uuid4, primary_key=True, index=True) + first_name: Optional[str] = None + last_name: Optional[str] = None + email: EmailStr = Field(index=True, unique=True, nullable=False) + hashed_password: str = Field(nullable=False) + phone_number: Optional[str] = None + role: UserRole = Field(default=UserRole.user) + + created_at: datetime = Field(default_factory=datetime.utcnow) + updated_at: datetime = Field(default_factory=datetime.utcnow) + + # One-to-many relationship + otp: List["OTP"] = Relationship(back_populates="user") + + +# Pydantic/SQLModel helper schemas used by the tests and API +class UserBase(SQLModel): + email: EmailStr + first_name: Optional[str] = None + last_name: Optional[str] = None + phone_number: Optional[str] = None + + +class UserCreate(UserBase): + password: str + + +class UserUpdate(SQLModel): + first_name: Optional[str] = None + last_name: Optional[str] = None + password: Optional[str] = None + phone_number: Optional[str] = None diff --git a/backend/app/schemas/__init__.py b/backend/app/schemas/__init__.py new file mode 100644 index 0000000000..e69de29bb2 diff --git a/backend/app/schemas/base.py b/backend/app/schemas/base.py new file mode 100644 index 0000000000..a479398fbc --- /dev/null +++ b/backend/app/schemas/base.py @@ -0,0 +1,12 @@ +from pydantic import BaseModel, ConfigDict +from datetime import datetime +from typing import Optional + + +class BaseSchema(BaseModel): + model_config = ConfigDict(from_attributes=True) + + +class TimestampMixin(BaseModel): + created_at: Optional[datetime] = None + updated_at: Optional[datetime] = None diff --git a/backend/app/schemas/response.py b/backend/app/schemas/response.py new file mode 100644 index 0000000000..c584594500 --- /dev/null +++ b/backend/app/schemas/response.py @@ -0,0 +1,34 @@ +from typing import Optional, Any, Generic, TypeVar +from pydantic import BaseModel, Field + +T = TypeVar('T') + + +class ResponseSchema(BaseModel, Generic[T]): + success: bool = Field(default=True, description="Operation success status") + message: str = Field(default="Success", description="Response message") + data: Optional[T] = Field(default=None, description="Response data") + errors: Optional[Any] = Field(default=None, description="Error details") + meta: Optional[dict] = Field(default=None, description="Additional metadata") + + class Config: + json_schema_extra = { + "example": { + "success": True, + "message": "Operation completed successfully", + "data": {"id": 1, "name": "Example"}, + "errors": None, + "meta": {"timestamp": "2024-01-01T00:00:00"} + } + } + + +class PaginationMeta(BaseModel): + page: int + page_size: int + total_items: int + total_pages: int + + +class PaginatedResponseSchema(ResponseSchema[T], Generic[T]): + meta: Optional[PaginationMeta] = None diff --git a/backend/app/schemas/user.py b/backend/app/schemas/user.py new file mode 100644 index 0000000000..9569abfd8c --- /dev/null +++ b/backend/app/schemas/user.py @@ -0,0 +1,30 @@ +from typing import Optional +import re +from pydantic import BaseModel, EmailStr, validator +from app.utils_helper.regex import RegexClass +from app.utils_helper.messages import MSG + + +class LoginSchema(BaseModel): + email: EmailStr + password: str + + @validator('password') + def password_strength(cls, v: str) -> str: + if not RegexClass.is_strong_password(v): + raise ValueError(MSG.VALIDATION["PASSWORD_TOO_WEAK"]) + return v + + +class RegisterSchema(BaseModel): + first_name: str + last_name: str + email: EmailStr + password: str + phone_number : Optional[str] = None + + @validator('password') + def password_strength(cls, v: str) -> str: + if not RegexClass.is_strong_password(v): + raise ValueError(MSG.VALIDATION["PASSWORD_TOO_WEAK"]) + return v diff --git a/backend/app/services/auth_service.py b/backend/app/services/auth_service.py new file mode 100644 index 0000000000..be39cd4d59 --- /dev/null +++ b/backend/app/services/auth_service.py @@ -0,0 +1,143 @@ + +import jwt +from datetime import timedelta +from typing import Any + +from sqlmodel import Session, select + +from app.core.config import settings +from app.core.db import engine +from app.core import security +from app.models.user import User +from app.utils_helper.messages import MSG + + +class AuthService: + async def get_user_by_email(self, email: str) -> User | None: + with Session(engine) as session: + statement = select(User).where(User.email == email) + result = session.exec(statement).first() + return result + + async def create_user(self, email: str, password: str, first_name: str | None = None, last_name: str | None = None, phone_number: str | None = None ) -> User: + if not email or not password: + raise ValueError(MSG.AUTH["ERROR"]["EMAIL_AND_PASSWORD_REQUIRED"]) + + hashed = security.get_password_hash(password) + user = User(email=email, hashed_password=hashed, first_name=first_name, last_name=last_name, phone_number=phone_number) + + with Session(engine) as session: + session.add(user) + session.commit() + session.refresh(user) + return user + + async def authenticate_user(self, email: str, password: str) -> User | None: + user = await self.get_user_by_email(email) + if not user: + return None + if not security.verify_password(password, user.hashed_password): + return None + return user + + +# Module-level helper for legacy tests that expect `app.services.auth_service.create_user` +def create_user(session, user_create) -> User: + """Create a user using a DB session and a `UserCreate` like object. + + This helper mirrors the behavior expected by older tests that import + `app.services.auth_service as crud` and call `crud.create_user(...)`. + """ + if not getattr(user_create, "email", None) or not getattr(user_create, "password", None): + raise ValueError(MSG.AUTH["ERROR"]["EMAIL_AND_PASSWORD_REQUIRED"]) + + hashed = security.get_password_hash(user_create.password) + user = User(email=user_create.email, hashed_password=hashed, first_name=getattr(user_create, "first_name", None), last_name=getattr(user_create, "last_name", None), phone_number=getattr(user_create, "phone_number", None)) + session.add(user) + session.commit() + session.refresh(user) + return user + + async def login(self, email: str, password: str) -> dict[str, Any]: + user = await self.authenticate_user(email, password) + if not user: + raise ValueError(MSG.AUTH["ERROR"]["INVALID_CREDENTIALS"]) + + expires = timedelta(minutes=settings.ACCESS_TOKEN_EXPIRE_MINUTES) + access_token = security.create_access_token(subject=str(user.id), expires_delta=expires) + + user_data = { + "id": str(user.id), + "email": str(user.email), + "first_name": user.first_name, + "last_name": user.last_name, + "role": str(user.role) if hasattr(user, "role") else None, + } + + return {"access_token": access_token, "token_type": "bearer", "user": user_data} + + async def register(self, email: str, password: str, first_name: str | None = None, last_name: str | None = None, phone_number: str | None = None) -> dict[str, Any]: + existing = await self.get_user_by_email(email) + if existing: + raise ValueError(MSG.AUTH["ERROR"]["USER_EXISTS"]) + + user = await self.create_user(email=email, password=password, first_name=first_name, last_name=last_name, phone_number=phone_number) + return {"message": MSG.AUTH["SUCCESS"]["USER_REGISTERED"], "user": {"id": str(user.id), "email": str(user.email)}} + + async def verify(self, token: str | None = None) -> dict[str, Any]: + if not token: + raise ValueError(MSG.AUTH["ERROR"]["TOKEN_REQUIRED"]) + return {"message": "Email verified"} + + async def forgot_password(self, email: str) -> dict[str, Any]: + if not email: + raise ValueError(MSG.AUTH["ERROR"]["EMAIL_REQUIRED"]) + + user = await self.get_user_by_email(email) + if not user: + return {"message": MSG.AUTH["SUCCESS"]["PASSWORD_RESET_EMAIL_SENT"]} + + expires = timedelta(hours=settings.EMAIL_RESET_TOKEN_EXPIRE_HOURS) + reset_token = security.create_access_token(subject=str(user.id), expires_delta=expires) + + return {"message": MSG.AUTH["SUCCESS"]["PASSWORD_RESET_EMAIL_SENT"], "reset_token": reset_token} + + async def reset_password(self, token: str, new_password: str) -> dict[str, Any]: + if not token or not new_password: + raise ValueError("Token and new password are required") + try: + + payload = jwt.decode(token, settings.SECRET_KEY, algorithms=[security.ALGORITHM]) + subject = payload.get("sub") + if not subject: + raise ValueError(MSG.AUTH["ERROR"]["INVALID_TOKEN"]) + + with Session(engine) as session: + statement = select(User).where(User.id == subject) + user = session.exec(statement).first() + if not user: + raise ValueError(MSG.AUTH["ERROR"]["INVALID_TOKEN_SUBJECT"]) + + user.hashed_password = security.get_password_hash(new_password) + session.add(user) + session.commit() + session.refresh(user) + + return {"message": MSG.AUTH["SUCCESS"]["PASSWORD_HAS_BEEN_RESET"]} + except jwt.ExpiredSignatureError: + raise ValueError(MSG.AUTH["ERROR"]["TOKEN_EXPIRED"]) + except jwt.PyJWTError: + raise ValueError(MSG.AUTH["ERROR"]["INVALID_TOKEN"]) + + async def resend_email(self, email: str) -> dict[str, Any]: + if not email: + raise ValueError(MSG.AUTH["ERROR"]["EMAIL_REQUIRED"]) + + user = await self.get_user_by_email(email) + if not user: + return {"message": MSG.AUTH["SUCCESS"]["VERIFICATION_EMAIL_RESENT"]} + + return {"message": MSG.AUTH["SUCCESS"]["VERIFICATION_EMAIL_RESENT"]} + + async def logout(self, user_id: str | None = None) -> dict[str, Any]: + return {"message": MSG.AUTH["SUCCESS"]["LOGGED_OUT"]} diff --git a/backend/app/services/webengage_email.py b/backend/app/services/webengage_email.py new file mode 100644 index 0000000000..82d187f2cb --- /dev/null +++ b/backend/app/services/webengage_email.py @@ -0,0 +1,43 @@ +from typing import Any, Dict +import httpx +from app.core.config import settings + + +async def send_email( + to_email: str, + subject: str, + template_id: str | None = None, + variables: Dict[str, Any] | None = None, + from_email: str | None = None, + from_name: str | None = None, +) -> Dict[str, Any]: + + if not settings.webengage_enabled: + raise RuntimeError("WebEngage is not configured (WEBENGAGE_API_URL/KEY missing)") + + url = str(settings.WEBENGAGE_API_URL) + headers = { + "Authorization": f"Bearer {settings.WEBENGAGE_API_KEY}", + "Content-Type": "application/json", + } + + body: Dict[str, Any] = { + "to": {"email": to_email}, + "subject": subject, + "personalization": variables or {}, + } + + if template_id: + body["template_id"] = template_id + + if from_email or settings.EMAILS_FROM_EMAIL: + body["from"] = { + "email": from_email or str(settings.EMAILS_FROM_EMAIL), + "name": from_name or settings.EMAILS_FROM_NAME, + } + + timeout = httpx.Timeout(10.0, connect=5.0) + async with httpx.AsyncClient(timeout=timeout) as client: + resp = await client.post(url, json=body, headers=headers) + resp.raise_for_status() + return resp.json() diff --git a/backend/app/tasks/tasks.py b/backend/app/tasks/tasks.py new file mode 100644 index 0000000000..d79faf3027 --- /dev/null +++ b/backend/app/tasks/tasks.py @@ -0,0 +1,5 @@ +from __future__ import annotations + +from app.workers import add, send_welcome_email + +__all__ = ["add", "send_welcome_email"] diff --git a/backend/app/utils_helper/helpers.py b/backend/app/utils_helper/helpers.py new file mode 100644 index 0000000000..29b6b5ec17 --- /dev/null +++ b/backend/app/utils_helper/helpers.py @@ -0,0 +1,28 @@ +import hashlib +import uuid +from datetime import datetime, timedelta +from typing import Any, Optional + + +def generate_uuid() -> str: + return str(uuid.uuid4()) + + +def generate_hash(data: str) -> str: + return hashlib.sha256(data.encode()).hexdigest() + + +def get_current_timestamp() -> datetime: + return datetime.utcnow() + + +def add_time(hours: int = 0, minutes: int = 0, days: int = 0) -> datetime: + return datetime.utcnow() + timedelta(hours=hours, minutes=minutes, days=days) + + +def format_datetime(dt: datetime, fmt: str = "%Y-%m-%d %H:%M:%S") -> str: + return dt.strftime(fmt) + + +def parse_datetime(dt_str: str, fmt: str = "%Y-%m-%d %H:%M:%S") -> datetime: + return datetime.strptime(dt_str, fmt) diff --git a/backend/app/utils_helper/messages.py b/backend/app/utils_helper/messages.py new file mode 100644 index 0000000000..444109451c --- /dev/null +++ b/backend/app/utils_helper/messages.py @@ -0,0 +1,27 @@ +class Messages: + AUTH = { + "SUCCESS": { + "USER_REGISTERED": "User registered", + "USER_LOGGED_IN": "Logged in", + "EMAIL_VERIFIED": "Email verified", + "PASSWORD_RESET_EMAIL_SENT": "Password reset email sent", + "PASSWORD_HAS_BEEN_RESET": "Password has been reset", + "VERIFICATION_EMAIL_RESENT": "Verification email resent", + "LOGGED_OUT": "Logged out", + }, + "ERROR": { + "EMAIL_AND_PASSWORD_REQUIRED": "Email and password are required", + "INVALID_CREDENTIALS": "Invalid credentials", + "USER_EXISTS": "A user with that email already exists", + "TOKEN_REQUIRED": "Token is required", + "EMAIL_REQUIRED": "Email is required", + "INVALID_TOKEN": "Invalid token", + "INVALID_TOKEN_SUBJECT": "Invalid token subject", + "TOKEN_EXPIRED": "Token expired", + "TOKEN_AND_PASSWORD_REQUIRED": "Token and new password are required", + }, + } + VALIDATION = { + "PASSWORD_TOO_WEAK": "Password must be at least 8 characters long and include uppercase, lowercase, number, and special character", + } +MSG = Messages() diff --git a/backend/app/utils_helper/regex.py b/backend/app/utils_helper/regex.py new file mode 100644 index 0000000000..c01efeb3cb --- /dev/null +++ b/backend/app/utils_helper/regex.py @@ -0,0 +1,8 @@ +import re + +class RegexClass : + + @staticmethod + def is_strong_password(password: str) -> bool: + _PASSWORD_PATTERN = re.compile(r'^(?=.*[a-z])(?=.*[A-Z])(?=.*\d)(?=.*[^\w\s]).{8,}$') + return bool(_PASSWORD_PATTERN.match(password)) diff --git a/backend/app/utils_helper/threading.py b/backend/app/utils_helper/threading.py new file mode 100644 index 0000000000..3b1ddb54e1 --- /dev/null +++ b/backend/app/utils_helper/threading.py @@ -0,0 +1,28 @@ +import asyncio +import os +from concurrent.futures import ThreadPoolExecutor +from typing import Callable, Any +from functools import wraps + + +class ThreadingUtils: + # Dynamic sizing: base workers on CPU count, capped to reasonable limits + _cpu = os.cpu_count() or 1 + _max_workers = min(32, max(2, _cpu * 5)) + executor = ThreadPoolExecutor(max_workers=_max_workers) + + @staticmethod + async def run_in_thread(func: Callable, *args, **kwargs) -> Any: + loop = asyncio.get_event_loop() + return await loop.run_in_executor( + ThreadingUtils.executor, + lambda: func(*args, **kwargs) + ) + + @staticmethod + def async_to_sync(func: Callable) -> Callable: + @wraps(func) + def wrapper(*args, **kwargs): + loop = asyncio.get_event_loop() + return loop.run_until_complete(func(*args, **kwargs)) + return wrapper diff --git a/backend/app/workers/__init__.py b/backend/app/workers/__init__.py new file mode 100644 index 0000000000..2cef3e8b56 --- /dev/null +++ b/backend/app/workers/__init__.py @@ -0,0 +1,6 @@ +from __future__ import annotations + +from .tasks import add, send_welcome_email +from .celery_worker import main as worker_main + +__all__ = ["add", "send_welcome_email", "worker_main"] diff --git a/backend/app/workers/celery_worker.py b/backend/app/workers/celery_worker.py new file mode 100644 index 0000000000..a1e28abe7c --- /dev/null +++ b/backend/app/workers/celery_worker.py @@ -0,0 +1,15 @@ +from __future__ import annotations + +from app.core.celery_app import celery_app + + +def main() -> None: + argv = [ + "worker", + "--loglevel=info", + ] + celery_app.worker_main(argv) + + +if __name__ == "__main__": + main() diff --git a/backend/pyproject.toml b/backend/pyproject.toml index d72454c28a..9735c2d11e 100644 --- a/backend/pyproject.toml +++ b/backend/pyproject.toml @@ -21,6 +21,10 @@ dependencies = [ "pydantic-settings<3.0.0,>=2.2.1", "sentry-sdk[fastapi]<2.0.0,>=1.40.6", "pyjwt<3.0.0,>=2.8.0", + "redis<5.0.0,>=4.6.0", + "celery[redis]<6,>=5.3.0", + "boto3>=1.26", + "aioboto3>=10.5", ] [tool.uv] diff --git a/backend/scripts/prestart.sh b/backend/scripts/prestart.sh index 1b395d513f..56f0c7a868 100644 --- a/backend/scripts/prestart.sh +++ b/backend/scripts/prestart.sh @@ -10,4 +10,9 @@ python app/backend_pre_start.py alembic upgrade head # Create initial data in DB -python app/initial_data.py +# Only run initial data script if it exists and SKIP_INITIAL_DATA is not set to "true". +if [ -f "app/initial_data.py" ] && [ "${SKIP_INITIAL_DATA:-false}" != "true" ]; then + python app/initial_data.py +else + echo "Skipping initial data (file missing or SKIP_INITIAL_DATA=true)" +fi diff --git a/backend/tests/api/routes/auth.py b/backend/tests/api/routes/auth.py new file mode 100644 index 0000000000..e69de29bb2 diff --git a/backend/tests/api/routes/test_items.py b/backend/tests/api/routes/test_items.py deleted file mode 100644 index db950b4535..0000000000 --- a/backend/tests/api/routes/test_items.py +++ /dev/null @@ -1,164 +0,0 @@ -import uuid - -from fastapi.testclient import TestClient -from sqlmodel import Session - -from app.core.config import settings -from tests.utils.item import create_random_item - - -def test_create_item( - client: TestClient, superuser_token_headers: dict[str, str] -) -> None: - data = {"title": "Foo", "description": "Fighters"} - response = client.post( - f"{settings.API_V1_STR}/items/", - headers=superuser_token_headers, - json=data, - ) - assert response.status_code == 200 - content = response.json() - assert content["title"] == data["title"] - assert content["description"] == data["description"] - assert "id" in content - assert "owner_id" in content - - -def test_read_item( - client: TestClient, superuser_token_headers: dict[str, str], db: Session -) -> None: - item = create_random_item(db) - response = client.get( - f"{settings.API_V1_STR}/items/{item.id}", - headers=superuser_token_headers, - ) - assert response.status_code == 200 - content = response.json() - assert content["title"] == item.title - assert content["description"] == item.description - assert content["id"] == str(item.id) - assert content["owner_id"] == str(item.owner_id) - - -def test_read_item_not_found( - client: TestClient, superuser_token_headers: dict[str, str] -) -> None: - response = client.get( - f"{settings.API_V1_STR}/items/{uuid.uuid4()}", - headers=superuser_token_headers, - ) - assert response.status_code == 404 - content = response.json() - assert content["detail"] == "Item not found" - - -def test_read_item_not_enough_permissions( - client: TestClient, normal_user_token_headers: dict[str, str], db: Session -) -> None: - item = create_random_item(db) - response = client.get( - f"{settings.API_V1_STR}/items/{item.id}", - headers=normal_user_token_headers, - ) - assert response.status_code == 400 - content = response.json() - assert content["detail"] == "Not enough permissions" - - -def test_read_items( - client: TestClient, superuser_token_headers: dict[str, str], db: Session -) -> None: - create_random_item(db) - create_random_item(db) - response = client.get( - f"{settings.API_V1_STR}/items/", - headers=superuser_token_headers, - ) - assert response.status_code == 200 - content = response.json() - assert len(content["data"]) >= 2 - - -def test_update_item( - client: TestClient, superuser_token_headers: dict[str, str], db: Session -) -> None: - item = create_random_item(db) - data = {"title": "Updated title", "description": "Updated description"} - response = client.put( - f"{settings.API_V1_STR}/items/{item.id}", - headers=superuser_token_headers, - json=data, - ) - assert response.status_code == 200 - content = response.json() - assert content["title"] == data["title"] - assert content["description"] == data["description"] - assert content["id"] == str(item.id) - assert content["owner_id"] == str(item.owner_id) - - -def test_update_item_not_found( - client: TestClient, superuser_token_headers: dict[str, str] -) -> None: - data = {"title": "Updated title", "description": "Updated description"} - response = client.put( - f"{settings.API_V1_STR}/items/{uuid.uuid4()}", - headers=superuser_token_headers, - json=data, - ) - assert response.status_code == 404 - content = response.json() - assert content["detail"] == "Item not found" - - -def test_update_item_not_enough_permissions( - client: TestClient, normal_user_token_headers: dict[str, str], db: Session -) -> None: - item = create_random_item(db) - data = {"title": "Updated title", "description": "Updated description"} - response = client.put( - f"{settings.API_V1_STR}/items/{item.id}", - headers=normal_user_token_headers, - json=data, - ) - assert response.status_code == 400 - content = response.json() - assert content["detail"] == "Not enough permissions" - - -def test_delete_item( - client: TestClient, superuser_token_headers: dict[str, str], db: Session -) -> None: - item = create_random_item(db) - response = client.delete( - f"{settings.API_V1_STR}/items/{item.id}", - headers=superuser_token_headers, - ) - assert response.status_code == 200 - content = response.json() - assert content["message"] == "Item deleted successfully" - - -def test_delete_item_not_found( - client: TestClient, superuser_token_headers: dict[str, str] -) -> None: - response = client.delete( - f"{settings.API_V1_STR}/items/{uuid.uuid4()}", - headers=superuser_token_headers, - ) - assert response.status_code == 404 - content = response.json() - assert content["detail"] == "Item not found" - - -def test_delete_item_not_enough_permissions( - client: TestClient, normal_user_token_headers: dict[str, str], db: Session -) -> None: - item = create_random_item(db) - response = client.delete( - f"{settings.API_V1_STR}/items/{item.id}", - headers=normal_user_token_headers, - ) - assert response.status_code == 400 - content = response.json() - assert content["detail"] == "Not enough permissions" diff --git a/backend/tests/api/routes/test_login.py b/backend/tests/api/routes/test_login.py deleted file mode 100644 index ee166913bd..0000000000 --- a/backend/tests/api/routes/test_login.py +++ /dev/null @@ -1,118 +0,0 @@ -from unittest.mock import patch - -from fastapi.testclient import TestClient -from sqlmodel import Session - -from app.core.config import settings -from app.core.security import verify_password -from app.crud import create_user -from app.models import UserCreate -from app.utils import generate_password_reset_token -from tests.utils.user import user_authentication_headers -from tests.utils.utils import random_email, random_lower_string - - -def test_get_access_token(client: TestClient) -> None: - login_data = { - "username": settings.FIRST_SUPERUSER, - "password": settings.FIRST_SUPERUSER_PASSWORD, - } - r = client.post(f"{settings.API_V1_STR}/login/access-token", data=login_data) - tokens = r.json() - assert r.status_code == 200 - assert "access_token" in tokens - assert tokens["access_token"] - - -def test_get_access_token_incorrect_password(client: TestClient) -> None: - login_data = { - "username": settings.FIRST_SUPERUSER, - "password": "incorrect", - } - r = client.post(f"{settings.API_V1_STR}/login/access-token", data=login_data) - assert r.status_code == 400 - - -def test_use_access_token( - client: TestClient, superuser_token_headers: dict[str, str] -) -> None: - r = client.post( - f"{settings.API_V1_STR}/login/test-token", - headers=superuser_token_headers, - ) - result = r.json() - assert r.status_code == 200 - assert "email" in result - - -def test_recovery_password( - client: TestClient, normal_user_token_headers: dict[str, str] -) -> None: - with ( - patch("app.core.config.settings.SMTP_HOST", "smtp.example.com"), - patch("app.core.config.settings.SMTP_USER", "admin@example.com"), - ): - email = "test@example.com" - r = client.post( - f"{settings.API_V1_STR}/password-recovery/{email}", - headers=normal_user_token_headers, - ) - assert r.status_code == 200 - assert r.json() == {"message": "Password recovery email sent"} - - -def test_recovery_password_user_not_exits( - client: TestClient, normal_user_token_headers: dict[str, str] -) -> None: - email = "jVgQr@example.com" - r = client.post( - f"{settings.API_V1_STR}/password-recovery/{email}", - headers=normal_user_token_headers, - ) - assert r.status_code == 404 - - -def test_reset_password(client: TestClient, db: Session) -> None: - email = random_email() - password = random_lower_string() - new_password = random_lower_string() - - user_create = UserCreate( - email=email, - full_name="Test User", - password=password, - is_active=True, - is_superuser=False, - ) - user = create_user(session=db, user_create=user_create) - token = generate_password_reset_token(email=email) - headers = user_authentication_headers(client=client, email=email, password=password) - data = {"new_password": new_password, "token": token} - - r = client.post( - f"{settings.API_V1_STR}/reset-password/", - headers=headers, - json=data, - ) - - assert r.status_code == 200 - assert r.json() == {"message": "Password updated successfully"} - - db.refresh(user) - assert verify_password(new_password, user.hashed_password) - - -def test_reset_password_invalid_token( - client: TestClient, superuser_token_headers: dict[str, str] -) -> None: - data = {"new_password": "changethis", "token": "invalid"} - r = client.post( - f"{settings.API_V1_STR}/reset-password/", - headers=superuser_token_headers, - json=data, - ) - response = r.json() - - assert "detail" in response - assert r.status_code == 400 - assert response["detail"] == "Invalid token" diff --git a/backend/tests/api/routes/test_private.py b/backend/tests/api/routes/test_private.py deleted file mode 100644 index 1e1f985021..0000000000 --- a/backend/tests/api/routes/test_private.py +++ /dev/null @@ -1,26 +0,0 @@ -from fastapi.testclient import TestClient -from sqlmodel import Session, select - -from app.core.config import settings -from app.models import User - - -def test_create_user(client: TestClient, db: Session) -> None: - r = client.post( - f"{settings.API_V1_STR}/private/users/", - json={ - "email": "pollo@listo.com", - "password": "password123", - "full_name": "Pollo Listo", - }, - ) - - assert r.status_code == 200 - - data = r.json() - - user = db.exec(select(User).where(User.id == data["id"])).first() - - assert user - assert user.email == "pollo@listo.com" - assert user.full_name == "Pollo Listo" diff --git a/backend/tests/api/routes/test_users.py b/backend/tests/api/routes/test_users.py deleted file mode 100644 index 39e053e554..0000000000 --- a/backend/tests/api/routes/test_users.py +++ /dev/null @@ -1,486 +0,0 @@ -import uuid -from unittest.mock import patch - -from fastapi.testclient import TestClient -from sqlmodel import Session, select - -from app import crud -from app.core.config import settings -from app.core.security import verify_password -from app.models import User, UserCreate -from tests.utils.utils import random_email, random_lower_string - - -def test_get_users_superuser_me( - client: TestClient, superuser_token_headers: dict[str, str] -) -> None: - r = client.get(f"{settings.API_V1_STR}/users/me", headers=superuser_token_headers) - current_user = r.json() - assert current_user - assert current_user["is_active"] is True - assert current_user["is_superuser"] - assert current_user["email"] == settings.FIRST_SUPERUSER - - -def test_get_users_normal_user_me( - client: TestClient, normal_user_token_headers: dict[str, str] -) -> None: - r = client.get(f"{settings.API_V1_STR}/users/me", headers=normal_user_token_headers) - current_user = r.json() - assert current_user - assert current_user["is_active"] is True - assert current_user["is_superuser"] is False - assert current_user["email"] == settings.EMAIL_TEST_USER - - -def test_create_user_new_email( - client: TestClient, superuser_token_headers: dict[str, str], db: Session -) -> None: - with ( - patch("app.utils.send_email", return_value=None), - patch("app.core.config.settings.SMTP_HOST", "smtp.example.com"), - patch("app.core.config.settings.SMTP_USER", "admin@example.com"), - ): - username = random_email() - password = random_lower_string() - data = {"email": username, "password": password} - r = client.post( - f"{settings.API_V1_STR}/users/", - headers=superuser_token_headers, - json=data, - ) - assert 200 <= r.status_code < 300 - created_user = r.json() - user = crud.get_user_by_email(session=db, email=username) - assert user - assert user.email == created_user["email"] - - -def test_get_existing_user( - client: TestClient, superuser_token_headers: dict[str, str], db: Session -) -> None: - username = random_email() - password = random_lower_string() - user_in = UserCreate(email=username, password=password) - user = crud.create_user(session=db, user_create=user_in) - user_id = user.id - r = client.get( - f"{settings.API_V1_STR}/users/{user_id}", - headers=superuser_token_headers, - ) - assert 200 <= r.status_code < 300 - api_user = r.json() - existing_user = crud.get_user_by_email(session=db, email=username) - assert existing_user - assert existing_user.email == api_user["email"] - - -def test_get_existing_user_current_user(client: TestClient, db: Session) -> None: - username = random_email() - password = random_lower_string() - user_in = UserCreate(email=username, password=password) - user = crud.create_user(session=db, user_create=user_in) - user_id = user.id - - login_data = { - "username": username, - "password": password, - } - r = client.post(f"{settings.API_V1_STR}/login/access-token", data=login_data) - tokens = r.json() - a_token = tokens["access_token"] - headers = {"Authorization": f"Bearer {a_token}"} - - r = client.get( - f"{settings.API_V1_STR}/users/{user_id}", - headers=headers, - ) - assert 200 <= r.status_code < 300 - api_user = r.json() - existing_user = crud.get_user_by_email(session=db, email=username) - assert existing_user - assert existing_user.email == api_user["email"] - - -def test_get_existing_user_permissions_error( - client: TestClient, normal_user_token_headers: dict[str, str] -) -> None: - r = client.get( - f"{settings.API_V1_STR}/users/{uuid.uuid4()}", - headers=normal_user_token_headers, - ) - assert r.status_code == 403 - assert r.json() == {"detail": "The user doesn't have enough privileges"} - - -def test_create_user_existing_username( - client: TestClient, superuser_token_headers: dict[str, str], db: Session -) -> None: - username = random_email() - # username = email - password = random_lower_string() - user_in = UserCreate(email=username, password=password) - crud.create_user(session=db, user_create=user_in) - data = {"email": username, "password": password} - r = client.post( - f"{settings.API_V1_STR}/users/", - headers=superuser_token_headers, - json=data, - ) - created_user = r.json() - assert r.status_code == 400 - assert "_id" not in created_user - - -def test_create_user_by_normal_user( - client: TestClient, normal_user_token_headers: dict[str, str] -) -> None: - username = random_email() - password = random_lower_string() - data = {"email": username, "password": password} - r = client.post( - f"{settings.API_V1_STR}/users/", - headers=normal_user_token_headers, - json=data, - ) - assert r.status_code == 403 - - -def test_retrieve_users( - client: TestClient, superuser_token_headers: dict[str, str], db: Session -) -> None: - username = random_email() - password = random_lower_string() - user_in = UserCreate(email=username, password=password) - crud.create_user(session=db, user_create=user_in) - - username2 = random_email() - password2 = random_lower_string() - user_in2 = UserCreate(email=username2, password=password2) - crud.create_user(session=db, user_create=user_in2) - - r = client.get(f"{settings.API_V1_STR}/users/", headers=superuser_token_headers) - all_users = r.json() - - assert len(all_users["data"]) > 1 - assert "count" in all_users - for item in all_users["data"]: - assert "email" in item - - -def test_update_user_me( - client: TestClient, normal_user_token_headers: dict[str, str], db: Session -) -> None: - full_name = "Updated Name" - email = random_email() - data = {"full_name": full_name, "email": email} - r = client.patch( - f"{settings.API_V1_STR}/users/me", - headers=normal_user_token_headers, - json=data, - ) - assert r.status_code == 200 - updated_user = r.json() - assert updated_user["email"] == email - assert updated_user["full_name"] == full_name - - user_query = select(User).where(User.email == email) - user_db = db.exec(user_query).first() - assert user_db - assert user_db.email == email - assert user_db.full_name == full_name - - -def test_update_password_me( - client: TestClient, superuser_token_headers: dict[str, str], db: Session -) -> None: - new_password = random_lower_string() - data = { - "current_password": settings.FIRST_SUPERUSER_PASSWORD, - "new_password": new_password, - } - r = client.patch( - f"{settings.API_V1_STR}/users/me/password", - headers=superuser_token_headers, - json=data, - ) - assert r.status_code == 200 - updated_user = r.json() - assert updated_user["message"] == "Password updated successfully" - - user_query = select(User).where(User.email == settings.FIRST_SUPERUSER) - user_db = db.exec(user_query).first() - assert user_db - assert user_db.email == settings.FIRST_SUPERUSER - assert verify_password(new_password, user_db.hashed_password) - - # Revert to the old password to keep consistency in test - old_data = { - "current_password": new_password, - "new_password": settings.FIRST_SUPERUSER_PASSWORD, - } - r = client.patch( - f"{settings.API_V1_STR}/users/me/password", - headers=superuser_token_headers, - json=old_data, - ) - db.refresh(user_db) - - assert r.status_code == 200 - assert verify_password(settings.FIRST_SUPERUSER_PASSWORD, user_db.hashed_password) - - -def test_update_password_me_incorrect_password( - client: TestClient, superuser_token_headers: dict[str, str] -) -> None: - new_password = random_lower_string() - data = {"current_password": new_password, "new_password": new_password} - r = client.patch( - f"{settings.API_V1_STR}/users/me/password", - headers=superuser_token_headers, - json=data, - ) - assert r.status_code == 400 - updated_user = r.json() - assert updated_user["detail"] == "Incorrect password" - - -def test_update_user_me_email_exists( - client: TestClient, normal_user_token_headers: dict[str, str], db: Session -) -> None: - username = random_email() - password = random_lower_string() - user_in = UserCreate(email=username, password=password) - user = crud.create_user(session=db, user_create=user_in) - - data = {"email": user.email} - r = client.patch( - f"{settings.API_V1_STR}/users/me", - headers=normal_user_token_headers, - json=data, - ) - assert r.status_code == 409 - assert r.json()["detail"] == "User with this email already exists" - - -def test_update_password_me_same_password_error( - client: TestClient, superuser_token_headers: dict[str, str] -) -> None: - data = { - "current_password": settings.FIRST_SUPERUSER_PASSWORD, - "new_password": settings.FIRST_SUPERUSER_PASSWORD, - } - r = client.patch( - f"{settings.API_V1_STR}/users/me/password", - headers=superuser_token_headers, - json=data, - ) - assert r.status_code == 400 - updated_user = r.json() - assert ( - updated_user["detail"] == "New password cannot be the same as the current one" - ) - - -def test_register_user(client: TestClient, db: Session) -> None: - username = random_email() - password = random_lower_string() - full_name = random_lower_string() - data = {"email": username, "password": password, "full_name": full_name} - r = client.post( - f"{settings.API_V1_STR}/users/signup", - json=data, - ) - assert r.status_code == 200 - created_user = r.json() - assert created_user["email"] == username - assert created_user["full_name"] == full_name - - user_query = select(User).where(User.email == username) - user_db = db.exec(user_query).first() - assert user_db - assert user_db.email == username - assert user_db.full_name == full_name - assert verify_password(password, user_db.hashed_password) - - -def test_register_user_already_exists_error(client: TestClient) -> None: - password = random_lower_string() - full_name = random_lower_string() - data = { - "email": settings.FIRST_SUPERUSER, - "password": password, - "full_name": full_name, - } - r = client.post( - f"{settings.API_V1_STR}/users/signup", - json=data, - ) - assert r.status_code == 400 - assert r.json()["detail"] == "The user with this email already exists in the system" - - -def test_update_user( - client: TestClient, superuser_token_headers: dict[str, str], db: Session -) -> None: - username = random_email() - password = random_lower_string() - user_in = UserCreate(email=username, password=password) - user = crud.create_user(session=db, user_create=user_in) - - data = {"full_name": "Updated_full_name"} - r = client.patch( - f"{settings.API_V1_STR}/users/{user.id}", - headers=superuser_token_headers, - json=data, - ) - assert r.status_code == 200 - updated_user = r.json() - - assert updated_user["full_name"] == "Updated_full_name" - - user_query = select(User).where(User.email == username) - user_db = db.exec(user_query).first() - db.refresh(user_db) - assert user_db - assert user_db.full_name == "Updated_full_name" - - -def test_update_user_not_exists( - client: TestClient, superuser_token_headers: dict[str, str] -) -> None: - data = {"full_name": "Updated_full_name"} - r = client.patch( - f"{settings.API_V1_STR}/users/{uuid.uuid4()}", - headers=superuser_token_headers, - json=data, - ) - assert r.status_code == 404 - assert r.json()["detail"] == "The user with this id does not exist in the system" - - -def test_update_user_email_exists( - client: TestClient, superuser_token_headers: dict[str, str], db: Session -) -> None: - username = random_email() - password = random_lower_string() - user_in = UserCreate(email=username, password=password) - user = crud.create_user(session=db, user_create=user_in) - - username2 = random_email() - password2 = random_lower_string() - user_in2 = UserCreate(email=username2, password=password2) - user2 = crud.create_user(session=db, user_create=user_in2) - - data = {"email": user2.email} - r = client.patch( - f"{settings.API_V1_STR}/users/{user.id}", - headers=superuser_token_headers, - json=data, - ) - assert r.status_code == 409 - assert r.json()["detail"] == "User with this email already exists" - - -def test_delete_user_me(client: TestClient, db: Session) -> None: - username = random_email() - password = random_lower_string() - user_in = UserCreate(email=username, password=password) - user = crud.create_user(session=db, user_create=user_in) - user_id = user.id - - login_data = { - "username": username, - "password": password, - } - r = client.post(f"{settings.API_V1_STR}/login/access-token", data=login_data) - tokens = r.json() - a_token = tokens["access_token"] - headers = {"Authorization": f"Bearer {a_token}"} - - r = client.delete( - f"{settings.API_V1_STR}/users/me", - headers=headers, - ) - assert r.status_code == 200 - deleted_user = r.json() - assert deleted_user["message"] == "User deleted successfully" - result = db.exec(select(User).where(User.id == user_id)).first() - assert result is None - - user_query = select(User).where(User.id == user_id) - user_db = db.execute(user_query).first() - assert user_db is None - - -def test_delete_user_me_as_superuser( - client: TestClient, superuser_token_headers: dict[str, str] -) -> None: - r = client.delete( - f"{settings.API_V1_STR}/users/me", - headers=superuser_token_headers, - ) - assert r.status_code == 403 - response = r.json() - assert response["detail"] == "Super users are not allowed to delete themselves" - - -def test_delete_user_super_user( - client: TestClient, superuser_token_headers: dict[str, str], db: Session -) -> None: - username = random_email() - password = random_lower_string() - user_in = UserCreate(email=username, password=password) - user = crud.create_user(session=db, user_create=user_in) - user_id = user.id - r = client.delete( - f"{settings.API_V1_STR}/users/{user_id}", - headers=superuser_token_headers, - ) - assert r.status_code == 200 - deleted_user = r.json() - assert deleted_user["message"] == "User deleted successfully" - result = db.exec(select(User).where(User.id == user_id)).first() - assert result is None - - -def test_delete_user_not_found( - client: TestClient, superuser_token_headers: dict[str, str] -) -> None: - r = client.delete( - f"{settings.API_V1_STR}/users/{uuid.uuid4()}", - headers=superuser_token_headers, - ) - assert r.status_code == 404 - assert r.json()["detail"] == "User not found" - - -def test_delete_user_current_super_user_error( - client: TestClient, superuser_token_headers: dict[str, str], db: Session -) -> None: - super_user = crud.get_user_by_email(session=db, email=settings.FIRST_SUPERUSER) - assert super_user - user_id = super_user.id - - r = client.delete( - f"{settings.API_V1_STR}/users/{user_id}", - headers=superuser_token_headers, - ) - assert r.status_code == 403 - assert r.json()["detail"] == "Super users are not allowed to delete themselves" - - -def test_delete_user_without_privileges( - client: TestClient, normal_user_token_headers: dict[str, str], db: Session -) -> None: - username = random_email() - password = random_lower_string() - user_in = UserCreate(email=username, password=password) - user = crud.create_user(session=db, user_create=user_in) - - r = client.delete( - f"{settings.API_V1_STR}/users/{user.id}", - headers=normal_user_token_headers, - ) - assert r.status_code == 403 - assert r.json()["detail"] == "The user doesn't have enough privileges" diff --git a/backend/tests/conftest.py b/backend/tests/conftest.py index 8ddab7b321..a4541bab71 100644 --- a/backend/tests/conftest.py +++ b/backend/tests/conftest.py @@ -2,22 +2,76 @@ import pytest from fastapi.testclient import TestClient -from sqlmodel import Session, delete +from sqlmodel import Session, SQLModel, delete from app.core.config import settings -from app.core.db import engine, init_db +from app.core.db import engine from app.main import app -from app.models import Item, User -from tests.utils.user import authentication_token_from_email -from tests.utils.utils import get_superuser_token_headers +from app.models.user import User +from app.models.otp import OTP +from app.core import security +from app.enums.user_enum import UserRole +from datetime import timedelta +from sqlmodel import select + + +# Helper utilities used by tests (keeps tests self-contained when utils/ is missing) +def authentication_token_from_email(client: TestClient, email: str, db: Session) -> dict[str, str]: + statement = select(User).where(User.email == email) + user = db.exec(statement).first() + if not user: + hashed = security.get_password_hash("password") + user = User(email=email, hashed_password=hashed, first_name="Test", last_name="User") + db.add(user) + db.commit() + db.refresh(user) + + token = security.create_access_token(subject=str(user.id), expires_delta=timedelta(minutes=settings.ACCESS_TOKEN_EXPIRE_MINUTES)) + return {"Authorization": f"Bearer {token}"} + + +def get_superuser_token_headers(client: TestClient, db: Session) -> dict[str, str]: + email = settings.FIRST_SUPERUSER if hasattr(settings, "FIRST_SUPERUSER") else "super@example.com" + password = settings.FIRST_SUPERUSER_PASSWORD if hasattr(settings, "FIRST_SUPERUSER_PASSWORD") else "password" + statement = select(User).where(User.email == email) + user = db.exec(statement).first() + if not user: + hashed = security.get_password_hash(password) + user = User(email=email, hashed_password=hashed, first_name="Super", last_name="User", role=UserRole.admin) + db.add(user) + db.commit() + db.refresh(user) + else: + # ensure role is admin + user.role = UserRole.admin + db.add(user) + db.commit() + db.refresh(user) + + token = security.create_access_token(subject=str(user.id), expires_delta=timedelta(minutes=settings.ACCESS_TOKEN_EXPIRE_MINUTES)) + return {"Authorization": f"Bearer {token}"} + + +# Small test utilities used across tests when `tests.utils` helpers are not present +def random_lower_string(length: int = 8) -> str: + import random + import string + + return "".join(random.choice(string.ascii_lowercase) for _ in range(length)) + + +def random_email() -> str: + return f"{random_lower_string()}@example.com" @pytest.fixture(scope="session", autouse=True) def db() -> Generator[Session, None, None]: + # Ensure all SQLModel models are registered/imported and tables created + SQLModel.metadata.create_all(engine) with Session(engine) as session: - init_db(session) yield session - statement = delete(Item) + # Clean up created rows after the test session + statement = delete(OTP) session.execute(statement) statement = delete(User) session.execute(statement) diff --git a/backend/tests/crud/test_user.py b/backend/tests/crud/test_user.py index 10bda25e25..4482b31f93 100644 --- a/backend/tests/crud/test_user.py +++ b/backend/tests/crud/test_user.py @@ -1,9 +1,8 @@ from fastapi.encoders import jsonable_encoder from sqlmodel import Session -from app import crud -from app.core.security import verify_password -from app.models import User, UserCreate, UserUpdate +from app.services import auth_service as crud +from app.models.user import User, UserCreate, UserUpdate from tests.utils.utils import random_email, random_lower_string @@ -14,78 +13,3 @@ def test_create_user(db: Session) -> None: user = crud.create_user(session=db, user_create=user_in) assert user.email == email assert hasattr(user, "hashed_password") - - -def test_authenticate_user(db: Session) -> None: - email = random_email() - password = random_lower_string() - user_in = UserCreate(email=email, password=password) - user = crud.create_user(session=db, user_create=user_in) - authenticated_user = crud.authenticate(session=db, email=email, password=password) - assert authenticated_user - assert user.email == authenticated_user.email - - -def test_not_authenticate_user(db: Session) -> None: - email = random_email() - password = random_lower_string() - user = crud.authenticate(session=db, email=email, password=password) - assert user is None - - -def test_check_if_user_is_active(db: Session) -> None: - email = random_email() - password = random_lower_string() - user_in = UserCreate(email=email, password=password) - user = crud.create_user(session=db, user_create=user_in) - assert user.is_active is True - - -def test_check_if_user_is_active_inactive(db: Session) -> None: - email = random_email() - password = random_lower_string() - user_in = UserCreate(email=email, password=password, disabled=True) - user = crud.create_user(session=db, user_create=user_in) - assert user.is_active - - -def test_check_if_user_is_superuser(db: Session) -> None: - email = random_email() - password = random_lower_string() - user_in = UserCreate(email=email, password=password, is_superuser=True) - user = crud.create_user(session=db, user_create=user_in) - assert user.is_superuser is True - - -def test_check_if_user_is_superuser_normal_user(db: Session) -> None: - username = random_email() - password = random_lower_string() - user_in = UserCreate(email=username, password=password) - user = crud.create_user(session=db, user_create=user_in) - assert user.is_superuser is False - - -def test_get_user(db: Session) -> None: - password = random_lower_string() - username = random_email() - user_in = UserCreate(email=username, password=password, is_superuser=True) - user = crud.create_user(session=db, user_create=user_in) - user_2 = db.get(User, user.id) - assert user_2 - assert user.email == user_2.email - assert jsonable_encoder(user) == jsonable_encoder(user_2) - - -def test_update_user(db: Session) -> None: - password = random_lower_string() - email = random_email() - user_in = UserCreate(email=email, password=password, is_superuser=True) - user = crud.create_user(session=db, user_create=user_in) - new_password = random_lower_string() - user_in_update = UserUpdate(password=new_password, is_superuser=True) - if user.id is not None: - crud.update_user(session=db, db_user=user, user_in=user_in_update) - user_2 = db.get(User, user.id) - assert user_2 - assert user.email == user_2.email - assert verify_password(new_password, user_2.hashed_password) diff --git a/backend/tests/e2e/test_websocket_flow.py b/backend/tests/e2e/test_websocket_flow.py new file mode 100644 index 0000000000..f9b537df01 --- /dev/null +++ b/backend/tests/e2e/test_websocket_flow.py @@ -0,0 +1,29 @@ +import pytest +import asyncio + +from app.api.websocket_manager import WebSocketManager + + +class FakeWebSocket: + def __init__(self): + self.sent = [] + + async def send_text(self, message): + self.sent.append(message) + + +@pytest.mark.asyncio +async def test_broadcast_to_local(): + fake_redis = type("R", (), {"publish": lambda *a, **k: asyncio.sleep(0)})() + mgr = WebSocketManager(fake_redis) + + ws1 = FakeWebSocket() + ws2 = FakeWebSocket() + + # directly insert connections + mgr.connections.setdefault("room1", set()).add(ws1) + mgr.connections.setdefault("room1", set()).add(ws2) + + await mgr._broadcast_to_local("room1", "hello") + assert "hello" in ws1.sent + assert "hello" in ws2.sent diff --git a/backend/tests/integration/test_celery_tasks.py b/backend/tests/integration/test_celery_tasks.py new file mode 100644 index 0000000000..e90342632f --- /dev/null +++ b/backend/tests/integration/test_celery_tasks.py @@ -0,0 +1,7 @@ +import pytest + +from app.core.celery_app import celery_app + + +def test_celery_app_exists(): + assert celery_app is not None diff --git a/backend/tests/integration/test_redis_client.py b/backend/tests/integration/test_redis_client.py new file mode 100644 index 0000000000..801edcdd90 --- /dev/null +++ b/backend/tests/integration/test_redis_client.py @@ -0,0 +1,49 @@ +import asyncio +import pytest + +import redis.asyncio as aioredis + +from app.core import redis as redis_module + + +class DummyPool: + pass + + +@pytest.mark.asyncio +async def test_get_redis_pool_is_singleton(monkeypatch): + called = 0 + + async def fake_from_url(url, **kwargs): + nonlocal called + called += 1 + return DummyPool() + + monkeypatch.setattr(aioredis.ConnectionPool, "from_url", staticmethod(fake_from_url)) + + p1 = await redis_module.get_redis_pool() + p2 = await redis_module.get_redis_pool() + assert p1 is p2 + assert called == 1 + + +@pytest.mark.asyncio +async def test_get_redis_generator(monkeypatch): + class FakeRedis: + async def close(self): + pass + + async def fake_from_url(url, **kwargs): + return DummyPool() + + monkeypatch.setattr(aioredis.ConnectionPool, "from_url", staticmethod(fake_from_url)) + + # ensure generator yields a redis client and closes without error + gen = redis_module.get_redis() + client = await gen.__anext__() + assert hasattr(client, "close") + # finalize generator + try: + await gen.__anext__() + except StopAsyncIteration: + pass diff --git a/backend/tests/integration/test_websocket_manager.py b/backend/tests/integration/test_websocket_manager.py new file mode 100644 index 0000000000..2312477523 --- /dev/null +++ b/backend/tests/integration/test_websocket_manager.py @@ -0,0 +1,50 @@ +import asyncio +import pytest + +from app.api.websocket_manager import WebSocketManager + + +class FakePubSub: + def __init__(self, messages): + self._messages = messages + + async def psubscribe(self, pattern): + return None + + async def listen(self): + for m in self._messages: + yield m + + async def close(self): + pass + + +class FakeRedis: + def __init__(self, messages=None): + self._messages = messages or [] + + def pubsub(self): + return FakePubSub(self._messages) + + async def publish(self, channel, message): + # pretend to publish + return 1 + + +@pytest.mark.asyncio +async def test_websocket_manager_start_stop(): + fake_redis = FakeRedis(messages=[{"type":"pmessage","pattern":"ws:*","channel":"ws:room1","data":"hello"}]) + mgr = WebSocketManager(fake_redis) + + await mgr.start() + assert mgr._listen_task is not None + + await mgr.stop() + assert mgr._listen_task is None or mgr._listen_task.cancelled() + + +@pytest.mark.asyncio +async def test_publish_no_error(): + fake_redis = FakeRedis() + mgr = WebSocketManager(fake_redis) + await mgr.publish("room1", "msg") diff --git a/backend/tests/unit/test_cache_service.py b/backend/tests/unit/test_cache_service.py new file mode 100644 index 0000000000..8e09ca2e52 --- /dev/null +++ b/backend/tests/unit/test_cache_service.py @@ -0,0 +1,37 @@ +import pytest +import asyncio + +from app.core.redis import CacheService + + +class FakeRedis: + def __init__(self): + self.store = {} + + async def get(self, key): + return self.store.get(key) + + async def set(self, key, value, ex=None): + self.store[key] = value + + async def delete(self, key): + self.store.pop(key, None) + + async def exists(self, key): + return 1 if key in self.store else 0 + + +@pytest.mark.asyncio +async def test_cache_set_get_delete_exists(): + redis = FakeRedis() + cache = CacheService(redis) + + await cache.set("k", {"a": 1}) + v = await cache.get("k") + assert v == {"a": 1} + + assert await cache.exists("k") is True + + await cache.delete("k") + assert await cache.get("k") is None + assert await cache.exists("k") is False diff --git a/backend/tests/unit/test_middlewares.py b/backend/tests/unit/test_middlewares.py new file mode 100644 index 0000000000..5af0a09881 --- /dev/null +++ b/backend/tests/unit/test_middlewares.py @@ -0,0 +1,21 @@ +import pytest +from fastapi import FastAPI +from starlette.requests import Request + +from app.middlewares.logger import RequestLoggerMiddleware + + +@pytest.mark.asyncio +async def test_request_logger_middleware_sets_headers(): + app = FastAPI() + mw = RequestLoggerMiddleware(app) + + async def call_next(req): + class Resp: + status_code = 200 + headers = {} + return Resp() + + resp = await mw.dispatch(Request(scope={"type":"http", "method":"GET", "url": "/"}), call_next) + assert "X-Process-Time" in resp.headers + assert "X-Request-ID" in resp.headers diff --git a/backend/tests/unit/test_rate_limiter.py b/backend/tests/unit/test_rate_limiter.py new file mode 100644 index 0000000000..fe725f62c0 --- /dev/null +++ b/backend/tests/unit/test_rate_limiter.py @@ -0,0 +1,49 @@ +import asyncio +import pytest +import time + +from app.middlewares.rate_limiter import RateLimiterMiddleware +from starlette.requests import Request +from fastapi import FastAPI + + +@pytest.mark.asyncio +async def test_rate_limiter_allows_under_limit(): + app = FastAPI() + mw = RateLimiterMiddleware(app, requests_per_minute=5, window_seconds=1) + + class Dummy: + client = type("C", (), {"host": "127.0.0.1"}) + + async def call_next(req): + class Resp: + headers = {} + return Resp() + + # send 3 requests quickly + for _ in range(3): + allowed_resp = await mw.dispatch(Request(scope={"type":"http"}), call_next) + assert allowed_resp.headers["X-RateLimit-Limit"] == "5" + + +@pytest.mark.asyncio +async def test_rate_limiter_blocks_over_limit(): + app = FastAPI() + mw = RateLimiterMiddleware(app, requests_per_minute=2, window_seconds=1) + + async def call_next(req): + class Resp: + headers = {} + return Resp() + + # monkeypatch request.client.host via scope + class Dummy: + client = type("C", (), {"host": "127.0.0.1"}) + + # first two allowed + await mw.dispatch(Request(scope={"type":"http"}), call_next) + await mw.dispatch(Request(scope={"type":"http"}), call_next) + + # third should raise HTTPException + with pytest.raises(Exception): + await mw.dispatch(Request(scope={"type":"http"}), call_next) diff --git a/backend/tests/unit/test_webengage_email.py b/backend/tests/unit/test_webengage_email.py new file mode 100644 index 0000000000..ecc5c9fea9 --- /dev/null +++ b/backend/tests/unit/test_webengage_email.py @@ -0,0 +1,60 @@ +import pytest + +from app.services.webengage_email import send_email +from app.core import config + + +class DummyResponse: + def raise_for_status(self): + return None + + def json(self): + return {"status": "sent"} + + +class DummyAsyncClient: + def __init__(self, *args, **kwargs): + self.called = None + + async def __aenter__(self): + return self + + async def __aexit__(self, exc_type, exc, tb): + return False + + async def post(self, url, json=None, headers=None): + self.called = {"url": url, "json": json, "headers": headers} + return DummyResponse() + + +@pytest.mark.asyncio +async def test_send_email_success(monkeypatch): + # Enable webengage settings + monkeypatch.setattr(config.settings, "WEBENGAGE_API_URL", "https://api.webengage.test") + monkeypatch.setattr(config.settings, "WEBENGAGE_API_KEY", "fake-key") + + # Patch httpx.AsyncClient used by the module + import httpx as _httpx + + monkeypatch.setattr(_httpx, "AsyncClient", DummyAsyncClient) + + result = await send_email( + to_email="to@example.com", + subject="Hi", + template_id=None, + variables={"name": "Alice"}, + from_email="from@example.com", + from_name="Sender", + ) + + assert result == {"status": "sent"} + + +@pytest.mark.asyncio +async def test_send_email_not_configured(monkeypatch): + # Ensure webengage disabled + monkeypatch.setattr(config.settings, "WEBENGAGE_API_URL", None) + monkeypatch.setattr(config.settings, "WEBENGAGE_API_KEY", None) + + with pytest.raises(RuntimeError): + await send_email("a@b.com", "s") diff --git a/backend/tests/utils/item.py b/backend/tests/utils/item.py deleted file mode 100644 index ee51b351a6..0000000000 --- a/backend/tests/utils/item.py +++ /dev/null @@ -1,16 +0,0 @@ -from sqlmodel import Session - -from app import crud -from app.models import Item, ItemCreate -from tests.utils.user import create_random_user -from tests.utils.utils import random_lower_string - - -def create_random_item(db: Session) -> Item: - user = create_random_user(db) - owner_id = user.id - assert owner_id is not None - title = random_lower_string() - description = random_lower_string() - item_in = ItemCreate(title=title, description=description) - return crud.create_item(session=db, item_in=item_in, owner_id=owner_id) diff --git a/backend/tests/utils/user.py b/backend/tests/utils/user.py deleted file mode 100644 index 5867431ed8..0000000000 --- a/backend/tests/utils/user.py +++ /dev/null @@ -1,49 +0,0 @@ -from fastapi.testclient import TestClient -from sqlmodel import Session - -from app import crud -from app.core.config import settings -from app.models import User, UserCreate, UserUpdate -from tests.utils.utils import random_email, random_lower_string - - -def user_authentication_headers( - *, client: TestClient, email: str, password: str -) -> dict[str, str]: - data = {"username": email, "password": password} - - r = client.post(f"{settings.API_V1_STR}/login/access-token", data=data) - response = r.json() - auth_token = response["access_token"] - headers = {"Authorization": f"Bearer {auth_token}"} - return headers - - -def create_random_user(db: Session) -> User: - email = random_email() - password = random_lower_string() - user_in = UserCreate(email=email, password=password) - user = crud.create_user(session=db, user_create=user_in) - return user - - -def authentication_token_from_email( - *, client: TestClient, email: str, db: Session -) -> dict[str, str]: - """ - Return a valid token for the user with given email. - - If the user doesn't exist it is created first. - """ - password = random_lower_string() - user = crud.get_user_by_email(session=db, email=email) - if not user: - user_in_create = UserCreate(email=email, password=password) - user = crud.create_user(session=db, user_create=user_in_create) - else: - user_in_update = UserUpdate(password=password) - if not user.id: - raise Exception("User id not set") - user = crud.update_user(session=db, db_user=user, user_in=user_in_update) - - return user_authentication_headers(client=client, email=email, password=password) diff --git a/backend/tests/utils/utils.py b/backend/tests/utils/utils.py index 184bac44d9..091d576717 100644 --- a/backend/tests/utils/utils.py +++ b/backend/tests/utils/utils.py @@ -1,26 +1,10 @@ import random import string -from fastapi.testclient import TestClient -from app.core.config import settings - - -def random_lower_string() -> str: - return "".join(random.choices(string.ascii_lowercase, k=32)) +def random_lower_string(length: int = 8) -> str: + return "".join(random.choice(string.ascii_lowercase) for _ in range(length)) def random_email() -> str: - return f"{random_lower_string()}@{random_lower_string()}.com" - - -def get_superuser_token_headers(client: TestClient) -> dict[str, str]: - login_data = { - "username": settings.FIRST_SUPERUSER, - "password": settings.FIRST_SUPERUSER_PASSWORD, - } - r = client.post(f"{settings.API_V1_STR}/login/access-token", data=login_data) - tokens = r.json() - a_token = tokens["access_token"] - headers = {"Authorization": f"Bearer {a_token}"} - return headers + return f"{random_lower_string()}@example.com" diff --git a/copier.yml b/copier.yml deleted file mode 100644 index f98e3fc861..0000000000 --- a/copier.yml +++ /dev/null @@ -1,100 +0,0 @@ -project_name: - type: str - help: The name of the project, shown to API users (in .env) - default: FastAPI Project - -stack_name: - type: str - help: The name of the stack used for Docker Compose labels (no spaces) (in .env) - default: fastapi-project - -secret_key: - type: str - help: | - 'The secret key for the project, used for security, - stored in .env, you can generate one with: - python -c "import secrets; print(secrets.token_urlsafe(32))"' - default: changethis - -first_superuser: - type: str - help: The email of the first superuser (in .env) - default: admin@example.com - -first_superuser_password: - type: str - help: The password of the first superuser (in .env) - default: changethis - -smtp_host: - type: str - help: The SMTP server host to send emails, you can set it later in .env - default: "" - -smtp_user: - type: str - help: The SMTP server user to send emails, you can set it later in .env - default: "" - -smtp_password: - type: str - help: The SMTP server password to send emails, you can set it later in .env - default: "" - -emails_from_email: - type: str - help: The email account to send emails from, you can set it later in .env - default: info@example.com - -postgres_password: - type: str - help: | - 'The password for the PostgreSQL database, stored in .env, - you can generate one with: - python -c "import secrets; print(secrets.token_urlsafe(32))"' - default: changethis - -sentry_dsn: - type: str - help: The DSN for Sentry, if you are using it, you can set it later in .env - default: "" - -_exclude: - # Global - - .vscode - - .mypy_cache - # Python - - __pycache__ - - app.egg-info - - "*.pyc" - - .mypy_cache - - .coverage - - htmlcov - - .cache - - .venv - # Frontend - # Logs - - logs - - "*.log" - - npm-debug.log* - - yarn-debug.log* - - yarn-error.log* - - pnpm-debug.log* - - lerna-debug.log* - - node_modules - - dist - - dist-ssr - - "*.local" - # Editor directories and files - - .idea - - .DS_Store - - "*.suo" - - "*.ntvs*" - - "*.njsproj" - - "*.sln" - - "*.sw?" - -_answers_file: .copier/.copier-answers.yml - -_tasks: - - ["{{ _copier_python }}", .copier/update_dotenv.py] diff --git a/docker-compose.override.test-dev.yml b/docker-compose.override.test-dev.yml new file mode 100644 index 0000000000..d2dad60fc3 --- /dev/null +++ b/docker-compose.override.test-dev.yml @@ -0,0 +1,97 @@ +services: + + # Local services are available on their ports, but also available on: + # http://api.localhost.tiangolo.com: backend + # http://dashboard.localhost.tiangolo.com: frontend + # etc. To enable it, update .env, set: + # DOMAIN=localhost.tiangolo.com + + db: + restart: "no" + ports: + - "5432:5432" + + backend: + restart: "no" + ports: + - "8000:8000" + build: + context: ./backend + command: + - fastapi + - run + - --reload + - "app/main.py" + develop: + watch: + - path: ./backend + action: sync + target: /app + ignore: + - ./backend/.venv + - .venv + - path: ./backend/pyproject.toml + action: rebuild + volumes: + - ./backend/htmlcov:/app/htmlcov + environment: + SMTP_HOST: "mailcatcher" + SMTP_PORT: "1025" + SMTP_TLS: "false" + EMAILS_FROM_EMAIL: "noreply@example.com" + + mailcatcher: + image: schickling/mailcatcher + ports: + - "1080:1080" + - "1025:1025" + + redis: + image: redis:7-alpine + restart: "no" + ports: + - "6379:6379" + + networks: + - default + volumes: + - redis-data:/data + + frontend: + restart: "no" + ports: + - "5173:80" + build: + context: ./frontend + args: + - VITE_API_URL=http://localhost:8000 + - NODE_ENV=development + + playwright: + build: + context: ./frontend + dockerfile: Dockerfile.playwright + args: + - VITE_API_URL=http://backend:8000 + - NODE_ENV=production + ipc: host + depends_on: + - backend + - mailcatcher + env_file: + - .env + environment: + - VITE_API_URL=http://backend:8000 + - MAILCATCHER_HOST=http://mailcatcher:1080 + - PLAYWRIGHT_HTML_HOST=0.0.0.0 + - CI=${CI} + volumes: + - ./frontend/blob-report:/app/blob-report + - ./frontend/test-results:/app/test-results + ports: + - 9323:9323 + +# Traefik network removed for local dev + +volumes: + redis-data: diff --git a/docker-compose.override.yml b/docker-compose.override.yml index 0751abe901..c10788bcc1 100644 --- a/docker-compose.override.yml +++ b/docker-compose.override.yml @@ -50,11 +50,6 @@ services: ports: - "5432:5432" - adminer: - restart: "no" - ports: - - "8080:8080" - backend: restart: "no" ports: diff --git a/docker-compose.test-dev.yml b/docker-compose.test-dev.yml new file mode 100644 index 0000000000..d29a2ff2dc --- /dev/null +++ b/docker-compose.test-dev.yml @@ -0,0 +1,124 @@ +services: + db: + image: postgres:17 + restart: always + healthcheck: + test: [ "CMD-SHELL", "pg_isready -U ${POSTGRES_USER} -d ${POSTGRES_DB}" ] + interval: 10s + retries: 5 + start_period: 30s + timeout: 10s + volumes: + - app-db-data:/var/lib/postgresql/data/pgdata + env_file: + - .env + environment: + - PGDATA=/var/lib/postgresql/data/pgdata + - POSTGRES_PASSWORD=${POSTGRES_PASSWORD?Variable not set} + - POSTGRES_USER=${POSTGRES_USER?Variable not set} + - POSTGRES_DB=${POSTGRES_DB?Variable not set} + + prestart: + image: '${DOCKER_IMAGE_BACKEND?Variable not set}:${TAG-latest}' + build: + context: ./backend + networks: + - default + depends_on: + db: + condition: service_healthy + restart: true + command: bash scripts/prestart.sh + env_file: + - .env + environment: + - DOMAIN=${DOMAIN} + - FRONTEND_HOST=${FRONTEND_HOST?Variable not set} + - ENVIRONMENT=${ENVIRONMENT} + - BACKEND_CORS_ORIGINS=${BACKEND_CORS_ORIGINS} + - SECRET_KEY=${SECRET_KEY?Variable not set} + - FIRST_SUPERUSER=${FIRST_SUPERUSER?Variable not set} + - FIRST_SUPERUSER_PASSWORD=${FIRST_SUPERUSER_PASSWORD?Variable not set} + - SMTP_HOST=${SMTP_HOST} + - SMTP_USER=${SMTP_USER} + - SMTP_PASSWORD=${SMTP_PASSWORD} + - EMAILS_FROM_EMAIL=${EMAILS_FROM_EMAIL} + - POSTGRES_SERVER=db + - POSTGRES_PORT=${POSTGRES_PORT} + - POSTGRES_DB=${POSTGRES_DB} + - POSTGRES_USER=${POSTGRES_USER?Variable not set} + - POSTGRES_PASSWORD=${POSTGRES_PASSWORD?Variable not set} + - SENTRY_DSN=${SENTRY_DSN} + + backend: + image: '${DOCKER_IMAGE_BACKEND?Variable not set}:${TAG-latest}' + restart: always + networks: + - default + depends_on: + db: + condition: service_healthy + restart: true + prestart: + condition: service_completed_successfully + env_file: + - .env + environment: + - DOMAIN=${DOMAIN} + - FRONTEND_HOST=${FRONTEND_HOST?Variable not set} + - ENVIRONMENT=${ENVIRONMENT} + - BACKEND_CORS_ORIGINS=${BACKEND_CORS_ORIGINS} + - SECRET_KEY=${SECRET_KEY?Variable not set} + - FIRST_SUPERUSER=${FIRST_SUPERUSER?Variable not set} + - FIRST_SUPERUSER_PASSWORD=${FIRST_SUPERUSER_PASSWORD?Variable not set} + - SMTP_HOST=${SMTP_HOST} + - SMTP_USER=${SMTP_USER} + - SMTP_PASSWORD=${SMTP_PASSWORD} + - EMAILS_FROM_EMAIL=${EMAILS_FROM_EMAIL} + - POSTGRES_SERVER=db + - POSTGRES_PORT=${POSTGRES_PORT} + - POSTGRES_DB=${POSTGRES_DB} + - POSTGRES_USER=${POSTGRES_USER?Variable not set} + - POSTGRES_PASSWORD=${POSTGRES_PASSWORD?Variable not set} + - SENTRY_DSN=${SENTRY_DSN} + + healthcheck: + test: [ "CMD", "curl", "-f", "http://localhost:8000/api/v1/utils/health-check/" ] + interval: 10s + timeout: 5s + retries: 5 + + build: + context: ./backend + # Traefik labels removed for dev environment + + frontend: + image: '${DOCKER_IMAGE_FRONTEND?Variable not set}:${TAG-latest}' + restart: always + networks: + - default + build: + context: ./frontend + args: + - VITE_API_URL=https://api.${DOMAIN?Variable not set} + - NODE_ENV=production + # Traefik labels removed for dev environment + + # Redis service added for dev/test usage + redis: + image: redis:7-alpine + restart: always + ports: + - "6379:6379" + networks: + - default + + volumes: + - redis-data:/data + +volumes: + app-db-data: + redis-data: + + +networks: # Traefik network removed for dev/test environment diff --git a/docker-compose.yml b/docker-compose.yml index b1aa17ed43..bb2feb790a 100644 --- a/docker-compose.yml +++ b/docker-compose.yml @@ -1,10 +1,12 @@ services: db: - image: postgres:17 + build: + context: ./docker/postgres-pgvector + image: organyz/postgres-pgvector:18 restart: always healthcheck: - test: ["CMD-SHELL", "pg_isready -U ${POSTGRES_USER} -d ${POSTGRES_DB}"] + test: [ "CMD-SHELL", "pg_isready -U ${POSTGRES_USER} -d ${POSTGRES_DB}" ] interval: 10s retries: 5 start_period: 30s @@ -19,6 +21,17 @@ services: - POSTGRES_USER=${POSTGRES_USER?Variable not set} - POSTGRES_DB=${POSTGRES_DB?Variable not set} + redis: + image: redis:7 + restart: always + volumes: + - redis-data:/data + healthcheck: + test: [ "CMD", "redis-cli", "ping" ] + interval: 10s + timeout: 5s + retries: 5 + adminer: image: adminer restart: always @@ -53,6 +66,9 @@ services: db: condition: service_healthy restart: true + redis: + condition: service_healthy + restart: true command: bash scripts/prestart.sh env_file: - .env @@ -85,6 +101,9 @@ services: db: condition: service_healthy restart: true + redis: + condition: service_healthy + restart: true prestart: condition: service_completed_successfully env_file: @@ -109,7 +128,7 @@ services: - SENTRY_DSN=${SENTRY_DSN} healthcheck: - test: ["CMD", "curl", "-f", "http://localhost:8000/api/v1/utils/health-check/"] + test: [ "CMD", "curl", "-f", "http://localhost:8000/api/v1/utils/health-check/" ] interval: 10s timeout: 5s retries: 5 @@ -164,6 +183,8 @@ services: - traefik.http.routers.${STACK_NAME?Variable not set}-frontend-http.middlewares=https-redirect volumes: app-db-data: + redis-data: + networks: traefik-public: diff --git a/docker/postgres-pgvector/Dockerfile b/docker/postgres-pgvector/Dockerfile new file mode 100644 index 0000000000..df32d9682a --- /dev/null +++ b/docker/postgres-pgvector/Dockerfile @@ -0,0 +1,32 @@ +FROM postgres:18 + +ENV PGVECTOR_VERSION= + +RUN apt-get update \ + && apt-get install -y --no-install-recommends \ + ca-certificates \ + build-essential \ + git \ + gcc \ + make \ + wget \ + libssl-dev \ + postgresql-server-dev-18 \ + && rm -rf /var/lib/apt/lists/* + +RUN if [ -z "${PGVECTOR_VERSION}" ]; then \ + git clone --depth 1 https://github.com/pgvector/pgvector.git /tmp/pgvector; \ + else \ + git clone --depth 1 --branch ${PGVECTOR_VERSION} https://github.com/pgvector/pgvector.git /tmp/pgvector; \ + fi \ + && cd /tmp/pgvector \ + && make \ + && make install \ + && cd / \ + && rm -rf /tmp/pgvector + +# Copy initialization SQL scripts into the image so they run on first init +COPY initdb /docker-entrypoint-initdb.d/ +RUN chmod -R 755 /docker-entrypoint-initdb.d + +# Keep the default entrypoint from postgres image diff --git a/docker/postgres-pgvector/initdb/01-enable-pgvector.sql b/docker/postgres-pgvector/initdb/01-enable-pgvector.sql new file mode 100644 index 0000000000..ae36c0b39a --- /dev/null +++ b/docker/postgres-pgvector/initdb/01-enable-pgvector.sql @@ -0,0 +1,2 @@ +-- Enable pgvector extension in the default database on initialization +CREATE EXTENSION IF NOT EXISTS vector; diff --git a/dockercompose-dev.yml b/dockercompose-dev.yml new file mode 100644 index 0000000000..79841aee3d --- /dev/null +++ b/dockercompose-dev.yml @@ -0,0 +1,111 @@ +services: + + db: + build: + context: ./docker/postgres-pgvector + image: organyz/postgres-pgvector:18 + restart: always + healthcheck: + test: [ "CMD-SHELL", "pg_isready -U ${POSTGRES_USER} -d ${POSTGRES_DB}" ] + interval: 10s + retries: 5 + start_period: 30s + timeout: 10s + volumes: + - app-db-data:/var/lib/postgresql/data/pgdata + env_file: + - .env + environment: + - PGDATA=/var/lib/postgresql/data/pgdata + - POSTGRES_PASSWORD=${POSTGRES_PASSWORD?Variable not set} + - POSTGRES_USER=${POSTGRES_USER?Variable not set} + - POSTGRES_DB=${POSTGRES_DB?Variable not set} + + # adminer service removed for dev environment + + prestart: + image: '${DOCKER_IMAGE_BACKEND?Variable not set}:${TAG-latest}' + build: + context: ./backend + networks: + - default + depends_on: + db: + condition: service_healthy + restart: true + command: bash scripts/prestart.sh + env_file: + - .env + environment: + - DOMAIN=${DOMAIN} + - FRONTEND_HOST=${FRONTEND_HOST?Variable not set} + - ENVIRONMENT=${ENVIRONMENT} + - BACKEND_CORS_ORIGINS=${BACKEND_CORS_ORIGINS} + - SECRET_KEY=${SECRET_KEY?Variable not set} + - FIRST_SUPERUSER=${FIRST_SUPERUSER?Variable not set} + - FIRST_SUPERUSER_PASSWORD=${FIRST_SUPERUSER_PASSWORD?Variable not set} + - SMTP_HOST=${SMTP_HOST} + - SMTP_USER=${SMTP_USER} + - SMTP_PASSWORD=${SMTP_PASSWORD} + - EMAILS_FROM_EMAIL=${EMAILS_FROM_EMAIL} + - POSTGRES_SERVER=db + - POSTGRES_PORT=${POSTGRES_PORT} + - POSTGRES_DB=${POSTGRES_DB} + - POSTGRES_USER=${POSTGRES_USER?Variable not set} + - POSTGRES_PASSWORD=${POSTGRES_PASSWORD?Variable not set} + - SENTRY_DSN=${SENTRY_DSN} + + backend: + image: '${DOCKER_IMAGE_BACKEND?Variable not set}:${TAG-latest}' + restart: always + networks: + - default + depends_on: + db: + condition: service_healthy + restart: true + prestart: + condition: service_completed_successfully + env_file: + - .env + environment: + - DOMAIN=${DOMAIN} + - FRONTEND_HOST=${FRONTEND_HOST?Variable not set} + - ENVIRONMENT=${ENVIRONMENT} + - BACKEND_CORS_ORIGINS=${BACKEND_CORS_ORIGINS} + - SECRET_KEY=${SECRET_KEY?Variable not set} + - FIRST_SUPERUSER=${FIRST_SUPERUSER?Variable not set} + - FIRST_SUPERUSER_PASSWORD=${FIRST_SUPERUSER_PASSWORD?Variable not set} + - SMTP_HOST=${SMTP_HOST} + - SMTP_USER=${SMTP_USER} + - SMTP_PASSWORD=${SMTP_PASSWORD} + - EMAILS_FROM_EMAIL=${EMAILS_FROM_EMAIL} + - POSTGRES_SERVER=db + - POSTGRES_PORT=${POSTGRES_PORT} + - POSTGRES_DB=${POSTGRES_DB} + - POSTGRES_USER=${POSTGRES_USER?Variable not set} + - POSTGRES_PASSWORD=${POSTGRES_PASSWORD?Variable not set} + - SENTRY_DSN=${SENTRY_DSN} + + healthcheck: + test: [ "CMD", "curl", "-f", "http://localhost:8000/api/v1/utils/health-check/" ] + interval: 10s + timeout: 5s + retries: 5 + + build: + context: ./backend + + frontend: + image: '${DOCKER_IMAGE_FRONTEND?Variable not set}:${TAG-latest}' + restart: always + networks: + - default + build: + context: ./frontend + args: + - VITE_API_URL=https://api.${DOMAIN?Variable not set} + - NODE_ENV=production + +volumes: + app-db-data: diff --git a/scripts/build-push.sh b/scripts/build-push.sh index 3fa3aa7e6b..b8a5f2fab2 100644 --- a/scripts/build-push.sh +++ b/scripts/build-push.sh @@ -7,4 +7,4 @@ TAG=${TAG?Variable not set} \ FRONTEND_ENV=${FRONTEND_ENV-production} \ sh ./scripts/build.sh -docker-compose -f docker-compose.yml push +docker-compose -f docker-compose.test-dev.yml -f docker-compose.override.test-dev.yml push diff --git a/scripts/build.sh b/scripts/build.sh index 21528c538e..a0c71bfb3d 100644 --- a/scripts/build.sh +++ b/scripts/build.sh @@ -3,8 +3,10 @@ # Exit in case of error set -e +build TAG=${TAG?Variable not set} \ FRONTEND_ENV=${FRONTEND_ENV-production} \ docker-compose \ --f docker-compose.yml \ +-f docker-compose.test-dev.yml \ +-f docker-compose.override.test-dev.yml \ build diff --git a/scripts/docker-compose-dev.sh b/scripts/docker-compose-dev.sh new file mode 100755 index 0000000000..ab60b4c403 --- /dev/null +++ b/scripts/docker-compose-dev.sh @@ -0,0 +1,13 @@ +#!/usr/bin/env zsh +# Wrapper to run `docker compose` using the generated `dockercompose-dev.yml` file +# Usage: +# ./scripts/docker-compose-dev.sh up -d +# ./scripts/docker-compose-dev.sh ps + +export COMPOSE_FILE="dockercompose-dev.yml" + +if [ "$#" -eq 0 ]; then + docker compose help +else + docker compose "$@" +fi diff --git a/scripts/docker-compose-test-dev.sh b/scripts/docker-compose-test-dev.sh new file mode 100755 index 0000000000..453c25f0cf --- /dev/null +++ b/scripts/docker-compose-test-dev.sh @@ -0,0 +1,17 @@ +#!/usr/bin/env zsh +# Wrapper to run `docker compose` using docker-compose.override.test-dev.yml +# This avoids modifying the original compose files. Usage: +# ./scripts/docker-compose-test-dev.sh up -d +# ./scripts/docker-compose-test-dev.sh ps + +# Compose will read files in the order they are listed. We set COMPOSE_FILE +# so `docker compose` uses `docker-compose.yml` together with +# `docker-compose.override.test-dev.yml` instead of the default override file. +export COMPOSE_FILE="docker-compose.yml:docker-compose.override.test-dev.yml" + +# Forward all args to docker compose. If no args provided, show help. +if [ "$#" -eq 0 ]; then + docker compose help +else + docker compose "$@" +fi diff --git a/scripts/test-local.sh b/scripts/test-local.sh index 7f2fa9fbce..15bf13941a 100644 --- a/scripts/test-local.sh +++ b/scripts/test-local.sh @@ -3,13 +3,13 @@ # Exit in case of error set -e -docker-compose down -v --remove-orphans # Remove possibly previous broken stacks left hanging after an error +docker-compose -f docker-compose.test-dev.yml -f docker-compose.override.test-dev.yml down -v --remove-orphans # Remove possibly previous broken stacks left hanging after an error if [ $(uname -s) = "Linux" ]; then echo "Remove __pycache__ files" sudo find . -type d -name __pycache__ -exec rm -r {} \+ fi -docker-compose build -docker-compose up -d -docker-compose exec -T backend bash scripts/tests-start.sh "$@" +docker-compose -f docker-compose.test-dev.yml -f docker-compose.override.test-dev.yml build +docker-compose -f docker-compose.test-dev.yml -f docker-compose.override.test-dev.yml up -d +docker-compose -f docker-compose.test-dev.yml -f docker-compose.override.test-dev.yml exec -T backend bash scripts/tests-start.sh "$@"