Daily development workflow and best practices for CurriculumExtractor.
Last Updated: October 23, 2025
Environment Status: ✅ All services operational
# Start all services with hot-reload
docker compose watch
# Access:
# - Frontend: http://localhost:5173
# - API: http://localhost:8000
# - API Docs: http://localhost:8000/docsLogin: admin@curriculumextractor.com / kRZtEcmM3tRevtEh1CitNL6s_s5ciE7q
cd /Users/amostan/Repositories/CurriculumExtractor
# Option A: With hot-reload (recommended)
docker compose watch
# Option B: Standard mode
docker compose up -d
# Verify all services are healthy
docker compose psExpected Services:
- ✅ Backend (FastAPI) - http://localhost:8000
- ✅ Frontend (React) - http://localhost:5173
- ✅ Redis - localhost:6379
- ✅ Celery Worker - 4 processes
- ✅ Database (Supabase) - Session Mode
- ✅ Proxy (Traefik) - localhost:80
- ✅ MailCatcher - http://localhost:1080
Edit code in your IDE - changes auto-reload:
- Backend: FastAPI with
--reloadflag - Frontend: Vite HMR (instant updates)
- Celery: Restart worker after task changes
# Backend unit tests
docker compose exec backend bash scripts/test.sh
# Frontend E2E tests (requires backend running)
cd frontend && npx playwright test
# Test Celery tasks
curl -X POST http://localhost:8000/api/v1/tasks/health-check# Automated (runs on git commit)
git commit -m "feat: your change"
# Manual pre-commit check
uv run pre-commit run --all-files
# Individual checks
cd backend && uv run ruff check . && uv run mypy .
cd frontend && npm run lintgit add .
git commit -m "feat: add extraction model"
git push origin your-branch# All services
docker compose logs -f
# Specific service
docker compose logs backend -f
docker compose logs celery-worker -f
docker compose logs frontend -f
# Filter for errors
docker compose logs backend | grep ERROR# Restart specific service
docker compose restart backend
docker compose restart celery-worker
# Restart all
docker compose restart
# Rebuild after dependency changes
docker compose build backend
docker compose up -d# Stop all services
docker compose down
# Stop and remove volumes (CAUTION: deletes Redis data)
docker compose down -v- ✅ Auto-reload enabled via
--reloadflag - Code changes trigger automatic restart
- Watch for restart in logs:
docker compose logs backend -f
- ✅ Hot Module Replacement (HMR) enabled
- Changes appear instantly in browser
- No page refresh needed for most changes
⚠️ Manual restart required after task changes- Run:
docker compose restart celery-worker - Watch logs:
docker compose logs celery-worker -f
- Python 3.10 (use pyenv if you have 3.13)
- Node.js v20+ (via fnm/nvm)
- Access to Supabase and Redis
# Terminal 1: Ensure Redis is running
docker compose up redis -d
# Terminal 2: Run backend locally
cd backend
source .venv/bin/activate
fastapi dev app/main.py
# Access: http://localhost:8000cd frontend
npm run dev
# Access: http://localhost:5173# Ensure Redis is running
docker compose up redis -d
cd backend
source .venv/bin/activate
celery -A app.worker worker --loglevel=info --concurrency=4-
Create task file (e.g.,
backend/app/tasks/extraction.py):from app.worker import celery_app import logging logger = logging.getLogger(__name__) @celery_app.task(bind=True, name="app.tasks.extraction.process_pdf") def process_pdf_task(self, extraction_id: str): logger.info(f"Processing: {extraction_id}") # Your task logic here return {"status": "completed", "extraction_id": extraction_id}
-
Import in
backend/app/tasks/__init__.py:from app.tasks.extraction import * # noqa
-
Rebuild and restart:
docker compose restart celery-worker
-
Test the task:
# Via API curl -X POST http://localhost:8000/api/v1/tasks/health-check # Check status curl http://localhost:8000/api/v1/tasks/status/{TASK_ID}
# View worker logs
docker compose logs celery-worker -f
# Check registered tasks
docker compose exec celery-worker celery -A app.worker inspect registered
# Get worker stats
docker compose exec celery-worker celery -A app.worker inspect stats
# See active tasks
docker compose exec celery-worker celery -A app.worker inspect activeAdd detailed logging:
import logging
logger = logging.getLogger(__name__)
@celery_app.task(bind=True)
def my_task(self):
logger.info(f"Task started: {self.request.id}")
logger.debug(f"Task args: {self.request.args}")
# ... your code
logger.info(f"Task completed: {self.request.id}")Watch logs in real-time:
docker compose logs celery-worker -f | grep "my_task"For database operations, use the Supabase MCP server (Model Context Protocol):
# Project ID for all MCP commands
PROJECT_ID = "wijzypbstiigssjuiuvh"
# List all tables
mcp_supabase_list_tables(
project_id="wijzypbstiigssjuiuvh",
schemas=["public"]
)
# Execute SQL query
mcp_supabase_execute_sql(
project_id="wijzypbstiigssjuiuvh",
query="SELECT * FROM users LIMIT 10;"
)
# Apply database migration
mcp_supabase_apply_migration(
project_id="wijzypbstiigssjuiuvh",
name="add_new_column",
query="ALTER TABLE users ADD COLUMN IF NOT EXISTS role VARCHAR(50);"
)
# Check for security issues
mcp_supabase_get_advisors(
project_id="wijzypbstiigssjuiuvh",
type="security"
)Create buckets (via Supabase Dashboard):
- Go to: https://app.supabase.com/project/wijzypbstiigssjuiuvh/storage
- Create bucket:
worksheets(private) - Create bucket:
extractions(private)
Upload files (in backend code):
from supabase import create_client
from app.core.config import settings
supabase = create_client(settings.SUPABASE_URL, settings.SUPABASE_SERVICE_KEY)
# Upload file
with open(file_path, 'rb') as f:
result = supabase.storage.from_("worksheets").upload(
file=f,
path=f"uploads/{user_id}/{filename}",
file_options={"content-type": "application/pdf"}
)
# Generate signed URL (7-day expiry)
url = supabase.storage.from_("worksheets").create_signed_url(
path=f"uploads/{user_id}/{filename}",
expires_in=604800
)- ✅ Write tests FIRST before implementing features
- ✅ Run tests frequently during development
- ✅ Ensure all tests pass before committing
- ✅ Aim for ≥80% code coverage
# Run all tests with coverage
docker compose exec backend bash scripts/test.sh
# Run specific test file
docker compose exec backend pytest tests/api/routes/test_users.py -v
# Run specific test
docker compose exec backend pytest tests/api/routes/test_users.py::test_create_user -v
# Run with output
docker compose exec backend pytest -scd frontend
# Run all E2E tests
npx playwright test
# Run specific test
npx playwright test login.spec.ts
# Run in UI mode (interactive)
npx playwright test --ui
# Debug mode
npx playwright test --debug# Test via API (recommended)
curl -X POST http://localhost:8000/api/v1/tasks/health-check
# Test via Python
docker compose exec backend python3 -c "
from app.tasks.default import health_check_task
result = health_check_task.delay()
print(result.get(timeout=10))
"
# Test with pytest (set CELERY_TASK_ALWAYS_EAGER=True in tests)
docker compose exec backend pytest tests/tasks/ -vPre-commit hooks automatically run:
- Ruff (Python linting/formatting)
- Biome (TypeScript linting)
- YAML/TOML validation
Manual checks:
# Backend
cd backend
uv run ruff check .
uv run mypy .
# Frontend
cd frontend
npm run lintView detailed logs:
# Application logs
docker compose logs backend -f
# Database query logs (set echo=True in db.py)
docker compose exec backend python3 -c "
from app.core.db import engine
engine.echo = True
# Run your code
"
# Check environment variables
docker compose exec backend env | grep -E "SUPABASE|DATABASE|REDIS"Test database connection:
# Via MCP
mcp_supabase_get_project(id="wijzypbstiigssjuiuvh")
# Via Python
docker compose exec backend python3 -c "
from app.core.db import engine
conn = engine.connect()
print('✅ Connected!')
conn.close()
"Interactive Python shell:
docker compose exec backend python3
>>> from app.core.db import engine
>>> from app.models import User
>>> from sqlmodel import Session, select
>>> with Session(engine) as session:
... users = session.exec(select(User)).all()
... print(f"Users: {len(users)}")Browser DevTools:
- React DevTools for component inspection
- TanStack Query DevTools (auto-enabled in dev)
- Network tab for API calls
Check for errors:
# Console logs in browser
# Or check Vite dev server output
docker compose logs frontend -fTypeScript type checking:
cd frontend
npx tsc --noEmitNetwork Request Debugging:
When API calls fail, check which server is handling the request:
-
Open browser DevTools → Network tab
-
Look at request URLs:
- ✅ Backend API:
http://localhost:8000/api/v1/... - ❌ Frontend nginx:
http://localhost:5173/api/v1/...
- ✅ Backend API:
-
Common issues:
# Issue: Requests hitting frontend (5173) instead of backend (8000) # Cause: Using relative URLs in axios without OpenAPI.BASE # Fix: Use `${OpenAPI.BASE}/api/v1/endpoint` # Issue: 401 Unauthorized on authenticated endpoints # Cause: Missing Authorization header in custom axios calls # Fix: Include token from OpenAPI.TOKEN # Issue: CORS errors # Cause: Backend not configured for frontend origin # Fix: Check BACKEND_CORS_ORIGINS in .env
-
Test API directly:
# Health check (no auth) curl http://localhost:8000/api/v1/utils/health-check/ # Authenticated endpoint TOKEN="your-token-from-browser-localstorage" curl -H "Authorization: Bearer $TOKEN" \ http://localhost:8000/api/v1/users/me # File upload curl -X POST \ -H "Authorization: Bearer $TOKEN" \ -F "file=@test.pdf" \ http://localhost:8000/api/v1/ingestions
-
Check authentication state:
// In browser console localStorage.getItem('access_token') // Should return JWT token // Decode token (without verification) to check expiry JSON.parse(atob(localStorage.getItem('access_token').split('.')[1]))
Check task status:
# Via API
curl http://localhost:8000/api/v1/tasks/status/{TASK_ID}
# Via Python
docker compose exec celery-worker python3 -c "
from celery.result import AsyncResult
from app.worker import celery_app
result = AsyncResult('task-id', app=celery_app)
print(f'Status: {result.status}')
print(f'Result: {result.result if result.successful() else result.info}')
"Test task directly:
docker compose exec celery-worker python3 -c "
from app.tasks.default import health_check_task
result = health_check_task.delay()
print(f'Task queued: {result.id}')
# Wait and get result
print(f'Result: {result.get(timeout=10)}')
"# Check table structure
mcp_supabase_execute_sql(
project_id="wijzypbstiigssjuiuvh",
query="""
SELECT column_name, data_type, is_nullable
FROM information_schema.columns
WHERE table_name = 'user'
ORDER BY ordinal_position;
"""
)
# Count records
mcp_supabase_execute_sql(
project_id="wijzypbstiigssjuiuvh",
query="SELECT COUNT(*) FROM users;"
)
# View recent migrations
mcp_supabase_list_migrations(project_id="wijzypbstiigssjuiuvh")
# Check database logs (last 24 hours)
mcp_supabase_get_logs(
project_id="wijzypbstiigssjuiuvh",
service="postgres"
)# Check for missing RLS policies
mcp_supabase_get_advisors(
project_id="wijzypbstiigssjuiuvh",
type="security"
)
# Check for performance issues
mcp_supabase_get_advisors(
project_id="wijzypbstiigssjuiuvh",
type="performance"
)Complete workflow:
-
Update models in
backend/app/models.py:class Extraction(SQLModel, table=True): id: uuid.UUID = Field(default_factory=uuid.uuid4, primary_key=True) filename: str = Field(max_length=255) status: str = Field(default="DRAFT", max_length=50) created_at: datetime = Field(default_factory=datetime.utcnow) user_id: uuid.UUID = Field(foreign_key="user.id")
-
Generate migration:
docker compose exec backend alembic revision --autogenerate -m "Add Extraction model"
-
Review migration in
backend/app/alembic/versions/:- Check the generated SQL
- Add any custom logic needed
- Verify foreign keys and constraints
-
Apply migration:
docker compose exec backend alembic upgrade head -
Verify in Supabase:
# Use MCP to verify mcp_supabase_list_tables( project_id="wijzypbstiigssjuiuvh", schemas=["public"] ) # Check for security issues mcp_supabase_get_advisors( project_id="wijzypbstiigssjuiuvh", type="security" )
-
Commit migration files:
git add backend/app/alembic/versions/ git commit -m "feat: add extraction model"
For rapid iterations:
# 1. Apply change via MCP
mcp_supabase_apply_migration(
project_id="wijzypbstiigssjuiuvh",
name="add_extraction_table",
query="""
CREATE TABLE IF NOT EXISTS extractions (
id UUID PRIMARY KEY DEFAULT uuid_generate_v4(),
filename VARCHAR(255) NOT NULL,
status VARCHAR(50) DEFAULT 'DRAFT',
created_at TIMESTAMP WITH TIME ZONE DEFAULT NOW()
);
"""
)
# 2. Sync Alembic to match
docker compose exec backend alembic stamp head
# 3. Generate migration for version control
docker compose exec backend alembic revision --autogenerate -m "Sync extraction table"- ✅ Always review auto-generated migrations
- ✅ Test migrations on local/staging before production
- ✅ Use MCP to verify tables after migration
- ✅ Check for missing RLS policies with MCP advisors
- ✅ Commit migration files to git
- ❌ Don't edit applied migrations (create new ones)
-
Create route file (e.g.,
backend/app/api/routes/extractions.py):from fastapi import APIRouter, HTTPException from app.api.deps import CurrentUser, SessionDep from app.models import Extraction, ExtractionCreate, ExtractionPublic router = APIRouter(prefix="/extractions", tags=["extractions"]) @router.get("/", response_model=list[ExtractionPublic]) def list_extractions(session: SessionDep, current_user: CurrentUser): statement = select(Extraction).where(Extraction.user_id == current_user.id) extractions = session.exec(statement).all() return extractions @router.post("/", response_model=ExtractionPublic) def create_extraction( session: SessionDep, current_user: CurrentUser, extraction_in: ExtractionCreate ): extraction = Extraction.model_validate( extraction_in, update={"user_id": current_user.id} ) session.add(extraction) session.commit() session.refresh(extraction) return extraction
-
Register route in
backend/app/api/main.py:from app.api.routes import extractions, login, users, utils, tasks api_router.include_router(extractions.router)
-
Generate TypeScript client:
./scripts/generate-client.sh
-
Test in API docs: http://localhost:8000/docs
-
Use in frontend:
import { ExtractionsService } from '@/client' const { data } = useQuery({ queryKey: ['extractions'], queryFn: () => ExtractionsService.listExtractions() })
The generated OpenAPI client doesn't support upload progress callbacks. When you need progress tracking (e.g., for file uploads), use axios directly with proper authentication.
Key Requirements:
- Use
${OpenAPI.BASE}for absolute URL (not relative paths) - Manually fetch and include authentication token
- Handle
OpenAPI.TOKENas async function - Include
Content-Type: multipart/form-dataheader
Example: useFileUpload Hook
import axios, { type AxiosProgressEvent } from "axios"
import { OpenAPI, type IngestionPublic } from "@/client"
export function useFileUpload() {
const upload = async (file: File): Promise<UploadResult> => {
const formData = new FormData()
formData.append("file", file)
// Get auth token (OpenAPI.TOKEN is async function)
const token = typeof OpenAPI.TOKEN === "function"
? await (OpenAPI.TOKEN as () => Promise<string>)()
: OpenAPI.TOKEN
// Use absolute URL with OpenAPI.BASE
const result = await axios.post<IngestionPublic>(
`${OpenAPI.BASE}/api/v1/ingestions`, // ✅ Absolute URL
formData,
{
headers: {
"Content-Type": "multipart/form-data",
...(token && { Authorization: `Bearer ${token}` }) // ✅ Auth
},
onUploadProgress: (progressEvent: AxiosProgressEvent) => {
const percentCompleted = Math.round(
(progressEvent.loaded * 100) / (progressEvent.total || 1)
)
updateProgress(Math.min(percentCompleted, 100))
},
},
)
return { success: true, data: result.data }
}
}Common Mistakes:
- ❌ Using relative URL:
/api/v1/ingestions→ Request goes to frontend nginx (port 5173) - ❌ Missing auth header → 401 Unauthorized
- ❌ Calling
OpenAPI.TOKEN()without type assertion → TypeScript error - ❌ Not handling undefined
progressEvent.total→ NaN progress
Debugging Upload Issues:
-
Check request URL in browser DevTools Network tab:
- ✅ Should see:
http://localhost:8000/api/v1/ingestions/(port 8000) - ❌ If you see:
http://localhost:5173/api/v1/ingestions(port 5173) → URL is relative
- ✅ Should see:
-
Verify authentication:
# Check if token exists localStorage.getItem('access_token') # Should see Authorization header in Network tab Authorization: Bearer eyJ0eXAiOiJKV1QiLCJhbGc...
-
Common HTTP errors:
413 Request Entity Too Large→ Request hitting nginx frontend instead of backend401 Unauthorized→ Missing or invalid auth token400 Bad Request→ File validation failed (check MIME type, size)
-
Test upload manually:
# Get token from browser localStorage TOKEN="your-token-here" # Test upload curl -X POST http://localhost:8000/api/v1/ingestions \ -H "Authorization: Bearer $TOKEN" \ -F "file=@test.pdf"
When changes don't appear after editing frontend code:
-
Docker-served frontend (port 5173 via nginx):
- Vite dev server changes won't affect Docker container
- Need full rebuild and restart:
cd frontend && npm run build docker compose build --no-cache frontend docker compose restart frontend
-
Standalone dev server (npm run dev):
- Changes apply via HMR automatically
- Only needed for development without Docker
-
Verify deployed bundle:
# Check files in Docker container docker compose exec frontend ls -la /usr/share/nginx/html/assets/ # Search for your code in bundle docker compose exec frontend grep -r "your-pattern" /usr/share/nginx/html/assets/
Cache Busting:
- Hard refresh:
Cmd+Shift+R(Mac) orCtrl+Shift+R(Windows) - Clear browser cache if route changes don't appear
- Vite automatically adds hashes to filenames (e.g.,
index-CNYtKbML.js)
Create new route:
# Create: frontend/src/routes/_layout/extractions.tsx
# URL becomes: http://localhost:5173/extractionsRoute template:
import { createFileRoute } from '@tanstack/react-router'
import { useQuery } from '@tanstack/react-query'
import { ExtractionsService } from '@/client'
export const Route = createFileRoute('/_layout/extractions')({
component: ExtractionsPage,
})
function ExtractionsPage() {
const { data, isLoading } = useQuery({
queryKey: ['extractions'],
queryFn: () => ExtractionsService.listExtractions(),
})
if (isLoading) return <div>Loading...</div>
return (
<div>
<h1>Extractions</h1>
{/* Your UI */}
</div>
)
}After creating routes:
- TanStack Router auto-generates
routeTree.gen.ts - No manual route registration needed
After any backend API changes:
# From project root
./scripts/generate-client.sh
# Or manually
cd frontend
npm run generate-clientThis updates:
frontend/src/client/schemas.gen.tsfrontend/src/client/sdk.gen.tsfrontend/src/client/types.gen.ts
Always commit generated client files!
# 1. Define in backend/app/models.py
# 2. Generate migration
docker compose exec backend alembic revision --autogenerate -m "Add model"
# 3. Review and apply
docker compose exec backend alembic upgrade head
# 4. Verify via MCP
mcp_supabase_list_tables(project_id="wijzypbstiigssjuiuvh", schemas=["public"])cd frontend
npm install package-name@version
# Frontend auto-reloads, no rebuild needed
# Update imports in your code# 1. Edit backend/pyproject.toml
# 2. Rebuild
docker compose build backend
# 3. Restart
docker compose up -d backend celery-worker# After any backend API changes
./scripts/generate-client.sh
# Commit generated files
git add frontend/src/client/
git commit -m "chore: update API client"- Supabase Project: wijzypbstiigssjuiuvh
- Region: ap-south-1 (Mumbai, India)
- Database: PostgreSQL 17.6.1
- Connection: Session Mode (port 5432)
- Dashboard: https://app.supabase.com/project/wijzypbstiigssjuiuvh
- Admin Email: admin@curriculumextractor.com
- Admin Password: kRZtEcmM3tRevtEh1CitNL6s_s5ciE7q
- Database Password: Curriculumextractor1234!
| Service | URL | Notes |
|---|---|---|
| Frontend | http://localhost:5173 | React app |
| Backend API | http://localhost:8000 | FastAPI |
| API Docs | http://localhost:8000/docs | Swagger UI |
| MailCatcher | http://localhost:1080 | Email testing |
| Traefik Dashboard | http://localhost:8090 | Proxy stats |
| Supabase Dashboard | https://app.supabase.com/project/wijzypbstiigssjuiuvh | DB management |
# Start with hot-reload
docker compose watch
# Start normally
docker compose up -d
# Stop all
docker compose down
# View all logs
docker compose logs -f
# Specific service logs
docker compose logs backend -f
# Restart service
docker compose restart backend
# Rebuild after dependency changes
docker compose build backend
docker compose up -d
# Check service status
docker compose ps
# Execute command in service
docker compose exec backend bash# View worker logs
docker compose logs celery-worker -f
# Restart worker
docker compose restart celery-worker
# Check registered tasks
docker compose exec celery-worker celery -A app.worker inspect registered
# Get worker stats
docker compose exec celery-worker celery -A app.worker inspect stats
# Purge all tasks
docker compose exec celery-worker celery -A app.worker purge# Generate migration
docker compose exec backend alembic revision --autogenerate -m "Description"
# Apply migrations (forward-only)
docker compose exec backend alembic upgrade head
# Run migrations on demand (prestart one-shot job)
docker compose run --rm prestart
# Show current version
docker compose exec backend alembic current
# Show migration history
docker compose exec backend alembic history# List tables
mcp_supabase_list_tables(project_id="wijzypbstiigssjuiuvh", schemas=["public"])
# Execute query
mcp_supabase_execute_sql(project_id="wijzypbstiigssjuiuvh", query="SELECT COUNT(*) FROM \"user\";")
# Apply migration
mcp_supabase_apply_migration(project_id="wijzypbstiigssjuiuvh", name="migration_name", query="SQL")
# Check advisories
mcp_supabase_get_advisors(project_id="wijzypbstiigssjuiuvh", type="security")# Check logs
docker compose logs backend --tail=50
# Common issues:
# - Database connection failed → Check .env DATABASE_URL
# - Import error → Rebuild: docker compose build backend
# - Port in use → Check: lsof -i :8000# Check if worker is running
docker compose ps celery-worker
# Check logs
docker compose logs celery-worker --tail=50
# Verify Redis connection
docker compose exec redis redis-cli -a 5WEQ47_uuNd-289-_ZnN79GmNY8LFWzy PING
# Check registered tasks
docker compose exec celery-worker celery -A app.worker inspect registered# Check logs
docker compose logs frontend --tail=50
# Rebuild if needed
docker compose build frontend
docker compose up -d frontend
# Check if backend is accessible
curl http://localhost:8000/api/v1/utils/health-check/# Test via MCP
mcp_supabase_get_project(id="wijzypbstiigssjuiuvh")
# Check database logs
mcp_supabase_get_logs(project_id="wijzypbstiigssjuiuvh", service="postgres")
# Test connection from backend
docker compose exec backend python3 -c "from app.core.db import engine; engine.connect(); print('✅ Connected')"Symptom: Upload fails with "413 Request Entity Too Large" or "Upload failed. Please try again."
Diagnosis:
# 1. Check browser Network tab - look at request URL
# If you see: http://localhost:5173/api/v1/ingestions
# → Request is hitting frontend nginx instead of backend
# 2. Check for 401 errors
# → Missing authentication token
# 3. Verify backend is accessible
curl http://localhost:8000/api/v1/utils/health-check/Fix for 413 errors (Request hitting frontend):
// ❌ Wrong - relative URL
await axios.post('/api/v1/ingestions', formData)
// ✅ Correct - absolute URL with OpenAPI.BASE
import { OpenAPI } from '@/client'
await axios.post(`${OpenAPI.BASE}/api/v1/ingestions`, formData)Fix for 401 errors (Missing auth):
// Get token from OpenAPI config
const token = typeof OpenAPI.TOKEN === "function"
? await (OpenAPI.TOKEN as () => Promise<string>)()
: OpenAPI.TOKEN
// Include in headers
await axios.post(url, formData, {
headers: {
"Content-Type": "multipart/form-data",
...(token && { Authorization: `Bearer ${token}` })
}
})Verification:
# 1. Network tab should show:
# - URL: http://localhost:8000/api/v1/ingestions/ (port 8000)
# - Status: 201 Created
# - Headers include: Authorization: Bearer ...
# 2. Test manually
TOKEN=$(node -e "console.log(localStorage.getItem('access_token'))")
curl -X POST http://localhost:8000/api/v1/ingestions \
-H "Authorization: Bearer $TOKEN" \
-F "file=@test.pdf"- Use
Sessioncontext managers (auto-closes connections) - Implement pagination for list endpoints
- Use
select()with filters before fetching - Add database indexes for frequent queries
- Monitor connection pool usage
- Set appropriate
time_limitfor tasks - Use
task_reject_on_worker_lost=Truefor critical tasks - Implement retry logic with exponential backoff
- Monitor task queue depth
- Use separate queues for different task types
- Use TanStack Query for server state (automatic caching)
- Implement virtualization for long lists
- Lazy load routes with TanStack Router
- Optimize images before upload
- Use React.memo for expensive components
# Create feature branch
git checkout -b feature/extraction-model
# Make changes, commit frequently
git add .
git commit -m "feat: add extraction model"
# Push to remote
git push origin feature/extraction-model
# Create pull request on GitHubAll pull requests must have at least one type label to pass CI checks. The check-labels workflow validates this requirement.
Available labels (based on conventional commit types):
feature- New feature implementation (feat commits)bug- Bug fixes (fix commits)docs- Documentation changes (docs commits)refactor- Code refactoring (refactor commits)enhancement- Performance improvements (perf commits)internal- Internal/maintenance changes (chore, ci, build, style commits)breaking- Breaking changes (feat!, fix! commits)security- Security-related changesupgrade- Dependency upgrades
Labeling methods:
- Manual: Add label via GitHub UI when creating PR
- Via Claude: Use the
pr-labelingskill when creating PRs with Claude Code - Via CLI:
gh pr edit <number> --add-label <label-name>
Example:
# Create PR with gh CLI
gh pr create --title "feat: add extraction model" --body "Description"
# Add label
gh pr edit <number> --add-label featureFollow conventional commits:
feat:- New featurefix:- Bug fixdocs:- Documentationtest:- Testsrefactor:- Code refactoringchore:- Maintenance
Examples:
git commit -m "feat: add PDF extraction Celery task"
git commit -m "fix: resolve database connection timeout"
git commit -m "docs: update CLAUDE.md with MCP commands"
git commit -m "test: add extraction model tests"Documentation:
- Setup Guide - Initial installation
- Supabase Setup - Database configuration
- Architecture Overview - System design
- PRD Overview - Product requirements
Status Documents:
- DEVELOPMENT_READY.md - Current environment status
- CELERY_SETUP_COMPLETE.md - Celery configuration
- ENVIRONMENT_RUNNING.md - Services overview
- CLAUDE.md - AI development guide
External:
Now that your environment is running, start building features:
- ✅
Environment setupCOMPLETE - ✅
Infrastructure (Celery + Redis)COMPLETE - Create core models (Extraction, Question, Ingestion)
- Set up Supabase Storage buckets
- Add PDF processing libraries
- Implement extraction pipeline
- Build review UI
See PRD Overview for complete feature requirements!
Happy developing! 🚀