A production-ready FastAPI backend system that automates the entire job application process using AI and machine learning.
- 4 Active Sources: RemoteOK, GitHub, Reed (UK), Adzuna (Global)
- Smart Deduplication: Prevents duplicate job listings
- Real-time Fetching: ~15 jobs per API call
- Error Handling: Graceful fallbacks for all sources
- ML Algorithm: TF-IDF similarity + keyword matching + technology alignment
- Smart Selection: Automatically selects top 3-5 most relevant projects
- Confidence Scoring: Detailed match explanations with percentages
- Performance: <1 second matching for 10 projects
- PDF Generation: LaTeX (primary) + ReportLab (fallback)
- Professional Templates: Custom LaTeX with Jinja2 placeholders
- Job-Specific: Automatically tailored for each application
- Fast: ~2 seconds generation time
- AI Integration: Groq (primary) + OpenAI (fallback)
- Template Fallback: Professional templates when AI unavailable
- Bulk Support: Generate for multiple jobs simultaneously
- Personalization: Uses job context and selected projects
βββββββββββββββββββ βββββββββββββββββββ βββββββββββββββββββ
β FastAPI β β PostgreSQL β β Redis β
β Backend βββββΊβ Database β β Cache β
β (Port 8000) β β (Supabase) β β (Port 6379) β
βββββββββββββββββββ βββββββββββββββββββ βββββββββββββββββββ
β β β
βΌ βΌ βΌ
βββββββββββββββββββ βββββββββββββββββββ βββββββββββββββββββ
β Job Sources β β ML Matching β β AI Services β
β RemoteOK,GitHub β β TF-IDF + NLP β β Groq, OpenAI β
β Reed, Adzuna β β Caching β β Templates β
βββββββββββββββββββ βββββββββββββββββββ βββββββββββββββββββ
- Python 3.11+
- PostgreSQL database (or Supabase)
- Redis (optional, for caching)
- Clone and Setup
git clone <repository>
cd job-application-system
pip install -r requirements.txt- Environment Configuration
cp .env.example .env
# Edit .env with your configuration- Database Setup
alembic upgrade head- Start the Server
python start_server_fixed.py# Start all services
docker-compose up --build
# Access:
# - API: http://localhost:8000
# - Docs: http://localhost:8000/docs
# - Redis: localhost:6379# Database
DATABASE_URL=postgresql://user:pass@host:port/db
# Security
SECRET_KEY=your-secret-key-here# Job Sources
REED_API_KEY=your-reed-api-key # UK jobs
ADZUNA_APP_ID=your-adzuna-app-id # Global jobs
ADZUNA_APP_KEY=your-adzuna-app-key
# AI Services
GROQ_API_KEY=your-groq-api-key # Fast AI generation
OPENAI_API_KEY=your-openai-api-key # Fallback AI
# Caching
REDIS_URL=redis://localhost:6379/0 # Performance optimization# Search jobs with filters
GET /api/v1/jobs?keywords=python&location=remote&limit=20
# Fetch new jobs from all sources
POST /api/v1/jobs/fetch
{
"keywords": ["python", "react"],
"limit_per_source": 10
}
# Get job details
GET /api/v1/jobs/{job_id}
# Manage job sources
GET /api/v1/jobs/sources# Match projects to job
GET /api/v1/match/{job_id}?user_id=123&max_results=5
# Get detailed explanation
POST /api/v1/match/{job_id}/explain?project_id=456&user_id=123
# Cache management
GET /api/v1/match/cache/stats
DELETE /api/v1/match/cache/{user_id}# Generate custom resume
POST /api/v1/resume/generate
{
"name": "John Doe",
"email": "[email protected]",
"phone": "+1(555) 123-4567",
"location": "San Francisco, CA",
"primary_skills": ["Python", "React", "AWS"],
"experience": [...],
"projects": [...],
"job_id": "optional-for-customization"
}
# Download resume PDF
GET /api/v1/resume/download/{resume_id}# Generate for specific job
POST /api/v1/cover-letters/{job_id}
{
"user_id": "123e4567-e89b-12d3-a456-426614174000",
"user_name": "John Doe",
"user_email": "[email protected]",
"primary_skills": ["Python", "React"],
"selected_projects": [...]
}
# Bulk generation for multiple jobs
POST /api/v1/cover-letters/bulk
{
"user_id": "123e4567-e89b-12d3-a456-426614174000",
"job_ids": ["job1", "job2", "job3"],
"user_name": "John Doe",
"user_email": "[email protected]"
}
# Download cover letter
GET /api/v1/cover-letters/{cover_letter_id}/download
# Bulk download as ZIP
POST /api/v1/cover-letters/bulk/download?cover_letter_ids=id1,id2,id3- Job Fetching: ~15 jobs per API call (4 sources)
- Project Matching: <1 second for 10 projects
- Resume Generation: ~2 seconds (ReportLab fallback)
- Cover Letter Generation: ~3 seconds (template mode)
- Cache Hit Rate: 85%+ for repeated operations
# Run integration tests
python test_integration_simple.py
# Test specific components
python -m pytest tests/
# Load test with sample data
python load_test_jobs.pyapp/
βββ api/v1/endpoints/ # REST API endpoints
β βββ jobs.py # Job management
β βββ resume.py # Resume generation
β βββ cover_letters.py # Cover letter generation
β βββ project_matching.py # ML project matching
βββ services/ # Business logic
β βββ job_fetcher.py # Multi-source job fetching
β βββ resume_generator.py # PDF resume generation
β βββ cover_letter_generator.py # AI cover letter generation
β βββ job_service.py # Job data management
β βββ matching/ # ML matching algorithms
βββ models/ # Database models
βββ schemas/ # Pydantic schemas
βββ templates/ # LaTeX and text templates
βββ generated/ # Generated files storage
- Input validation with Pydantic models
- SQL injection prevention (SQLAlchemy ORM)
- File upload restrictions
- Environment-based configuration
- Health monitoring endpoints
- Set strong SECRET_KEY
- Configure production database
- Set up Redis for caching
- Configure API keys for job sources
- Set up monitoring and logging
- Configure CORS origins
- Set up SSL/TLS
- Use Redis for caching and session storage
- Implement rate limiting for API endpoints
- Consider horizontal scaling with load balancers
- Monitor API usage and performance metrics
- Fork the repository
- Create a feature branch
- Make your changes
- Add tests for new functionality
- Submit a pull request
This project is licensed under the MIT License - see the LICENSE file for details.
- Documentation: Check
/docsendpoint when server is running - Issues: Create an issue in the repository
- API Reference: Available at
http://localhost:8000/docs
Built with FastAPI, PostgreSQL, Redis, and AI services for modern job application automation.
If you just installed PostgreSQL locally and want the fastest path to a running server + landing page:
Create a database and user (example):
CREATE DATABASE applybot;
CREATE USER applybot_user WITH PASSWORD 'applybot_pass';
GRANT ALL PRIVILEGES ON DATABASE applybot TO applybot_user;From the ApplyBot directory (or project root using npm script):
powershell -ExecutionPolicy Bypass -File .\setup_local.ps1
# or from root (package.json):
npm run backend:setup-runThis will:
- Create
.envif missing - Create & activate
.venv - Install dependencies
- Sanity check DB port
- Start the FastAPI server on
http://localhost:8000
Run the standalone static site (does not require Python once built):
npm run serve:static
# open http://localhost:5173/If the backend (port 8000) is also running, the static page will show live version + health.
| What | URL |
|---|---|
| Landing (FastAPI) | http://localhost:8000/ |
| Swagger Docs | http://localhost:8000/docs |
| ReDoc | http://localhost:8000/redoc |
| Health | http://localhost:8000/health |
| Static Variant | http://localhost:5173/ |
Created automatically if absent. Adjust DATABASE_URL in .env as needed:
DATABASE_URL=postgresql://applybot_user:applybot_pass@localhost:5432/applybot
| Issue | Fix |
|---|---|
| ValueError: DATABASE_URL required | Ensure .env created; re-run script |
| Cannot connect to Postgres | Confirm service running; correct port/credentials |
| LaTeX errors | Install MiKTeX or rely on ReportLab fallback |
| loguru not found | Virtual environment not activated; re-run script |
deactivate
Remove-Item .venv -Recurse -Force
# Optional: drop database
psql -U postgres -c "DROP DATABASE applybot;"