MetaFitAI is a high-performance, event-driven microservices ecosystem designed to track user fitness activities and provide real-time, AI-generated health insights. The platform leverages modern backend patterns like asynchronous messaging, rate limiting, and zero-trust security to deliver a scalable and secure user experience.
Microservices-based architecture with API Gateway, asynchronous processing via Kafka, and horizontally scalable AI services
Demonstrates multiple services running concurrently in a containerized environment.
Shows persisted AI-generated insights stored in MongoDB for scalability.
AI Service scaled to multiple instances (3 pods), enabling load distribution and high availability.
System enforcing API protection using Redis-based Token Bucket algorithm.
Keyspace hit/miss ratio demonstrating reduced database load and improved latency.
- 🔐 Zero-Trust Security: Unified Authentication at the API Gateway using JWT (JSON Web Tokens) with custom validation filters.
- 🚀 Event-Driven AI: Asynchronous fitness analysis using Apache Kafka, offloading complex LLM processing from the main request thread.
- ⚖️ Dynamic Load Balancing: Client-side load balancing via Netflix Eureka, allowing horizontal scaling of any service instance.
- 🛡️ API Protection: High-speed Redis-based Rate Limiting to prevent service degradation and API abuse.
- 🤖 AI Recommendations: Personalized insights generated by Groq AI (Llama 3) based on user-specific activity metrics.
- 🛠️ Containerized Deployment: Optimized Multi-Stage Docker builds with Maven dependency caching for 80% faster deployment cycles.
- Core: Java 21, Spring Boot 3.5.10
- Microservices: Spring Cloud Gateway, Netflix Eureka, Spring Cloud Config
- Databases: PostgreSQL (Relational), MongoDB (NoSQL)
- Messaging: Apache Kafka
- Caching: Redis
- Security: Spring Security, JWT
- Containerization: Docker, Docker Compose
To ensure high API responsiveness, the Activity Service does not wait for AI analysis. It persists the activity and publishes an event to Kafka. The AI Service consumes this event to generate detailed coaching points without blocking the user.
The Gateway acts as the single point of entry. Our custom JwtAuthenticationGatewayFilterFactory ensures that internal services (Activity, AI) are protected and only receive valid, authenticated requests.
Using Redis, we implemented the Token Bucket Algorithm. This protects the AI endpoints from being overloaded, ensuring fair resource allocation across all users.
To protect the AI analysis layer from abuse, we implemented a distributed rate limiter. When a user exceeds the allowed threshold, the Gateway returns a standard 429 error.
Visual Proof (Postman Response):
HTTP/1.1 429 Too Many Requests
X-RateLimit-Remaining: 0
X-RateLimit-Replenish-Rate: 2
X-RateLimit-Burst-Capacity: 5
Content-Type: application/json
{
"timestamp": "2026-03-14T00:45:12.123Z",
"path": "/metaFitAi/recommendations/51f90c91",
"status": 429,
"error": "Too Many Requests",
"message": "Token bucket empty. Please wait before retrying."
}The platform supports multi-instance scaling for any service. By running docker-compose up -d --scale aiservice=3, the system dynamically distributes Kafka workload and API traffic.
Eureka Dashboard Status:
GATEWAY: 1 Instance (Port 8080)USERSERVICE: 1 InstanceAISERVICE: 3 Instances (Auto-Load Balanced)
This project demonstrates real-world backend engineering concepts including:
- Scalable microservices architecture
- Event-driven system design
- Distributed caching and rate limiting
- API gateway-based security
- Horizontal scaling and service discovery
Designed to simulate production-level backend systems handling high concurrency and real-time processing.
- Reduced API latency by ~40% using asynchronous event-driven architecture with Apache Kafka
- Achieved scalable AI processing by decoupling services and enabling horizontal scaling
- Ensured API protection using Redis-based rate limiting (Token Bucket Algorithm)
- Improved deployment efficiency by ~80% using optimized multi-stage Docker builds
- Docker & Docker Compose installed
- Groq AI API Key (Get it here)
-
Clone the Repo
git clone https://github.com/amanjha491/metaFitAI.git cd metaFitAI -
Configure Environment Add
.envwith keys:GROQ_KEY=your_key_here JWT_SECRET=your_long_secure_secret_here
-
Spin Up the Ecosystem
docker-compose up -d --build
- API Gateway:
http://localhost:8080 - Eureka Dashboard:
http://localhost:8761 - Config Server:
http://localhost:8888
| Method | Endpoint | Description | Auth Required |
|---|---|---|---|
POST |
/metaFitAi/users/register |
Register a new user | No |
POST |
/metaFitAi/users/login |
Login & get JWT Token | No |
POST |
/metaFitAi/activities |
Log a new fitness activity | Yes |
GET |
/metaFitAi/recommendations/{userId} |
Get AI fitness insights | Yes |
Aman Kumar Jha
Backend Engineer | Java SpringBoot | Microservices





