QuickCheck is a Flutter-based mobile application with a Python Flask backend that uses AI to analyze facial features and provide health assessments.
- AI Health Diagnosis: Capture images of eyes, teeth, and tongue for health analysis
- Multi-condition Detection:
- Anemia (conjunctival pallor + glossitis)
- Jaundice (scleral icterus using CIELAB + fuzzy logic)
- Dental Health (gingivitis and caries)
- Tongue Health (coating and texture analysis)
- Conjunctivitis (pink eye detection)
- Chikungunya (combined eye + tongue symptoms)
- AI Medical Chat: Chat with Groq-powered Llama 3 AI for medical guidance
- Multi-language Support: English and Hindi (हिंदी)
- Emergency Detection: Keyword-based emergency symptom detection
- Symptom Analysis: Voice and text input for symptom description
JiloHealth/
├── mobile/ # Flutter mobile app
│ ├── lib/
│ │ ├── screens/ # App screens
│ │ ├── services/ # API and business logic services
│ │ └── main.dart
│ ├── assets/
│ │ └── translations/ # Localization files (en.json, hi.json)
│ └── pubspec.yaml
│
├── server/ # Python Flask backend
│ ├── app.py # Main Flask application
│ ├── calculate_probability.py # Disease detection algorithms
│ ├── feature_extractor.py # MediaPipe facial feature extraction
│ ├── requirements.txt
│ └── venv/ # Python virtual environment
│
└── README.md
- Flutter SDK (>=3.0.0)
- Python (>=3.8)
- Firebase Account (for authentication)
- Groq API Key (for AI chat and diagnosis)
-
Navigate to server directory:
cd server -
Create and activate virtual environment:
# Windows python -m venv venv venv\Scripts\activate # macOS/Linux python3 -m venv venv source venv/bin/activate
-
Install dependencies:
pip install -r requirements.txt
-
Setup environment variables:
# Copy the example file cp .env.example .env # Edit .env and add your API keys # GROQ_API_KEY=your-actual-groq-api-key
-
Run the Flask server:
python app.py
Server will start on
http://localhost:5000
-
Navigate to mobile directory:
cd mobile -
Install dependencies:
flutter pub get
-
Setup Firebase:
- Create a Firebase project at Firebase Console
- Add Android/iOS apps to your Firebase project
- Download and place configuration files:
- Android:
google-services.json→android/app/ - iOS:
GoogleService-Info.plist→ios/Runner/
- Android:
-
Configure API URLs:
- Edit
lib/services/api_service.dart - Update
baseUrlandflaskServerUrlto match your backend
- Edit
-
Run the app:
# For Android flutter run # For iOS flutter run -d ios # For web (if enabled) flutter run -d chrome
GROQ_API_KEY=your-groq-api-key
FLASK_ENV=development
FLASK_DEBUG=True
SECRET_KEY=your-secret-key
PORT=5000- Backend API URL
- Flask Server URL
- Firebase configuration
- POST /api/detect - Complete health analysis (extract + detect)
- POST /api/diagnose - Get AI diagnosis from health analysis
- POST /api/chat - Medical chat with AI
- POST /api/extract - Extract facial features from images
- Flutter - Cross-platform mobile framework
- Camera - Image capture
- Google ML Kit - Face detection
- Easy Localization - Multi-language support
- Provider - State management
- Flutter Markdown - Markdown rendering
- Flask - Web framework
- MediaPipe - Facial feature extraction
- OpenCV - Image processing
- LangChain + Groq - AI integration with Llama 3
- NumPy - Numerical computations
- Groq Llama 3 (70B) - Medical diagnosis and chat
- CIELAB Color Space - Perceptually uniform color analysis
- Fuzzy Logic - Intelligent symptom combination
- MediaPipe Face Mesh - 468-point facial landmarks
- CIELAB a* channel analysis of conjunctiva
- Glossitis texture analysis (smooth/beefy red tongue)
- Weighted fusion: Eyes (70%) + Tongue (30%)
- CIELAB b* channel analysis
- Fuzzy logic combining eye and face scores
- Advanced region segmentation (luminance + YCrCb)
- CIELAB a* channel for sclera redness
- Threshold: a* > 135 indicates inflammation
- Fuzzy logic combining conjunctivitis + white tongue coating
- Eye as primary indicator, tongue as secondary
- Login/Signup - Firebase authentication
- Home - Start diagnosis or AI chat
- Symptom Input - Describe symptoms (text/voice)
- Camera Capture - 4-stage guided capture (face, eyes, teeth, tongue)
- Review & Analysis - View captured images and health analysis
- Medical Chat - AI-powered medical Q&A
- English (en)
- Hindi (हिंदी) (hi)
All UI elements and AI responses support both languages.
- DO NOT commit
.envfiles to git - DO NOT commit API keys or secrets
- DO NOT commit Firebase configuration files
- Always use
.env.exampleas a template
This project is for hackathon/educational purposes.
Built for a hackathon with ❤️
# Make sure you're in the virtual environment
cd server
source venv/bin/activate # or venv\Scripts\activate on Windows
pip install -r requirements.txt# Clean and rebuild
flutter clean
flutter pub get
flutter run# Reinstall with correct versions
pip install mediapipe==0.10.11 numpy==1.24.4For issues and questions, please check the code documentation or create an issue in the repository.
Medical Disclaimer: This app is for educational and informational purposes only. Always consult a qualified healthcare professional for medical advice, diagnosis, or treatment.