This project sets up a AI tutor using ChromaDB for vector storage and Large Language Model for processing responses.
- Python 3.7+
- Required dependencies (install via
requirements.txt) - Hugging Face API key
- Ollama
- ChromaDB
- Flask
- Visit
config.pyto change the local LLM, MIN_QUESTIONS, MAX_QUESTIONS, etc.
First, make sure Ollama is running in the background with Mistral model installed on your system
-
Install Required dependencies
Run following command,pip3 install -r requirements.txtIf you get errors regarding
modules not foundwhile running the application install them withpip3 install MODULE_NAME. -
Add Hugging Face Token
Create a.envfile in the root directory and include your Hugging Face API key:HF_API_KEY = "your_huggingface_api_key"
-
Populate Database
Run the following script to create a ChromaDB instance usingResource Version 3.xlsx:python3 populate_database.py
-
Run the Application
Start the Flask server by executing:python3 app.py
-
Access the Chatbot
Open your browser and go to:http://127.0.0.1:5000The chatbot should be up and running.
Happy AI Learning :)