Smart Agent for Automation, Reasoning and Task Handling
Saarth is a chatbot that I have been intrested to develop.
This works through groq API key with llama-3.1-8b-instant, llama-3.3-70b-versatile and openai/gpt-oss-20b models. So, basically it's just an LLM which i have cutomized to my liking.
The dev repository can be a base for your projects using groq's api keys :)
Used an 8 Billion parameter and 70 Billion parameter models to reduce costs and because it matches my use case.
List of all models you can use through groq's free API can be brought like this according to the official docs:
import requests
import os
api_key = os.environ.get("GROQ_API_KEY")
url = "https://api.groq.com/openai/v1/models"
headers = {
"Authorization": f"Bearer {api_key}",
"Content-Type": "application/json"
}
response = requests.get(url, headers=headers)
print(response.json())
Or you can just simpley go to the official documentation https://console.groq.com/docs/overview.
git clone https://github.com/lxsh-S/SAARTH.git
cd SAARTH
Install the dependencies from the requirements.txt file:
pip install -r requirements.txt
then:
- Make a .env file and paste your groq API key
GROQ_API_KEY=your_api_key_here
Simply run the file:
python main.py
To run local models:
- Download Ollama
Arch Linux:
sudo pacman -S ollama
Or visit the official site https://ollama.com/
-
Pull Qwen3:1.7 b model:
ollama pull qwen3:1.7b -
Run Ollama:
sudo systemctl start ollamaRun automatically wehn system starts:
sudo systemctl enable ollama
Now you can use flags to switch between Models('llama-3.3-70b-versatile', 'openai/gpt-oss-20b - on_demand', 'llama-3.1-8b-instant') or Modes(local/cloud)
List of FLAGS:
- --clear # Clear the chat history with SAARTH(MAX=20)
- --local # Switch to ollama(qwen3:1.7b)
- --cloud # Switch to groq
- --exit # Quit saarth.py
- --quit # Quit saarth.py
- models # LIst all the models
- model n(1/2/3) #Switch between models
When the main.py file is run you are able to acess the TUI version of the project.
Online Search This can be done by simple adding "search" or "look up" keyword before the query
Read files add "read file" before the file to be read and summarized
NOTE that currently SAARTH only searches in these directories:
Projects/
Documents/
Desktop/
This can be changed by changing the names of directories in files.py
- STAGE - 1 --> Core Chat
- STAGE - 2 --> File Acess
- STAGE - 3 --> Search online
- STAGE - 4 --> Brain
- STAGE - 5 --> Local LLM(llama) -- didn't test "HARDWARE LIMITATION"
- STAGE - 6 --> CLI-Polish
- STAGE - 7 --> Voice Inputs
- STAGE - 8 --> Voice Outputs
- STAGE - 9 --> Final Poilsh