A simple chat interface built with Gradio that supports multiple language models.
- Chat interface with message history
- Configurable settings (temperature, model selection)
- Support for multiple models (llama2, mistral, gpt4all)
- Local API compatibility with LM Studio
- Clone the repository:
git clone https://github.com/aiforhumans/simple-chat-interface.git
cd simple-chat-interface
- Install dependencies:
pip install -r requirements.txt
- Set up environment variables (optional):
export LM_STUDIO_API_KEY=your-key
export LM_STUDIO_BASE_URL=http://localhost:1234/v1
Run the application:
python gradio_app.py
The interface will be available at http://localhost:7860
- Temperature: Controls response randomness (0-1)
- Model: Choose between available models (llama2, mistral, gpt4all)
To run tests:
pytest
MIT