Skip to content

A clean and simple chat interface for language models, built with Gradio. Supports multiple models and local LLM servers.

Notifications You must be signed in to change notification settings

aiforhumans/simple-chat-interface

Repository files navigation

Simple Chat Interface

A simple chat interface built with Gradio that supports multiple language models.

Features

  • Chat interface with message history
  • Configurable settings (temperature, model selection)
  • Support for multiple models (llama2, mistral, gpt4all)
  • Local API compatibility with LM Studio

Installation

  1. Clone the repository:
git clone https://github.com/aiforhumans/simple-chat-interface.git
cd simple-chat-interface
  1. Install dependencies:
pip install -r requirements.txt
  1. Set up environment variables (optional):
export LM_STUDIO_API_KEY=your-key
export LM_STUDIO_BASE_URL=http://localhost:1234/v1

Usage

Run the application:

python gradio_app.py

The interface will be available at http://localhost:7860

Configuration

  • Temperature: Controls response randomness (0-1)
  • Model: Choose between available models (llama2, mistral, gpt4all)

Development

To run tests:

pytest

License

MIT

Author

aiforhumans

About

A clean and simple chat interface for language models, built with Gradio. Supports multiple models and local LLM servers.

Topics

Resources

Stars

Watchers

Forks

Packages

No packages published

Languages