Skip to content

ChatGPT style interface for working with LLMs offline locally using Ollama.

License

Notifications You must be signed in to change notification settings

mrmendoza-dev/offline-chatbot

Folders and files

NameName
Last commit message
Last commit date

Latest commit

2dd32dc · Feb 15, 2025

History

21 Commits
Sep 10, 2024
Feb 14, 2025
Feb 15, 2025
Jul 28, 2024
Oct 2, 2024
Feb 7, 2025
Feb 14, 2025
Feb 14, 2025
Feb 14, 2025
Feb 14, 2025
Feb 14, 2025
Jul 28, 2024
Feb 14, 2025
Feb 14, 2025
Jul 28, 2024
Feb 14, 2025

Repository files navigation

Offline AI Chatbot

A desktop application that provides a clean interface for interacting with Ollama's AI models locally. Chat with AI models without needing internet connectivity after initial setup.

Features

  • Fully offline AI chat capabilities
  • Multiple AI model support through Ollama
  • Support for basic text files, CSVs, and JSON
  • Support for images compatible with select models
    • LLaVa and Llama 3.2 for example
  • Clean, modern interface
  • Dark/Light mode support
  • Real-time responses
  • Local data storage

Prerequisites

  1. Install Ollama:
    • Visit Ollama's website
    • Download and install for your system
    • Open terminal and verify installation:
      ollama --version
    • Pull and run your first model:
      ollama pull llama2
      ollama run llama2
    • Check out my blog post for more information on how to get started with Ollama.

Setup

  1. Clone the repository

    git clone https://github.com/yourusername/offline-chatbot.git
    cd offline-chatbot
  2. Install dependencies

    npm install
  3. Create a .env file in the root directory:

    VITE_PORT=3030

Running the Application

Start both frontend and backend servers:

npm start

This will run:

  • Frontend: http://localhost:5173
  • Backend: http://localhost:3030

Development

Run frontend only:

npm run dev

Run backend only:

npm run server

Project Structure

offline-chatbot/
├── src/               # Frontend source code
├── server/            # Backend server code
└── public/            # Static assets

Environment Variables

Variable Description Required
VITE_PORT Backend server port Yes

Tech Stack

  • React + Vite
  • Express.js
  • Ollama API
  • TailwindCSS
  • Node.js

Additional Resources

License

MIT