Skip to content

๐Ÿ” AI search engine with self hosted LLMs via Ollama

Notifications You must be signed in to change notification settings

felixiho/ollama-search

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 

History

30 Commits
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 

Repository files navigation

Ollama Search

Open-source AI-powered search engine based off Ollama models.

0102.1.mov

Features

  • ๐Ÿ” Generate search queries with langchain
  • ๐Ÿ’พ Search with Tavily
  • ๐Ÿš€ Answer questions with local installed ollama models (llama3, mistral, gemma)
  • ๐ŸŽจ Clean and intuitive user interface

Getting Started

Prerequisites

  • Ollama installed and running locally
    • Ollama: Pull a model from the library. Example: ollama pull gemma
  • Node.js (version 18 or higher)
  • Tavily API key

Installation

git clone https://github.com/felixiho/ollama-search
cd ollama-search
pnpm install
cp .env.example .env.local

Usage

Update your .env with the tavily api key

Start application:

pnpm dev

Roadmap

  • Implement search functionality via Tavily
  • Support local LLMs via Ollama
  • Add support for chat history
  • Multiple model support
  • Add support for Google search
  • Add support for research mode via PDF search
  • Add image search support
  • Add support for cloud models
  • Add support docker

Contributing

We welcome contributions! Please feel free to submit pull requests, create issues, or suggest new features.

  1. Fork the repository
  2. Create your feature branch (git checkout -b feature/AmazingFeature)
  3. Commit your changes (git commit -m 'Add some AmazingFeature')
  4. Push to the branch (git push origin feature/AmazingFeature)
  5. Open a Pull Request