-
-
Notifications
You must be signed in to change notification settings - Fork 1
Installation Guide
Comprehensive installation instructions for all platforms and scenarios.
- System Requirements
- Platform-Specific Installation
- Dependency Installation
- Ollama Setup
- Environment Variables
- Docker Installation
- Verification
| Component | Specification |
|---|---|
| CPU | Dual-core 2.0 GHz |
| RAM | 8 GB |
| Storage | 10 GB free (models + indexes) |
| OS | Windows 10, macOS 10.15, Ubuntu 20.04 |
| Internet | Only for initial setup |
| Component | Specification |
|---|---|
| CPU | Quad-core 3.0 GHz+ |
| RAM | 16 GB+ |
| GPU | NVIDIA GPU with 8GB+ VRAM (optional) |
| Storage | 50 GB SSD |
| OS | Windows 11, macOS 13+, Ubuntu 22.04 |
For faster LLM inference, CodeScope supports GPU acceleration through Ollama:
- NVIDIA GPUs: CUDA 11.8+ (automatic detection)
- Apple Silicon: Metal (automatic detection)
- AMD GPUs: ROCm support (experimental)
Download from python.org and ensure "Add Python to PATH" is checked.
Verify installation:
python --version
# Output: Python 3.10.x or higherDownload from nodejs.org (LTS version recommended).
Verify installation:
node --version # Should be 18.x or higher
npm --version # Should be 9.x or higherDownload from git-scm.com or use GitHub Desktop.
Download installer from ollama.com and run it.
Verify installation:
ollama --version# Install Homebrew (if not installed)
/bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/HEAD/install.sh)"
# Install dependencies
brew install [email protected] node git
# Install Ollama
brew install ollama- Python: Download from python.org
- Node.js: Download from nodejs.org
- Ollama: Download installer from ollama.com
Verify installations:
python3 --version
node --version
ollama --version# Update package list
sudo apt update
# Install Python 3.10+
sudo apt install python3.10 python3.10-venv python3-pip
# Install Node.js 18+ (via NodeSource)
curl -fsSL https://deb.sourcesource.com/setup_18.x | sudo -E bash -
sudo apt install nodejs
# Install Git
sudo apt install git
# Install Ollama
curl -fsSL https://ollama.com/install.sh | sh# Install Python
sudo dnf install python3 python3-pip python3-venv
# Install Node.js
sudo dnf install nodejs npm
# Install Git
sudo dnf install git
# Install Ollama
curl -fsSL https://ollama.com/install.sh | shgit clone https://github.com/Yigtwxx/CodeScope.git
cd CodeScopecd backend
python -m venv .venv
.venv\Scripts\activate
pip install --upgrade pip
pip install -r requirements.txtcd backend
python3 -m venv .venv
source .venv/bin/activate
pip install --upgrade pip
pip install -r requirements.txtDependencies Installed:
- FastAPI 0.115.12 - Web framework
- LangChain 0.3.20 - RAG orchestration
- ChromaDB 0.5.26 - Vector database
- Sentence-Transformers 3.4.1 - Embeddings
- Uvicorn 0.37.0 - ASGI server
cd frontend
npm installDependencies Installed:
- Next.js 16.0.10 - React framework
- React 19.2.1 - UI library
- Tailwind CSS 4.0 - Styling
- Shadcn/UI - Component library
- React Markdown - Markdown rendering
CodeScope works with any Ollama model. Choose based on your needs:
# Llama 3 (Recommended, 4.7GB)
ollama pull llama3
# Llama 3.1 (Latest, 8.5GB, higher quality)
ollama pull llama3.1
# Mistral (Fast, 4.1GB)
ollama pull mistral# CodeLlama (Python, JS, C++, 3.8GB)
ollama pull codellama
# DeepSeek Coder (Multi-language, 6.7GB)
ollama pull deepseek-coder
# StarCoder (Code generation, 15GB)
ollama pull starcoder# Phi-2 (2.7GB)
ollama pull phi
# TinyLlama (637MB)
ollama pull tinyllama# List installed models
ollama list
# Test a model
ollama run llama3 "Hello, test"CodeScope uses the model Ollama is currently serving. To change models:
# Option 1: Environment variable (temporary)
export OLLAMA_MODEL=codellama
uvicorn main:app --reload
# Option 2: Edit backend/app/core/config.py
# Change OLLAMA_MODEL settingCreate a .env file in the backend/ directory for custom configuration:
# backend/.env
# Ollama Configuration
OLLAMA_BASE_URL=http://localhost:11434
OLLAMA_MODEL=llama3
# Server Configuration
API_V1_STR=/api/v1
PROJECT_NAME=CodeScope
VERSION=0.1.0
# ChromaDB Configuration
CHROMA_DB_DIR=./chroma_db
# Ingestion Settings
CHUNK_SIZE=1000
CHUNK_OVERLAP=200
MAX_FILES_TO_PROCESS=10000
# Logging
LOG_LEVEL=INFOConfiguration Options:
| Variable | Description | Default |
|---|---|---|
| OLLAMA_BASE_URL | Ollama API endpoint | http://localhost:11434 |
| OLLAMA_MODEL | LLM model to use | llama3 |
| CHUNK_SIZE | Code chunk size (chars) | 1000 |
| CHUNK_OVERLAP | Overlap between chunks | 200 |
| LOG_LEVEL | Logging verbosity | INFO |
For containerized deployment:
Create docker-compose.yml:
version: '3.8'
services:
backend:
build: ./backend
ports:
- "8000:8000"
volumes:
- ./backend:/app
- chroma-data:/app/chroma_db
environment:
- OLLAMA_BASE_URL=http://host.docker.internal:11434
frontend:
build: ./frontend
ports:
- "3000:3000"
depends_on:
- backend
environment:
- NEXT_PUBLIC_API_URL=http://localhost:8000
volumes:
chroma-data:Run:
docker-compose up -dNote: Ollama must run on the host machine (not containerized) for GPU access.
curl http://localhost:8000/healthExpected output:
{
"status": "healthy",
"service": "CodeScope",
"version": "0.1.0"
}Visit http://localhost:3000 - you should see the CodeScope UI.
curl http://localhost:11434/api/tagsShould return list of installed models.
- Start backend and frontend
- Open Settings in UI
- Ingest a small test repository
- Ask: "What files are in this project?"
- Verify you receive an AI response
Problem: python: command not found
Solution: Install Python and add to PATH
Problem: Permission denied when installing packages
Solution: Use virtual environment (recommended) or pip install --user
Problem: npm ERR! ERESOLVE
Solution: npm install --legacy-peer-deps
Problem: Port 3000 already in use
Solution: npm run dev -- -p 3001 (use different port)
Problem: Ollama command not found
Solution: Restart terminal after installation
Problem: Out of memory during model inference
Solution: Use smaller model (phi, tinyllama)
Problem: Slow responses
Solution: Enable GPU acceleration or reduce context size
For more issues, see Troubleshooting.
- Configure your installation: Configuration
- Learn the architecture: Architecture
- Start using CodeScope: User Guide
Installation complete! 🎉