Skip to content

Configuration

Yiğit ERDOĞAN edited this page Jan 11, 2026 · 1 revision

⚙️ Configuration Guide

CodeScope is highly configurable through environment variables and settings files.

Table of Contents

  1. Environment Variables
  2. Backend Configuration
  3. Frontend Configuration
  4. LLM Model Selection
  5. Ingestion Settings

Environment Variables

The most common way to configure CodeScope is via a .env file in the backend/ directory.

Global Settings

Variable Description Default
PROJECT_NAME Name of the application CodeScope
VERSION Application version 0.1.0
LOG_LEVEL Logging verbosity INFO

Ollama Connection

Variable Description Default
OLLAMA_BASE_URL URL where Ollama is running http://localhost:11434
OLLAMA_MODEL Default LLM model to use llama3

LLM Model Selection

CodeScope's performance depends heavily on the selected model.

Model Size Best For RAM Req
Llama 3 8B General reasoning 8GB
CodeLlama 7B Coding tasks 8GB

To switch models in the backend:

export OLLAMA_MODEL=codellama

Next Steps

Clone this wiki locally