-
-
Notifications
You must be signed in to change notification settings - Fork 1
Configuration
Yiğit ERDOĞAN edited this page Jan 11, 2026
·
1 revision
CodeScope is highly configurable through environment variables and settings files.
- Environment Variables
- Backend Configuration
- Frontend Configuration
- LLM Model Selection
- Ingestion Settings
The most common way to configure CodeScope is via a .env file in the backend/ directory.
| Variable | Description | Default |
|---|---|---|
| PROJECT_NAME | Name of the application | CodeScope |
| VERSION | Application version | 0.1.0 |
| LOG_LEVEL | Logging verbosity | INFO |
| Variable | Description | Default |
|---|---|---|
| OLLAMA_BASE_URL | URL where Ollama is running | http://localhost:11434 |
| OLLAMA_MODEL | Default LLM model to use | llama3 |
CodeScope's performance depends heavily on the selected model.
| Model | Size | Best For | RAM Req |
|---|---|---|---|
| Llama 3 | 8B | General reasoning | 8GB |
| CodeLlama | 7B | Coding tasks | 8GB |
To switch models in the backend:
export OLLAMA_MODEL=codellama- Understand the system: Architecture
- Start using features: User Guide