Add Ollama provider for self-hosted LLM inference#38
Conversation
Adds LLM_PROVIDER=ollama for fully local, zero-cost inference via Ollama's OpenAI-compatible API. No API key required. Configurable base URL via OLLAMA_BASE_URL env var.
|
I haven't been able to get to this yet because work has been busy, but I definitely plan to review it over the weekend. |
Include both Mistral and Ollama providers in factory, config, and env docs.
calesthio
left a comment
There was a problem hiding this comment.
Approved. The Ollama provider wiring looks sound and the main regression checks passed on an isolated review worktree. One non-blocking nit: est/llm-ollama-integration.test.mjs currently only checks whether the Ollama server is reachable, so it hard-fails if the default model isn't installed locally. It would be better to skip unless the requested model is actually present in /api/tags, since the PR description says the integration test should auto-skip when the local setup isn't ready.
Check /api/tags for the requested model before running, instead of only checking server reachability. Provides a descriptive skip reason listing available models.
|
Thanks for the review! Good catch on the integration test — I've pushed a fix (ca8f76c) that now checks /api/tags for the requested model before running, not just server reachability. |
Summary
LLM_PROVIDER=ollamafor fully local, zero-cost LLM inference via Ollama/v1/chat/completionsendpoint — no SDK, just nativefetchOLLAMA_BASE_URL(defaults tohttp://localhost:11434)llama3.1:8b(user-configurable viaLLM_MODEL)Changes
lib/llm/ollama.mjs— new provider following existing patternlib/llm/index.mjs— factory case + exportcrucix.config.mjs—OLLAMA_BASE_URLenv var passthrough.env.example— updated docs with ollama optiontest/llm-ollama.test.mjs— 11 unit tests (defaults, request format, response parsing, errors, factory)test/llm-ollama-integration.test.mjs— integration test (auto-skips if Ollama unavailable)Closes #30
Test plan
node --test test/llm-ollama.test.mjs(11/11)OLLAMA_MODEL=llama3.1:8b node --test test/llm-ollama-integration.test.mjsLLM_PROVIDER=ollamagenerates trade ideas