You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
fix: auto-install node-llama-cpp for local embeddings (memory_search)
OpenClaw's default memorySearch.provider is 'local', which requires
node-llama-cpp. This optional peer dependency was never auto-installed,
causing memory_search to silently fail for every EverClaw user.
Changes:
- diagnose.sh: Add check A10 (node-llama-cpp detection)
- setup.mjs: Add Stage 5 (auto-install with --skip-embeddings flag)
- install.sh: Add node-llama-cpp check on every install/upgrade
- Dockerfile: Install node-llama-cpp@3.18.1 in production image
- CHANGELOG.md, SKILL.md, README.md: Documentation updates
Detection uses NODE_PATH + require.resolve (CJS) since ESM import()
ignores NODE_PATH. All installs are non-blocking with graceful fallback.
Copy file name to clipboardExpand all lines: CHANGELOG.md
+17Lines changed: 17 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -2,6 +2,23 @@
2
2
3
3
All notable changes to EverClaw are documented here.
4
4
5
+
## [UNRELEASED] — Local Embeddings Fix (node-llama-cpp)
6
+
7
+
### Fixed
8
+
-**Silent `memory_search` failure on all fresh installs** — OpenClaw's default `memorySearch.provider` is `"local"`, which requires `node-llama-cpp`. This optional peer dependency was never auto-installed, causing `memory_search` to silently fail for every EverClaw user. Now detected and installed automatically across all modalities.
9
+
10
+
### Added
11
+
-**`diagnose.sh` check A10** — New diagnostic check detects whether `node-llama-cpp` is installed. Reports clear fix instructions if missing.
-**`--skip-embeddings` flag** — New `setup.mjs` flag to skip the node-llama-cpp install (for headless/minimal installs).
14
+
-**`install.sh` node-llama-cpp check** — Runs on every install/upgrade, catching existing users who never re-run setup.
15
+
-**Dockerfile node-llama-cpp install** — Production Docker image now includes `node-llama-cpp@3.18.1` for container users.
16
+
17
+
### Technical Notes
18
+
- Detection uses `NODE_PATH="$(npm root -g)"` with `require.resolve()` (CJS). ESM `import()` ignores `NODE_PATH` in Node.js — `require.resolve` is the reliable cross-platform approach for global module detection.
19
+
- All installs are non-blocking: failure gracefully degrades to remote embedding providers or no memory search.
20
+
- Version pinned to `node-llama-cpp@3.18.1` (latest stable, matches OpenClaw's peer dependency).
0 commit comments