Skip to content

memory.backend = "builtin" with provider = "local" fails on NixOS — node-llama-cpp can't load or build #67

@kimptoc

Description

@kimptoc

Summary (provided by Claude)

The builtin memory backend with local embeddings (provider = "local") is broken on NixOS.
node-llama-cpp fails both to load prebuilt binaries and to build from source.

Environment

Config

{
"memory": { "backend": "builtin" },
"agents": {
"defaults": {
"memorySearch": {
"enabled": true,
"provider": "local"
}
}
}
}

Observed behavior

openclaw memory status shows the provider resolved correctly:

Provider: local (requested: local)
Model: hf:ggml-org/embeddinggemma-300m-qat-q8_0-GGUF/embeddinggemma-300m-qat-Q8_0.gguf
Vector: ready
Indexed: 0/2 files · 0 chunks

But openclaw memory index fails:

[node-llama-cpp] The prebuilt binaries cannot be used in this Linux distro, as glibc is not
detected
[node-llama-cpp] Failed to build llama.cpp with no GPU support. Error: [Error: ENOENT: no such file
or directory, mkdir '/nix/store/.../node-llama-cpp/llama/localBuilds']
Memory index failed (main): ENOENT: no such file or directory, mkdir
'/nix/store/.../node-llama-cpp/llama/localBuilds'

Root cause

Two issues:

  1. glibc detection fails — node-llama-cpp prebuilt binaries check for glibc in standard paths, but
    NixOS has it in /nix/store/. The prebuilt .so files likely also need patchelf --set-interpreter and
    patchelf --set-rpath to find the NixOS glibc/libstdc++.
  2. Nix store is read-only — The fallback "build from source" path tries to mkdir inside
    /nix/store/... which is immutable. The build should happen at derivation time, not at runtime.

Expected fix

The Nix derivation for openclaw-gateway should either:

  • Patch the prebuilt node-llama-cpp binaries at build time using patchelf (similar to how other Nix
    packages handle prebuilt native Node addons), or
  • Pre-build llama.cpp during the derivation and place the resulting .so in the correct location

sqlite-vec already works correctly (its .so is at the expected path), so the pattern exists for
native Node addons in this package.

Workaround

Currently none for provider = "local". The openai/gemini/voyage providers work if you have the
appropriate API keys, but require external API access and costs.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type
    No fields configured for issues without a type.

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions