Hardware

Software

  • LM Studio – run AI models, locally and privately

OpenClaw

Install OpenClaw

  • Add new openclaw user: useradd -m -u 987 -s /bin/bash openclaw && mkdir -m 700 /home/openclaw && chown openclaw:openclaw /home/openclaw
  • Make sure that gateway.auth.token in /home/openclaw/.openclaw/openclaw.json matches the one in docker-compose.yml (openclaw-gateway.environment.OPENCLAW_GATEWAY_TOKEN).
  • Start/stop the container: docker compose up -d openclaw-gateway / docker compose down
  • To approve device (OpenClaw UI) request:
    # docker compose exec openclaw-gateway bash
    
    $ /app/openclaw.mjs devices list
    Direct scope access failed; using local fallback.
    Pending (1)
    ┌──────────────────────────────────────┬───────────────────────────────────────────────────┬──────────┬───────────────┬──────────┬────────┐
    │ Request                              │ Device                                            │ Role     │ IP            │ Age      │ Flags  │
    ├──────────────────────────────────────┼───────────────────────────────────────────────────┼──────────┼───────────────┼──────────┼────────┤
    │ 4ec7a9d0-f5f2-4316-9c07-f9e4fa96620a │ f74fdd392ee3bca006e344a86554ef76b473d7e2bd8502fde │ operator │               │ just now │        │
    └──────────────────────────────────────┴───────────────────────────────────────────────────┴──────────┴───────────────┴──────────┴────────┘
    
    $ /app/openclaw.mjs devices approve 4ec7a9d0-f5f2-4316-9c07-f9e4fa96620a
    Direct scope access failed; using local fallback.
    Approved f74fdd392ee3bca006e344a86554ef76b473d7e2bd8502fde (4ec7a9d0-f5f2-4316-9c07-f9e4fa96620a)
  • If you get the following error Control UI requires gateway.controlUi.allowedOrigins (set explicit origins), or set gateway.controlUi.dangerouslyAllowHostHeaderOriginFallback=true to use Host-header origin fallback mode then add the following to /home/openclaw/.openclaw/openclaw.json:
    "gateway": {
      "controlUi": {
        "allowedOrigins": [
          "http://localhost"
        ]
      }
    }

and force that host in Apache configuration:

RequestHeader set Origin "http://localhost:18789"
  • To workaround OpenClaw error “Proxy headers detected from untrusted address. Connection will not be treated as local.” add the following to Apache configuration:
    ProxyAddHeaders Off
  • To add extras to original docker image do:
    # cat - > Dockerfile
    FROM alpine/openclaw:2026.4.2
    
    USER root
    
    RUN apt-get update -q \
     && apt-get install -y -q --no-install-recommends \
      python3-bs4 \
      python3-dateutil \
      python3-lxml \
      python3-requests \
      python3-yaml \
      && rm -rf /var/lib/apt/lists/* /var/cache/apt/archives/*
    
    USER openclaw
    
    # docker build -t openclaw-custom:2026.4.2 .

    then use openclaw-custom:2026.4.2 in docker-compose.yaml.

Install LLama local model

  • Add to docker-compose.yml:
    services:
      ollama:
        image: ollama/ollama:latest
        user: "987:1001"
        network_mode: host
        volumes:
          - /home/openclaw/.ollama:/.ollama
  • Pull LLM model (~3GB) and run it:
    # docker compose up -d ollama
    # docker compose exec ollama ollama pull llama3.1:8b-instruct-q4_K_M
    # docker compose exec ollama ollama run llama3.1:8b-instruct-q4_K_M
  • Configure OpenClaw to use it so that configuration in .openclaw/openclaw.json looks like this:
    "models": {
      "providers": {
        "vllm": {
          "baseUrl": "http://127.0.0.1:11434/v1",
          "apiKey": "dummy-api-key",
          "api": "openai-completions",
          "models": [
            {
              "id": "ollama/llama3.1:8b-instruct-q4_K_M",
              "name": "ollama/llama3.1:8b-instruct-q4_K_M",
              "reasoning": false,
          ...

Install browser support

  • Add extra directory mappings:
        volumes:
          - /home/openclaw/.openclaw:/home/node/.openclaw
          - /home/openclaw/.openclaw/.agent-browser:/home/node/.agent-browser
          - /home/openclaw/.openclaw/.cache:/home/node/.cache
  • Run:
    ~/.openclaw$ ./bin/agent-browser install
  • Test:
    ~/.openclaw$ ./bin/agent-browser open google.com
  • To configure add this to .openclaw/openclaw.json:
    ~/.openclaw$ ./bin/agent-browser open google.com

Using OpenClaw

  • To re-run onboarding, do docker compose run --rm openclaw-cli onboard or like below.
  • To switch the model, run docker compose exec openclaw-gateway /app/openclaw.mjs config
  • To list all model, run
    # docker compose exec openclaw-gateway /app/openclaw.mjs models list
    
    Model                                      Input      Ctx      Local Auth  Tags
    openrouter/stepfun/step-3.5-flash:free     text       250k     no    yes   default,configured,alias:OpenRouter
    openai/gpt-5.1-codex                       text+image 391k     no    yes   fallback#1,configured,alias:GPT
  • When you get a paring request from a bot:
    OpenClaw: access not configured.
    Your Telegram user id: 30528162
    Pairing code: Z5RQ92MX

    you need to run

    # docker compose exec openclaw-gateway /app/openclaw.mjs pairing approve telegram Z5RQ92MX
    Approved telegram sender 30528162.
software/ai.txt · Last modified: 2026/04/05 16:32 (external edit)
 
 
Recent changes RSS feed Driven by DokuWiki