Skip to content

Docker

Docker is the simplest way to run AI Butler in production. Three compose files ship in the repo for different scenarios.

Terminal window
docker run -d \
--name aibutler \
-p 3377:3377 \
-v aibutler-data:/data \
-e AIBUTLER_ANTHROPIC_API_KEY=sk-ant-... \
ghcr.io/lumabyteco/aibutler:latest

Open http://localhost:3377 — that’s it.

The repo ships with three compose files. Pick the one that matches your setup:

AI Butler only. Bring your own model (Claude, GPT, Gemini) via API key. Smallest footprint.

Terminal window
docker compose up -d

AI Butler + Ollama bundled. No API keys, no network calls to external AI providers. Great for privacy-first setups.

Terminal window
docker compose -f docker-compose.ollama.yml up -d

First run pulls a default model (Llama 3.3 or Qwen). Change it with the OLLAMA_MODEL env var.

docker-compose.full.yml — Full Smart Home Stack

Section titled “docker-compose.full.yml — Full Smart Home Stack”

AI Butler + Ollama + Home Assistant + Mosquitto (MQTT) + Zigbee2MQTT — everything you need for a self-hosted smart home with a voice assistant.

Terminal window
docker compose -f docker-compose.full.yml up -d

All state lives in /data inside the container:

  • /data/aibutler.db — SQLite database (memory, sessions, plugins, etc.)
  • /data/plugins/ — installed WASM plugins
  • /data/backups/ — incremental backups
  • /data/vault.json — encrypted secret vault

Mount a named volume or a host path:

services:
aibutler:
image: ghcr.io/lumabyteco/aibutler:latest
volumes:
- aibutler-data:/data
# or: - /srv/aibutler:/data
volumes:
aibutler-data:
VariablePurpose
AIBUTLER_CONFIGPath to config file (default /data/config.yaml)
AIBUTLER_DATA_DIROverride data directory (default /data)
AIBUTLER_ANTHROPIC_API_KEYAnthropic API key (can also use vault)
AIBUTLER_OPENAI_API_KEYOpenAI API key
AIBUTLER_OLLAMA_BASE_URLOllama endpoint (e.g. http://ollama:11434)
AIBUTLER_WEBCHAT_PORTWeb UI port (default 3377)
AIBUTLER_LOG_LEVELdebug, info, warn, error

Any config-file value can also be set via environment with the prefix AIBUTLER_ and underscore-separated path.

TagNotes
latestMost recent stable release
vX.Y.ZSpecific version
vX.YLatest patch of a minor version
edgeBuilt from main branch — pre-release

Multi-arch images support linux/amd64, linux/arm64, and linux/arm/v7 (Raspberry Pi 3+).

Terminal window
docker compose pull
docker compose up -d

Data in the named volume persists across image updates. Schema migrations run automatically on startup.

Terminal window
# Hot backup via built-in command
docker exec aibutler aibutler backup create /data/backups/manual.db
# Cold backup
docker compose stop
tar -czf aibutler-backup-$(date +%Y%m%d).tar.gz /var/lib/docker/volumes/aibutler-data
docker compose start

Typical resource footprint:

WorkloadRAMCPU
Idle~60 MB<1%
Active chat~120 MB2–5%
With Ollama (7B model)~6 GB40% during inference
With 12 channels active~180 MB3–8%