Docker
Docker is the simplest way to run AI Butler in production. Three compose files ship in the repo for different scenarios.
Quick Start
Section titled “Quick Start”docker run -d \ --name aibutler \ -p 3377:3377 \ -v aibutler-data:/data \ -e AIBUTLER_ANTHROPIC_API_KEY=sk-ant-... \ ghcr.io/lumabyteco/aibutler:latestOpen http://localhost:3377 — that’s it.
Docker Compose
Section titled “Docker Compose”The repo ships with three compose files. Pick the one that matches your setup:
docker-compose.yml — Standalone
Section titled “docker-compose.yml — Standalone”AI Butler only. Bring your own model (Claude, GPT, Gemini) via API key. Smallest footprint.
docker compose up -ddocker-compose.ollama.yml — Fully Local
Section titled “docker-compose.ollama.yml — Fully Local”AI Butler + Ollama bundled. No API keys, no network calls to external AI providers. Great for privacy-first setups.
docker compose -f docker-compose.ollama.yml up -dFirst run pulls a default model (Llama 3.3 or Qwen). Change it with the OLLAMA_MODEL env var.
docker-compose.full.yml — Full Smart Home Stack
Section titled “docker-compose.full.yml — Full Smart Home Stack”AI Butler + Ollama + Home Assistant + Mosquitto (MQTT) + Zigbee2MQTT — everything you need for a self-hosted smart home with a voice assistant.
docker compose -f docker-compose.full.yml up -dPersistent Data
Section titled “Persistent Data”All state lives in /data inside the container:
/data/aibutler.db— SQLite database (memory, sessions, plugins, etc.)/data/plugins/— installed WASM plugins/data/backups/— incremental backups/data/vault.json— encrypted secret vault
Mount a named volume or a host path:
services: aibutler: image: ghcr.io/lumabyteco/aibutler:latest volumes: - aibutler-data:/data # or: - /srv/aibutler:/data
volumes: aibutler-data:Environment Variables
Section titled “Environment Variables”| Variable | Purpose |
|---|---|
AIBUTLER_CONFIG | Path to config file (default /data/config.yaml) |
AIBUTLER_DATA_DIR | Override data directory (default /data) |
AIBUTLER_ANTHROPIC_API_KEY | Anthropic API key (can also use vault) |
AIBUTLER_OPENAI_API_KEY | OpenAI API key |
AIBUTLER_OLLAMA_BASE_URL | Ollama endpoint (e.g. http://ollama:11434) |
AIBUTLER_WEBCHAT_PORT | Web UI port (default 3377) |
AIBUTLER_LOG_LEVEL | debug, info, warn, error |
Any config-file value can also be set via environment with the prefix AIBUTLER_ and underscore-separated path.
Image Tags
Section titled “Image Tags”| Tag | Notes |
|---|---|
latest | Most recent stable release |
vX.Y.Z | Specific version |
vX.Y | Latest patch of a minor version |
edge | Built from main branch — pre-release |
Multi-arch images support linux/amd64, linux/arm64, and linux/arm/v7 (Raspberry Pi 3+).
Updating
Section titled “Updating”docker compose pulldocker compose up -dData in the named volume persists across image updates. Schema migrations run automatically on startup.
Backup
Section titled “Backup”# Hot backup via built-in commanddocker exec aibutler aibutler backup create /data/backups/manual.db
# Cold backupdocker compose stoptar -czf aibutler-backup-$(date +%Y%m%d).tar.gz /var/lib/docker/volumes/aibutler-datadocker compose startResource Usage
Section titled “Resource Usage”Typical resource footprint:
| Workload | RAM | CPU |
|---|---|---|
| Idle | ~60 MB | <1% |
| Active chat | ~120 MB | 2–5% |
| With Ollama (7B model) | ~6 GB | 40% during inference |
| With 12 channels active | ~180 MB | 3–8% |