Frequently Asked Questions
The basics
Section titled “The basics”What is AI Butler?
Section titled “What is AI Butler?”AI Butler is a self-hosted personal AI agent that runs on your own machine. It works across every channel you chat on (web, terminal, Telegram, Slack, Discord, WhatsApp, etc.), remembers everything you tell it across all those channels, connects to any AI model (Claude, GPT, Gemini, or fully-local Ollama), and is extensible via WASM plugins. It’s a single Go binary with no external dependencies.
Think of it as the open-source alternative to commercial assistants like Claude Desktop, ChatGPT, or Microsoft Copilot — except your data stays on your machine and you own everything.
How is it different from ChatGPT, Claude.ai, or Gemini?
Section titled “How is it different from ChatGPT, Claude.ai, or Gemini?”Those are all SaaS rentals. Your chat history, your memory, and (often) your data-for-training live on someone else’s servers. You can’t choose the model, you can’t self-host, and the moment you stop paying, it all disappears.
AI Butler is the opposite:
| Feature | AI Butler | ChatGPT / Claude.ai / Gemini |
|---|---|---|
| Self-hosted | ✅ | ❌ |
| Data on your machine | ✅ | ❌ |
| Choose your model | ✅ (Claude, GPT, Gemini, or local) | ❌ |
| Works offline | ✅ (with Ollama) | ❌ |
| Multi-channel (Telegram, Slack, etc.) | ✅ | ❌ |
| Knowledge graph + vector memory | ✅ | Limited / paid tiers only |
| Open source (Apache 2.0) | ✅ | ❌ |
| WASM plugin extensibility | ✅ | ❌ |
| Free | ✅ (you pay your model provider directly) | ❌ (monthly subscription) |
How is it different from LibreChat, Open WebUI, or librechat?
Section titled “How is it different from LibreChat, Open WebUI, or librechat?”Those are frontends for LLMs — they’re great at chat with file upload and model picker. AI Butler is a full agent runtime: memory with knowledge graph + vector search, scheduler, multi-channel message router, agent-to-agent protocol, plugin sandbox, cost tracking, RBAC, the works.
If you want a nice chat UI for an LLM, use LibreChat or Open WebUI. If you want a personal assistant that runs across your life and remembers everything, use AI Butler. They can coexist.
How is it different from Aider, Continue, or Cline?
Section titled “How is it different from Aider, Continue, or Cline?”Those are coding assistants focused on code editing and pair programming. AI Butler has coding tools (file, shell, git, PR creation) but it’s much broader — personal assistant across many channels, long-term memory, scheduling, smart home, etc. And AI Butler can call Aider or Continue as a subprocess bridge, so they can work together.
Models and costs
Section titled “Models and costs”What AI models does AI Butler work with?
Section titled “What AI models does AI Butler work with?”Production-ready:
- Anthropic Claude (direct API)
- Ollama (local, any model)
Beta (code ready, help us test):
- OpenAI GPT + Azure OpenAI
- Google Gemini
- xAI Grok
- LM Studio (local, any model)
- vLLM (self-hosted, any model)
- Groq (fast inference)
- DeepSeek
- Any other OpenAI-compatible endpoint
Can I run it fully offline?
Section titled “Can I run it fully offline?”Yes. Install Ollama on the same machine, pick a model like llama3.2 or qwen2.5, and AI Butler will auto-detect it on startup. Memory embeddings will use a local model too (we default to nomic-embed-text via Ollama). Zero external API calls. Fully airgapped-capable.
How much does it cost to run?
Section titled “How much does it cost to run?”AI Butler itself is free — Apache 2.0 licensed, no telemetry, no paywall. Your only cost is whatever your AI model provider charges:
- Claude Haiku: ~$0.001 per message — basically free
- Claude Sonnet: ~$0.01 per message
- GPT-4o: ~$0.02 per message
- Ollama / local: $0 (you pay your electric bill)
The built-in cost tracker shows you live spending per model in the web dashboard, and you can set a monthly budget with alerts.
Does my data train anyone’s model?
Section titled “Does my data train anyone’s model?”No, not by default. AI Butler stores everything locally in SQLite. When it calls a cloud model (Claude, GPT, etc.), the model provider receives the request but — per their APIs’ data policies — API calls are not used for training unless you explicitly opt in with them. If you use Ollama locally, nothing leaves your machine at all.
Privacy and security
Section titled “Privacy and security”Does my data stay private?
Section titled “Does my data stay private?”Yes, when self-hosted. AI Butler:
- Stores everything in a local SQLite database on your machine
- Does not send telemetry unless you explicitly enable it (off by default)
- Only talks to the AI model provider you configure (or no one, if you use Ollama)
- Has no analytics, no phoning home, no hidden network calls
- The audit trail is local too — you can see every tool call in the web dashboard
Is it secure?
Section titled “Is it secure?”AI Butler is built with a security-first architecture:
- 59 internal security audit passes with 74 findings found and 70+ fixed
- Capability engine with per-tool granular permissions
- RBAC with admin / user / viewer / agent roles
- OIDC SSO, FIDO2/WebAuthn, and TOTP 2FA (beta)
- SSRF protection blocks private/internal IP ranges
- WASM plugin sandbox (Extism) — plugins can’t touch filesystem or network without declared grants
- Shell command allowlisting with Linux
unshareand macOSsandbox-execisolation - Webhook signature verification on every incoming channel message
- Rate limiting on all external-facing endpoints
- Zero known CVEs —
govulncheckverified clean
An external third-party audit is planned for v1.0. Until then, all our audit passes are internal — we’re transparent about that.
How do I report a security vulnerability?
Section titled “How do I report a security vulnerability?”See SECURITY.md. Please do not open a public issue for security-sensitive reports.
Installation and deployment
Section titled “Installation and deployment”How do I install it?
Section titled “How do I install it?”Three commands:
git clone https://github.com/LumabyteCo/aibutler.gitcd aibutler && CGO_ENABLED=0 go build -o aibutler ../aibutler vault set anthropic_api_key sk-ant-... && ./aibutler startThen open http://localhost:3377. Full guide: Installation.
What platforms does it run on?
Section titled “What platforms does it run on?”Any platform Go 1.26+ supports:
- Linux (x86_64, ARM64, ARMv7, RISC-V)
- macOS (Apple Silicon and Intel)
- Windows (x86_64, ARM64)
- Raspberry Pi (ARM64)
- FreeBSD, OpenBSD, NetBSD
- Docker and Kubernetes (Helm chart included)
Single Go binary, zero CGO, cross-compiles anywhere.
Can I run it on a Raspberry Pi?
Section titled “Can I run it on a Raspberry Pi?”Yes, and it’s a first-class use case. Low power, always-on, serves your whole household:
CGO_ENABLED=0 GOOS=linux GOARCH=arm64 go build -o aibutler .# Transfer to Pi and runCan I run it in Docker or Kubernetes?
Section titled “Can I run it in Docker or Kubernetes?”Yes — the repo ships a Dockerfile, three docker-compose variants, and a
Helm chart. All three compose files build from the Dockerfile on first run:
# Clone, then from the repo root:docker compose up -d# Or with Ollama baked in (fully local):docker compose -f docker-compose.ollama.yml up -d# Or the full stack (AI Butler + Ollama + Home Assistant):docker compose -f docker-compose.full.yml up -d
# Kuberneteshelm install aibutler deploy/helm/aibutler/A pre-built image on GitHub Container Registry (ghcr.io/lumabyteco/aibutler)
lands when we tag the first public release — tracked as part of the v0.1
launch checklist. Until then, the compose files build locally in one
step (takes ~30s on a modern laptop).
Does it need a database server?
Section titled “Does it need a database server?”No. AI Butler uses embedded SQLite. No Postgres, no MySQL, no Redis, no Elasticsearch. Everything is in one .db file that you can back up with cp. The whole point is zero dependencies.
Can I use it on my phone?
Section titled “Can I use it on my phone?”Yes, in two ways:
- Through messaging channels — connect Telegram or WhatsApp and chat with AI Butler from your phone’s native messaging app. Same memory as your desktop web chat.
- Via the web UI — enable LAN mode and open
http://your-machine-ip:3377in your phone’s browser. The UI is responsive and works great on mobile. (PWA install coming in v0.2.)
Features
Section titled “Features”How does its memory work?
Section titled “How does its memory work?”AI Butler’s memory is built on three layers that work together:
- FTS5 full-text search — SQLite’s built-in BM25 keyword search across every saved note, conversation, and extracted fact
- Knowledge graph — entities (people, projects, places, decisions) and their relationships, stored in a relational table
- Vector embeddings — semantic similarity via an embedding model (OpenAI, Ollama, or any OpenAI-compatible provider)
A hybrid search combines all three using Reciprocal Rank Fusion (RRF), so you get the best of exact-match, graph traversal, and semantic similarity in one query. Ask it “what did I tell you about Sarah last month” and it’ll find the right thing even if you don’t remember the exact words.
Can it control my smart home?
Section titled “Can it control my smart home?”Beta for v0.1, full for v0.2. The IoT tool interface is complete — iot.sensor.read, iot.device.control, iot.safety.control, iot.device.list, iot.device.discover, all with PIN safety gating for destructive actions (locks, gas valves, water valves). The adapter that connects to Home Assistant is in final wiring for v0.2.
Today you can test the whole flow with the built-in stub adapter (it returns mock data). When the HA adapter ships, all your existing flows keep working — the tool surface doesn’t change.
Want to help finish the Home Assistant adapter? Open an issue — we’d love the help.
Can I schedule reminders and recurring tasks?
Section titled “Can I schedule reminders and recurring tasks?”Yes, and this is a production-ready feature. Just talk to AI Butler:
You: every weekday at 8am, send me a briefing with weather, calendar, and overnight emails
Butler: Scheduled “morning-briefing” (cron: 0 8 * * 1-5) — delivered to webchat
The agent converts natural language to a cron expression (with an optional LLM fallback for edge cases) and persists the schedule in SQLite so it survives restarts.
Can it do voice input and output?
Section titled “Can it do voice input and output?”Partially, today. Fully, in v0.2.
- Voice input on messaging channels (Telegram, Discord, Slack, WhatsApp) — ready. Just send a voice message, it’s transcribed and processed.
- Voice input in the web chat — ready. Microphone button uploads audio through the browser MediaRecorder API.
- Voice output — Piper TTS (local, CPU-only) works in beta. ElevenLabs adapter is on the v0.2 roadmap.
- Voice TUI mode (
aibutler voice tui) — not yet implemented. Coming in v0.2. - Wake word (“Hey Butler”) — requires a native companion app (Porcupine/Picovoice). Not in v0.1.
What’s this “agent ecosystem hub” thing?
Section titled “What’s this “agent ecosystem hub” thing?”AI Butler implements three open protocols for agent interoperability:
- Google A2A v2 (Agent-to-Agent) — external agents can call AI Butler’s tools, and AI Butler can delegate to external agents. Full protocol compliance in beta.
- MCP (Model Context Protocol) — AI Butler is both a client (connects to external MCP servers like filesystem, memory, browser tools) and a server (exposes its own tools to MCP-compatible clients like Claude Desktop).
- Subprocess bridges — wrap any CLI tool (ffmpeg, Aider, Continue, custom scripts) as a first-class Butler tool with capability gating and safety.
This makes AI Butler a hub — it can coordinate work across every AI agent and tool you use, not just the ones built-in.
Can I write plugins?
Section titled “Can I write plugins?”Yes — in any language that compiles to WebAssembly. Plugins run in an Extism sandbox with zero filesystem/network access by default. You declare capabilities in a manifest, the host checks them, and the plugin can only do what you allowed.
The runtime is beta-ready today. Sample plugins (and a marketplace) coming in v0.2. Want to write the first community plugin? Start here.
Contributing and community
Section titled “Contributing and community”Is this really open source?
Section titled “Is this really open source?”Yes. Apache 2.0 licensed. Full source on GitHub. Includes explicit patent grant and retaliation clause — appropriate for AI agents that may interact with patented APIs. You can use it commercially, fork it, modify it, redistribute it. No CLA required.
Who’s behind AI Butler?
Section titled “Who’s behind AI Butler?”AI Butler is developed by LumaByte Co and contributors. It’s a genuine open-source project, not a loss-leader for a SaaS. We want it to succeed as a community-owned tool.
Can I use it for commercial work?
Section titled “Can I use it for commercial work?”Yes. Apache 2.0 is commercial-friendly. You can:
- Run AI Butler inside your company
- Build products that use AI Butler
- Offer AI Butler as a hosted service
- Sell commercial plugins
The only requirements are the standard Apache 2.0 conditions: preserve the copyright notice, note any changes, and include the license text.
How do I contribute?
Section titled “How do I contribute?”We’re actively looking for community help. The highest-impact contributions right now:
- Test a beta channel (Telegram, Slack, Discord, WhatsApp…) with your own credentials
- Write a sample WASM plugin for the marketplace
- Wire up a real Home Assistant adapter (tool interface is ready)
- Verify A2A v2 interop with a third-party agent
- Translate the web UI to your language
- Build editor extensions (VS Code, JetBrains, Zed) against the dashboard API
- Record a demo video showing memory + scheduling + channels
- Fix a bug from the issues list
Full guide: CONTRIBUTING.md.
Is there a hosted version?
Section titled “Is there a hosted version?”Not yet. A hosted playground (demo.aibutler.dev) is on the v0.1.1 roadmap — it’ll be a read-only demo instance with a rate-limited model key so people can try AI Butler without installing anything. Your AI Butler instance will always be self-hosted — we’re not building a SaaS.
Where can I ask questions?
Section titled “Where can I ask questions?”- GitHub Discussions — general questions, ideas, show-and-tell
- GitHub Issues — bugs and specific feature requests
- Documentation — this site has guides for every feature
What about Windows / enterprise / compliance?
Section titled “What about Windows / enterprise / compliance?”- Windows: builds and runs natively. Single Go binary, zero CGO, cross-compiles with one command.
- Enterprise: Apache 2.0 with patent grant, OIDC SSO, RBAC, capability audit trail, full configuration via YAML (GitOps-friendly).
- Compliance: audit log for every tool call, data classification tags on the roadmap, backup + encryption at rest supported.
If you’re evaluating AI Butler for a larger deployment, start a discussion — we’d love to hear your use case and help you get set up.