Production-ready local AI stack with systemd service management, Docker bridge networking, and desktop integration.
Quick Start • Installation • Architecture • Troubleshooting
Most guides say "docker run" and leave you with containers running 24/7. I wanted to run Open WebUI + Ollama on a laptop without draining the battery — with proper service management, not a hack.
Key features:
- Battery optimized — manual control saves ~25% daily battery
- systemd integration — proper lifecycle management, no background bloat
- Docker bridge networking — clean isolation, production-ready
- Desktop launchers — one-click start/stop from application menu
- GDPR compliant — all data processed locally, no external APIs
cd ~
git clone https://github.com/serg-markovich/openwebui-systemd-stack.git
cd openwebui-systemd-stack
# Configure Ollama (one-time setup)
sudo mkdir -p /etc/systemd/system/ollama.service.d/
echo -e '[Service]\nEnvironment="OLLAMA_HOST=0.0.0.0:11434"' | \
sudo tee /etc/systemd/system/ollama.service.d/override.conf
sudo systemctl daemon-reload && sudo systemctl restart ollama
# Install systemd service and desktop launchers (portable)
make install
# This substitutes %%INSTALL_PATH%% with your actual project path
# Start
systemctl --user start openwebui# Desktop launchers are installed automatically via:
make installFull guide: Quick Start Documentation
- Quick Start Guide — get running in 5 minutes
- Installation Guide — detailed step-by-step setup
- Architecture — design decisions and trade-offs
- Troubleshooting — common issues and solutions
- Changelog — version history
| Component | Technology | Purpose |
|---|---|---|
| Container Runtime | Docker 24.0+ | Isolation, reproducibility |
| Service Manager | systemd 249+ | Lifecycle management |
| Orchestration | docker-compose v2 | Service definition |
| Web UI | Open WebUI | Chat interface |
| LLM Runtime | Ollama | Model serving |
| Desktop Integration | XDG Desktop Entry | GUI launchers |
Hardware: HP EliteBook 845 G8, Ubuntu 24.04 LTS
| Model | Size | Purpose |
|---|---|---|
mistral |
~4.1 GB | General tasks, coding |
qwen3:14b |
~9 GB | Complex reasoning |
gemma3:3b |
~2 GB | Quick responses, saves battery |
codellama:7b |
~3.8 GB | Code review, refactoring |
172.17.0.1is the Docker bridge gateway — not obvious until you spend 2 hours debuggingOLLAMA_HOST=0.0.0.0is required — default127.0.0.1doesn't work from inside containersType=oneshot+RemainAfterExit=yesis the correct systemd pattern for docker compose services- Manual service control vs auto-restart: measured ~25% daily battery savings
Key decisions and reasoning: Architecture Documentation
- Prometheus monitoring for container metrics
- Ansible playbook for automated deployment
- Multi-distribution support (Fedora, Arch)
- systemd user service integration
- Docker bridge networking setup
- Desktop launchers (XDG standards)
- Battery-optimized manual control
- Makefile — unified entry point
- GitHub Actions CI/CD pipeline
- Backup/restore for chat history
- Comprehensive documentation
Issues and PRs welcome.
- Star the repo if it's useful
- Report bugs you encounter
- Improve documentation
- Suggest features
This project is built around the eigenstack philosophy — privacy-first, local-first infrastructure where every service runs on your own hardware, no cloud dependencies, no vendor lock-in.
Related projects:
- eigenstack — architecture overview
- local-whisper-obsidian — local voice transcription pipeline
- whisper-ollama-enricher — AI enrichment for voice notes
MIT — see LICENSE


