Skip to content

serg-markovich/openwebui-systemd-stack

Repository files navigation

Open WebUI + Ollama systemd Stack

License Docker systemd Ubuntu Battery Optimized CI

Production-ready local AI stack with systemd service management, Docker bridge networking, and desktop integration.

Quick StartInstallationArchitectureTroubleshooting


Screenshots

Open WebUI chat view Model selection Desktop launchers

Why This Project?

Most guides say "docker run" and leave you with containers running 24/7. I wanted to run Open WebUI + Ollama on a laptop without draining the battery — with proper service management, not a hack.

Key features:

  • Battery optimized — manual control saves ~25% daily battery
  • systemd integration — proper lifecycle management, no background bloat
  • Docker bridge networking — clean isolation, production-ready
  • Desktop launchers — one-click start/stop from application menu
  • GDPR compliant — all data processed locally, no external APIs

Quick Start

cd ~
git clone https://github.com/serg-markovich/openwebui-systemd-stack.git
cd openwebui-systemd-stack

# Configure Ollama (one-time setup)
sudo mkdir -p /etc/systemd/system/ollama.service.d/
echo -e '[Service]\nEnvironment="OLLAMA_HOST=0.0.0.0:11434"' | \
  sudo tee /etc/systemd/system/ollama.service.d/override.conf
sudo systemctl daemon-reload && sudo systemctl restart ollama

# Install systemd service and desktop launchers (portable)
make install
# This substitutes %%INSTALL_PATH%% with your actual project path

# Start
systemctl --user start openwebui

Desktop Launchers (Optional)

# Desktop launchers are installed automatically via:
make install

Full guide: Quick Start Documentation


Documentation


Tech Stack

Component Technology Purpose
Container Runtime Docker 24.0+ Isolation, reproducibility
Service Manager systemd 249+ Lifecycle management
Orchestration docker-compose v2 Service definition
Web UI Open WebUI Chat interface
LLM Runtime Ollama Model serving
Desktop Integration XDG Desktop Entry GUI launchers

My Setup

Hardware: HP EliteBook 845 G8, Ubuntu 24.04 LTS

Model Size Purpose
mistral ~4.1 GB General tasks, coding
qwen3:14b ~9 GB Complex reasoning
gemma3:3b ~2 GB Quick responses, saves battery
codellama:7b ~3.8 GB Code review, refactoring

What I Learned

  • 172.17.0.1 is the Docker bridge gateway — not obvious until you spend 2 hours debugging
  • OLLAMA_HOST=0.0.0.0 is required — default 127.0.0.1 doesn't work from inside containers
  • Type=oneshot + RemainAfterExit=yes is the correct systemd pattern for docker compose services
  • Manual service control vs auto-restart: measured ~25% daily battery savings

Key decisions and reasoning: Architecture Documentation


Roadmap

Planned

  • Prometheus monitoring for container metrics
  • Ansible playbook for automated deployment
  • Multi-distribution support (Fedora, Arch)

Completed

  • systemd user service integration
  • Docker bridge networking setup
  • Desktop launchers (XDG standards)
  • Battery-optimized manual control
  • Makefile — unified entry point
  • GitHub Actions CI/CD pipeline
  • Backup/restore for chat history
  • Comprehensive documentation

Contributing

Issues and PRs welcome.

  • Star the repo if it's useful
  • Report bugs you encounter
  • Improve documentation
  • Suggest features

Eigenstack

This project is built around the eigenstack philosophy — privacy-first, local-first infrastructure where every service runs on your own hardware, no cloud dependencies, no vendor lock-in.

Related projects:


License

MIT — see LICENSE

About

Production-ready local AI with systemd service management, Docker bridge networking, and desktop integration. GDPR-compliant, battery-optimized.

Topics

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Contributors