Run the full TierFlow stack (router + ML classifier) with Docker Compose.
# Clone both repos
git clone <tierflow-repo> tierflow
git clone <llmrouter-service-repo> llmrouter-service
# Start everything
cd tierflow
docker compose up -d
# Check status
docker compose ps
docker compose logs -fTierFlow will be available at http://localhost:18800.
docker network
┌──────────────┐ ┌────────────────┐
│ tierflow │─────▶│ llmrouter │
│ :18800 │ │ :18801 │
│ (Node.js) │ │ (Python) │
└──────────────┘ └────────────────┘
- tierflow waits for llmrouter to be healthy before starting
- They communicate via
http://llmrouter:18801/classify(Docker DNS) - Both bind to
127.0.0.1on the host (localhost only)
Pass API keys as environment variables:
# Option 1: .env file
echo "ANTHROPIC_API_KEY=sk-ant-..." > .env
docker compose up -d
# Option 2: inline
ANTHROPIC_API_KEY=sk-ant-... docker compose up -dMount your config file:
# Default: mounts ./tierflow.config.json
docker compose up -d
# Custom path:
TIERFLOW_CONFIG=/path/to/config.json docker compose up -dIf you only want the router (rule-based routing):
docker compose up tierflow -dTierFlow falls back to the 14-dimension keyword scorer when the ML classifier is unavailable.
| Image | Size | Why |
|---|---|---|
| tierflow | ~180MB | Node.js slim + built JS |
| tierflow-llmrouter | ~600MB | Python + sentence-transformers model (80MB) baked in |
The ML model is baked into the image to avoid downloading at runtime.
# Router health
curl http://localhost:18800/health
# ML classifier health
curl http://localhost:18801/health
# Dashboard
open http://localhost:18800/dashboarddocker compose build --no-cache
docker compose up -d