Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 3 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -7,3 +7,6 @@ logs/
# Python virtual environment
venv/
.dbt/

# Local secrets — copy from .env.example and fill in
demo/.env
5 changes: 5 additions & 0 deletions CLAUDE.md
Original file line number Diff line number Diff line change
Expand Up @@ -64,6 +64,11 @@ Marts (tables, incremental)
- CTEs for readability
- meaningful CTE names

### Macros
- For cross-platform SQL compatibility, use `adapter.dispatch` instead of `{% if target.type == '...' %}` branches — it keeps each adapter's implementation separate and allows overrides by downstream projects.
- Check `dbt.` built-ins first (`dbt.type_string()`, `dbt.date_trunc()`, `dbt.dateadd()`, etc.) before writing a custom cross-platform macro.
- Use dispatch when SQL syntax genuinely differs across adapters (JSON, arrays, date math, regex, string aggregation). Skip it for logic that is identical everywhere.

### Naming
| Type | Convention |
|------|------------|
Expand Down
16 changes: 16 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -59,6 +59,22 @@ Visits is successful when the last attempt of the visit is successful.

---

## Try it locally (demo)

The fastest way to explore kwwhat is the self-contained Docker demo — no cloud account needed.

```bash
cd demo
cp .env.example .env # add your Anthropic API key
./run-demo.sh
```

This spins up three services: a local DuckDB database loaded with sample OCPP logs, the full dbt pipeline, and an AI chat interface where you can ask plain-English questions about your EV charger data.

See [`demo/README.md`](demo/README.md) for details.

---

## Installation

```bash
Expand Down
4 changes: 4 additions & 0 deletions demo/.env.example
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
# Copy this file to .env and fill in your values before running the demo.

# Required: Anthropic API key for nao chat BI (powers the AI assistant)
ANTHROPIC_API_KEY=your_anthropic_api_key_here
116 changes: 116 additions & 0 deletions demo/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,116 @@
# kwwhat demo

Run the full kwwhat analytics pipeline locally — no cloud account needed.

<img width="1427" height="770" alt="Screenshot 2026-04-12 at 3 11 46 PM" src="https://github.com/user-attachments/assets/95010112-11c3-4c51-b77d-8ea3dcc10053" />

---

## Quickstart

```bash
# 1. Copy the environment template and add your API key
cp .env.example .env
# edit .env and set ANTHROPIC_API_KEY=...

# 2. Run the demo
./run-demo.sh
```

That's it. The script will build the images, load the data, run the pipeline, and open the chat interface at **http://localhost:5005**.

---

## Prerequisites

- [Docker Desktop](https://www.docker.com/products/docker-desktop/) installed and running

This demo spins up three services via Docker Compose:

| Service | What it does |
|---------|-------------|
| `duckdb-init` | Loads the sample OCPP log data into a local database |
| `dbt` | Runs the kwwhat dbt pipeline, transforming raw logs into analytics tables |
| `chat-bi` | Opens an AI chat interface so you can ask questions about the data |

- An [Anthropic API key](https://console.anthropic.com/) (powers the chat interface)

---

## What happens under the hood

```
seeds/ demo/
ocpp_1_6_synthetic_logs_14d.csv ──▶ duckdb-init ──▶ raw.duckdb
ports.csv │
dbt build
analytics.duckdb
(fact_*, dim_*)
nao chat
```

1. **duckdb-init** reads the two CSV files in `seeds/` and loads them into `raw.duckdb` — a file that acts as the raw data warehouse.
2. **dbt** picks up from there, runs all the staging → intermediate → mart models, and writes the results to `analytics.duckdb`.
3. **chat-bi** connects to both the analytics database and the dbt service, so you can ask plain-English questions like:
- *"Report on reliability of my EV charging network"*
- *"What was the charge attempt success rate last week?"*
- *"Which chargers had the most downtime?"*
- *"Show me visit success by location."*

---

## Sample data

The `seeds/` folder contains synthetic OCPP 1.6 logs donated for this project:

| File | Contents |
|------|----------|
| `ocpp_1_6_synthetic_logs_14d.csv` | 14 days of synthetic EV charger messages (~2 MB) |
| `ports.csv` | Reference data: charger ports, connector types, and commission dates |

---

## Manual commands

If you prefer to run services individually:

```bash
# Build images
docker compose build

# Load seed data (runs once and exits)
docker compose up duckdb-init

# Run dbt pipeline
docker compose up dbt

# Open chat interface (interactive) — available at http://localhost:5005
docker compose run --rm -p 5005:5005 -p 8005:8005 chat-bi

# Stop everything
docker compose down

# Stop and wipe the database volume
docker compose down -v
```

---

## Project structure

```
demo/
├── README.md
├── docker-compose.yml
├── run-demo.sh
├── .env.example
├── seeds/ ← sample OCPP CSV data
├── duckdb-init/ ← Service 1: loads seeds into raw.duckdb
├── dbt/ ← Service 2: dbt build + MCP server
└── chat-bi/ ← Service 3: nao chat interface
```
13 changes: 13 additions & 0 deletions demo/chat-bi/Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
FROM python:3.12-slim

RUN pip install --no-cache-dir \
"nao-core[duckdb]==0.0.59"

WORKDIR /app

COPY nao_config.yaml .
COPY RULES.md .
COPY entrypoint.sh /entrypoint.sh
RUN chmod +x /entrypoint.sh

ENTRYPOINT ["/entrypoint.sh"]
21 changes: 21 additions & 0 deletions demo/chat-bi/RULES.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
# Rules

- Never run schema introspection queries (`information_schema`, `SHOW TABLES`, etc.). The available tables and their columns are already provided in your context.
- For metrics, measures, dimensions, and entities: use `repos/kwwhat/models/semantic/semantic_models.yml` as the authoritative source.
- For SQL: only query `fact_*` and `dim_*` tables. Always use the fully qualified path `analytics.ANALYTICS.<table>` (e.g. `analytics.ANALYTICS.fact_visits`). Column documentation is in `repos/kwwhat/models/marts/marts.yml`.
- Do not make up metrics. If a metric is not defined in the semantic model, say so.
- When reporting on metrics, always start with a "metrics at a glance" summary table: metric name, value, and status.
- Default time window is last 7 days unless the user specifies otherwise.
- Never use the term "session". Use "charge attempt", "transaction", or "visit" depending on context.
- Show rates and uptime as percentages (e.g. 94.2%). Always include period-over-period change in percentage points (pp), e.g. "+1.3 pp".
- Always use the brand colour palette for visualizations:
- Light Purple: #F3DDEE
- Purple: #C357AA
- Dark Purple: #7C2167
- Light Turquoise: #D2F3F3
- Turquoise: #6AD8D6
- Dark Turquoise: #165255
- Yellow (Accent): #FFD72E
- Bluish (Accent): #1F0D79
- White: #FFFFFF
- Grey-black: #2A2A2A
9 changes: 9 additions & 0 deletions demo/chat-bi/agent_instructions.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
# Agent instructions

## IMPORTANT: Data access rules

**Never** run `information_schema` queries, `SHOW TABLES`, or any other schema introspection SQL. You already have everything you need:

- **What metrics exist**: read `models/semantic/semantic_models.yml` — it contains all metrics, measures, dimensions, and entities.
- **What tables to query**: read `models/marts/marts.yml` — it describes every mart table with column-level documentation. Only `fact_*` and `dim_*` tables in the `ANALYTICS` schema are available.
- **No other tables exist** for your purposes. Do not look for them.
27 changes: 27 additions & 0 deletions demo/chat-bi/entrypoint.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,27 @@
#!/bin/bash
set -e

ANALYTICS_DB="/data/analytics.duckdb"

echo "=== kwwhat Chat BI ==="

# Wait for analytics.duckdb to exist (written by the dbt service)
echo "Waiting for analytics.duckdb to be ready..."
until [ -f "$ANALYTICS_DB" ]; do
sleep 2
done
echo "analytics.duckdb is ready."

echo ""
echo "==========================================="
echo " kwwhat demo is ready!"
echo " Ask me anything about your EV charger data."
echo "==========================================="
echo ""

cd /app
mkdir -p repos databases
echo "Syncing nao context (repos + database schemas)..."
nao sync
echo ""
exec nao chat
22 changes: 22 additions & 0 deletions demo/chat-bi/nao_config.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,22 @@
project_name: kwwhat-demo

databases:
- name: duckdb_analytics
type: duckdb
path: /data/analytics.duckdb
schema_name: ANALYTICS
include:
- analytics.ANALYTICS.fact_*
- analytics.ANALYTICS.dim_*
accessors:
- columns
- description
- preview

repos:
- name: kwwhat
local_path: /kwwhat
include:
- models/semantic/semantic_models.yml
- models/marts/marts.yml
- dbt_project.yml
12 changes: 12 additions & 0 deletions demo/chat-bi/profiles.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
kwwhat:
target: duckdb
outputs:
duckdb:
type: duckdb
path: /data/analytics.duckdb
attach:
- path: /data/raw.duckdb
alias: RAW
read_only: true
schema: ANALYTICS
threads: 4
20 changes: 20 additions & 0 deletions demo/dbt/Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
FROM python:3.12-slim

RUN apt-get update && apt-get install -y curl socat && rm -rf /var/lib/apt/lists/*

# dbt with DuckDB adapter, pinned to match the project's venv
RUN pip install --no-cache-dir \
"dbt-core==1.10.9" \
"dbt-duckdb>=1.9.0" \
"dbt-metricflow" \
"dbt-mcp"

WORKDIR /kwwhat

COPY profiles.yml /profiles/profiles.yml
COPY entrypoint.sh /entrypoint.sh
RUN chmod +x /entrypoint.sh

ENV DBT_PROFILES_DIR=/profiles

ENTRYPOINT ["/entrypoint.sh"]
26 changes: 26 additions & 0 deletions demo/dbt/entrypoint.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
#!/bin/bash
set -e

RAW_DB="/data/raw.duckdb"

echo "=== dbt service starting ==="

# Wait for Service 1 to finish loading the seed data
echo "Waiting for raw.duckdb to be ready..."
until [ -f "$RAW_DB" ]; do
sleep 2
done
echo "raw.duckdb found."

cd /kwwhat

echo "Installing dbt packages..."
dbt deps --log-path /tmp/dbt-logs

echo "Running dbt run (staging → intermediate → marts)..."
dbt run --target duckdb --log-path /tmp/dbt-logs

echo "Running dbt tests (failures reported but do not block startup)..."
dbt test --target duckdb --log-path /tmp/dbt-logs --exclude "test_type:unit" || echo "Some tests failed — see logs."

echo "=== dbt build complete ==="
15 changes: 15 additions & 0 deletions demo/dbt/profiles.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
kwwhat:
target: duckdb
outputs:
duckdb:
type: duckdb
# Analytics models are written here
path: /data/analytics.duckdb
# Attach the raw seed data (read-only) as the RAW catalog
attach:
- path: /data/raw.duckdb
alias: RAW
read_only: true
# Write transformed models into the ANALYTICS schema
schema: ANALYTICS
threads: 4
41 changes: 41 additions & 0 deletions demo/docker-compose.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,41 @@
services:

# ── Service 1 ──────────────────────────────────────────────────────────────
# Loads CSV seed data into raw.duckdb (RAW catalog, SEED schema).
# Runs once and exits — all other services wait for it to complete.
duckdb-init:
build: ./duckdb-init
volumes:
- duckdb-data:/data
- ./seeds:/seeds:ro # seed CSVs live here under demo/

# ── Service 2 ──────────────────────────────────────────────────────────────
# Runs dbt build: staging → intermediate → marts written to analytics.duckdb.
# Exits when done — chat-bi waits for completion before starting.
dbt:
build: ./dbt
depends_on:
duckdb-init:
condition: service_completed_successfully
volumes:
- duckdb-data:/data
- ../:/kwwhat # mount the kwwhat dbt project

# ── Service 3 ──────────────────────────────────────────────────────────────
# Interactive chat interface with dbt-mcp running as a local subprocess.
# Run with: docker compose run --rm chat-bi
chat-bi:
build: ./chat-bi
volumes:
- duckdb-data:/data # read analytics.duckdb
- ../:/kwwhat:ro # dbt-mcp needs the project for CLI tools
ports:
- "5005:5005" # nao chat server
- "8005:8005" # nao FastAPI server
stdin_open: true
tty: true
environment:
- ANTHROPIC_API_KEY=${ANTHROPIC_API_KEY}

volumes:
duckdb-data:
8 changes: 8 additions & 0 deletions demo/duckdb-init/Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
FROM python:3.12-slim

RUN pip install --no-cache-dir duckdb

WORKDIR /app
COPY init.py .

CMD ["python", "init.py"]
Loading