A webapp chat interface for Ollama LLM, whose primary intent is to Dogfood Datadog.
-
install Docker Desktop
-
Install Ollama, and download at least one model of your choice (suggested:
mistral).- Make sure the models you want to use are running (run
ollama psto see which models are running) on your Ollama instance. - You may use default values for the inference parameters
.env/ollama.env, and update later if needed.
- Make sure the models you want to use are running (run
-
Create a Datadog Org, and update the
.env/datadog.envfile:- Update
DD_SITE. See documentation for reference. - Get an API key, an APP key, and update
DD_API_KEY,DD_APP_KEYaccordingly. - Update the
NOTIF_EMAILwith an email where to send datadog notifications (you can use the email you used for your Datadog Account).
- Update
-
Run
./terraform.sh initand then./terraform.sh applyfrom a terminal at the root of thellm-2000folder, to create all Datadog resources and update environment variables in the .env/ folder with a bunch of new IDs and secrets:- a RUM Application and a Client Token
- a Synthetics Private Location
- Dashboards, Monitors, etc.
Note: On the first apply from a clean state, you need to create the Synthetics Private Location first, then apply the rest. This is due to a known limitation in the Datadog Terraform provider where synthetics test
locationsvalidation runs at plan time and cannot resolve computed private location IDs that don't exist yet. Subsequent applies work without this workaround.
$ ./terraform.sh init
Initializing the backend...
[...]
Terraform has been successfully initialized!
$ ./terraform.sh apply -target=datadog_synthetics_private_location.local
[...]
Apply complete! Resources: 1 added, 0 changed, 0 destroyed.
$ ./terraform.sh apply
[...]
Apply complete! Resources: 7 added, 0 changed, 0 destroyed.-
Run
docker compose upfrom a terminal at the root of thellm-2000folder:- nginx proxies all incoming HTTP requests.
- flask handles cookie-based authentication, web page template rendering, and chat functionality with an ollama integration.
- redis stores chat history for each user.
- datadog is the datadog agent.
- datadog-op-worker is the Observability Pipelines Worker that sits between the Datadog agent and Datadog Cloud, allowing you to process, transform, and route telemetry data. To activate it, set
DD_OBSERVABILITY_PIPELINES_WORKER_LOGS_ENABLED=truein.env/datadog.env. - synthetics is the private location that runs synthetic tests.
$ docker compose up
[+] Running 6/6
✔ Container nginx Started 0.3s
✔ Container datadog-op-worker Started 0.3s
✔ Container datadog Started 0.3s
✔ Container redis Started 0.3s
✔ Container datadog-synthetics-worker Started 0.3s
✔ Container flask Started 0.3sFrom a web browser:
Connect to http://localhost:8000. You'll be logged in as a random user abcd1234@sandbox.com. Alternatively, log in as any user injecting their user_id in the URL (yay... security): http://localhost:8000/?user_id=john.doe.
Your cookie expires when you close your browser.
The application provides a programmatic API endpoint for direct chat interactions. No authentication, no streaming, no persistence. The system prompt is optional:
curl -X POST http://localhost:8000/api/chat \
-H "Content-Type: application/json" \
-d '{
"message": "What is your favourite colour?",
"prompt": "YOU USE ONLY CAPITAL LETTERS IN YOUR RESPONSES. AND YOU HATE BLUE.",
"model": "mistral:latest"
}'Terraform runs within a docker container, with working directory properly wired to the terraform configuration (see --chdir option )
$ ./terraform.sh init
Initializing the backend...
[...]
Terraform has been successfully initialized!$ ./terraform.sh apply
[...]
Apply complete! Resources: 2 added, 0 changed, 0 destroyed.$ ./terraform.sh output monitor_tags
[
"env:sandbox",
"owner:terraform"
]$ ./terraform.sh destroy
[...]
Destroy complete! Resources: 2 destroyed.... Dashboards, Monitors
MIT
