Skip to content

Commit b16809c

Browse files
author
frogmoses
committed
Adding coffee-roasting and r1-eye projects
1 parent 5cf31f3 commit b16809c

2 files changed

Lines changed: 58 additions & 0 deletions

File tree

Lines changed: 25 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,25 @@
1+
+++
2+
title = "Coffee Roasting Analysis"
3+
date = 2026-03-13
4+
draft = false
5+
+++
6+
7+
I roast coffee at home on a Hottop KN-8828B-2K+. This roaster can connect to [Artisan](https://artisan-scope.org/) for logging and roaster control. When I first bought the roaster I tried using it with Artisan but I fundamentally didn't understand the roasting process. It was too confusing to know what to change. The software felt very overwhelming. I like to learn on my own and I couldn't find any source of information that really helped me understand both the process and the software. I wound up defaulting to the automatic program on the Hottop.
8+
9+
This winter I got the idea that maybe AI could help me understand the process and improve my roasts systematically. So I threw the Hottop manual into Claude's maw and the data collected from the automatic program, then asked it for a 5 roast improvement plan. At this point everything was done manually.
10+
11+
I was pretty impressed by the advice given by AI and started to glimpse the process through the smoke. After roasting several more batches, manually collecting the data and feeding it into AI for recommendations, I thought...wait, we can automate this and drive a pipeline to analyze the data and suggest next steps.
12+
13+
So this project was born.
14+
15+
I hooked up the roaster to my laptop, got Artisan running, and wrote some code to transfer both the `.alog` roasting logs and the accompanying `.pngs` of the graph back to my dev computer. There I wrote some more code to ingest the logs and images, compare them to the targets based on the bean profile and my last batch, and get a recommendation of what to do differently for the next batch. The recommendation was tailored to action...not just "your drying phase was long" but "charge hotter — aim for a turning point around 145-150F to compress drying."
16+
17+
So now, after each roast, I get a full report: summary, bean profile, target comparison, prioritized recommendations, and a "Next Roast" box with 2-4 concrete things to change. Over multiple roasts, it shows a trend table so I can see if I'm actually getting better.
18+
19+
As for the bean profile which informs the target, I buy all my green coffee from [Sweet Maria's Coffee](https://www.sweetmarias.com). For each coffee, they provide an extensive set of cupping notes and bean attributes. I feed that data into the analysis pipeline to inform AI what the bean targets should look like. It's some kind of objective value to compare against.
20+
21+
Anyway, I've used the full pipeline several times now and my coffee roasts have definitely improved. I understand both the Hottop roaster and the process much better through this exploration.
22+
23+
There are several other tools that play into this ecosystem. I'll be describing them shortly and releasing the code for them as well.
24+
25+
[View the code on GitHub](https://github.com/frogmoses/coffee-roasting)

content/projects/r1-eye.md

Lines changed: 33 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,33 @@
1+
+++
2+
title = "R1-Eye, vision analysis of coffee bean roasts"
3+
date = 2026-03-14
4+
draft = false
5+
+++
6+
7+
As I described in my previous post [](), I roast coffee at home on a Hottop KN-8828b-2k+. The Hottop has a viewport on the left side facing away from the control panel. When roasting all my focus is on the laptop, which sits on the right side of the roaster. I can't really see the beans during the roast and can't visually assess the changes, which is a key criteria of the beans' development.
8+
9+
After jailbreaking the Rabbit R1, I was looking to use it for something real, then inspiration struck. Since I could now do whatever I want with the R1, including having live control over it, why not use it to watch the beans for me, capture point-in-time images of the roast, and have the images visually analyzed via AI to include in my larger coffee roasting analysis pipeline.
10+
11+
That's what this project does...and here is how it does it.
12+
13+
The R1 sits in front of the roaster viewport. When I start a roast in Artisan, the r1-eye (Claude named it "sentinel") automatically receives roast events via WebSocket. On each event, the sentinel:
14+
15+
1. Raises the R1's motorized camera via sysfs
16+
2. Taps the camera UI programmatically to take a full 8MP photo
17+
3. Pulls the photo over ADB, crops to the bean viewport (which is the bottom right quadrant where the beans are most visible), then applies white balance correction and sharpening
18+
4. Sends it to Claude Vision with roast context (elapsed time, current phase) for color assessment and a 1-10 development score
19+
5. Logs everything to a timestamped JSON that feeds into my roast analysis pipeline
20+
21+
Captures are phase-adaptive — every 30s during drying (slow changes), every 20s during Maillard, and every 10s during development (when color shifts fast). Key events like first crack trigger an immediate capture regardless of timing.
22+
23+
This started as "can I make the R1 do something useful?" and turned into a tool I actually use every roast. I was surprised by how quickly it came together. I would have been daunted in the past by the thought of having to learn all the intricacies of ADB and WebSocket and the like. Working with AI became a joy because I could get it done by conducting, not grunt work. It unlocked all this possibility. Matt Webb said in a great blog series,
24+
> "AI tools provide what I’ve previously called Universal Basic Agency and it is wonderful. When individuals are unblocked, we get an abundance of creativity in the world." [Matt Webb's post](https://interconnected.org/home/2026/03/12/nwh)
25+
26+
27+
I've never felt more unblocked.
28+
29+
If you have an R1 collecting dust and a hobby that could use a dedicated camera with AI analysis, the code is generic enough to adapt.
30+
31+
I do intend to expand the r1 to be a real AI device in the near future. Something like a 1-button push to activate an AI workflow - maybe like the *claw tooling - an agent trigger.
32+
33+
GitHub: [github.com/frogmoses/r1-eye](https://github.com/frogmoses/r1-eye)

0 commit comments

Comments
 (0)