Goal: Live Preview for Android Apps development while AI builds
Mobile is different from web in 4 ways, sticking to the architecture we have that uses Hive, Staklink, and even Staktrak, where live feedback from Hot Module replacements + an iframe pointing to a public URL. something like:
Pod -> Staklink Proxy -> PM2 -> dev server -> iframe in browser (on Hive)
Here are my findings, split into two possible approaches: In-House or using a SaaS (Appetize.io is the best I found)
1. Compute Layer: Where does the emulator run?
Mobile apps require emulating entire operating systems and simulating hardware, requiring special Linux Kernel Modules like KVM or binder... So our standard Docker containers cannot work in this case.
- In-House options:
- Use Redroid: a lightweight alternative to Standard android emulator that runs on Docker container with a
--privilledged Linux host. This solution requires kernel modules (binder and ashmen) to be enabled on the host machines running the pods. This means we can either:
- Create a Dedicated Redroid nodes pool (infra-stuff) and configure them to have
--privileged flag or specific security context with needed capabilities to run Redroid. This method is better because it isolates these pods only for Android dev and not for everything (web)...
- Shared Nodes with Redroid Capable Hosts: We convert all the hive nodes/pods to support Redroid (binder and ashmen modules loaded) so that every node can run both web and Android containers. This is simpler, but then it is more vulnerable because of the
--privileged access... security must be solid.
- Saas like Appetize.io already take care of this in their priced packaging but apps binary is uploaded to a 3rd party (privacy concerns)
2. Transport Layer: How does the screen get to the browser?
Web apps render natively into the browser (we re-render it on Hive in an iframe), but mobile requires us to capture the emulator's screen buffer, encode it as video (H264), and stream it over a protocol like WebRTC.
- In-House:
- scrcpy-server.jar runs inside Redroid + custom WebRTC Bridge:
-
Redroid <-[ ADB socket] -> scrcpy-server.jar (for screen capture + input) - [H254 Frames ] - Custom WebRTC Bridge - [WebRTC: video + DataChannel] - Hive (Canvas/Browser Artifact). ```
- We'll need to build the Custom WebRTC Bridge service
- Saas Takes care of this layer completely, we hook it onto our
<iframe>, and it works.
3. Interaction Layer: How do clicks and swipes get back to the emulator?
Mobile requires translating browser mouse clicks into hardware-level touch coordinates injected via ADB.
- In-House options:
- "Dumb Screen" approach: browser canvas captures mouse (x,y) → sent over WebRTC DataChannel → Staklink translates → ADB injects native touch event into Android kernel. Works for normal interactions but very fragile for recording user journeys (from
Staktrak experience).
- Slower async operations for recording user journey (this is the Staktrak part): we set up Appium to get the UI Tree... It's like playwright for JS.
- Sass takes care of this, but there are limitations (best in class is Appetize.io):
- They have a proprietary Test format
- Experimenta UI Tree in XML (advised against using it for prod work)
4. Execution Layer: How does new code from the Agent update the running app?
Web projects hot reload instantly, but Native Mobile apps require compiling heavy binaries and re-installing them on their device for every code change... Takes a long time.
- In-House options:
- for React Native/Flutter: Metro bundler or Flutter engine HMR - update appears in 1-3 seconds, no need to re-install
- Native Kotlin: Incremental Gradle build +
adb install -r - about 8-30 seconds for small change (not instant but works)
- Compose Hot Reload Experimental but promising, 2-5 seconds.
- Worst case is a full Gradle build, and ADB install: takes 30s to at least 2 minutes.
- Saas:
- No hot reload: every code change requires uploading a new APK and re-installing
- App state is lost on re-install since session re-starts... significantly slower, even for React Native.
- They are fundamentally not meant for dev environments, so no hot reload... only binary uploads.
Goal: Live Preview for Android Apps development while AI builds
Mobile is different from web in 4 ways, sticking to the architecture we have that uses
Hive,Staklink, and evenStaktrak, where live feedback from Hot Module replacements + an iframe pointing to a public URL. something like:Here are my findings, split into two possible approaches: In-House or using a SaaS (Appetize.io is the best I found)
1. Compute Layer: Where does the emulator run?
Mobile apps require emulating entire operating systems and simulating hardware, requiring special Linux Kernel Modules like KVM or
binder... So our standard Docker containers cannot work in this case.--privilledgedLinux host. This solution requires kernel modules (binderandashmen) to be enabled on the host machines running the pods. This means we can either:--privilegedflag or specific security context with needed capabilities to runRedroid. This method is better because it isolates these pods only for Android dev and not for everything (web)...--privilegedaccess... security must be solid.2. Transport Layer: How does the screen get to the browser?
Web apps render natively into the browser (we re-render it on Hive in an iframe), but mobile requires us to capture the emulator's screen buffer, encode it as video (H264), and stream it over a protocol like WebRTC.
<iframe>,and it works.3. Interaction Layer: How do clicks and swipes get back to the emulator?
Mobile requires translating browser mouse clicks into hardware-level touch coordinates injected via ADB.
Staktrakexperience).4. Execution Layer: How does new code from the Agent update the running app?
Web projects hot reload instantly, but Native Mobile apps require compiling heavy binaries and re-installing them on their device for every code change... Takes a long time.
adb install -r- about 8-30 seconds for small change (not instant but works)