Skip to content

Releases: UnlikeOtherAI/AppReveal

v0.8.0 — Cross-platform parity: tap_text, idSource, screen source

31 Mar 22:04

Choose a tag to compare

What's new

Cross-platform parity release. All platforms (iOS, macOS, Android, Flutter, React Native) now share the same enhanced tool surface.

New tool: tap_text

Tap elements by their visible text content. Finds matching text in the view hierarchy and walks up to the nearest tappable ancestor. Supports exact and contains match modes with occurrence disambiguation for multiple matches.

Enhanced element identity: idSource

Every element in get_elements now includes an idSource field indicating how its ID was derived:

  • explicit — from accessibility identifier, view tag, or resource ID
  • text — normalized from visible text content
  • semantics — from accessibility label or content description
  • tooltip — from tooltip text
  • derived — auto-generated fallback

Enhanced screen identity: source and appBarTitle

get_screen now returns:

  • source"explicit" (from ScreenIdentifiable/registerScreen) or "derived" (auto-detected)
  • appBarTitle — title extracted from navigation bar (iOS), window title (macOS), or toolbar/action bar (Android)

Smarter tap_element

tap_element now uses a 4-step fallback chain: accessibility ID → accessibility label → text-derived ID → exact text match with tappable ancestor. Error messages suggest tap_text or get_elements when an element isn't found.

ID deduplication

When multiple elements resolve to the same ID, they are automatically disambiguated with _1, _2 suffixes.

Platforms

All changes are implemented identically across iOS, macOS, Android, Flutter, and React Native.

v0.7.0 — Enhanced Flutter tap targeting

31 Mar 21:12

Choose a tag to compare

What's new

Flutter: Text-based element targeting

  • tap_text MCP tool — tap any visible text on screen without requiring a ValueKey. Supports match_mode (exact/contains) and occurrence for ambiguous matches.
  • Enhanced tap_element resolution — 5-priority fallback chain: ValueKey → Semantics label → derived text ID → exact visible text → normalized text. Elements that previously required tap_point with coordinate guessing now resolve by name.
  • Auto-discovery of 20+ widget typesget_elements now surfaces ListTile, SwitchListTile, CheckboxListTile, ExpansionTile, all button types (ElevatedButton, TextButton, OutlinedButton, FilledButton, IconButton, FloatingActionButton), PopupMenuButton, TextField, TextFormField, Checkbox, Switch, Radio, DropdownButton, GestureDetector, InkWell, and scrollable containers — even without ValueKey.
  • idSource field in element responses: explicit, text, semantics, tooltip, or derived — agents know how stable each ID is.

Screen identity improvements

  • 4-level fallback: explicit ScreenIdentifiable (1.0) → named route (0.8) → AppBar title (0.6) → inferred (0.3)
  • source and appBarTitle fields in get_screen response

Interaction safety

  • Off-screen tap detection with clear error messages
  • Descriptive errors suggesting tap_text or get_elements when resolution fails

Tests

  • 25 new tests covering element discovery, resolution, text targeting, and ancestor walking

Files changed

  • element_inventory.dart — rewritten for 20+ widget type discovery
  • element_resolver.dart — new file for text-based resolution
  • interaction_engine.dart — resolver integration, off-screen detection
  • screen_resolver.dart — AppBar title fallback, confidence scoring
  • mcp_tools.darttap_text tool registration
  • element_discovery_test.dart — 25 tests
  • Flutter/README.md — updated documentation

Full Changelog: v0.6.0...v0.7.0

v0.6.0 — Flutter Overlay Visibility

31 Mar 18:44

Choose a tag to compare

Flutter Overlay Visibility

get_elements, get_view_tree, and findElement now walk Flutter Overlay entries, making drawers, dialogs, bottom sheets, and tooltips fully inspectable and tappable.

What's fixed

Flutter renders overlay-hosted widgets (Drawer, AlertDialog, ModalBottomSheet, Tooltip) in OverlayState entries that sit above the route tree. The standard visitChildren walk from renderViewElement never reached them, so these widgets were invisible to all inspection and interaction tools.

ElementInventory — After the normal depth-first walk, explicitly collects OverlayState entry elements and traverses them. A visited set prevents double-processing when the standard walk already reaches an element.

InteractionEngine_visitElements and _findScrollableState now delegate to ElementInventory.visitAll, so scroll, tab selection, and focused text field search also cover overlay-hosted widgets.

ScreenshotCapturecaptureElement now finds elements inside overlays.

Affected tools

All Flutter tools that inspect or interact with the element tree:

  • get_elements — overlay widgets now appear in the flat element list
  • get_view_tree — overlay widget trees included in the full dump
  • tap / tap_point — element lookup finds overlay-hosted targets
  • scroll — scrollable containers inside overlays are discoverable
  • screenshot — element capture works for overlay-hosted widgets

Upgrade

No breaking changes. No API changes. Existing integrations get overlay visibility automatically.

v0.5.0 — macOS Support

31 Mar 10:33

Choose a tag to compare

macOS Support

AppReveal now works on macOS apps. The same Swift package compiles for both iOS 16+ and macOS 13+, sharing the MCP server, Bonjour discovery, and full tool surface.

What's new

macOS framework modules

  • MacOSWindowProvider — enumerates visible NSWindow instances
  • MacOSElementInventory — walks AppKit view hierarchy (NSButton, NSTextField, NSTableView, etc.)
  • MacOSInteractionEngine — tap, type, scroll via AppKit APIs
  • MacOSScreenResolver — controller hierarchy walking with NSTabViewController/NSSplitViewController support
  • MacOSScreenshotCapture — window capture via CGWindowListCreateImage
  • MacOSWebViewBridgeWKWebView discovery and DOM tools on macOS

3 new macOS-only tools

  • get_menu_bar — read the app menu bar hierarchy
  • click_menu_item — invoke a menu item by title path
  • focus_window — bring a window to the front

Multi-window support (all platforms)

  • list_windows tool returns stable window IDs
  • All 48 native UI and WebView tools accept optional window_id
  • WindowProvider abstraction with iOS and macOS implementations

CLI

  • --platform macos filter works across all commands
  • Help text and README updated

Example app

  • Full macOS AppKit example with sidebar navigation, 8 screens, 13+ accessibility identifiers, and 4 registered providers

Upgrade

No breaking changes. Existing iOS/Android/Flutter integrations are unaffected.

// Package.swift — same package, now builds for macOS too
.package(url: "https://github.com/UnlikeOtherAI/AppReveal.git", from: "0.5.0")

v0.4.0 — CLI and LLM control workflows

30 Mar 15:38

Choose a tag to compare

What's new

AppReveal CLI for LLM-driven control

This release adds a first-party appreveal CLI that discovers AppReveal MCP servers on the local network and gives LLMs a stable shell interface for inspecting and driving apps.

Core commands:

  • discover — browse _appreveal._tcp targets with device names, IPs, bundle IDs, and codes
  • inspect / tools / call / request — low-level MCP workflows
  • snapshot / find / tap / type — higher-level UI-control flows for agents

Fleet support:

  • --all to fan out across every discovered target
  • repeated --target selectors
  • --platform ios,android,flutter,reactnative filters
  • per-target success and failure reporting

npm package:

  • @unlikeotherai/appreveal

Reliability fixes

  • fixed iOS numeric JSON decoding so integer coordinate payloads can be read correctly by tools like tap_point
  • mirrored the same numeric decoding fix into the React Native iOS copy
  • increased the CLI's default discovery timeout to make mDNS lookups less flaky on real devices

Validation

Validated against a live iPhone target running ai.unlikeother.facesregistration:

  • discovery
  • inspect
  • tools/list
  • initialize
  • get_screen
  • get_elements
  • get_view_tree
  • screenshot
  • snapshot
  • find

The live test also confirmed a React Native screen where accessibility labels exist in get_view_tree but get_elements is empty, so the CLI now falls back to accessible view-tree nodes for find and tap.

Install / upgrade

npm install -g @unlikeotherai/appreveal
appreveal --help
.package(url: "https://github.com/UnlikeOtherAI/AppReveal.git", from: "0.4.0")

v0.3.0 — React Native, device_info, and project icon

17 Mar 17:41

Choose a tag to compare

What's new

React Native support

Full React Native implementation (react-native-appreveal) with:

  • iOS native module (Swift + ObjC bridge) — same NWListener HTTP server, 44 MCP tools
  • Android native module (Kotlin) — same NanoHTTPD server, 44 MCP tools
  • TypeScript public API with `DEV` guard (no-op in production)
  • React Navigation v6/v7 integration via `createNavigationListener()`
  • `useAppRevealScreen(key, title)` hook for per-screen registration
  • `AppRevealFetchInterceptor.install()` — patches global fetch to capture network traffic
  • Example app: bare React Native, React Navigation v7, 8 screens

New `device_info` tool (all platforms)

Single call returning everything an agent needs to understand the runtime environment:

  • iOS/React Native (iOS): full Info.plist, UIDevice, ProcessInfo, UIScreen, locale, timezone, battery state/level, disk free/total, declared NS*UsageDescription permissions
  • Android/React Native (Android): PackageInfo + manifest metadata, Build fields (manufacturer, model, fingerprint, SDK, security patch, ABIs), display metrics, battery, RAM, disk, locale, timezone, declared permissions
  • Flutter: PackageInfo, Platform, window size/DPR, Dart runtime memory, locale, timezone, process environment (secrets filtered)

Project icon

Cute ghost-peeking-through-phone icon, shown in README header.

Tools reference doc

docs/tools.md — complete reference for all 44 tools with parameters and response shapes.

Bug fixes

  • Flutter: `mdns_advertiser.dart` updated for nsd 2.x API (`register`/`unregister` replacing `startRegistration`/`stopRegistration`)

Tool count

44 tools on all four platforms (up from 43).

Platforms

Platform Tools
iOS (Swift/SPM) 44
Android (Kotlin/Gradle) 44
Flutter (Dart) 44
React Native (npm) 44

v0.2.0 — WKWebView Support

15 Mar 23:31

Choose a tag to compare

What's new

WKWebView support — Playwright-style DOM access for web views

14 new MCP tools that give agents full visibility and control inside any WKWebView embedded in the app. No integration code needed — AppReveal auto-discovers web views and injects JavaScript to read and interact with the DOM.

Discovery:

  • get_webviews — list all web views with URL, title, loading state
  • get_dom_tree — full or partial DOM tree as JSON
  • get_dom_interactive — all inputs, buttons, links, selects with selectors and attributes
  • query_dom — CSS selector queries
  • find_dom_text — find elements by text content

Interaction:

  • web_click — click by CSS selector
  • web_type — type into inputs/textareas (React/Vue/Angular compatible)
  • web_select — select dropdown options
  • web_toggle — check/uncheck checkboxes and radios
  • web_scroll_to — scroll to element

Navigation & advanced:

  • web_navigate — load a URL
  • web_back / web_forward — browser history
  • web_evaluate — run arbitrary JavaScript

Other improvements

  • Auto-derived screen identity — no ScreenIdentifiable protocol needed. Screen key and title derived from class name automatically.
  • get_view_tree tool — full native view hierarchy dump with all properties
  • 35 total MCP tools (21 native + 14 web)

Upgrade

.package(url: "https://github.com/UnlikeOtherAI/AppReveal.git", from: "0.2.0")

No code changes needed — existing AppReveal.start() picks up all new tools automatically.

v0.1.0 — Initial Release

15 Mar 22:58

Choose a tag to compare

AppReveal v0.1.0

Debug-only in-app MCP server for iOS. Gives LLM agents structured access to native app UI, state, navigation, and diagnostics over the local network.

What's included

  • 20 MCP tools — screen identity, element inventory, tap, type, scroll, screenshot, state, navigation, feature flags, network traffic, logs, errors, and more
  • Zero-config discovery — Bonjour/mDNS advertising on _appreveal._tcp
  • Streamable HTTP transport — standard MCP protocol over NWListener
  • #if DEBUG everywhere — zero production footprint
  • Example app — 10 screens, 60+ identified elements, all framework features integrated

Quick start

// Package.swift dependency
.package(url: "https://github.com/UnlikeOtherAI/AppReveal.git", from: "0.1.0")
#if DEBUG
import AppReveal
AppReveal.start()
#endif

Requirements

  • iOS 16.0+
  • Swift 5.9+
  • Xcode 15+