Releases: UnlikeOtherAI/AppReveal
v0.8.0 — Cross-platform parity: tap_text, idSource, screen source
What's new
Cross-platform parity release. All platforms (iOS, macOS, Android, Flutter, React Native) now share the same enhanced tool surface.
New tool: tap_text
Tap elements by their visible text content. Finds matching text in the view hierarchy and walks up to the nearest tappable ancestor. Supports exact and contains match modes with occurrence disambiguation for multiple matches.
Enhanced element identity: idSource
Every element in get_elements now includes an idSource field indicating how its ID was derived:
explicit— from accessibility identifier, view tag, or resource IDtext— normalized from visible text contentsemantics— from accessibility label or content descriptiontooltip— from tooltip textderived— auto-generated fallback
Enhanced screen identity: source and appBarTitle
get_screen now returns:
source—"explicit"(from ScreenIdentifiable/registerScreen) or"derived"(auto-detected)appBarTitle— title extracted from navigation bar (iOS), window title (macOS), or toolbar/action bar (Android)
Smarter tap_element
tap_element now uses a 4-step fallback chain: accessibility ID → accessibility label → text-derived ID → exact text match with tappable ancestor. Error messages suggest tap_text or get_elements when an element isn't found.
ID deduplication
When multiple elements resolve to the same ID, they are automatically disambiguated with _1, _2 suffixes.
Platforms
All changes are implemented identically across iOS, macOS, Android, Flutter, and React Native.
v0.7.0 — Enhanced Flutter tap targeting
What's new
Flutter: Text-based element targeting
tap_textMCP tool — tap any visible text on screen without requiring aValueKey. Supportsmatch_mode(exact/contains) andoccurrencefor ambiguous matches.- Enhanced
tap_elementresolution — 5-priority fallback chain: ValueKey → Semantics label → derived text ID → exact visible text → normalized text. Elements that previously requiredtap_pointwith coordinate guessing now resolve by name. - Auto-discovery of 20+ widget types —
get_elementsnow surfacesListTile,SwitchListTile,CheckboxListTile,ExpansionTile, all button types (ElevatedButton,TextButton,OutlinedButton,FilledButton,IconButton,FloatingActionButton),PopupMenuButton,TextField,TextFormField,Checkbox,Switch,Radio,DropdownButton,GestureDetector,InkWell, and scrollable containers — even withoutValueKey. idSourcefield in element responses:explicit,text,semantics,tooltip, orderived— agents know how stable each ID is.
Screen identity improvements
- 4-level fallback: explicit
ScreenIdentifiable(1.0) → named route (0.8) → AppBar title (0.6) → inferred (0.3) sourceandappBarTitlefields inget_screenresponse
Interaction safety
- Off-screen tap detection with clear error messages
- Descriptive errors suggesting
tap_textorget_elementswhen resolution fails
Tests
- 25 new tests covering element discovery, resolution, text targeting, and ancestor walking
Files changed
element_inventory.dart— rewritten for 20+ widget type discoveryelement_resolver.dart— new file for text-based resolutioninteraction_engine.dart— resolver integration, off-screen detectionscreen_resolver.dart— AppBar title fallback, confidence scoringmcp_tools.dart—tap_texttool registrationelement_discovery_test.dart— 25 testsFlutter/README.md— updated documentation
Full Changelog: v0.6.0...v0.7.0
v0.6.0 — Flutter Overlay Visibility
Flutter Overlay Visibility
get_elements, get_view_tree, and findElement now walk Flutter Overlay entries, making drawers, dialogs, bottom sheets, and tooltips fully inspectable and tappable.
What's fixed
Flutter renders overlay-hosted widgets (Drawer, AlertDialog, ModalBottomSheet, Tooltip) in OverlayState entries that sit above the route tree. The standard visitChildren walk from renderViewElement never reached them, so these widgets were invisible to all inspection and interaction tools.
ElementInventory — After the normal depth-first walk, explicitly collects OverlayState entry elements and traverses them. A visited set prevents double-processing when the standard walk already reaches an element.
InteractionEngine — _visitElements and _findScrollableState now delegate to ElementInventory.visitAll, so scroll, tab selection, and focused text field search also cover overlay-hosted widgets.
ScreenshotCapture — captureElement now finds elements inside overlays.
Affected tools
All Flutter tools that inspect or interact with the element tree:
get_elements— overlay widgets now appear in the flat element listget_view_tree— overlay widget trees included in the full dumptap/tap_point— element lookup finds overlay-hosted targetsscroll— scrollable containers inside overlays are discoverablescreenshot— element capture works for overlay-hosted widgets
Upgrade
No breaking changes. No API changes. Existing integrations get overlay visibility automatically.
v0.5.0 — macOS Support
macOS Support
AppReveal now works on macOS apps. The same Swift package compiles for both iOS 16+ and macOS 13+, sharing the MCP server, Bonjour discovery, and full tool surface.
What's new
macOS framework modules
MacOSWindowProvider— enumerates visibleNSWindowinstancesMacOSElementInventory— walks AppKit view hierarchy (NSButton,NSTextField,NSTableView, etc.)MacOSInteractionEngine— tap, type, scroll via AppKit APIsMacOSScreenResolver— controller hierarchy walking withNSTabViewController/NSSplitViewControllersupportMacOSScreenshotCapture— window capture viaCGWindowListCreateImageMacOSWebViewBridge—WKWebViewdiscovery and DOM tools on macOS
3 new macOS-only tools
get_menu_bar— read the app menu bar hierarchyclick_menu_item— invoke a menu item by title pathfocus_window— bring a window to the front
Multi-window support (all platforms)
list_windowstool returns stable window IDs- All 48 native UI and WebView tools accept optional
window_id WindowProviderabstraction with iOS and macOS implementations
CLI
--platform macosfilter works across all commands- Help text and README updated
Example app
- Full macOS AppKit example with sidebar navigation, 8 screens, 13+ accessibility identifiers, and 4 registered providers
Upgrade
No breaking changes. Existing iOS/Android/Flutter integrations are unaffected.
// Package.swift — same package, now builds for macOS too
.package(url: "https://github.com/UnlikeOtherAI/AppReveal.git", from: "0.5.0")v0.4.0 — CLI and LLM control workflows
What's new
AppReveal CLI for LLM-driven control
This release adds a first-party appreveal CLI that discovers AppReveal MCP servers on the local network and gives LLMs a stable shell interface for inspecting and driving apps.
Core commands:
discover— browse_appreveal._tcptargets with device names, IPs, bundle IDs, and codesinspect/tools/call/request— low-level MCP workflowssnapshot/find/tap/type— higher-level UI-control flows for agents
Fleet support:
--allto fan out across every discovered target- repeated
--targetselectors --platform ios,android,flutter,reactnativefilters- per-target success and failure reporting
npm package:
@unlikeotherai/appreveal
Reliability fixes
- fixed iOS numeric JSON decoding so integer coordinate payloads can be read correctly by tools like
tap_point - mirrored the same numeric decoding fix into the React Native iOS copy
- increased the CLI's default discovery timeout to make mDNS lookups less flaky on real devices
Validation
Validated against a live iPhone target running ai.unlikeother.facesregistration:
- discovery
- inspect
- tools/list
- initialize
- get_screen
- get_elements
- get_view_tree
- screenshot
- snapshot
- find
The live test also confirmed a React Native screen where accessibility labels exist in get_view_tree but get_elements is empty, so the CLI now falls back to accessible view-tree nodes for find and tap.
Install / upgrade
npm install -g @unlikeotherai/appreveal
appreveal --help.package(url: "https://github.com/UnlikeOtherAI/AppReveal.git", from: "0.4.0")v0.3.0 — React Native, device_info, and project icon
What's new
React Native support
Full React Native implementation (react-native-appreveal) with:
- iOS native module (Swift + ObjC bridge) — same NWListener HTTP server, 44 MCP tools
- Android native module (Kotlin) — same NanoHTTPD server, 44 MCP tools
- TypeScript public API with `DEV` guard (no-op in production)
- React Navigation v6/v7 integration via `createNavigationListener()`
- `useAppRevealScreen(key, title)` hook for per-screen registration
- `AppRevealFetchInterceptor.install()` — patches global fetch to capture network traffic
- Example app: bare React Native, React Navigation v7, 8 screens
New `device_info` tool (all platforms)
Single call returning everything an agent needs to understand the runtime environment:
- iOS/React Native (iOS): full Info.plist, UIDevice, ProcessInfo, UIScreen, locale, timezone, battery state/level, disk free/total, declared NS*UsageDescription permissions
- Android/React Native (Android): PackageInfo + manifest metadata, Build fields (manufacturer, model, fingerprint, SDK, security patch, ABIs), display metrics, battery, RAM, disk, locale, timezone, declared permissions
- Flutter: PackageInfo, Platform, window size/DPR, Dart runtime memory, locale, timezone, process environment (secrets filtered)
Project icon
Cute ghost-peeking-through-phone icon, shown in README header.
Tools reference doc
docs/tools.md — complete reference for all 44 tools with parameters and response shapes.
Bug fixes
- Flutter: `mdns_advertiser.dart` updated for nsd 2.x API (`register`/`unregister` replacing `startRegistration`/`stopRegistration`)
Tool count
44 tools on all four platforms (up from 43).
Platforms
| Platform | Tools |
|---|---|
| iOS (Swift/SPM) | 44 |
| Android (Kotlin/Gradle) | 44 |
| Flutter (Dart) | 44 |
| React Native (npm) | 44 |
v0.2.0 — WKWebView Support
What's new
WKWebView support — Playwright-style DOM access for web views
14 new MCP tools that give agents full visibility and control inside any WKWebView embedded in the app. No integration code needed — AppReveal auto-discovers web views and injects JavaScript to read and interact with the DOM.
Discovery:
get_webviews— list all web views with URL, title, loading stateget_dom_tree— full or partial DOM tree as JSONget_dom_interactive— all inputs, buttons, links, selects with selectors and attributesquery_dom— CSS selector queriesfind_dom_text— find elements by text content
Interaction:
web_click— click by CSS selectorweb_type— type into inputs/textareas (React/Vue/Angular compatible)web_select— select dropdown optionsweb_toggle— check/uncheck checkboxes and radiosweb_scroll_to— scroll to element
Navigation & advanced:
web_navigate— load a URLweb_back/web_forward— browser historyweb_evaluate— run arbitrary JavaScript
Other improvements
- Auto-derived screen identity — no
ScreenIdentifiableprotocol needed. Screen key and title derived from class name automatically. get_view_treetool — full native view hierarchy dump with all properties- 35 total MCP tools (21 native + 14 web)
Upgrade
.package(url: "https://github.com/UnlikeOtherAI/AppReveal.git", from: "0.2.0")No code changes needed — existing AppReveal.start() picks up all new tools automatically.
v0.1.0 — Initial Release
AppReveal v0.1.0
Debug-only in-app MCP server for iOS. Gives LLM agents structured access to native app UI, state, navigation, and diagnostics over the local network.
What's included
- 20 MCP tools — screen identity, element inventory, tap, type, scroll, screenshot, state, navigation, feature flags, network traffic, logs, errors, and more
- Zero-config discovery — Bonjour/mDNS advertising on
_appreveal._tcp - Streamable HTTP transport — standard MCP protocol over NWListener
#if DEBUGeverywhere — zero production footprint- Example app — 10 screens, 60+ identified elements, all framework features integrated
Quick start
// Package.swift dependency
.package(url: "https://github.com/UnlikeOtherAI/AppReveal.git", from: "0.1.0")#if DEBUG
import AppReveal
AppReveal.start()
#endifRequirements
- iOS 16.0+
- Swift 5.9+
- Xcode 15+