RevVision delivers human-centered sports analytics for bowling using Meta glasses for computer vision. The system extracts information from a first-person wearable perspective to help users understand their performance.
- Processes first-person video from Meta smart glasses and produces visualizations and performance metrics in real time (when deployed on a GPU-backed processing environment).
- Segments the bowling lane to isolate relevant playing surfaces and applies geometric transformations, including perspective correction and spatial warping, to normalize the view.
- Detects the ball and tracks its trajectory through each shot, extracting metrics such as speed and curvature.
- Displays lane visualization metrics on the Meta display HUD.
Note: Until the Meta Wearables SDK provides native access, this is coordinated through a video call. See: https://github.com/josephletobar/glass-sync
side_by_side.mp4
Note: Actual device does not allow direct recording; this is a reconstructed approximation.
demo.mp4
RevVision is built to operate on first-person video captured from Meta smart glasses via the Meta Wearables Device Access Toolkit. In the full system, video is streamed from the glasses to a mobile device, forwarded into the RevVision vision pipeline, and processed to produce visualizations and metrics.
This repository exposes the same processing pipeline, but uses recorded video files as the input interface for simplicity and reproducibility. This allows users to run the exact vision and analysis stages without requiring live glasses streaming or mobile setup.
- Setup
git clone https://github.com/yourusername/rev-vision.git cd rev-vision pip install -r requirements.txt - Run
Set the path to a recorded bowling session video (for example, from test_videos/) inside launcher.py, then run:
python launcher.py