MediaPipe Hands: On-device Real-time Hand Tracking

06/18/2020
by   Fan Zhang, et al.
7

We present a real-time on-device hand tracking pipeline that predicts hand skeleton from single RGB camera for AR/VR applications. The pipeline consists of two models: 1) a palm detector, 2) a hand landmark model. It's implemented via MediaPipe, a framework for building cross-platform ML solutions. The proposed model and pipeline architecture demonstrates real-time inference speed on mobile GPUs and high prediction quality. MediaPipe Hands is open sourced at https://mediapipe.dev.

READ FULL TEXT

page 1

page 3

page 4

page 5

research
07/15/2019

Real-time Hair Segmentation and Recoloring on Mobile GPUs

We present a novel approach for neural network-based hair segmentation f...
research
07/15/2019

Real-time Facial Surface Geometry from Monocular Video on Mobile GPUs

We present an end-to-end neural network-based model for inferring an app...
research
10/29/2021

On-device Real-time Hand Gesture Recognition

We present an on-device real-time hand gesture recognition (HGR) system,...
research
10/04/2015

Efficient Hand Articulations Tracking using Adaptive Hand Model and Depth map

Real-time hand articulations tracking is important for many applications...
research
11/04/2022

HoloLens 2 Sensor Streaming

We present a HoloLens 2 server application for streaming device data via...
research
05/24/2023

Generative Adversarial Shaders for Real-Time Realism Enhancement

Application of realism enhancement methods, particularly in real-time an...
research
07/27/2022

On-Device CPU Scheduling for Sense-React Systems

Sense-react systems (e.g. robotics and AR/VR) have to take highly respon...

Please sign up or login with your details

Forgot password? Click here to reset