Real-time Pupil Tracking from Monocular Video for Digital Puppetry

06/19/2020
by   Artsiom Ablavatski, et al.
0

We present a simple, real-time approach for pupil tracking from live video on mobile devices. Our method extends a state-of-the-art face mesh detector with two new components: a tiny neural network that predicts positions of the pupils in 2D, and a displacement-based estimation of the pupil blend shape coefficients. Our technique can be used to accurately control the pupil movements of a virtual puppet, and lends liveliness and energy to it. The proposed approach runs at over 50 FPS on modern phones, and enables its usage in any real-time puppeteering pipeline.

READ FULL TEXT

page 1

page 2

research
09/11/2023

Blendshapes GHUM: Real-time Monocular Facial Blendshape Prediction

We present Blendshapes GHUM, an on-device ML pipeline that predicts 52 f...
research
09/21/2016

Production-Level Facial Performance Capture Using Deep Convolutional Neural Networks

We present a real-time deep learning framework for video-based facial pe...
research
10/07/2020

A Novel Face-tracking Mouth Controller and its Application to Interacting with Bioacoustic Models

We describe a simple, computationally light, real-time system for tracki...
research
12/19/2017

Real-time deep hair matting on mobile devices

Augmented reality is an emerging technology in many application domains....
research
10/08/2019

Real-time processing of high resolution video and 3D model-based tracking in remote tower operations

High quality video data is a core component in emerging remote tower ope...
research
03/06/2019

Clear Skies Ahead: Towards Real-Time Automatic Sky Replacement in Video

Digital videos such as those captured by a smartphone often exhibit expo...
research
12/09/2011

Real-time face swapping as a tool for understanding infant self-recognition

To study the preference of infants for contingency of movements and fami...

Please sign up or login with your details

Forgot password? Click here to reset