Measuring and modeling the perception of natural and unconstrained gaze in humans and machines

11/29/2016
by   Daniel Harari, et al.
0

Humans are remarkably adept at interpreting the gaze direction of other individuals in their surroundings. This skill is at the core of the ability to engage in joint visual attention, which is essential for establishing social interactions. How accurate are humans in determining the gaze direction of others in lifelike scenes, when they can move their heads and eyes freely, and what are the sources of information for the underlying perceptual processes? These questions pose a challenge from both empirical and computational perspectives, due to the complexity of the visual input in real-life situations. Here we measure empirically human accuracy in perceiving the gaze direction of others in lifelike scenes, and study computationally the sources of information and representations underlying this cognitive capacity. We show that humans perform better in face-to-face conditions compared with recorded conditions, and that this advantage is not due to the availability of input dynamics. We further show that humans are still performing well when only the eyes-region is visible, rather than the whole face. We develop a computational model, which replicates the pattern of human performance, including the finding that the eyes-region contains on its own, the required information for estimating both head orientation and direction of gaze. Consistent with neurophysiological findings on task-specific face regions in the brain, the learned computational representations reproduce perceptual effects such as the Wollaston illusion, when trained to estimate direction of gaze, but not when trained to recognize objects or faces.

READ FULL TEXT

page 1

page 15

page 16

page 17

page 18

page 19

page 20

research
04/17/2021

Gaze Perception in Humans and CNN-Based Model

Making accurate inferences about other individuals' locus of attention i...
research
12/08/2014

When Computer Vision Gazes at Cognition

Joint attention is a core, early-developing form of social interaction. ...
research
09/01/2021

From simple innate biases to complex visual concepts

Early in development, infants learn to solve visual problems that are hi...
research
04/10/2018

Discovery and usage of joint attention in images

Joint visual attention is characterized by two or more individuals looki...
research
11/05/2022

HREyes: Design, Development, and Evaluation of a Novel Method for AUVs to Communicate Information and Gaze Direction

We present the design, development, and evaluation of HREyes: biomimetic...
research
12/23/2020

Estimation of Driver's Gaze Region from Head Position and Orientation using Probabilistic Confidence Regions

A smart vehicle should be able to understand human behavior and predict ...
research
08/24/2022

Active Gaze Control for Foveal Scene Exploration

Active perception and foveal vision are the foundations of the human vis...

Please sign up or login with your details

Forgot password? Click here to reset