Influence of visual cues on head and eye movements during listening tasks in multi-talker audiovisual environments with animated characters

11/16/2018
by   Maartje M. E. Hendrikse, et al.
0

Recent studies of hearing aid benefits indicate that head movement behavior influences performance. To systematically assess these effects, movement behavior must be measured in realistic communication conditions. For this, the use of virtual audiovisual environments with animated characters as visual stimuli has been proposed. It is unclear, however, how these animations influence the head- and eye-movement behavior of subjects. Here, two listening tasks were carried out with a group of 14 young normal hearing subjects to investigate the influence of visual cues on head- and eye-movement behavior; on combined localization and speech intelligibility task performance; as well as on perceived speech intelligibility, perceived listening effort and the general impression of the audiovisual environments. Animated characters with different lip-syncing and gaze patterns were compared to an audio-only condition and to a video of real persons. Results show that movement behavior, task performance, and perception were all influenced by visual cues. The movement behavior of young normal hearing listeners in animation conditions with lip-syncing was similar to that in the video condition. These results in young normal hearing listeners are a first step towards using the animated characters to assess the influence of head movement behavior on hearing aid performance.

READ FULL TEXT
research
04/03/2020

Comparison of a Head-Mounted Display and a Curved Screen in a Multi-Talker Audiovisual Listening Task

Virtual audiovisual technology has matured and its use in research is wi...
research
11/28/2016

Easy-setup eye movement recording system for human-computer interaction

Tracking the movement of human eyes is expected to yield natural and con...
research
08/10/2021

Exploring the Effect of Visual Cues on Eye Gaze During AR-Guided Picking and Assembly Tasks

In this paper, we present an analysis of eye gaze patterns pertaining to...
research
06/21/2016

An active efficient coding model of the optokinetic nystagmus

Optokinetic nystagmus (OKN) is an involuntary eye movement responsible f...
research
03/06/2019

Effects of Self-Avatar and Gaze on Avoidance Movement Behavior

The present study investigates users' movement behavior in a virtual env...
research
06/19/2023

The Psychophysics of Human Three-Dimensional Active Visuospatial Problem-Solving

Our understanding of how visual systems detect, analyze and interpret vi...
research
06/24/2019

Multisensory cues facilitate coordination of stepping movements with a virtual reality avatar

The effectiveness of simple sensory cues for retraining gait have been d...

Please sign up or login with your details

Forgot password? Click here to reset