Facial gesture interfaces for expression and communication

by   Michael J. Lyons, et al.

Considerable effort has been devoted to the automatic extraction of information about action of the face from image sequences. Within the context of human-computer interaction (HCI) we may distinguish systems that allow expression from those which aim at recognition. Most of the work in facial action processing has been directed at automatically recognizing affect from facial actions. By contrast, facial gesture interfaces, which respond to deliberate facial actions, have received comparatively little attention. This paper reviews several projects on vision-based interfaces that rely on facial action for intentional HCI. Applications to several domains are introduced, including text entry, artistic and musical expression and assistive technology for motor-impaired users.


page 2

page 4

page 5


Facial Gesture Recognition Using Correlation And Mahalanobis Distance

Augmenting human computer interaction with automated analysis and synthe...

Sonification of Facial Actions for Musical Expression

The central role of the face in social interaction and non-verbal commun...

Input Devices for Musical Expression: Borrowing Tools from HCI

This paper reviews the existing literature on input device evaluation an...

A High-Fidelity Open Embodied Avatar with Lip Syncing and Expression Capabilities

Embodied avatars as virtual agents have many applications and provide be...

Deep Multi-Facial patches Aggregation Network for Expression Classification from Face Images

Emotional Intelligence in Human-Computer Interaction has attracted incre...

Using Virtual Humans to Understand Real Ones

Human interactions are characterized by explicit as well as implicit cha...

Designing, Playing, and Performing with a Vision-based Mouth Interface

The role of the face and mouth in speech production as well asnon-verbal...