Sonification of Facial Actions for Musical Expression

10/07/2020
by   Mathias Funk, et al.
0

The central role of the face in social interaction and non-verbal communication suggests we explore facial action as a means of musical expression. This paper presents the design, implementation, and preliminary studies of a novel system utilizing face detection and optic flow algorithms to associate facial movements with sound synthesis in a topographically specific fashion. We report on our experience with various gesture-to-sound mappings and applications, and describe our preliminary experiments at musical performance using the system.

READ FULL TEXT

page 2

page 3

page 4

research
10/07/2020

Designing, Playing, and Performing with a Vision-based Mouth Interface

The role of the face and mouth in speech production as well asnon-verbal...
research
09/14/2018

Mugeetion: Musical Interface Using Facial Gesture and Emotion

People feel emotions when listening to music. However, emotions are not ...
research
03/02/2023

AI as mediator between composers, sound designers, and creative media producers

Musical professionals who produce material for non-musical stakeholders ...
research
11/26/2020

Interactive Machine Learning of Musical Gesture

This chapter presents an overview of Interactive Machine Learning (IML) ...
research
10/04/2020

Facial gesture interfaces for expression and communication

Considerable effort has been devoted to the automatic extraction of info...
research
12/11/2020

Exploring Facial Expressions and Affective Domains for Parkinson Detection

Parkinson's Disease (PD) is a neurological disorder that affects facial ...
research
02/08/2023

A Perceptual Study of Sound Ecology in Peripheral Sonification

Based on a case study on 3D printing, we have been experimenting on the ...

Please sign up or login with your details

Forgot password? Click here to reset