Psychoacoustic Sonification as User Interface for Human-Machine Interaction

12/18/2019 ∙ by Tim Ziemer, et al. ∙ 0

When operating a machine, the operator needs to know some spatial relations, like the relative location of the target or the nearest obstacle. Often, sensors are used to derive this spatial information, and visual displays are deployed as interfaces to communicate this information to the operator. In this paper, we present psychoacoustic sonification as an alternative interface for human-machine interaction. Instead of visualizations, an interactive sound guides the operator to the desired target location, or helps her avoid obstacles in space. By considering psychoacoustics — i.e., the relationship between the physical and the perceptual attributes of sound — in the audio signal processing, we can communicate precisely and unambiguously interpretable direction and distance cues along three orthogonal axes to a user. We present exemplary use cases from various application areas where users can benefit from psychoacoustic sonification.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 5

page 6

page 7

page 8

page 9

page 10

page 11

page 14

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.