Kinetic surface friction rendering for interactive sonification: an initial exploration

07/29/2021
by   Staas de Jong, et al.
0

Inspired by the role sound and friction play in interactions with everyday objects, this work aims to identify some of the ways in which kinetic surface friction rendering can complement interactive sonification controlled by movable objects. In order to do this, a tactile system is presented which implements a movable physical object with programmable friction. Important aspects of this system include the capacity to display high-resolution kinetic friction patterns, the ability to algorithmically define interactions directly in terms of physical units, and the complete integration of audio and tactile synthesis. A prototype interaction spatially mapping arbitrary 1D signal data on a surface and directly converting these to sound and friction during movements across the surface is described. The results of a pilot evaluation of this interaction indicate how kinetic surface friction rendering can be a means for giving dynamically created virtual objects for sonification a tangible presence. Some specific possible roles for movement input and friction output are identified, as well as issues to be considered when applying and further developing this type of haptic feedback in the context of interactive sonification.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

08/23/2017

Object-Based Audio Rendering

Apparatus and methods are disclosed for performing object-based audio re...
09/09/2021

Dynamic Modeling of Hand-Object Interactions via Tactile Sensing

Tactile sensing is critical for humans to perform everyday tasks. While ...
05/18/2010

Dynamical issues in interactive representation of physical objects

The quality of a simulator equipped with a haptic interface is given by ...
07/27/2021

Making grains tangible: microtouch for microsound

This paper proposes a new research direction for the large family of ins...
07/29/2021

Presenting the cyclotactor project

The cyclotactor is a novel platform for finger-based tactile interaction...
10/05/2020

Combined Hapto-Visual and Auditory Rendering of Cultural Heritage Objects

In this work, we develop a multi-modal rendering framework comprising of...
05/14/2019

Pointing task on smart glasses: Comparison of four interaction techniques

Mobile devices such as smartphones, smartwatches or smart glasses have r...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.