Concurrent Crossmodal Feedback Assists Target-searching: Displaying Distance Information Through Visual, Auditory and Haptic Modalities

02/16/2020
by   Feng Feng, et al.
0

Humans sense of distance depends on the integration of multi sensory cues. The incoming visual luminance, auditory pitch and tactile vibration could all contribute to the ability of distance judgement. This ability can be enhanced if the multimodal cues are associated in a congruent manner, a phenomenon has been referred to as Crossmodal correspondences. In the context of multi-sensory interaction, whether and how such correspondences influence information processing with continuous motor engagement, particularly for target searching activities, has rarely been investigated. This paper presents an experimental user study to address this question. We built a target-searching application based on a Table-top, displayed the unimodal and Crossmodal distance cues concurrently responding to peoples searching movement, measured task performance through kinematic evaluation. We find that the Crossmodal display an audio display lead to improved searching efficiency and accuracy. More interestingly, this improvement is confirmed by kinematic analysis, which also unveiled the underlying movement features that could account for this improvement. We discussed how these findings could shed lights on the design of assistive technology and of other multi sensory interaction.

READ FULL TEXT
research
08/10/2021

Haptic Situational Awareness Using Continuous Vibrotactile Sensations

In this research, we have developed a haptic situational awareness devic...
research
02/16/2020

Can rhythm be touched? An evaluation of rhythmic sketch performance with augmented multimodal feedback

Although it has been shown that augmented multimodal feedback has a faci...
research
04/05/2019

Sensory Regimes of Effective Distributed Searching without Leaders

Collective animal movement fascinates children and scientists alike. One...
research
07/26/2018

Multi-modal Feedback for Affordance-driven Interactive Reinforcement Learning

Interactive reinforcement learning (IRL) extends traditional reinforceme...
research
11/29/2017

Towards Cross-Surface Immersion Using Low Cost Multi-Sensory Output Cues to Support Proxemics and Kinesics Across Heterogeneous Systems

Collaboration in immersive systems can be achieved by using an immersive...
research
04/03/2020

Impact of Tactile and Visual Feedback on Breathing Rhythm and User Experience in VR Exergaming

Combining interconnected wearables provides fascinating opportunities li...

Please sign up or login with your details

Forgot password? Click here to reset