Concurrent Crossmodal Feedback Assists Target-searching: Displaying Distance Information Through Visual, Auditory and Haptic Modalities

02/16/2020
by   Feng Feng, et al.
0

Humans sense of distance depends on the integration of multi sensory cues. The incoming visual luminance, auditory pitch and tactile vibration could all contribute to the ability of distance judgement. This ability can be enhanced if the multimodal cues are associated in a congruent manner, a phenomenon has been referred to as Crossmodal correspondences. In the context of multi-sensory interaction, whether and how such correspondences influence information processing with continuous motor engagement, particularly for target searching activities, has rarely been investigated. This paper presents an experimental user study to address this question. We built a target-searching application based on a Table-top, displayed the unimodal and Crossmodal distance cues concurrently responding to peoples searching movement, measured task performance through kinematic evaluation. We find that the Crossmodal display an audio display lead to improved searching efficiency and accuracy. More interestingly, this improvement is confirmed by kinematic analysis, which also unveiled the underlying movement features that could account for this improvement. We discussed how these findings could shed lights on the design of assistive technology and of other multi sensory interaction.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 5

08/10/2021

Haptic Situational Awareness Using Continuous Vibrotactile Sensations

In this research, we have developed a haptic situational awareness devic...
02/16/2020

Can rhythm be touched? An evaluation of rhythmic sketch performance with augmented multimodal feedback

Although it has been shown that augmented multimodal feedback has a faci...
04/05/2019

Sensory Regimes of Effective Distributed Searching without Leaders

Collective animal movement fascinates children and scientists alike. One...
11/29/2017

Towards Cross-Surface Immersion Using Low Cost Multi-Sensory Output Cues to Support Proxemics and Kinesics Across Heterogeneous Systems

Collaboration in immersive systems can be achieved by using an immersive...
07/26/2018

Multi-modal Feedback for Affordance-driven Interactive Reinforcement Learning

Interactive reinforcement learning (IRL) extends traditional reinforceme...
02/17/2021

EEG-based Texture Roughness Classification in Active Tactile Exploration with Invariant Representation Learning Networks

During daily activities, humans use their hands to grasp surrounding obj...
10/29/2018

Modelling visual-vestibular integration and behavioural adaptation in the driving simulator

It is well established that not only vision but also other sensory modal...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.