Multisensory cues facilitate coordination of stepping movements with a virtual reality avatar

06/24/2019 ∙ by Omar Khan, et al. ∙ 0

The effectiveness of simple sensory cues for retraining gait have been demonstrated, yet the feasibility of humanoid avatars for entrainment have yet to be investigated. Here, we describe the development of a novel method of visually cued training, in the form of a virtual partner, and investigate its ability to provide movement guidance in the form of stepping. Real stepping movements were mapped onto an avatar using motion capture data. The trajectory of one of the avatar step cycles was then accelerated or decelerated by 15 create a perturbation. Healthy participants were motion captured while instructed to step in time to the avatar's movements, as viewed through a virtual reality headset. Step onset times were used to measure the timing errors (asynchronies) between them. Participants completed either a visual-only condition, or auditory-visual with footstep sounds included. Participants' asynchronies exhibited slow drift in the Visual-Only condition, but became stable in the Auditory-Visual condition. Moreover, we observed a clear corrective response to the phase perturbation in both auditory-visual conditions. We conclude that an avatar's movements can be used to influence a person's own gait, but should include relevant auditory cues congruent with the movement to ensure a suitable accuracy is achieved.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 30

page 31

page 32

page 33

page 34

page 35

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.