The Spatial Selective Auditory Attention of Cochlear Implant Users in Different Conversational Sound Levels

03/03/2021
by   Sara Akbarzadeh, et al.
0

In multi speakers environments, cochlear implant (CI) users may attend to a target sound source in a different manner from the normal hearing (NH) individuals during a conversation. This study attempted to investigate the effect of conversational sound levels on the mechanisms adopted by CI and NH listeners in selective auditory attention and how it affects their daily conversation. Nine CI users (five bilateral, three unilateral, and one bimodal) and eight NH listeners participated in this study. The behavioral speech recognition scores were collected using a matrix sentences test and neural tracking to speech envelope was recorded using electroencephalography (EEG). Speech stimuli were presented at three different levels (75, 65, and 55 dB SPL) in the presence of two maskers from three spatially separated speakers. Different combinations of assisted/impaired hearing modes were evaluated for CI users and the outcomes were analyzed in three categories: electric hearing only, acoustic hearing only, and electric+acoustic hearing. Our results showed that increasing the conversational sound level degraded the selective auditory attention in electrical hearing. On the other hand, increasing the sound level improved the selective auditory attention for the acoustic hearing group. In NH listeners, however, increasing the sound level did not cause a significant change in the auditory attention. Our result implies that the effect of the sound level on the selective auditory attention varies depending on hearing modes and the loudness control is necessary for the ease of attending to the conversation by CI users.

READ FULL TEXT

page 15

page 17

page 18

page 19

page 20

page 21

page 22

page 23

research
03/03/2021

The effect of speech and noise levels on the quality perceived by cochlear implant and normal hearing listeners

Electrical hearing by cochlear implants (CIs) may be fundamentally diffe...
research
02/16/2022

Conversational Speech Recognition By Learning Conversation-level Characteristics

Conversational automatic speech recognition (ASR) is a task to recognize...
research
02/27/2023

Predicting EEG Responses to Attended Speech via Deep Neural Networks for Speech

Attending to the speech stream of interest in multi-talker environments ...
research
03/28/2023

Egocentric Auditory Attention Localization in Conversations

In a noisy conversation environment such as a dinner party, people often...
research
07/01/2022

Distance-Based Sound Separation

We propose the novel task of distance-based sound separation, where soun...

Please sign up or login with your details

Forgot password? Click here to reset