Investigating EEG-Based Functional Connectivity Patterns for Multimodal Emotion Recognition

04/04/2020
by   Xun Wu, et al.
0

Compared with the rich studies on the motor brain-computer interface (BCI), the recently emerging affective BCI presents distinct challenges since the brain functional connectivity networks involving emotion are not well investigated. Previous studies on emotion recognition based on electroencephalography (EEG) signals mainly rely on single-channel-based feature extraction methods. In this paper, we propose a novel emotion-relevant critical subnetwork selection algorithm and investigate three EEG functional connectivity network features: strength, clustering coefficient, and eigenvector centrality. The discrimination ability of the EEG connectivity features in emotion recognition is evaluated on three public emotion EEG datasets: SEED, SEED-V, and DEAP. The strength feature achieves the best classification performance and outperforms the state-of-the-art differential entropy feature based on single-channel analysis. The experimental results reveal that distinct functional connectivity patterns are exhibited for the five emotions of disgust, fear, sadness, happiness, and neutrality. Furthermore, we construct a multimodal emotion recognition model by combining the functional connectivity features from EEG and the features from eye movements or physiological signals using deep canonical correlation analysis. The classification accuracies of multimodal emotion recognition are 95.08/6.42 on the SEED dataset, 84.51/5.11 86.61/3.76 results demonstrate the complementary representation properties of the EEG connectivity features with eye movement data. In addition, we find that the brain networks constructed with 18 channels achieve comparable performance with that of the 62-channel network in multimodal emotion recognition and enable easier setups for BCI systems in real scenarios.

READ FULL TEXT

page 1

page 3

page 4

page 6

page 7

page 8

page 9

page 14

research
09/12/2018

Convolutional Neural Network Approach for EEG-based Emotion Recognition using Brain Connectivity and its Spatial Information

Emotion recognition based on electroencephalography (EEG) has received a...
research
08/13/2019

Multimodal Emotion Recognition Using Deep Canonical Correlation Analysis

Multimodal signals are more powerful than unimodal data for emotion reco...
research
10/19/2021

EEGminer: Discovering Interpretable Features of Brain Activity with Learnable Filters

Patterns of brain activity are associated with different brain processes...
research
05/01/2019

The Psychological and Physiological Part of Emotions: Multimodal Approximation for Valence Classification

In order to develop more precise and functional affective applications, ...
research
06/07/2022

EEG-based Emotion Recognition with Spatial and Functional Brain Mapping of CNS and PNS Signals

Emotion plays a significant role in our daily life. Recognition of emoti...
research
02/26/2016

Multimodal Emotion Recognition Using Multimodal Deep Learning

To enhance the performance of affective models and reduce the cost of ac...
research
02/23/2020

A study of resting-state EEG biomarkers for depression recognition

Background: Depression has become a major health burden worldwide, and e...

Please sign up or login with your details

Forgot password? Click here to reset