Self-supervised Contrastive Learning for EEG-based Sleep Staging

09/16/2021
by   Xue Jiang, et al.
0

EEG signals are usually simple to obtain but expensive to label. Although supervised learning has been widely used in the field of EEG signal analysis, its generalization performance is limited by the amount of annotated data. Self-supervised learning (SSL), as a popular learning paradigm in computer vision (CV) and natural language processing (NLP), can employ unlabeled data to make up for the data shortage of supervised learning. In this paper, we propose a self-supervised contrastive learning method of EEG signals for sleep stage classification. During the training process, we set up a pretext task for the network in order to match the right transformation pairs generated from EEG signals. In this way, the network improves the representation ability by learning the general features of EEG signals. The robustness of the network also gets improved in dealing with diverse data, that is, extracting constant features from changing data. In detail, the network's performance depends on the choice of transformations and the amount of unlabeled data used in the training process of self-supervised learning. Empirical evaluations on the Sleep-edf dataset demonstrate the competitive performance of our method on sleep staging (88.16 effectiveness of SSL strategy for EEG signal analysis in limited labeled data regimes. All codes are provided publicly online.

READ FULL TEXT

page 4

page 5

research
07/31/2020

Uncovering the structure of clinical EEG signals with self-supervised learning

Objective. Supervised learning paradigms are often limited by the amount...
research
10/27/2021

Self-supervised EEG Representation Learning for Automatic Sleep Staging

Objective: In this paper, we aim to learn robust vector representations ...
research
08/15/2022

Self-Supervised Learning for Anomalous Channel Detection in EEG Graphs: Application to Seizure Analysis

Electroencephalogram (EEG) signals are effective tools towards seizure a...
research
10/21/2016

Deep Models for Engagement Assessment With Scarce Label Information

Task engagement is defined as loadings on energetic arousal (affect), ta...
research
12/07/2022

Self-Supervised PPG Representation Learning Shows High Inter-Subject Variability

With the progress of sensor technology in wearables, the collection and ...
research
01/28/2021

BENDR: using transformers and a contrastive self-supervised learning task to learn from massive amounts of EEG data

Deep neural networks (DNNs) used for brain-computer-interface (BCI) clas...
research
07/13/2023

Multi-view self-supervised learning for multivariate variable-channel time series

Labeling of multivariate biomedical time series data is a laborious and ...

Please sign up or login with your details

Forgot password? Click here to reset