Self-supervised EEG Representation Learning for Automatic Sleep Staging

10/27/2021
by   Chaoqi Yang, et al.
0

Objective: In this paper, we aim to learn robust vector representations from massive unlabeled Electroencephalogram (EEG) signals, such that the learned representations (1) are expressive enough to replace the raw signals in the sleep staging task; and (2) provide better predictive performance than supervised models in scenarios of fewer labels and noisy samples. Materials and Methods: We propose a self-supervised model, named Contrast with the World Representation (ContraWR), for EEG signal representation learning, which uses global statistics from the dataset to distinguish signals associated with different sleep stages. The ContraWR model is evaluated on three real-world EEG datasets that include both at-home and in-lab recording settings. Results: ContraWR outperforms recent self-supervised learning methods, MoCo, SimCLR, BYOL, SimSiam on the sleep staging task across three datasets. ContraWR also beats supervised learning when fewer training labels are available (e.g., 4 provides informative representations in 2D projection. Discussion: The proposed model can be generalized to other unsupervised physiological signal learning tasks. Future directions include exploring task-specific data augmentations and combining self-supervised with supervised methods, building upon the initial success of self-supervised learning in this paper. Conclusions: We show that ContraWR is robust to noise and can provide high-quality EEG representations for downstream prediction tasks. In low-label scenarios (e.g., only 2 predictive power (e.g., 4 supervised baselines.

READ FULL TEXT
research
09/16/2021

Self-supervised Contrastive Learning for EEG-based Sleep Staging

EEG signals are usually simple to obtain but expensive to label. Althoug...
research
01/28/2021

BENDR: using transformers and a contrastive self-supervised learning task to learn from massive amounts of EEG data

Deep neural networks (DNNs) used for brain-computer-interface (BCI) clas...
research
12/07/2022

Self-Supervised PPG Representation Learning Shows High Inter-Subject Variability

With the progress of sensor technology in wearables, the collection and ...
research
02/11/2022

Investigating Power laws in Deep Representation Learning

Representation learning that leverages large-scale labelled datasets, is...
research
07/31/2020

Uncovering the structure of clinical EEG signals with self-supervised learning

Objective. Supervised learning paradigms are often limited by the amount...
research
11/13/2019

Self-supervised representation learning from electroencephalography signals

The supervised learning paradigm is limited by the cost - and sometimes ...
research
07/27/2022

Towards Sleep Scoring Generalization Through Self-Supervised Meta-Learning

In this work we introduce a novel meta-learning method for sleep scoring...

Please sign up or login with your details

Forgot password? Click here to reset