Uncovering the structure of clinical EEG signals with self-supervised learning

07/31/2020
by   Hubert Banville, et al.
68

Objective. Supervised learning paradigms are often limited by the amount of labeled data that is available. This phenomenon is particularly problematic in clinically-relevant data, such as electroencephalography (EEG), where labeling can be costly in terms of specialized expertise and human processing time. Consequently, deep learning architectures designed to learn on EEG data have yielded relatively shallow models and performances at best similar to those of traditional feature-based approaches. However, in most situations, unlabeled data is available in abundance. By extracting information from this unlabeled data, it might be possible to reach competitive performance with deep neural networks despite limited access to labels. Approach. We investigated self-supervised learning (SSL), a promising technique for discovering structure in unlabeled data, to learn representations of EEG signals. Specifically, we explored two tasks based on temporal context prediction as well as contrastive predictive coding on two clinically-relevant problems: EEG-based sleep staging and pathology detection. We conducted experiments on two large public datasets with thousands of recordings and performed baseline comparisons with purely supervised and hand-engineered approaches. Main results. Linear classifiers trained on SSL-learned features consistently outperformed purely supervised deep neural networks in low-labeled data regimes while reaching competitive performance when all labels were available. Additionally, the embeddings learned with each method revealed clear latent structures related to physiological and clinical phenomena, such as age effects. Significance. We demonstrate the benefit of self-supervised learning approaches on EEG data. Our results suggest that SSL may pave the way to a wider use of deep learning models on EEG data.

READ FULL TEXT

page 6

page 17

research
09/16/2021

Self-supervised Contrastive Learning for EEG-based Sleep Staging

EEG signals are usually simple to obtain but expensive to label. Althoug...
research
11/13/2019

Self-supervised representation learning from electroencephalography signals

The supervised learning paradigm is limited by the cost - and sometimes ...
research
10/21/2016

Deep Models for Engagement Assessment With Scarce Label Information

Task engagement is defined as loadings on energetic arousal (affect), ta...
research
10/27/2021

Self-supervised EEG Representation Learning for Automatic Sleep Staging

Objective: In this paper, we aim to learn robust vector representations ...
research
12/07/2022

Self-Supervised PPG Representation Learning Shows High Inter-Subject Variability

With the progress of sensor technology in wearables, the collection and ...
research
08/24/2022

Self-Supervised Endoscopic Image Key-Points Matching

Feature matching and finding correspondences between endoscopic images i...
research
08/15/2022

Self-Supervised Learning for Anomalous Channel Detection in EEG Graphs: Application to Seizure Analysis

Electroencephalogram (EEG) signals are effective tools towards seizure a...

Please sign up or login with your details

Forgot password? Click here to reset