Overcoming the Domain Gap in Contrastive Learning of Neural Action Representations

11/29/2021
by   Semih Günel, et al.
0

A fundamental goal in neuroscience is to understand the relationship between neural activity and behavior. For example, the ability to extract behavioral intentions from neural data, or neural decoding, is critical for developing effective brain machine interfaces. Although simple linear models have been applied to this challenge, they cannot identify important non-linear relationships. Thus, a self-supervised means of identifying non-linear relationships between neural dynamics and behavior, in order to compute neural representations, remains an important open problem. To address this challenge, we generated a new multimodal dataset consisting of the spontaneous behaviors generated by fruit flies, Drosophila melanogaster – a popular model organism in neuroscience research. The dataset includes 3D markerless motion capture data from six camera views of the animal generating spontaneous actions, as well as synchronously acquired two-photon microscope images capturing the activity of descending neuron populations that are thought to drive actions. Standard contrastive learning and unsupervised domain adaptation techniques struggle to learn neural action representations (embeddings computed from the neural data describing action labels) due to large inter-animal differences in both neural and behavioral modalities. To overcome this deficiency, we developed simple yet effective augmentations that close the inter-animal domain gap, allowing us to extract behaviorally relevant, yet domain agnostic, information from neural data. This multimodal dataset and our new set of augmentations promise to accelerate the application of self-supervised learning methods in neuroscience.

READ FULL TEXT

page 14

page 15

research
12/02/2021

Overcoming the Domain Gap in Neural Action Representations

Relating animal behaviors to brain activity is a fundamental goal in neu...
research
04/01/2022

Learnable latent embeddings for joint behavioral and neural analysis

Mapping behavioral actions to neural activity is a fundamental goal of n...
research
06/03/2021

TVDIM: Enhancing Image Self-Supervised Pretraining via Noisy Text Data

Among ubiquitous multimodal data in the real world, text is the modality...
research
08/08/2023

Unsupervised Camouflaged Object Segmentation as Domain Adaptation

Deep learning for unsupervised image segmentation remains challenging du...
research
07/20/2023

Language-based Action Concept Spaces Improve Video Self-Supervised Learning

Recent contrastive language image pre-training has led to learning highl...
research
10/17/2022

Self-Supervised Learning Through Efference Copies

Self-supervised learning (SSL) methods aim to exploit the abundance of u...
research
11/03/2021

Drop, Swap, and Generate: A Self-Supervised Approach for Generating Neural Activity

Meaningful and simplified representations of neural activity can yield i...

Please sign up or login with your details

Forgot password? Click here to reset