Pre-training and Fine-tuning Transformers for fMRI Prediction Tasks

12/10/2021
by   Itzik Malkiel, et al.
0

We present the TFF Transformer framework for the analysis of functional Magnetic Resonance Imaging (fMRI) data. TFF employs a transformer-based architecture and a two-phase training approach. First, self-supervised training is applied to a collection of fMRI scans, where the model is trained for the reconstruction of 3D volume data. Second, the pre-trained model is fine-tuned on specific tasks, utilizing ground truth labels. Our results show state-of-the-art performance on a variety of fMRI tasks, including age and gender prediction, as well as schizophrenia recognition.

READ FULL TEXT

page 5

page 6

page 17

research
05/22/2023

Sequential Transfer Learning to Decode Heard and Imagined Timbre from fMRI Data

We present a sequential transfer learning framework for transformers on ...
research
11/29/2022

Self-Supervised Mental Disorder Classifiers via Time Reversal

Data scarcity is a notable problem, especially in the medical domain, du...
research
08/29/2021

Variational voxelwise rs-fMRI representation learning: Evaluation of sex, age, and neuropsychiatric signatures

We propose to apply non-linear representation learning to voxelwise rs-f...
research
07/12/2023

SwiFT: Swin 4D fMRI Transformer

The modeling of spatiotemporal brain dynamics from high-dimensional data...
research
03/17/2022

GATE: Graph CCA for Temporal SElf-supervised Learning for Label-efficient fMRI Analysis

In this work, we focus on the challenging task, neuro-disease classifica...
research
04/20/2022

Disentangling Spatial-Temporal Functional Brain Networks via Twin-Transformers

How to identify and characterize functional brain networks (BN) is funda...
research
05/23/2022

BolT: Fused Window Transformers for fMRI Time Series Analysis

Functional magnetic resonance imaging (fMRI) enables examination of inte...

Please sign up or login with your details

Forgot password? Click here to reset