Conditional Recurrent Flow: Conditional Generation of Longitudinal Samples with Applications to Neuroimaging

11/24/2018
by   Seong Jae Hwang, et al.
0

Generative models using neural network have opened a door to large-scale studies for various application domains, especially for studies that suffer from lack of real samples to obtain statistically robust inference. Typically, these generative models would train on existing data to learn the underlying distribution of the measurements (e.g., images) in latent spaces conditioned on covariates (e.g., image labels), and generate independent samples that are identically distributed in the latent space. Such models may work for cross-sectional studies, however, they are not suitable to generate data for longitudinal studies that focus on "progressive" behavior in a sequence of data. In practice, this is a quite common case in various neuroimaging studies whose goal is to characterize a trajectory of pathologies of a specific disease even from early stages. This may be too ambitious especially when the sample size is small (e.g., up to a few hundreds). Motivated from the setup above, we seek to develop a conditional generative model for longitudinal data generation by designing an invertable neural network. Inspired by recurrent nature of longitudinal data, we propose a novel neural network that incorporates recurrent subnetwork and context gating to include smooth transition in a sequence of generated data. Our model is validated on a video sequence dataset and a longitudinal AD dataset with various experimental settings for qualitative and quantitative evaluations of the generated samples. The results with the AD dataset captures AD specific group differences with sufficiently generated longitudinal samples that are consistent with existing literature, which implies a great potential to be applicable to other disease studies.

READ FULL TEXT

page 7

page 8

page 9

page 10

page 11

page 18

research
05/24/2018

Cross Domain Image Generation through Latent Space Exploration with Adversarial Loss

Conditional domain generation is a good way to interactively control sam...
research
02/02/2023

Longformer: Longitudinal Transformer for Alzheimer's Disease Classification with Structural MRIs

Structural magnetic resonance imaging (sMRI) is widely used for brain ne...
research
11/23/2021

Smoothing the Generative Latent Space with Mixup-based Distance Learning

Producing diverse and realistic images with generative models such as GA...
research
01/27/2023

A statistical framework for planning and analysing test-retest studies for repeatability of quantitative biomarker measurements

There is an increasing number of potential biomarkers that could allow f...
research
01/27/2018

Image2GIF: Generating Cinemagraphs using Recurrent Deep Q-Networks

Given a still photograph, one can imagine how dynamic objects might move...
research
06/15/2022

A Deep Generative Model of Neonatal Cortical Surface Development

The neonatal cortical surface is known to be affected by preterm birth, ...
research
09/10/2018

Inferring Influence Networks from Longitudinal Bipartite Relational Data

Longitudinal bipartite relational data characterize the evolution of rel...

Please sign up or login with your details

Forgot password? Click here to reset