Standardization of multivariate Gaussian mixture models and background adjustment of PET images in brain oncology

10/23/2017
by   Meng Li, et al.
0

Given observations from a multivariate Gaussian mixture model plus outliers, this paper addresses the question of how to standardize the mixture to a standard multivariate normal distribution, so that the outliers can be detected using a statistical test. This question is motivated by an image analysis problem in brain oncology of detecting changes between a post-treatment Positron Emission Tomography (PET) scan, where background adjustment is necessary to reduce confounding by tissue-dependent changes not related to the disease. When modeling the voxel intensities for the two scans as a bivariate Gaussian mixture, background adjustment translates into standardizing the mixture at each voxel, while tumor lesions present themselves as outliers to be detected. For general multivariate Gaussian mixtures, we show theoretically and numerically that the tail distribution of the standardized scores is favorably close to standard normal in a wide range of scenarios while being conservative at the tails, validating voxelwise hypothesis testing based on standardized scores. To address standardization in spatially heterogeneous data, we propose a spatial and robust multivariate expectation-maximization (EM) algorithm, where prior class membership probabilities are provided by transformation of spatial probability template maps and the estimation of the class mean and covariances are robust to outliers. Simulations in both univariate and bivariate cases suggest that standardized scores with soft assignment have tail probabilities that are either very close to or more conservative than standard normal. The proposed methods are applied to a real data set from a PET phantom experiment, yet they are generic and can be used in other contexts.

READ FULL TEXT

page 4

page 13

page 14

page 22

page 23

page 35

page 36

page 37

research
05/06/2020

A Bayesian approach for clustering skewed data using mixtures of multivariate normal-inverse Gaussian distributions

Non-Gaussian mixture models are gaining increasing attention for mixture...
research
07/02/2019

Using Subset Log-Likelihoods to Trim Outliers in Gaussian Mixture Models

Mixtures of Gaussian distributions are a popular choice in model-based c...
research
04/08/2020

Robust Mixture Modeling using Weighted Complete Estimating Equations

Mixture modeling that takes account of potential heterogeneity in data i...
research
07/07/2016

Whole-brain substitute CT generation using Markov random field mixture models

Computed tomography (CT) equivalent information is needed for attenuatio...
research
05/09/2021

GMOTE: Gaussian based minority oversampling technique for imbalanced classification adapting tail probability of outliers

Classification of imbalanced data is one of the common problems in the r...
research
05/25/2019

Combining mixture models with linear mixing updates: multilayer image segmentation and synthesis

Finite mixture models for clustering can often be improved by adding a r...

Please sign up or login with your details

Forgot password? Click here to reset