Contrastive inverse regression for dimension reduction

05/20/2023
by   Sam Hawke, et al.
0

Supervised dimension reduction (SDR) has been a topic of growing interest in data science, as it enables the reduction of high-dimensional covariates while preserving the functional relation with certain response variables of interest. However, existing SDR methods are not suitable for analyzing datasets collected from case-control studies. In this setting, the goal is to learn and exploit the low-dimensional structure unique to or enriched by the case group, also known as the foreground group. While some unsupervised techniques such as the contrastive latent variable model and its variants have been developed for this purpose, they fail to preserve the functional relationship between the dimension-reduced covariates and the response variable. In this paper, we propose a supervised dimension reduction method called contrastive inverse regression (CIR) specifically designed for the contrastive setting. CIR introduces an optimization problem defined on the Stiefel manifold with a non-standard loss function. We prove the convergence of CIR to a local optimum using a gradient descent-based algorithm, and our numerical study empirically demonstrates the improved performance over competing methods for high-dimensional data.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/06/2018

High Dimensional Elliptical Sliced Inverse Regression in non-Gaussian Distributions

Sliced inverse regression (SIR) is the most widely-used sufficient dimen...
research
09/25/2019

Function Preserving Projection for Scalable Exploration of High-Dimensional Data

We present function preserving projections (FPP), a scalable linear proj...
research
12/22/2020

Unsupervised Functional Data Analysis via Nonlinear Dimension Reduction

In recent years, manifold methods have moved into focus as tools for dim...
research
09/17/2018

A convex formulation for high-dimensional sparse sliced inverse regression

Sliced inverse regression is a popular tool for sufficient dimension red...
research
08/10/2013

High-Dimensional Regression with Gaussian Mixtures and Partially-Latent Response Variables

In this work we address the problem of approximating high-dimensional da...
research
07/29/2019

ICE: An Interactive Configuration Explorer for High Dimensional Categorical Parameter Spaces

There are many applications where users seek to explore the impact of th...
research
05/01/2020

Manifold learning for brain connectivity

Human brain connectome studies aim at extracting and analyzing relevant ...

Please sign up or login with your details

Forgot password? Click here to reset