Direct Estimation of the Derivative of Quadratic Mutual Information with Application in Supervised Dimension Reduction

08/05/2015
by   Voot Tangkaratt, et al.
0

A typical goal of supervised dimension reduction is to find a low-dimensional subspace of the input space such that the projected input variables preserve maximal information about the output variables. The dependence maximization approach solves the supervised dimension reduction problem through maximizing a statistical dependence between projected input variables and output variables. A well-known statistical dependence measure is mutual information (MI) which is based on the Kullback-Leibler (KL) divergence. However, it is known that the KL divergence is sensitive to outliers. On the other hand, quadratic MI (QMI) is a variant of MI based on the L_2 distance which is more robust against outliers than the KL divergence, and a computationally efficient method to estimate QMI from data, called least-squares QMI (LSQMI), has been proposed recently. For these reasons, developing a supervised dimension reduction method based on LSQMI seems promising. However, not QMI itself, but the derivative of QMI is needed for subspace search in supervised dimension reduction, and the derivative of an accurate QMI estimator is not necessarily a good estimator of the derivative of QMI. In this paper, we propose to directly estimate the derivative of QMI without estimating QMI itself. We show that the direct estimation of the derivative of QMI is more accurate than the derivative of the estimated QMI. Finally, we develop a supervised dimension reduction algorithm which efficiently uses the proposed derivative estimator, and demonstrate through experiments that the proposed method is more robust against outliers than existing methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/25/2011

Sufficient Component Analysis for Supervised Dimension Reduction

The purpose of sufficient dimension reduction (SDR) is to find the low-d...
research
10/15/2017

Estimation of Squared-Loss Mutual Information from Positive and Unlabeled Data

Capturing input-output dependency is an important task in statistical da...
research
06/30/2014

Direct Density-Derivative Estimation and Its Application in KL-Divergence Approximation

Estimation of density derivatives is a versatile tool in statistical dat...
research
04/06/2021

Accelerated derivative-free nonlinear least-squares applied to the estimation of Manning coefficients

A general framework for solving nonlinear least squares problems without...
research
12/12/2020

Sparse dimension reduction based on energy and ball statistics

As its name suggests, sufficient dimension reduction (SDR) targets to es...
research
03/25/2021

Model Order Reduction based on Runge-Kutta Neural Network

Model Order Reduction (MOR) methods enable the generation of real-time-c...
research
01/01/2019

Supervised Coarse-Graining of Composite Objects

We consider supervised dimension reduction for regression with composite...

Please sign up or login with your details

Forgot password? Click here to reset