Spectral Non-Convex Optimization for Dimension Reduction with Hilbert-Schmidt Independence Criterion

09/06/2019
by   Chieh Wu, et al.
0

The Hilbert Schmidt Independence Criterion (HSIC) is a kernel dependence measure that has applications in various aspects of machine learning. Conveniently, the objectives of different dimensionality reduction applications using HSIC often reduce to the same optimization problem. However, the nonconvexity of the objective function arising from non-linear kernels poses a serious challenge to optimization efficiency and limits the potential of HSIC-based formulations. As a result, only linear kernels have been computationally tractable in practice. This paper proposes a spectral-based optimization algorithm that extends beyond the linear kernel. The algorithm identifies a family of suitable kernels and provides the first and second-order local guarantees when a fixed point is reached. Furthermore, we propose a principled initialization strategy, thereby removing the need to repeat the algorithm at random initialization points. Compared to state-of-the-art optimization algorithms, our empirical results on real data show a run-time improvement by as much as a factor of 10^5 while consistently achieving lower cost and classification/clustering errors. The implementation source code is publicly available on https://github.com/endsley.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/06/2019

Solving Interpretable Kernel Dimension Reduction

Kernel dimensionality reduction (KDR) algorithms find a low dimensional ...
research
08/20/2018

Supervised Kernel PCA For Longitudinal Data

In statistical learning, high covariate dimensionality poses challenges ...
research
07/21/2021

Neural Fixed-Point Acceleration for Convex Optimization

Fixed-point iterations are at the heart of numerical computing and are o...
research
03/02/2021

Self-supervised Symmetric Nonnegative Matrix Factorization

Symmetric nonnegative matrix factorization (SNMF) has demonstrated to be...
research
04/15/2023

Efficient Convex Algorithms for Universal Kernel Learning

The accuracy and complexity of machine learning algorithms based on kern...
research
09/26/2022

Out-of-Distribution Detection with Hilbert-Schmidt Independence Optimization

Outlier detection tasks have been playing a critical role in AI safety. ...
research
02/07/2020

Oblivious Data for Fairness with Kernels

We investigate the problem of algorithmic fairness in the case where sen...

Please sign up or login with your details

Forgot password? Click here to reset