Solving Interpretable Kernel Dimension Reduction

09/06/2019
by   Chieh Wu, et al.
0

Kernel dimensionality reduction (KDR) algorithms find a low dimensional representation of the original data by optimizing kernel dependency measures that are capable of capturing nonlinear relationships. The standard strategy is to first map the data into a high dimensional feature space using kernels prior to a projection onto a low dimensional space. While KDR methods can be easily solved by keeping the most dominant eigenvectors of the kernel matrix, its features are no longer easy to interpret. Alternatively, Interpretable KDR (IKDR) is different in that it projects onto a subspace before the kernel feature mapping, therefore, the projection matrix can indicate how the original features linearly combine to form the new features. Unfortunately, the IKDR objective requires a non-convex manifold optimization that is difficult to solve and can no longer be solved by eigendecomposition. Recently, an efficient iterative spectral (eigendecomposition) method (ISM) has been proposed for this objective in the context of alternative clustering. However, ISM only provides theoretical guarantees for the Gaussian kernel. This greatly constrains ISM's usage since any kernel method using ISM is now limited to a single kernel. This work extends the theoretical guarantees of ISM to an entire family of kernels, thereby empowering ISM to solve any kernel method of the same objective. In identifying this family, we prove that each kernel within the family has a surrogate Φ matrix and the optimal projection is formed by its most dominant eigenvectors. With this extension, we establish how a wide range of IKDR applications across different learning paradigms can be solved by ISM. To support reproducible results, the source code is made publicly available on <https://github.com/chieh-neu/ISM_supervised_DR>.

READ FULL TEXT
research
09/06/2019

Spectral Non-Convex Optimization for Dimension Reduction with Hilbert-Schmidt Independence Criterion

The Hilbert Schmidt Independence Criterion (HSIC) is a kernel dependence...
research
11/04/2021

ExClus: Explainable Clustering on Low-dimensional Data Representations

Dimensionality reduction and clustering techniques are frequently used t...
research
07/17/2017

Non-Linear Subspace Clustering with Learned Low-Rank Kernels

In this paper, we present a kernel subspace clustering method that can h...
research
12/19/2019

Bounded Manifold Completion

Nonlinear dimensionality reduction or, equivalently, the approximation o...
research
09/08/2019

Iterative Spectral Method for Alternative Clustering

Given a dataset and an existing clustering as input, alternative cluster...
research
09/09/2017

Simultaneously Learning Neighborship and Projection Matrix for Supervised Dimensionality Reduction

Explicitly or implicitly, most of dimensionality reduction methods need ...
research
08/03/2015

Kernelized Multiview Projection

Conventional vision algorithms adopt a single type of feature or a simpl...

Please sign up or login with your details

Forgot password? Click here to reset