Simultaneously Learning Neighborship and Projection Matrix for Supervised Dimensionality Reduction

09/09/2017
by   Yanwei Pang, et al.
0

Explicitly or implicitly, most of dimensionality reduction methods need to determine which samples are neighbors and the similarity between the neighbors in the original highdimensional space. The projection matrix is then learned on the assumption that the neighborhood information (e.g., the similarity) is known and fixed prior to learning. However, it is difficult to precisely measure the intrinsic similarity of samples in high-dimensional space because of the curse of dimensionality. Consequently, the neighbors selected according to such similarity might and the projection matrix obtained according to such similarity and neighbors are not optimal in the sense of classification and generalization. To overcome the drawbacks, in this paper we propose to let the similarity and neighbors be variables and model them in low-dimensional space. Both the optimal similarity and projection matrix are obtained by minimizing a unified objective function. Nonnegative and sum-to-one constraints on the similarity are adopted. Instead of empirically setting the regularization parameter, we treat it as a variable to be optimized. It is interesting that the optimal regularization parameter is adaptive to the neighbors in low-dimensional space and has intuitive meaning. Experimental results on the YALE B, COIL-100, and MNIST datasets demonstrate the effectiveness of the proposed method.

READ FULL TEXT

page 1

page 8

page 9

page 10

research
11/13/2019

Topological Stability: Guided Determination of the Nearest Neighbors in Non-Linear Dimensionality Reduction Techniques

In machine learning field, dimensionality reduction is one of the import...
research
06/24/2016

Multipartite Ranking-Selection of Low-Dimensional Instances by Supervised Projection to High-Dimensional Space

Pruning of redundant or irrelevant instances of data is a key to every s...
research
11/17/2022

Interpretable Dimensionality Reduction by Feature Preserving Manifold Approximation and Projection

Nonlinear dimensionality reduction lacks interpretability due to the abs...
research
08/06/2008

LLE with low-dimensional neighborhood representation

The local linear embedding algorithm (LLE) is a non-linear dimension-red...
research
07/27/2019

Modeling Winner-Take-All Competition in Sparse Binary Projections

Inspired by the advances in biological science, the study of sparse bina...
research
11/26/2013

Auto-adaptative Laplacian Pyramids for High-dimensional Data Analysis

Non-linear dimensionality reduction techniques such as manifold learning...
research
09/06/2019

Solving Interpretable Kernel Dimension Reduction

Kernel dimensionality reduction (KDR) algorithms find a low dimensional ...

Please sign up or login with your details

Forgot password? Click here to reset