Low-rank features based double transformation matrices learning for image classification

01/28/2022
by   Yu-Hong Cai, et al.
3

Linear regression is a supervised method that has been widely used in classification tasks. In order to apply linear regression to classification tasks, a technique for relaxing regression targets was proposed. However, methods based on this technique ignore the pressure on a single transformation matrix due to the complex information contained in the data. A single transformation matrix in this case is too strict to provide a flexible projection, thus it is necessary to adopt relaxation on transformation matrix. This paper proposes a double transformation matrices learning method based on latent low-rank feature extraction. The core idea is to use double transformation matrices for relaxation, and jointly projecting the learned principal and salient features from two directions into the label space, which can share the pressure of a single transformation matrix. Firstly, the low-rank features are learned by the latent low rank representation (LatLRR) method which processes the original data from two directions. In this process, sparse noise is also separated, which alleviates its interference on projection learning to some extent. Then, two transformation matrices are introduced to process the two features separately, and the information useful for the classification is extracted. Finally, the two transformation matrices can be easily obtained by alternate optimization methods. Through such processing, even when a large amount of redundant information is contained in samples, our method can also obtain projection results that are easy to classify. Experiments on multiple data sets demonstrate the effectiveness of our approach for classification, especially for complex scenarios.

READ FULL TEXT
research
05/14/2019

Transition Subspace Learning based Least Squares Regression for Image Classification

Only learning one projection matrix from original samples to the corresp...
research
12/13/2019

Multilayer Collaborative Low-Rank Coding Network for Robust Deep Subspace Discovery

For subspace recovery, most existing low-rank representation (LRR) model...
research
12/30/2020

Low Rank Pure Quaternion Approximation for Pure Quaternion Matrices

Quaternion matrices are employed successfully in many color image proces...
research
03/19/2019

Low-Rank Discriminative Least Squares Regression for Image Classification

Latest least squares regression (LSR) methods mainly try to learn slack ...
research
07/07/2023

Scalable High-Dimensional Multivariate Linear Regression for Feature-Distributed Data

Feature-distributed data, referred to data partitioned by features and s...
research
05/28/2019

Adaptive Reduced Rank Regression

Low rank regression has proven to be useful in a wide range of forecasti...
research
02/23/2020

Sketching Transformed Matrices with Applications to Natural Language Processing

Suppose we are given a large matrix A=(a_i,j) that cannot be stored in m...

Please sign up or login with your details

Forgot password? Click here to reset