Projecting "better than randomly": How to reduce the dimensionality of very large datasets in a way that outperforms random projections

01/03/2019
by   Michael Wojnowicz, et al.
0

For very large datasets, random projections (RP) have become the tool of choice for dimensionality reduction. This is due to the computational complexity of principal component analysis. However, the recent development of randomized principal component analysis (RPCA) has opened up the possibility of obtaining approximate principal components on very large datasets. In this paper, we compare the performance of RPCA and RP in dimensionality reduction for supervised learning. In Experiment 1, study a malware classification task on a dataset with over 10 million samples, almost 100,000 features, and over 25 billion non-zero values, with the goal of reducing the dimensionality to a compressed representation of 5,000 features. In order to apply RPCA to this dataset, we develop a new algorithm called large sample RPCA (LS-RPCA), which extends the RPCA algorithm to work on datasets with arbitrarily many samples. We find that classification performance is much higher when using LS-RPCA for dimensionality reduction than when using random projections. In particular, across a range of target dimensionalities, we find that using LS-RPCA reduces classification error by between 37 phenomenon to multiple datasets, feature representations, and classifiers. These findings have implications for a large number of research projects in which random projections were used as a preprocessing step for dimensionality reduction. As long as accuracy is at a premium and the target dimensionality is sufficiently less than the numeric rank of the dataset, randomized PCA may be a superior choice. Moreover, if the dataset has a large number of samples, then LS-RPCA will provide a method for obtaining the approximate principal components.

READ FULL TEXT
research
09/25/2020

Improved Dimensionality Reduction of various Datasets using Novel Multiplicative Factoring Principal Component Analysis (MPCA)

Principal Component Analysis (PCA) is known to be the most widely applie...
research
09/21/2017

Lazy stochastic principal component analysis

Stochastic principal component analysis (SPCA) has become a popular dime...
research
10/11/2017

Dimensionality Reduction Ensembles

Ensemble learning has had many successes in supervised learning, but it ...
research
05/17/2017

Maximum Margin Principal Components

Principal Component Analysis (PCA) is a very successful dimensionality r...
research
08/28/2022

AutoQML: Automatic Generation and Training of Robust Quantum-Inspired Classifiers by Using Genetic Algorithms on Grayscale Images

We propose a new hybrid system for automatically generating and training...
research
08/09/2018

Fast computation of the principal components of genotype matrices in Julia

Finding the largest few principal components of a matrix of genetic data...
research
11/17/2016

"Influence Sketching": Finding Influential Samples In Large-Scale Regressions

There is an especially strong need in modern large-scale data analysis t...

Please sign up or login with your details

Forgot password? Click here to reset