Random Projections and Dimension Reduction

08/11/2020
by   Rishi Advani, et al.
0

This paper, broadly speaking, covers the use of randomness in two main areas: low-rank approximation and kernel methods. Low-rank approximation is very important in numerical linear algebra. Many applications depend on matrix decomposition algorithms that provide accurate low-rank representations of data. In modern problems, however, various factors make this hard to accomplish. One solution to these problems is the use of random projections. Instead of directly computing the matrix factorization, we randomly project the matrix onto a lower-dimensional subspace and then compute the factorization. Often, we are able to do this without significant loss of accuracy. We describe how randomization can be used to create more efficient algorithms to perform low-rank matrix approximation, as well as introducing a novel randomized algorithm for matrix decomposition. Compared to standard approaches, random algorithms are often faster and more robust. With these randomized algorithms, analyzing massive data sets becomes tractable. Kernel methods are almost diametrically opposite from low-rank approximation. The idea is to project low-dimensional data into a higher-dimensional 'feature space,' such that it is linear separable in the feature space. This enables the model to learn a nonlinear separation of the data. As before, with large data matrices, computing the kernel matrix can be expensive, so we use randomized methods to approximate the matrix. In addition, we propose an extension of the random Fourier features kernel in which hyperparameter values are randomly sampled from an interval or Borel set. The experiments discussed in this paper can be found on our website at https://rishi1999.github.io/random-projections.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/17/2020

Pass-Efficient Randomized LU Algorithms for Computing Low-Rank Matrix Approximation

Low-rank matrix approximation is extremely useful in the analysis of dat...
research
10/22/2020

Fast Approximate CoSimRanks via Random Projections

Given a graph G with n nodes, and two nodes u,v in G, the CoSim-Rank val...
research
10/20/2022

Block subsampled randomized Hadamard transform for low-rank approximation on distributed architectures

This article introduces a novel structured random matrix composed blockw...
research
02/16/2017

Completing a joint PMF from projections: a low-rank coupled tensor factorization approach

There has recently been considerable interest in completing a low-rank m...
research
04/15/2015

Theory of Dual-sparse Regularized Randomized Reduction

In this paper, we study randomized reduction methods, which reduce high-...
research
10/09/2018

Data-dependent compression of random features for large-scale kernel approximation

Kernel methods offer the flexibility to learn complex relationships in m...
research
04/15/2022

Kernel similarity matching with Hebbian neural networks

Recent works have derived neural networks with online correlation-based ...

Please sign up or login with your details

Forgot password? Click here to reset