Sign Stable Random Projections for Large-Scale Learning

04/27/2015
by   Ping Li, et al.
0

We study the use of "sign α-stable random projections" (where 0<α≤ 2) for building basic data processing tools in the context of large-scale machine learning applications (e.g., classification, regression, clustering, and near-neighbor search). After the processing by sign stable random projections, the inner products of the processed data approximate various types of nonlinear kernels depending on the value of α. Thus, this approach provides an effective strategy for approximating nonlinear learning algorithms essentially at the cost of linear learning. When α =2, it is known that the corresponding nonlinear kernel is the arc-cosine kernel. When α=1, the procedure approximates the arc-cos-χ^2 kernel (under certain condition). When α→0+, it corresponds to the resemblance kernel. From practitioners' perspective, the method of sign α-stable random projections is ready to be tested for large-scale learning applications, where α can be simply viewed as a tuning parameter. What is missing in the literature is an extensive empirical study to show the effectiveness of sign stable random projections, especially for α≠ 2 or 1. The paper supplies such a study on a wide variety of classification datasets. In particular, we compare shoulder-by-shoulder sign stable random projections with the recently proposed "0-bit consistent weighted sampling (CWS)" (Li 2015).

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/26/2018

Sign-Full Random Projections

The method of 1-bit ("sign-sign") random projections has been a popular ...
research
02/25/2021

Quantization Algorithms for Random Fourier Features

The method of random projection (RP) is the standard technique in machin...
research
07/12/2016

Nystrom Method for Approximating the GMM Kernel

The GMM (generalized min-max) kernel was recently proposed (Li, 2016) as...
research
05/17/2022

ROP inception: signal estimation with quadratic random sketching

Rank-one projections (ROP) of matrices and quadratic random sketching of...
research
11/28/2014

Learning with Algebraic Invariances, and the Invariant Kernel Trick

When solving data analysis problems it is important to integrate prior k...
research
10/22/2019

Kernel computations from large-scale random features obtained by Optical Processing Units

Approximating kernel functions with random features (RFs)has been a succ...
research
02/21/2016

2-Bit Random Projections, NonLinear Estimators, and Approximate Near Neighbor Search

The method of random projections has become a standard tool for machine ...

Please sign up or login with your details

Forgot password? Click here to reset