A Sparse Johnson-Lindenstrauss Transform using Fast Hashing

05/04/2023
by   Jakob Bæk Tejs Houen, et al.
0

The Sparse Johnson-Lindenstrauss Transform of Kane and Nelson (SODA 2012) provides a linear dimensionality-reducing map A ∈ℝ^m × u in ℓ_2 that preserves distances up to distortion of 1 + ε with probability 1 - δ, where m = O(ε^-2log 1/δ) and each column of A has O(ε m) non-zero entries. The previous analyses of the Sparse Johnson-Lindenstrauss Transform all assumed access to a Ω(log 1/δ)-wise independent hash function. The main contribution of this paper is a more general analysis of the Sparse Johnson-Lindenstrauss Transform with less assumptions on the hash function. We also show that the Mixed Tabulation hash function of Dahlgaard, Knudsen, Rotenberg, and Thorup (FOCS 2015) satisfies the conditions of our analysis, thus giving us the first analysis of a Sparse Johnson-Lindenstrauss Transform that works with a practical hash function.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/29/2018

SHAH: Hash Function based on Irregularly Decimated Chaotic Map

In this paper, we propose a novel hash function based on irregularly dec...
research
11/23/2017

Practical Hash Functions for Similarity Estimation and Dimensionality Reduction

Hashing is a basic tool for dimensionality reduction employed in several...
research
11/11/2013

Toward a unified theory of sparse dimensionality reduction in Euclidean space

Let Φ∈R^m× n be a sparse Johnson-Lindenstrauss transform [KN14] with s n...
research
09/18/2021

When Similarity Digest Meets Vector Management System: A Survey on Similarity Hash Function

The booming vector manage system calls for feasible similarity hash func...
research
03/04/2023

Optimization of SpGEMM with Risc-V vector instructions

The Sparse GEneral Matrix-Matrix multiplication (SpGEMM) C = A × B is a ...
research
05/12/2021

Sparse Nonnegative Convolution Is Equivalent to Dense Nonnegative Convolution

Computing the convolution A⋆ B of two length-n vectors A,B is an ubiquit...
research
06/08/2021

Hash Layers For Large Sparse Models

We investigate the training of sparse layers that use different paramete...

Please sign up or login with your details

Forgot password? Click here to reset