When Are Learned Models Better Than Hash Functions?

07/03/2021
by   Ibrahim Sabek, et al.
0

In this work, we aim to study when learned models are better hash functions, particular for hash-maps. We use lightweight piece-wise linear models to replace the hash functions as they have small inference times and are sufficiently general to capture complex distributions. We analyze the learned models in terms of: the model inference time and the number of collisions. Surprisingly, we found that learned models are not much slower to compute than hash functions if optimized correctly. However, it turns out that learned models can only reduce the number of collisions (i.e., the number of times different keys have the same hash value) if the model is able to over-fit to the data; otherwise, it can not be better than an ordinary hash function. Hence, how much better a learned model is in avoiding collisions highly depends on the data and the ability of the model to over-fit. To evaluate the effectiveness of learned models, we used them as hash functions in the bucket chaining and Cuckoo hash tables. For bucket chaining hash table, we found that learned models can achieve 30 Cuckoo hash tables, in some datasets, learned models can increase the ratio of keys stored in their primary locations by around 10 learned models can indeed outperform hash functions but only for certain data distributions and with a limited margin.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/13/2023

Invertible Bloom Lookup Tables with Less Memory and Randomness

In this work we study Invertible Bloom Lookup Tables (IBLTs) with small ...
research
03/28/2022

Learning to Collide: Recommendation System Model Compression with Learned Hash Functions

A key characteristic of deep recommendation models is the immense memory...
research
02/11/2018

Binary Pebbling Algorithms for In-Place Reversal of One-Way Hash Chains

We present optimal binary pebbling algorithms for in-place reversal (bac...
research
07/05/2019

HashGraph – Scalable Hash Tables Using A Sparse Graph Data Structure

Hash tables are ubiquitous and used in a wide range of applications for ...
research
09/13/2022

A Hash Table Without Hash Functions, and How to Get the Most Out of Your Random Bits

This paper considers the basic question of how strong of a probabilistic...
research
03/28/2019

Implementing Noise with Hash functions for Graphics Processing Units

We propose a modification to Perlin noise which use computable hash func...
research
12/27/2017

Analysing the Performance of GPU Hash Tables for State Space Exploration

In the past few years, General Purpose Graphics Processors (GPUs) have b...

Please sign up or login with your details

Forgot password? Click here to reset