DeepAI AI Chat
Log In Sign Up

Meta-Learning Neural Bloom Filters

06/10/2019
by   Jack W Rae, et al.
0

There has been a recent trend in training neural networks to replace data structures that have been crafted by hand, with an aim for faster execution, better accuracy, or greater compression. In this setting, a neural data structure is instantiated by training a network over many epochs of its inputs until convergence. In applications where inputs arrive at high throughput, or are ephemeral, training a network from scratch is not practical. This motivates the need for few-shot neural data structures. In this paper we explore the learning of approximate set membership over a set of data in one-shot via meta-learning. We propose a novel memory architecture, the Neural Bloom Filter, which is able to achieve significant compression gains over classical Bloom Filters and existing memory-augmented neural networks.

READ FULL TEXT
02/19/2019

In oder Aus

Bloom filters are data structures used to determine set membership of el...
06/08/2023

EMO: Episodic Memory Optimization for Few-Shot Meta-Learning

Few-shot meta-learning presents a challenge for gradient descent optimiz...
05/31/2022

Meta-ticket: Finding optimal subnetworks for few-shot learning within randomly initialized neural networks

Few-shot learning for neural networks (NNs) is an important problem that...
08/05/2022

Compressing (Multidimensional) Learned Bloom Filters

Bloom filters are widely used data structures that compactly represent s...
07/05/2017

Labeled Memory Networks for Online Model Adaptation

Augmenting a neural network with memory that can grow without growing th...
06/27/2020

Optimizing Cuckoo Filter for high burst tolerance,low latency, and high throughput

In this paper, we present an implementation of a cuckoo filter for membe...
08/08/2022

NeuralVDB: High-resolution Sparse Volume Representation using Hierarchical Neural Networks

We introduce NeuralVDB, which improves on an existing industry standard ...