Constant Memory Attentive Neural Processes

05/23/2023
by   Leo Feng, et al.
0

Neural Processes (NPs) are efficient methods for estimating predictive uncertainties. NPs comprise of a conditioning phase where a context dataset is encoded, a querying phase where the model makes predictions using the context dataset encoding, and an updating phase where the model updates its encoding with newly received datapoints. However, state-of-the-art methods require additional memory which scales linearly or quadratically with the size of the dataset, limiting their applications, particularly in low-resource settings. In this work, we propose Constant Memory Attentive Neural Processes (CMANPs), an NP variant which only requires constant memory for the conditioning, querying, and updating phases. In building CMANPs, we propose Constant Memory Attention Block (CMAB), a novel general-purpose attention block that can compute its output in constant memory and perform updates in constant computation. Empirically, we show CMANPs achieve state-of-the-art results on meta-regression and image completion tasks while being (1) significantly more memory efficient than prior methods and (2) more scalable to harder settings.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/21/2023

Constant Memory Attention Block

Modern foundation model architectures rely on attention mechanisms to ef...
research
11/15/2022

Latent Bottlenecked Attentive Neural Processes

Neural Processes (NPs) are popular methods in meta-learning that can est...
research
06/29/2020

Robustifying Sequential Neural Processes

When tasks change over time, meta-transfer learning seeks to improve the...
research
06/09/2021

Memory-based Optimization Methods for Model-Agnostic Meta-Learning

Recently, model-agnostic meta-learning (MAML) has garnered tremendous at...
research
01/27/2023

Meta Temporal Point Processes

A temporal point process (TPP) is a stochastic process where its realiza...
research
04/11/2022

Neural Processes with Stochastic Attention: Paying more attention to the context dataset

Neural processes (NPs) aim to stochastically complete unseen data points...

Please sign up or login with your details

Forgot password? Click here to reset