Contrastive Conditional Neural Processes

03/08/2022
by   Zesheng Ye, et al.
0

Conditional Neural Processes (CNPs) bridge neural networks with probabilistic inference to approximate functions of Stochastic Processes under meta-learning settings. Given a batch of non-i.i.d function instantiations, CNPs are jointly optimized for in-instantiation observation prediction and cross-instantiation meta-representation adaptation within a generative reconstruction pipeline. There can be a challenge in tying together such two targets when the distribution of function observations scales to high-dimensional and noisy spaces. Instead, noise contrastive estimation might be able to provide more robust representations by learning distributional matching objectives to combat such inherent limitation of generative models. In light of this, we propose to equip CNPs by 1) aligning prediction with encoded ground-truth observation, and 2) decoupling meta-representation adaptation from generative reconstruction. Specifically, two auxiliary contrastive branches are set up hierarchically, namely in-instantiation temporal contrastive learning (TCL) and cross-instantiation function contrastive learning (FCL), to facilitate local predictive alignment and global function consistency, respectively. We empirically show that TCL captures high-level abstraction of observations, whereas FCL helps identify underlying functions, which in turn provides more efficient representations. Our model outperforms other CNPs variants when evaluating function distribution reconstruction and parameter identification across 1D, 2D and high-dimensional time-series.

READ FULL TEXT
research
06/18/2021

On Contrastive Representations of Stochastic Processes

Learning representations of stochastic processes is an emerging problem ...
research
03/23/2023

Adversarially Contrastive Estimation of Conditional Neural Processes

Conditional Neural Processes (CNPs) formulate distributions over functio...
research
11/10/2021

Conditional Alignment and Uniformity for Contrastive Learning with Continuous Proxy Labels

Contrastive Learning has shown impressive results on natural and medical...
research
06/05/2019

Noise Contrastive Meta-Learning for Conditional Density Estimation using Kernel Mean Embeddings

Current meta-learning approaches focus on learning functional representa...
research
01/30/2023

Contrastive Meta-Learning for Partially Observable Few-Shot Learning

Many contrastive and meta-learning approaches learn representations by i...
research
06/04/2023

ContraBAR: Contrastive Bayes-Adaptive Deep RL

In meta reinforcement learning (meta RL), an agent seeks a Bayes-optimal...
research
05/31/2022

SOM-CPC: Unsupervised Contrastive Learning with Self-Organizing Maps for Structured Representations of High-Rate Time Series

Continuous monitoring with an ever-increasing number of sensors has beco...

Please sign up or login with your details

Forgot password? Click here to reset