Learning about an exponential amount of conditional distributions

We introduce the Neural Conditioner (NC), a self-supervised machine able to learn about all the conditional distributions of a random vector X. The NC is a function NC(x · a, a, r) that leverages adversarial training to match each conditional distribution P(X_r|X_a=x_a). After training, the NC generalizes to sample from conditional distributions never seen, including the joint distribution. The NC is also able to auto-encode examples, providing data representations useful for downstream classification tasks. In sum, the NC integrates different self-supervised tasks (each being the estimation of a conditional distribution) and levels of supervision (partially observed data) seamlessly into a single learning experience.

READ FULL TEXT

page 3

page 7

page 8

research
01/16/2020

Masking schemes for universal marginalisers

We consider the effect of structure-agnostic and structure-dependent mas...
research
06/28/2019

Using Self-Supervised Learning Can Improve Model Robustness and Uncertainty

Self-supervision provides effective representations for downstream tasks...
research
01/23/2021

Online Adversarial Purification based on Self-Supervision

Deep neural networks are known to be vulnerable to adversarial examples,...
research
08/07/2019

Advocacy Learning: Learning through Competition and Class-Conditional Representations

We introduce advocacy learning, a novel supervised training scheme for a...
research
07/15/2016

Learning from Conditional Distributions via Dual Embeddings

Many machine learning tasks, such as learning with invariance and policy...
research
12/14/2022

Learning useful representations for shifting tasks and distributions

Does the dominant approach to learn representations (as a side effect of...
research
12/04/2019

Learnt dynamics generalizes across tasks, datasets, and populations

Differentiating multivariate dynamic signals is a difficult learning pro...

Please sign up or login with your details

Forgot password? Click here to reset