Probabilistic Meta-Representations Of Neural Networks

10/01/2018
by   Theofanis Karaletsos, et al.
0

Existing Bayesian treatments of neural networks are typically characterized by weak prior and approximate posterior distributions according to which all the weights are drawn independently. Here, we consider a richer prior distribution in which units in the network are represented by latent variables, and the weights between units are drawn conditionally on the values of the collection of those variables. This allows rich correlations between related weights, and can be seen as realizing a function prior with a Bayesian complexity regularizer ensuring simple solutions. We illustrate the resulting meta-representations and representations, elucidating the power of this prior.

READ FULL TEXT
research
10/25/2019

Attention for Inference Compilation

We present a new approach to automatic amortized inference in universal ...
research
02/20/2020

Bayesian Deep Learning and a Probabilistic Perspective of Generalization

The key distinguishing property of a Bayesian approach is marginalizatio...
research
04/23/2022

SIReN-VAE: Leveraging Flows and Amortized Inference for Bayesian Networks

Initial work on variational autoencoders assumed independent latent vari...
research
10/11/2018

Bayesian neural networks increasingly sparsify their units with depth

We investigate deep Bayesian neural networks with Gaussian priors on the...
research
04/29/2010

Designing neural networks that process mean values of random variables

We introduce a class of neural networks derived from probabilistic model...
research
05/20/2022

Nonlinear motion separation via untrained generator networks with disentangled latent space variables and applications to cardiac MRI

In this paper, a nonlinear approach to separate different motion types i...
research
07/05/2022

Meta-Learning a Real-Time Tabular AutoML Method For Small Data

We present TabPFN, an AutoML method that is competitive with the state o...

Please sign up or login with your details

Forgot password? Click here to reset