DISCO Nets: DISsimilarity COefficient Networks

06/08/2016
by   Diane Bouchacourt, et al.
0

We present a new type of probabilistic model which we call DISsimilarity COefficient Networks (DISCO Nets). DISCO Nets allow us to efficiently sample from a posterior distribution parametrised by a neural network. During training, DISCO Nets are learned by minimising the dissimilarity coefficient between the true distribution and the estimated distribution. This allows us to tailor the training to the loss related to the task at hand. We empirically show that (i) by modeling uncertainty on the output value, DISCO Nets outperform equivalent non-probabilistic predictive networks and (ii) DISCO Nets accurately model the uncertainty of the output, outperforming existing probabilistic models based on deep neural networks.

READ FULL TEXT
research
01/01/2019

Realizing data features by deep nets

This paper considers the power of deep neural networks (deep nets for sh...
research
05/28/2023

Forecasting the levels of disability in the older population of England: Application of neural nets

Deep neural networks are powerful tools for modelling non-linear pattern...
research
04/26/2022

PAC-Bayes training for neural networks: sparsity and uncertainty quantification

We study the Gibbs posterior distribution from PAC-Bayes theory for spar...
research
05/23/2018

Particle Filter Networks: End-to-End Probabilistic Localization From Visual Observations

Particle filters sequentially approximate posterior distributions by sam...
research
01/06/2022

Bio-inspired Min-Nets Improve the Performance and Robustness of Deep Networks

Min-Nets are inspired by end-stopped cortical cells with units that outp...
research
04/21/2020

How to Train your DNN: The Network Operator Edition

Deep Neural Nets have hit quite a crest, But physical networks are where...
research
08/06/2012

Credal nets under epistemic irrelevance

We present a new approach to credal nets, which are graphical models tha...

Please sign up or login with your details

Forgot password? Click here to reset