Unifying supervised learning and VAEs – automating statistical inference in high-energy physics

08/13/2020
by   Thorsten Glüsenkamp, et al.
0

A KL-divergence objective of the joint distribution of data and labels allows to unify supervised learning, VAEs and semi-supervised learning under one umbrella of variational inference. This viewpoint has several advantages. For VAEs, it clarifies the interpretation of encoder and decoder parts. For supervised learning, it re-iterates that the training procedure approximates the true posterior over labels and can always be viewed as approximate likelihood-free inference. This is typically not discussed, even though the derivation is well-known in the literature. In the context of semi-supervised learning it motivates an extended supervised scheme which allows to calculate a goodness-of-fit p-value using posterior predictive simulations. Flow-based networks with a standard normal base distribution are crucial. We discuss how they allow to rigorously define coverage for arbitrary joint posteriors on ℝ^n ×𝒮^m, which encompasses posteriors over directions. Finally, systematic uncertainties are naturally included in the variational viewpoint. With the three ingredients of (1) systematics, (2) coverage and (3) goodness-of-fit, flow-based neural networks have the potential to replace a large part of the statistical toolbox of the contemporary high-energy physicist.

READ FULL TEXT
research
05/26/2019

A Flexible Generative Framework for Graph-based Semi-supervised Learning

We consider a family of problems that are concerned about making predict...
research
05/02/2018

SaaS: Speed as a Supervisor for Semi-supervised Learning

We introduce the SaaS Algorithm for semi-supervised learning, which uses...
research
05/01/2019

Semi-Conditional Normalizing Flows for Semi-Supervised Learning

This paper proposes a semi-conditional normalizing flow model for semi-s...
research
07/17/2020

Dealing with Nuisance Parameters using Machine Learning in High Energy Physics: a Review

In this work we discuss the impact of nuisance parameters on the effecti...
research
04/05/2006

Semi-Supervised Learning -- A Statistical Physics Approach

We present a novel approach to semi-supervised learning which is based o...
research
03/13/2020

Minor Constraint Disturbances for Deep Semi-supervised Learning

In high-dimensional data space, semi-supervised feature learning based o...
research
01/16/2012

Spike-and-Slab Sparse Coding for Unsupervised Feature Discovery

We consider the problem of using a factor model we call spike-and-slab ...

Please sign up or login with your details

Forgot password? Click here to reset