Non-Generative Energy Based Models

04/03/2023
by   Jacob Piland, et al.
0

Energy-based models (EBM) have become increasingly popular within computer vision. EBMs bring a probabilistic approach to training deep neural networks (DNN) and have been shown to enhance performance in areas such as calibration, out-of-distribution detection, and adversarial resistance. However, these advantages come at the cost of estimating input data probabilities, usually using a Langevin based method such as Stochastic Gradient Langevin Dynamics (SGLD), which bring additional computational costs, require parameterization, caching methods for efficiency, and can run into stability and scaling issues. EBMs use dynamical methods to draw samples from the probability density function (PDF) defined by the current state of the network and compare them to the training data using a maximum log likelihood approach to learn the correct PDF. We propose a non-generative training approach, Non-Generative EBM (NG-EBM), that utilizes the Approximate Mass, identified by Grathwohl et al., as a loss term to direct the training. We show that our NG-EBM training strategy retains many of the benefits of EBM in calibration, out-of-distribution detection, and adversarial resistance, but without the computational complexity and overhead of the traditional approaches. In particular, the NG-EBM approach improves the Expected Calibration Error by a factor of 2.5 for CIFAR10 and 7.5 times for CIFAR100, when compared to traditionally trained models.

READ FULL TEXT
research
07/18/2022

Adversarial Training Improves Joint Energy-Based Generative Modelling

We propose the novel framework for generative modelling using hybrid ene...
research
12/06/2019

Your Classifier is Secretly an Energy Based Model and You Should Treat it Like One

We propose to reinterpret a standard discriminative classifier of p(y|x)...
research
11/06/2016

Learning to Draw Samples: With Application to Amortized MLE for Generative Adversarial Learning

We propose a simple algorithm to train stochastic neural networks to dra...
research
07/02/2023

Morse Neural Networks for Uncertainty Quantification

We introduce a new deep generative model useful for uncertainty quantifi...
research
05/24/2023

Training Energy-Based Normalizing Flow with Score-Matching Objectives

In this paper, we establish a connection between the parameterization of...
research
10/07/2020

Gradient-based Causal Structure Learning with Normalizing Flow

In this paper, we propose a score-based normalizing flow method called D...
research
03/02/2020

Energy-efficient and Robust Cumulative Training with Net2Net Transformation

Deep learning has achieved state-of-the-art accuracies on several comput...

Please sign up or login with your details

Forgot password? Click here to reset