Evolutionary Variational Optimization of Generative Models

12/22/2020
by   Jakob Drefs, et al.
6

We combine two popular optimization approaches to derive learning algorithms for generative models: variational optimization and evolutionary algorithms. The combination is realized for generative models with discrete latents by using truncated posteriors as the family of variational distributions. The variational parameters of truncated posteriors are sets of latent states. By interpreting these states as genomes of individuals and by using the variational lower bound to define a fitness, we can apply evolutionary algorithms to realize the variational loop. The used variational distributions are very flexible and we show that evolutionary algorithms can effectively and efficiently optimize the variational bound. Furthermore, the variational loop is generally applicable ("black box") with no analytical derivations required. To show general applicability, we apply the approach to three generative models (we use noisy-OR Bayes Nets, Binary Sparse Coding, and Spike-and-Slab Sparse Coding). To demonstrate effectiveness and efficiency of the novel variational approach, we use the standard competitive benchmarks of image denoising and inpainting. The benchmarks allow quantitative comparisons to a wide range of methods including probabilistic approaches, deep deterministic and generative networks, and non-local image processing methods. In the category of "zero-shot" learning (when only the corrupted image is used for training), we observed the evolutionary variational algorithm to significantly improve the state-of-the-art in many benchmark settings. For one well-known inpainting benchmark, we also observed state-of-the-art performance across all categories of algorithms although we only train on the corrupted image. In general, our investigations highlight the importance of research on optimization methods for generative models to achieve performance improvements.

READ FULL TEXT

page 12

page 19

page 20

page 22

page 33

page 34

research
12/21/2017

Truncated Variational Sampling for "Black Box" Optimization of Generative Models

We investigate the optimization of two generative models with binary hid...
research
11/27/2020

Direct Evolutionary Optimization of Variational Autoencoders With Binary Latents

Discrete latent variables are considered important for real world data, ...
research
11/15/2012

A Truncated EM Approach for Spike-and-Slab Sparse Coding

We study inference and learning based on a sparse coding model with `spi...
research
09/07/2022

On the Convergence of the ELBO to Entropy Sums

The variational lower bound (a.k.a. ELBO or free energy) is the central ...
research
03/07/2022

Learning to Bound: A Generative Cramér-Rao Bound

The Cramér-Rao bound (CRB), a well-known lower bound on the performance ...
research
01/31/2019

Improving Evolutionary Strategies with Generative Neural Networks

Evolutionary Strategies (ES) are a popular family of black-box zeroth-or...
research
03/10/2017

Evolutionary Image Composition Using Feature Covariance Matrices

Evolutionary algorithms have recently been used to create a wide range o...

Please sign up or login with your details

Forgot password? Click here to reset