Quantum Variational Autoencoder

02/15/2018
by   Amir Khoshaman, et al.
0

Variational autoencoders (VAEs) are powerful generative models with the salient ability to perform inference. Here, we introduce a quantum variational autoencoder (QVAE): a VAE whose latent generative process is implemented as a quantum Boltzmann machine (QBM). We show that our model can be trained end-to-end by maximizing a well-defined loss-function: a "quantum" lower-bound to a variational approximation of the log-likelihood. We use quantum Monte Carlo (QMC) simulations to train and evaluate the performance of QVAEs. To achieve the best performance, we first create a VAE platform with discrete latent space generated by a restricted Boltzmann machine (RBM). Our model achieves state-of-the-art performance on the MNIST dataset when compared against similar approaches that only involve discrete variables in the generative process. We consider QVAEs with a smaller number of latent units to be able to perform QMC simulations, which are computationally expensive. We show that QVAEs can be trained effectively in regimes where quantum effects are relevant despite training via the quantum bound. Our findings open the way to the use of quantum computers to train QVAEs to achieve competitive performance for generative models. Placing a QBM in the latent space of a VAE leverages the full potential of current and next-generation quantum computers as sampling devices.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/04/2019

A Path Towards Quantum Advantage in Training Deep Generative Models with Quantum Annealers

The development of quantum-classical hybrid (QCH) algorithms is critical...
research
08/26/2021

Training a discrete variational autoencoder for generative chemistry and drug design on a quantum annealer

Deep generative chemistry models emerge as powerful tools to expedite dr...
research
08/06/2021

GLASS: Geometric Latent Augmentation for Shape Spaces

We investigate the problem of training generative models on a very spars...
research
07/09/2021

Lifelong Mixture of Variational Autoencoders

In this paper, we propose an end-to-end lifelong learning mixture of exp...
research
04/23/2019

Generated Loss, Augmented Training, and Multiscale VAE

The variational autoencoder (VAE) framework remains a popular option for...
research
06/13/2020

High-Dimensional Similarity Search with Quantum-Assisted Variational Autoencoder

Recent progress in quantum algorithms and hardware indicates the potenti...
research
12/22/2018

Can VAEs Generate Novel Examples?

An implicit goal in works on deep generative models is that such models ...

Please sign up or login with your details

Forgot password? Click here to reset