DeepAI AI Chat
Log In Sign Up

Benchmarking Quantum Hardware for Training of Fully Visible Boltzmann Machines

by   Dmytro Korenkevych, et al.

Quantum annealing (QA) is a hardware-based heuristic optimization and sampling method applicable to discrete undirected graphical models. While similar to simulated annealing, QA relies on quantum, rather than thermal, effects to explore complex search spaces. For many classes of problems, QA is known to offer computational advantages over simulated annealing. Here we report on the ability of recent QA hardware to accelerate training of fully visible Boltzmann machines. We characterize the sampling distribution of QA hardware, and show that in many cases, the quantum distributions differ significantly from classical Boltzmann distributions. In spite of this difference, training (which seeks to match data and model statistics) using standard classical gradient updates is still effective. We investigate the use of QA for seeding Markov chains as an alternative to contrastive divergence (CD) and persistent contrastive divergence (PCD). Using k=50 Gibbs steps, we show that for problems with high-energy barriers between modes, QA-based seeds can improve upon chains with CD and PCD initializations. For these hard problems, QA gradient estimates are more accurate, and allow for faster learning. Furthermore, and interestingly, even the case of raw QA samples (that is, k=0) achieved similar improvements. We argue that this relates to the fact that we are training a quantum rather than classical Boltzmann distribution in this case. The learned parameters give rise to hardware QA distributions closely approximating classical Boltzmann distributions that are hard to train with CD/PCD.


page 5

page 9

page 14

page 15


Restricted Boltzmann Machines for galaxy morphology classification with a quantum annealer

We present the application of Restricted Boltzmann Machines (RBMs) to th...

High-quality Thermal Gibbs Sampling with Quantum Annealing Hardware

Quantum Annealing (QA) was originally intended for accelerating the solu...

Accelerating Deep Learning with Memcomputing

Restricted Boltzmann machines (RBMs) and their extensions, often called ...

Comparison of D-Wave Quantum Annealing and Classical Simulated Annealing for Local Minima Determination

Restricted Boltzmann Machines trained with different numbers of iteratio...

A belief propagation algorithm based on domain decomposition

This note provides a detailed description and derivation of the domain d...

Application of Quantum Annealing to Training of Deep Neural Networks

In Deep Learning, a well-known approach for training a Deep Neural Netwo...

Training Quantum Boltzmann Machines with Coresets

Recent work has proposed and explored using coreset techniques for quant...