Training Neural Networks with Universal Adiabatic Quantum Computing

08/24/2023
by   Steve Abel, et al.
0

The training of neural networks (NNs) is a computationally intensive task requiring significant time and resources. This paper presents a novel approach to NN training using Adiabatic Quantum Computing (AQC), a paradigm that leverages the principles of adiabatic evolution to solve optimisation problems. We propose a universal AQC method that can be implemented on gate quantum computers, allowing for a broad range of Hamiltonians and thus enabling the training of expressive neural networks. We apply this approach to various neural networks with continuous, discrete, and binary weights. Our results indicate that AQC can very efficiently find the global minimum of the loss function, offering a promising alternative to classical training methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/27/2019

Defining Quantum Neural Networks via Quantum Time Evolution

This work presents a novel fundamental algorithm for for defining and tr...
research
04/29/2020

Insights on Training Neural Networks for QUBO Tasks

Current hardware limitations restrict the potential when solving quadrat...
research
02/27/2019

Efficient Learning for Deep Quantum Neural Networks

Neural networks enjoy widespread success in both research and industry a...
research
09/26/2019

Information Scrambling in Quantum Neural Networks

Quantum neural networks are one of the promising applications for near-t...
research
02/24/2023

Generative Invertible Quantum Neural Networks

Invertible Neural Networks (INN) have become established tools for the s...
research
10/23/2022

Accelerating the training of single-layer binary neural networks using the HHL quantum algorithm

Binary Neural Networks are a promising technique for implementing effici...
research
02/23/2022

Completely Quantum Neural Networks

Artificial neural networks are at the heart of modern deep learning algo...

Please sign up or login with your details

Forgot password? Click here to reset