Log In Sign Up

The Hintons in your Neural Network: a Quantum Field Theory View of Deep Learning

by   Roberto Bondesan, et al.

In this work we develop a quantum field theory formalism for deep learning, where input signals are encoded in Gaussian states, a generalization of Gaussian processes which encode the agent's uncertainty about the input signal. We show how to represent linear and non-linear layers as unitary quantum gates, and interpret the fundamental excitations of the quantum model as particles, dubbed “Hintons”. On top of opening a new perspective and techniques for studying neural networks, the quantum formulation is well suited for optical quantum computing, and provides quantum deformations of neural networks that can be run efficiently on those devices. Finally, we discuss a semi-classical limit of the quantum deformed models which is amenable to classical simulation.


page 1

page 2

page 3

page 4


Baryons from Mesons: A Machine Learning Perspective

Quantum chromodynamics (QCD) is the theory of the strong interaction. Th...

Quantum Deformed Neural Networks

We develop a new quantum neural network layer designed to run efficientl...

Quantum simulation from the bottom up: the case of rebits

Typically, quantum mechanics is thought of as a linear theory with unita...

Quantum computing and the brain: quantum nets, dessins d'enfants and neural networks

In this paper, we will discuss a formal link between neural networks and...

Continuous-variable quantum neural networks

We introduce a general method for building neural networks on quantum co...

Bayesian Deep Learning on a Quantum Computer

Bayesian methods in machine learning, such as Gaussian processes, have g...