Quantum activation functions for quantum neural networks

01/10/2022
by   Marco Maronese, et al.
15

The field of artificial neural networks is expected to strongly benefit from recent developments of quantum computers. In particular, quantum machine learning, a class of quantum algorithms which exploit qubits for creating trainable neural networks, will provide more power to solve problems such as pattern recognition, clustering and machine learning in general. The building block of feed-forward neural networks consists of one layer of neurons connected to an output neuron that is activated according to an arbitrary activation function. The corresponding learning algorithm goes under the name of Rosenblatt perceptron. Quantum perceptrons with specific activation functions are known, but a general method to realize arbitrary activation functions on a quantum computer is still lacking. Here we fill this gap with a quantum algorithm which is capable to approximate any analytic activation functions to any given order of its power series. Unlike previous proposals providing irreversible measurement–based and simplified activation functions, here we show how to approximate any analytic function to any required accuracy without the need to measure the states encoding the information. Thanks to the generality of this construction, any feed-forward neural network may acquire the universal approximation properties according to Hornik's theorem. Our results recast the science of artificial neural networks in the architecture of gate-model quantum computers.

READ FULL TEXT
research
01/25/2021

Activation Functions in Artificial Neural Networks: A Systematic Overview

Activation functions shape the outputs of artificial neurons and, theref...
research
12/11/2014

Simulating a perceptron on a quantum computer

Perceptrons are the basic computational unit of artificial neural networ...
research
11/30/2017

Quantum Neuron: an elementary building block for machine learning on quantum computers

Even the most sophisticated artificial neural networks are built by aggr...
research
02/21/2023

On the Behaviour of Pulsed Qubits and their Application to Feed Forward Networks

In the last two decades, the combination of machine learning and quantum...
research
03/18/2021

Neural tensor contractions and the expressive power of deep neural quantum states

We establish a direct connection between general tensor networks and dee...
research
08/17/2001

Artificial Neurons with Arbitrarily Complex Internal Structures

Artificial neurons with arbitrarily complex internal structure are intro...
research
11/18/2016

Spikes as regularizers

We present a confidence-based single-layer feed-forward learning algorit...

Please sign up or login with your details

Forgot password? Click here to reset