Learning to Utilize Correlated Auxiliary Classical or Quantum Noise

06/08/2020
by   Aida Ahmadzadegan, et al.
0

This paper has two messages. First, we demonstrate that neural networks can learn to exploit correlations between noisy data and suitable auxiliary noise. In effect, the network learns to use the correlated auxiliary noise as an approximate key to decipher its noisy input data. Second, we show that the scaling behavior with increasing noise is such that future quantum machines should possess an advantage. For a concrete example, we reduce the image classification performance of convolutional neural networks (CNNs) by adding noise of different amounts and quality to the input images. We then demonstrate that the CNNs are able to partly recover their performance if, along with each noisy image, they are given auxiliary noise that is correlated with the image noise. We analyze the scaling of a CNN ability to learn and utilize these noise correlations as the level, dimensionality, or complexity of the noise is increased. We thereby find numerical and theoretical indications that quantum machines, due to their efficiency in representing complex correlations, could possess a significant advantage over classical machines.

READ FULL TEXT

page 3

page 4

research
07/17/2023

A Quantum Convolutional Neural Network Approach for Object Detection and Classification

This paper presents a comprehensive evaluation of the potential of Quant...
research
07/25/2019

Convolutional Neural Networks on Randomized Data

Convolutional Neural Networks (CNNs) are build specifically for computer...
research
04/03/2018

Convolutional Neural Networks Regularized by Correlated Noise

Neurons in the visual cortex are correlated in their variability. The pr...
research
06/02/2020

Experimental demonstration of a quantum generative adversarial network for continuous distributions

The potential advantage of machine learning in quantum computers is a to...
research
08/10/2021

The information of attribute uncertainties: what convolutional neural networks can learn about errors in input data

Errors in measurements are key to weighting the value of data, but are o...
research
12/18/2021

Being Friends Instead of Adversaries: Deep Networks Learn from Data Simplified by Other Networks

Amongst a variety of approaches aimed at making the learning procedure o...
research
06/08/2022

Predict better with less training data using a QNN

Over the past decade, machine learning revolutionized vision-based quali...

Please sign up or login with your details

Forgot password? Click here to reset