Experimentally realized in situ backpropagation for deep learning in nanophotonic neural networks

05/17/2022
by   Sunil Pai, et al.
9

Neural networks are widely deployed models across many scientific disciplines and commercial endeavors ranging from edge computing and sensing to large-scale signal processing in data centers. The most efficient and well-entrenched method to train such networks is backpropagation, or reverse-mode automatic differentiation. To counter an exponentially increasing energy budget in the artificial intelligence sector, there has been recent interest in analog implementations of neural networks, specifically nanophotonic neural networks for which no analog backpropagation demonstration exists. We design mass-manufacturable silicon photonic neural networks that alternately cascade our custom designed "photonic mesh" accelerator with digitally implemented nonlinearities. These reconfigurable photonic meshes program computationally intensive arbitrary matrix multiplication by setting physical voltages that tune the interference of optically encoded input data propagating through integrated Mach-Zehnder interferometer networks. Here, using our packaged photonic chip, we demonstrate in situ backpropagation for the first time to solve classification tasks and evaluate a new protocol to keep the entire gradient measurement and update of physical device voltages in the analog domain, improving on past theoretical proposals. Our method is made possible by introducing three changes to typical photonic meshes: (1) measurements at optical "grating tap" monitors, (2) bidirectional optical signal propagation automated by fiber switch, and (3) universal generation and readout of optical amplitude and phase. After training, our classification achieves accuracies similar to digital equivalents even in presence of systematic error. Our findings suggest a new training paradigm for photonics-accelerated artificial intelligence based entirely on a physical analog of the popular backpropagation technique.

READ FULL TEXT

page 1

page 2

page 7

page 12

page 18

page 20

page 21

page 22

research
10/20/2016

Embodiment of Learning in Electro-Optical Signal Processors

Delay-coupled electro-optical systems have received much attention for t...
research
08/09/2023

Training neural networks with end-to-end optical backpropagation

Optics is an exciting route for the next generation of computing hardwar...
research
04/14/2023

A Reconfigurable Linear RF Analog Processor for Realizing Microwave Artificial Neural Network

Owing to the data explosion and rapid development of artificial intellig...
research
07/24/2014

Trainable and Dynamic Computing: Error Backpropagation through Physical Media

Machine learning algorithms, and more in particular neural networks, arg...
research
06/23/2020

Inference with Artificial Neural Networks on the Analog BrainScaleS-2 Hardware

The neuromorphic BrainScaleS-2 ASIC comprises mixed-signal neurons and s...
research
11/30/2020

Monadic Pavlovian associative learning in a backpropagation-free photonic network

Over a century ago, Ivan P. Pavlov, in a classic experiment, demonstrate...
research
04/01/2022

Physical Deep Learning with Biologically Plausible Training Method

The ever-growing demand for further advances in artificial intelligence ...

Please sign up or login with your details

Forgot password? Click here to reset