Backpropagation-free Training of Deep Physical Neural Networks

04/20/2023
by   Ali Momeni, et al.
0

Recent years have witnessed the outstanding success of deep learning in various fields such as vision and natural language processing. This success is largely indebted to the massive size of deep learning models that is expected to increase unceasingly. This growth of the deep learning models is accompanied by issues related to their considerable energy consumption, both during the training and inference phases, as well as their scalability. Although a number of work based on unconventional physical systems have been proposed which addresses the issue of energy efficiency in the inference phase, efficient training of deep learning models has remained unaddressed. So far, training of digital deep learning models mainly relies on backpropagation, which is not suitable for physical implementation as it requires perfect knowledge of the computation performed in the so-called forward pass of the neural network. Here, we tackle this issue by proposing a simple deep neural network architecture augmented by a biologically plausible learning algorithm, referred to as "model-free forward-forward training". The proposed architecture enables training deep physical neural networks consisting of layers of physical nonlinear systems, without requiring detailed knowledge of the nonlinear physical layers' properties. We show that our method outperforms state-of-the-art hardware-aware training methods by improving training speed, decreasing digital computations, and reducing power consumption in physical systems. We demonstrate the adaptability of the proposed method, even in systems exposed to dynamic or unpredictable external perturbations. To showcase the universality of our approach, we train diverse wave-based physical neural networks that vary in the underlying wave phenomenon and the type of non-linearity they use, to perform vowel and image classification tasks experimentally.

READ FULL TEXT

page 2

page 4

page 5

page 6

page 8

research
04/27/2021

Deep physical neural networks enabled by a backpropagation algorithm for arbitrary physical systems

Deep neural networks have become a pervasive tool in science and enginee...
research
04/01/2022

Physical Deep Learning with Biologically Plausible Training Method

The ever-growing demand for further advances in artificial intelligence ...
research
06/11/2021

Monotonic Neural Network: combining Deep Learning with Domain Knowledge for Chiller Plants Energy Optimization

In this paper, we are interested in building a domain knowledge based de...
research
10/21/2021

Analysis of memory consumption by neural networks based on hyperparameters

Deep learning models are trained and deployed in multiple domains. Incre...
research
04/18/2022

Entropy-based Stability-Plasticity for Lifelong Learning

The ability to continuously learn remains elusive for deep learning mode...
research
05/30/2023

Forward-Forward Training of an Optical Neural Network

Neural networks (NN) have demonstrated remarkable capabilities in variou...
research
09/01/2020

Training Deep Neural Networks with Constrained Learning Parameters

Today's deep learning models are primarily trained on CPUs and GPUs. Alt...

Please sign up or login with your details

Forgot password? Click here to reset