Tensor-Compressed Back-Propagation-Free Training for (Physics-Informed) Neural Networks

08/18/2023
by   Yequan Zhao, et al.
0

Backward propagation (BP) is widely used to compute the gradients in neural network training. However, it is hard to implement BP on edge devices due to the lack of hardware and software resources to support automatic differentiation. This has tremendously increased the design complexity and time-to-market of on-device training accelerators. This paper presents a completely BP-free framework that only requires forward propagation to train realistic neural networks. Our technical contributions are three-fold. Firstly, we present a tensor-compressed variance reduction approach to greatly improve the scalability of zeroth-order (ZO) optimization, making it feasible to handle a network size that is beyond the capability of previous ZO approaches. Secondly, we present a hybrid gradient evaluation approach to improve the efficiency of ZO training. Finally, we extend our BP-free training framework to physics-informed neural networks (PINNs) by proposing a sparse-grid approach to estimate the derivatives in the loss function without using BP. Our BP-free training only loses little accuracy on the MNIST dataset compared with standard first-order training. We also demonstrate successful results in training a PINN for solving a 20-dim Hamiltonian-Jacobi-Bellman PDE. This memory-efficient and BP-free approach may serve as a foundation for the near-future on-device training on many resource-constraint platforms (e.g., FPGA, ASIC, micro-controllers, and photonic chips).

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/02/2021

Target Propagation via Regularized Inversion

Target Propagation (TP) algorithms compute targets instead of gradients ...
research
06/21/2019

Fully Decoupled Neural Network Learning Using Delayed Gradients

Using the back-propagation (BP) to train neural networks requires a sequ...
research
07/23/2019

BPPSA: Scaling Back-propagation by Parallel Scan Algorithm

In an era when the performance of a single compute device plateaus, soft...
research
07/04/2022

TT-PINN: A Tensor-Compressed Neural PDE Solver for Edge Computing

Physics-informed neural networks (PINNs) have been increasingly employed...
research
07/23/2019

Scaling Back-propagation by Parallel Scan Algorithm

In an era when the performance of a single compute device plateaus, soft...
research
08/17/2023

Learning representations by forward-propagating errors

Back-propagation (BP) is widely used learning algorithm for neural netwo...
research
05/14/2022

BackLink: Supervised Local Training with Backward Links

Empowered by the backpropagation (BP) algorithm, deep neural networks ha...

Please sign up or login with your details

Forgot password? Click here to reset