Efficient Real Time Recurrent Learning through combined activity and parameter sparsity

03/10/2023
by   Anand Subramoney, et al.
0

Backpropagation through time (BPTT) is the standard algorithm for training recurrent neural networks (RNNs), which requires separate simulation phases for the forward and backward passes for inference and learning, respectively. Moreover, BPTT requires storing the complete history of network states between phases, with memory consumption growing proportional to the input sequence length. This makes BPTT unsuited for online learning and presents a challenge for implementation on low-resource real-time systems. Real-Time Recurrent Learning (RTRL) allows online learning, and the growth of required memory is independent of sequence length. However, RTRL suffers from exceptionally high computational costs that grow proportional to the fourth power of the state size, making RTRL computationally intractable for all but the smallest of networks. In this work, we show that recurrent networks exhibiting high activity sparsity can reduce the computational cost of RTRL. Moreover, combining activity and parameter sparsity can lead to significant enough savings in computational and memory costs to make RTRL practical. Unlike previous work, this improvement in the efficiency of RTRL can be achieved without using any approximations for the learning process.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/12/2020

A Practical Sparse Approximation for Real Time Recurrent Learning

Current methods for training recurrent neural networks are based on back...
research
02/11/2019

Optimal Kronecker-Sum Approximation of Real Time Recurrent Learning

One of the central goals of Recurrent Neural Networks (RNNs) is to learn...
research
05/28/2018

Approximating Real-Time Recurrent Learning with Random Kronecker Factors

Despite all the impressive advances of recurrent neural networks, sequen...
research
06/10/2016

Memory-Efficient Backpropagation Through Time

We propose a novel approach to reduce memory consumption of the backprop...
research
05/30/2023

Exploring the Promise and Limits of Real-Time Recurrent Learning

Real-time recurrent learning (RTRL) for sequence-processing recurrent ne...
research
06/13/2022

EGRU: Event-based GRU for activity-sparse inference and learning

The scalability of recurrent neural networks (RNNs) is hindered by the s...
research
02/14/2020

Interleaved Sequence RNNs for Fraud Detection

Payment card fraud causes multibillion dollar losses for banks and merch...

Please sign up or login with your details

Forgot password? Click here to reset