Learning to Adaptively Scale Recurrent Neural Networks

02/15/2019
by   Hao Hu, et al.
0

Recent advancements in recurrent neural network (RNN) research have demonstrated the superiority of utilizing multiscale structures in learning temporal representations of time series. Currently, most of multiscale RNNs use fixed scales, which do not comply with the nature of dynamical temporal patterns among sequences. In this paper, we propose Adaptively Scaled Recurrent Neural Networks (ASRNN), a simple but efficient way to handle this problem. Instead of using predefined scales, ASRNNs are able to learn and adjust scales based on different temporal contexts, making them more flexible in modeling multiscale patterns. Compared with other multiscale RNNs, ASRNNs are bestowed upon dynamical scaling capabilities with much simpler structures, and are easy to be integrated with various RNN cells. The experiments on multiple sequence modeling tasks indicate ASRNNs can efficiently adapt scales based on different sequence contexts and yield better performances than baselines without dynamical scaling abilities.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/09/2021

DeepSITH: Efficient Learning via Decomposition of What and When Across Time Scales

Extracting temporal relationships over a range of scales is a hallmark o...
research
06/12/2023

On the Dynamics of Learning Time-Aware Behavior with Recurrent Neural Networks

Recurrent Neural Networks (RNNs) have shown great success in modeling ti...
research
10/23/2020

Loss-analysis via Attention-scale for Physiologic Time Series

Physiologic signals have properties across multiple spatial and temporal...
research
11/18/2019

Radar Emitter Classification with Attribute-specific Recurrent Neural Networks

Radar pulse streams exhibit increasingly complex temporal patterns and c...
research
07/28/2023

A Distance Correlation-Based Approach to Characterize the Effectiveness of Recurrent Neural Networks for Time Series Forecasting

Time series forecasting has received a lot of attention with recurrent n...
research
11/20/2015

Recurrent Gaussian Processes

We define Recurrent Gaussian Processes (RGP) models, a general family of...
research
03/28/2016

Learning to Read Chest X-Rays: Recurrent Neural Cascade Model for Automated Image Annotation

Despite the recent advances in automatically describing image contents, ...

Please sign up or login with your details

Forgot password? Click here to reset