Riemannian metrics for neural networks II: recurrent networks and learning symbolic data sequences

06/03/2013
by   Yann Ollivier, et al.
0

Recurrent neural networks are powerful models for sequential data, able to represent complex dependencies in the sequence that simpler models such as hidden Markov models cannot handle. Yet they are notoriously hard to train. Here we introduce a training procedure using a gradient ascent in a Riemannian metric: this produces an algorithm independent from design choices such as the encoding of parameters and unit activities. This metric gradient ascent is designed to have an algorithmic cost close to backpropagation through time for sparsely connected networks. We use this procedure on gated leaky neural networks (GLNNs), a variant of recurrent neural networks with an architecture inspired by finite automata and an evolution equation inspired by continuous-time networks. GLNNs trained with a Riemannian gradient are demonstrated to effectively capture a variety of structures in synthetic problems: basic block nesting as in context-free grammars (an important feature of natural languages, but difficult to learn), intersections of multiple independent Markov-type relations, or long-distance relationships such as the distant-XOR problem. This method does not require adjusting the network structure or initial parameters: the network used is a sparse random graph and the initialization is identical for all problems considered.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/03/2020

Gated Graph Recurrent Neural Networks

Graph processes exhibit a temporal structure determined by the sequence ...
research
01/12/2017

Simplified Minimal Gated Unit Variations for Recurrent Neural Networks

Recurrent neural networks with various types of hidden units have been u...
research
03/05/2019

Gated Graph Convolutional Recurrent Neural Networks

Graph processes model a number of important problems such as identifying...
research
06/04/2020

Hidden Markov models are recurrent neural networks: A disease progression modeling application

Hidden Markov models (HMMs) are commonly used for sequential data modeli...
research
05/01/2019

Learning higher-order sequential structure with cloned HMMs

Variable order sequence modeling is an important problem in artificial a...
research
10/06/2017

The Recurrent Temporal Discriminative Restricted Boltzmann Machines

The recurrent temporal restricted Boltzmann machine (RTRBM) has been suc...
research
07/28/2015

Training recurrent networks online without backtracking

We introduce the "NoBackTrack" algorithm to train the parameters of dyna...

Please sign up or login with your details

Forgot password? Click here to reset