Inductive Visual Localisation: Factorised Training for Superior Generalisation

07/21/2018
by   Ankush Gupta, et al.
0

End-to-end trained Recurrent Neural Networks (RNNs) have been successfully applied to numerous problems that require processing sequences, such as image captioning, machine translation, and text recognition. However, RNNs often struggle to generalise to sequences longer than the ones encountered during training. In this work, we propose to optimise neural networks explicitly for induction. The idea is to first decompose the problem in a sequence of inductive steps and then to explicitly train the RNN to reproduce such steps. Generalisation is achieved as the RNN is not allowed to learn an arbitrary internal state; instead, it is tasked with mimicking the evolution of a valid state. In particular, the state is restricted to a spatial memory map that tracks parts of the input image which have been accounted for in previous steps. The RNN is trained for single inductive steps, where it produces updates to the memory in addition to the desired output. We evaluate our method on two different visual recognition problems involving visual sequences: (1) text spotting, i.e. joint localisation and reading of text in images containing multiple lines (or a block) of text, and (2) sequential counting of objects in aerial images. We show that inductive training of recurrent models enhances their generalisation ability on challenging image datasets.

READ FULL TEXT

page 2

page 5

page 7

page 8

page 9

page 10

page 15

research
05/16/2023

Empirical Analysis of the Inductive Bias of Recurrent Neural Networks by Discrete Fourier Transform of Output Sequences

A unique feature of Recurrent Neural Networks (RNNs) is that it incremen...
research
11/14/2012

Sequence Transduction with Recurrent Neural Networks

Many machine learning tasks can be expressed as the transformation---or ...
research
11/21/2015

Online Sequence Training of Recurrent Neural Networks with Connectionist Temporal Classification

Connectionist temporal classification (CTC) based supervised sequence tr...
research
08/28/2023

Kernel Limit of Recurrent Neural Networks Trained on Ergodic Data Sequences

Mathematical methods are developed to characterize the asymptotics of re...
research
08/19/2018

Linked Recurrent Neural Networks

Recurrent Neural Networks (RNNs) have been proven to be effective in mod...
research
04/07/2023

Theoretical Conditions and Empirical Failure of Bracket Counting on Long Sequences with Linear Recurrent Networks

Previous work has established that RNNs with an unbounded activation fun...

Please sign up or login with your details

Forgot password? Click here to reset