Investigating Efficient Learning and Compositionality in Generative LSTM Networks

04/16/2020
by   Sarah Fabi, et al.
0

When comparing human with artificial intelligence, one major difference is apparent: Humans can generalize very broadly from sparse data sets because they are able to recombine and reintegrate data components in compositional manners. To investigate differences in efficient learning, Joshua B. Tenenbaum and colleagues developed the character challenge: First an algorithm is trained in generating handwritten characters. In a next step, one version of a new type of character is presented. An efficient learning algorithm is expected to be able to re-generate this new character, to identify similar versions of this character, to generate new variants of it, and to create completely new character types. In the past, the character challenge was only met by complex algorithms that were provided with stochastic primitives. Here, we tackle the challenge without providing primitives. We apply a minimal recurrent neural network (RNN) model with one feedforward layer and one LSTM layer and train it to generate sequential handwritten character trajectories from one-hot encoded inputs. To manage the re-generation of untrained characters, when presented with only one example of them, we introduce a one-shot inference mechanism: the gradient signal is backpropagated to the feedforward layer weights only, leaving the LSTM layer untouched. We show that our model is able to meet the character challenge by recombining previously learned dynamic substructures, which are visible in the hidden LSTM states. Making use of the compositional abilities of RNNs in this way might be an important step towards bridging the gap between human and artificial intelligence.

READ FULL TEXT
research
06/30/2010

A novel approach for handwritten Devnagari character recognition

In this paper a method for recognition of handwritten devanagari charact...
research
08/30/2009

Handwritten Farsi Character Recognition using Artificial Neural Network

Neural Networks are being used for character recognition from last many ...
research
08/11/2019

PCGAN-CHAR: Progressively Trained Classifier Generative Adversarial Networks for Classification of Noisy Handwritten Bangla Characters

Due to the sparsity of features, noise has proven to be a great inhibito...
research
07/24/2019

RNN-based Online Handwritten Character Recognition Using Accelerometer and Gyroscope Data

This abstract explores an RNN-based approach to online handwritten recog...
research
06/21/2018

Pixel-level Reconstruction and Classification for Noisy Handwritten Bangla Characters

Classification techniques for images of handwritten characters are susce...
research
05/11/2021

Restoring Hebrew Diacritics Without a Dictionary

We demonstrate that it is feasible to diacritize Hebrew script without a...
research
06/23/2018

Deductron - A Recurrent Neural Network

The current paper is a study in Recurrent Neural Networks (RNN), motivat...

Please sign up or login with your details

Forgot password? Click here to reset