Numerical Sequence Prediction using Bayesian Concept Learning

01/13/2020
by   Mohith Damarapati, et al.
0

When people learn mathematical patterns or sequences, they are able to identify the concepts (or rules) underlying those patterns. Having learned the underlying concepts, humans are also able to generalize those concepts to other numbers, so far as to even identify previously unseen combinations of those rules. Current state-of-the art RNN architectures like LSTMs perform well in predicting successive elements of sequential data, but require vast amounts of training examples. Even with extensive data, these models struggle to generalize concepts. From our behavioral study, we also found that humans are able to disregard noise and identify the underlying rules generating the corrupted sequences. We therefore propose a Bayesian model that captures these human-like learning capabilities to predict next number in a given sequence, better than traditional LSTMs.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/15/2020

Neural Execution Engines: Learning to Execute Subroutines

A significant effort has been made to train neural networks that replica...
research
10/31/2022

Do LSTMs See Gender? Probing the Ability of LSTMs to Learn Abstract Syntactic Rules

LSTMs trained on next-word prediction can accurately perform linguistic ...
research
06/11/2021

Sample-efficient Linguistic Generalizations through Program Synthesis: Experiments with Phonology Problems

Neural models excel at extracting statistical patterns from large amount...
research
06/09/2019

Learning to Predict Novel Noun-Noun Compounds

We introduce temporally and contextually-aware models for the novel task...
research
05/20/2021

Flexible Compositional Learning of Structured Visual Concepts

Humans are highly efficient learners, with the ability to grasp the mean...
research
05/17/2018

Learning is Compiling: Experience Shapes Concept Learning by Combining Primitives in a Language of Thought

Recent approaches to human concept learning have successfully combined t...
research
11/30/2019

Modeling German Verb Argument Structures: LSTMs vs. Humans

LSTMs have proven very successful at language modeling. However, it rema...

Please sign up or login with your details

Forgot password? Click here to reset