Sample-efficient Linguistic Generalizations through Program Synthesis: Experiments with Phonology Problems

06/11/2021
by   Saujas Vaduguru, et al.
0

Neural models excel at extracting statistical patterns from large amounts of data, but struggle to learn patterns or reason about language from only a few examples. In this paper, we ask: Can we learn explicit rules that generalize well from only a few examples? We explore this question using program synthesis. We develop a synthesis model to learn phonology rules as programs in a domain-specific language. We test the ability of our models to generalize from few training examples using our new dataset of problems from the Linguistics Olympiad, a challenging set of tasks that require strong linguistic reasoning ability. In addition to being highly sample-efficient, our approach generates human-readable programs, and allows control over the generalizability of the learnt programs.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/14/2021

Web Question Answering with Neurosymbolic Program Synthesis

In this paper, we propose a new technique based on program synthesis for...
research
07/01/2021

EqFix: Fixing LaTeX Equation Errors by Examples

LaTeX is a widely-used document preparation system. Its powerful ability...
research
07/10/2021

What underlies rapid learning and systematic generalization in humans

Despite the groundbreaking successes of neural networks, contemporary mo...
research
05/22/2022

Blackbird's language matrices (BLMs): a new benchmark to investigate disentangled generalisation in neural networks

Current successes of machine learning architectures are based on computa...
research
06/22/2020

Information-theoretic User Interaction: Significant Inputs for Program Synthesis

Programming-by-example technologies are being deployed in industrial pro...
research
01/13/2020

Numerical Sequence Prediction using Bayesian Concept Learning

When people learn mathematical patterns or sequences, they are able to i...
research
09/12/2019

Neural Semantic Parsing in Low-Resource Settings with Back-Translation and Meta-Learning

Neural semantic parsing has achieved impressive results in recent years,...

Please sign up or login with your details

Forgot password? Click here to reset