Meta-Learning to Compositionally Generalize

06/08/2021
by   Henry Conklin, et al.
12

Natural language is compositional; the meaning of a sentence is a function of the meaning of its parts. This property allows humans to create and interpret novel sentences, generalizing robustly outside their prior experience. Neural networks have been shown to struggle with this kind of generalization, in particular performing poorly on tasks designed to assess compositional generalization (i.e. where training and testing distributions differ in ways that would be trivial for a compositional strategy to resolve). Their poor performance on these tasks may in part be due to the nature of supervised learning which assumes training and testing data to be drawn from the same distribution. We implement a meta-learning augmented version of supervised learning whose objective directly optimizes for out-of-distribution generalization. We construct pairs of tasks for meta-learning by sub-sampling existing training data. Each pair of tasks is constructed to contain relevant examples, as determined by a similarity metric, in an effort to inhibit models from memorizing their input. Experimental results on the COGS and SCAN datasets show that our similarity-driven meta-learning can improve generalization performance.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/05/2020

Meta-Learning of Compositional Task Distributions in Humans and Machines

Modern machine learning systems struggle with sample efficiency and are ...
research
05/22/2023

Improved Compositional Generalization by Generating Demonstrations for Meta-Learning

Meta-learning and few-shot prompting are viable methods to induce certai...
research
04/10/2023

Meta Compositional Referring Expression Segmentation

Referring expression segmentation aims to segment an object described by...
research
08/03/2021

Generalization in Multimodal Language Learning from Simulation

Neural networks can be powerful function approximators, which are able t...
research
02/03/2020

Revisiting Meta-Learning as Supervised Learning

Recent years have witnessed an abundance of new publications and approac...
research
09/18/2019

Meta-Neighborhoods

Traditional methods for training neural networks use training data just ...
research
05/04/2022

Measuring and Improving Compositional Generalization in Text-to-SQL via Component Alignment

In text-to-SQL tasks – as in much of NLP – compositional generalization ...

Please sign up or login with your details

Forgot password? Click here to reset