Total Recall: a Customized Continual Learning Method for Neural Semantic Parsers

09/11/2021
by   Zhuang Li, et al.
0

This paper investigates continual learning for semantic parsing. In this setting, a neural semantic parser learns tasks sequentially without accessing full training data from previous tasks. Direct application of the SOTA continual learning algorithms to this problem fails to achieve comparable performance with re-training models with all seen tasks because they have not considered the special properties of structured outputs yielded by semantic parsers. Therefore, we propose TotalRecall, a continual learning method designed for neural semantic parsers from two aspects: i) a sampling method for memory replay that diversifies logical form templates and balances distributions of parse actions in a memory; ii) a two-stage training method that significantly improves generalization capability of the parsers across tasks. We conduct extensive experiments to study the research problems involved in continual semantic parsing and demonstrate that a neural semantic parser trained with TotalRecall achieves superior performance than the one trained directly with the SOTA continual learning algorithms and achieve a 3-6 times speedup compared to re-training from scratch. Code and datasets are available at: https://github.com/zhuang-li/cl_nsp.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/30/2020

Bilevel Continual Learning

Continual learning aims to learn continuously from a stream of tasks and...
research
03/30/2022

Continual Normalization: Rethinking Batch Normalization for Online Continual Learning

Existing continual learning methods use Batch Normalization (BN) to faci...
research
03/31/2021

Rainbow Memory: Continual Learning with a Memory of Diverse Samples

Continual learning is a realistic learning scenario for AI models. Preva...
research
09/14/2023

Semantic Parsing in Limited Resource Conditions

This thesis explores challenges in semantic parsing, specifically focusi...
research
10/15/2020

Continual Learning for Neural Semantic Parsing

A semantic parsing model is crucial to natural language processing appli...
research
07/29/2021

Few-Shot and Continual Learning with Attentive Independent Mechanisms

Deep neural networks (DNNs) are known to perform well when deployed to t...
research
03/20/2023

Computationally Budgeted Continual Learning: What Does Matter?

Continual Learning (CL) aims to sequentially train models on streams of ...

Please sign up or login with your details

Forgot password? Click here to reset