Compressing Transformer-Based Semantic Parsing Models using Compositional Code Embeddings

10/10/2020
by   Prafull Prakash, et al.
0

The current state-of-the-art task-oriented semantic parsing models use BERT or RoBERTa as pretrained encoders; these models have huge memory footprints. This poses a challenge to their deployment for voice assistants such as Amazon Alexa and Google Assistant on edge devices with limited memory budgets. We propose to learn compositional code embeddings to greatly reduce the sizes of BERT-base and RoBERTa-base. We also apply the technique to DistilBERT, ALBERT-base, and ALBERT-large, three already compressed BERT variants which attain similar state-of-the-art performances on semantic parsing with much smaller model sizes. We observe 95.15 20.47 semantic parsing performances. We provide the recipe for training and analyze the trade-off between code embedding sizes and downstream performances.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/14/2019

Establishing Strong Baselines for the New Decade: Sequence Tagging, Syntactic and Semantic Parsing with BERT

This paper presents new state-of-the-art models for three tasks, part-of...
research
06/27/2019

Compositional Semantic Parsing Across Graphbanks

Most semantic parsers that map sentences to graph-based meaning represen...
research
09/28/2020

Conversational Semantic Parsing

The structured representation for semantic parsing in task-oriented assi...
research
08/09/2021

Making Transformers Solve Compositional Tasks

Several studies have reported the inability of Transformer models to gen...
research
08/07/2021

Multilingual Compositional Wikidata Questions

Semantic parsing allows humans to leverage vast knowledge resources thro...
research
03/22/2023

Open-source Frame Semantic Parsing

While the state-of-the-art for frame semantic parsing has progressed dra...
research
08/07/2021

Tiny Neural Models for Seq2Seq

Semantic parsing models with applications in task oriented dialog system...

Please sign up or login with your details

Forgot password? Click here to reset