DeepAI AI Chat
Log In Sign Up

mcBERT: Momentum Contrastive Learning with BERT for Zero-Shot Slot Filling

03/24/2022
by   Seong-Hwan Heo, et al.
POSTECH
0

Zero-shot slot filling has received considerable attention to cope with the problem of limited available data for the target domain. One of the important factors in zero-shot learning is to make the model learn generalized and reliable representations. For this purpose, we present mcBERT, which stands for momentum contrastive learning with BERT, to develop a robust zero-shot slot filling model. mcBERT uses BERT to initialize the two encoders, the query encoder and key encoder, and is trained by applying momentum contrastive learning. Our experimental results on the SNIPS benchmark show that mcBERT substantially outperforms the previous models, recording a new state-of-the-art. Besides, we also show that each component composing mcBERT contributes to the performance improvement.

READ FULL TEXT

page 1

page 2

page 3

page 4

11/24/2020

Zero-Shot Visual Slot Filling as Question Answering

This paper presents a new approach to visual zero-shot slot filling. The...
01/16/2021

Linguistically-Enriched and Context-Aware Zero-shot Slot Filling

Slot filling is identifying contiguous spans of words in an utterance th...
03/24/2023

Toward Open-domain Slot Filling via Self-supervised Co-training

Slot filling is one of the critical tasks in modern conversational syste...
04/24/2023

Generation-driven Contrastive Self-training for Zero-shot Text Classification with Instruction-tuned GPT

Moreover, GPT-based zero-shot classification models tend to make indepen...
02/15/2023

How to Train Your DRAGON: Diverse Augmentation Towards Generalizable Dense Retrieval

Various techniques have been developed in recent years to improve dense ...