Using dependency parsing for few-shot learning in distributional semantics

05/12/2022
by   Stefania Preda, et al.
0

In this work, we explore the novel idea of employing dependency parsing information in the context of few-shot learning, the task of learning the meaning of a rare word based on a limited amount of context sentences. Firstly, we use dependency-based word embedding models as background spaces for few-shot learning. Secondly, we introduce two few-shot learning methods which enhance the additive baseline model by using dependencies.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/14/2017

Distributional Modeling on a Diet: One-shot Word Learning from Text Only

We test whether distributional models can do one-shot learning of defini...
research
11/09/2019

PoD: Positional Dependency-Based Word Embedding for Aspect Term Extraction

Dependency context-based word embedding jointly learns the representatio...
research
02/25/2019

Cross-Lingual Alignment of Contextual Word Embeddings, with Applications to Zero-shot Dependency Parsing

We introduce a novel method for multilingual transfer that utilizes deep...
research
06/10/2020

Few-shot Slot Tagging with Collapsed Dependency Transfer and Label-enhanced Task-adaptive Projection Network

In this paper, we explore the slot tagging with only a few labeled suppo...
research
06/11/2021

What Can Knowledge Bring to Machine Learning? – A Survey of Low-shot Learning for Structured Data

Supervised machine learning has several drawbacks that make it difficult...
research
03/14/2022

Self-Promoted Supervision for Few-Shot Transformer

The few-shot learning ability of vision transformers (ViTs) is rarely in...
research
06/23/2020

Discrete Few-Shot Learning for Pan Privacy

In this paper we present the first baseline results for the task of few-...

Please sign up or login with your details

Forgot password? Click here to reset