A Dynamic Window Neural Network for CCG Supertagging

10/10/2016
by   Huijia Wu, et al.
0

Combinatory Category Grammar (CCG) supertagging is a task to assign lexical categories to each word in a sentence. Almost all previous methods use fixed context window sizes as input features. However, it is obvious that different tags usually rely on different context window sizes. These motivate us to build a supertagger with a dynamic window approach, which can be treated as an attention mechanism on the local contexts. Applying dropout on the dynamic filters can be seen as drop on words directly, which is superior to the regular dropout on word embeddings. We use this approach to demonstrate the state-of-the-art CCG supertagging performance on the standard test set.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/22/2016

Using Word Embeddings in Twitter Election Classification

Word embeddings and convolutional neural networks (CNN) have attracted e...
research
12/21/2022

Parallel Context Windows Improve In-Context Learning of Large Language Models

For applications that require processing large amounts of text at infere...
research
04/22/2020

Revisiting the Context Window for Cross-lingual Word Embeddings

Existing approaches to mapping-based cross-lingual word embeddings are b...
research
12/30/2020

SemGloVe: Semantic Co-occurrences for GloVe from BERT

GloVe learns word embeddings by leveraging statistical information from ...
research
04/08/2020

Improving Expressivity of Graph Neural Networks

We propose a Graph Neural Network with greater expressive power than com...
research
06/24/2020

Differentiable Window for Dynamic Local Attention

We propose Differentiable Window, a new neural module and general purpos...

Please sign up or login with your details

Forgot password? Click here to reset