Globally Normalized Transition-Based Neural Networks

03/19/2016
by   Daniel Andor, et al.
0

We introduce a globally normalized transition-based neural network model that achieves state-of-the-art part-of-speech tagging, dependency parsing and sentence compression results. Our model is a simple feed-forward neural network that operates on a task-specific transition system, yet achieves comparable or better accuracies than recurrent models. We discuss the importance of global as opposed to local normalization: a key insight is that the label bias problem implies that globally normalized models can be strictly more expressive than locally normalized models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/07/2021

A Globally Normalized Neural Model for Semantic Parsing

In this paper, we propose a globally normalized model for context-free g...
research
04/25/2017

Joint POS Tagging and Dependency Parsing with Transition-based Neural Networks

While part-of-speech (POS) tagging and dependency parsing are observed t...
research
09/10/2018

Towards JointUD: Part-of-speech Tagging and Lemmatization using Recurrent Neural Networks

This paper describes our submission to CoNLL 2018 UD Shared Task. We hav...
research
10/12/2017

Revisiting the Design Issues of Local Models for Japanese Predicate-Argument Structure Analysis

The research trend in Japanese predicate-argument structure (PAS) analys...
research
05/26/2022

Global Normalization for Streaming Speech Recognition in a Modular Framework

We introduce the Globally Normalized Autoregressive Transducer (GNAT) fo...
research
09/26/2018

Batch-normalized Recurrent Highway Networks

Gradient control plays an important role in feed-forward networks applie...

Please sign up or login with your details

Forgot password? Click here to reset