Web-scale Surface and Syntactic n-gram Features for Dependency Parsing

02/25/2015
by   Dominick Ng, et al.
0

We develop novel first- and second-order features for dependency parsing based on the Google Syntactic Ngrams corpus, a collection of subtree counts of parsed sentences from scanned books. We also extend previous work on surface n-gram features from Web1T to the Google Books corpus and from first-order to second-order, comparing and analysing performance over newswire and web treebanks. Surface and syntactic n-grams both produce substantial and complementary gains in parsing accuracy across domains. Our best system combines the two feature sets, achieving up to 0.8 1.4

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/28/2020

Second-Order Unsupervised Neural Dependency Parsing

Most of the unsupervised dependency parsers are based on first-order pro...
research
06/19/2020

A Survey of Syntactic-Semantic Parsing Based on Constituent and Dependency Structures

Syntactic and semantic parsing has been investigated for decades, which ...
research
04/06/2019

Speeding Up Natural Language Parsing by Reusing Partial Results

This paper proposes a novel technique that applies case-based reasoning ...
research
09/14/2021

Sparse Fuzzy Attention for Structured Sentiment Analysis

Attention scorers have achieved success in parsing tasks like semantic a...
research
10/10/2020

Second-Order Neural Dependency Parsing with Message Passing and End-to-End Training

In this paper, we propose second-order graph-based neural dependency par...
research
07/16/2021

POS tagging, lemmatization and dependency parsing of West Frisian

We present a lemmatizer/POS-tagger/dependency parser for West Frisian us...
research
11/26/2019

Convolutional Composer Classification

This paper investigates end-to-end learnable models for attributing comp...

Please sign up or login with your details

Forgot password? Click here to reset