LSTM Easy-first Dependency Parsing with Pre-trained Word Embeddings and Character-level Word Embeddings in Vietnamese

10/30/2019
by   Binh Duc Nguyen, et al.
0

In Vietnamese dependency parsing, several methods have been proposed. Dependency parser which uses deep neural network model has been reported that achieved state-of-the-art results. In this paper, we proposed a new method which applies LSTM easy-first dependency parsing with pre-trained word embeddings and character-level word embeddings. Our method achieves an accuracy of 80.91 on the Vietnamese Dependency Treebank (VnDT).

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/30/2017

Character Composition Model with Convolutional Neural Networks for Dependency Parsing on Morphologically Rich Languages

We present a transition-based dependency parser that uses a convolutiona...
research
01/13/2020

Visual Storytelling via Predicting Anchor Word Embeddings in the Stories

We propose a learning model for the task of visual storytelling. The mai...
research
05/03/2018

Binarizer at SemEval-2018 Task 3: Parsing dependency and deep learning for irony detection

In this paper, we describe the system submitted for the SemEval 2018 Tas...
research
11/29/2020

Improved Semantic Role Labeling using Parameterized Neighborhood Memory Adaptation

Deep neural models achieve some of the best results for semantic role la...
research
10/01/2019

Specializing Word Embeddings (for Parsing) by Information Bottleneck

Pre-trained word embeddings like ELMo and BERT contain rich syntactic an...
research
04/16/2018

A Deeper Look into Dependency-Based Word Embeddings

We investigate the effect of various dependency-based word embeddings on...
research
09/20/2023

Assessment of Pre-Trained Models Across Languages and Grammars

We present an approach for assessing how multilingual large language mod...

Please sign up or login with your details

Forgot password? Click here to reset