Semi-supervised Autoencoding Projective Dependency Parsing

11/02/2020
by   Xiao Zhang, et al.
0

We describe two end-to-end autoencoding models for semi-supervised graph-based projective dependency parsing. The first model is a Locally Autoencoding Parser (LAP) encoding the input using continuous latent variables in a sequential manner; The second model is a Globally Autoencoding Parser (GAP) encoding the input into dependency trees as latent variables, with exact inference. Both models consist of two parts: an encoder enhanced by deep neural networks (DNN) that can utilize the contextual information to encode the input into latent variables, and a decoder which is a generative model able to reconstruct the input. Both LAP and GAP admit a unified structure with different loss functions for labeled and unlabeled data with shared parameters. We conducted experiments on WSJ and UD dependency parsing data sets, showing that our models can exploit the unlabeled data to improve the performance given a limited amount of labeled data, and outperform a previously proposed semi-supervised model.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/20/2018

StructVAE: Tree-structured Latent Variable Models for Semi-supervised Semantic Parsing

Semantic parsing is the task of transducing natural language (NL) uttera...
research
04/06/2017

Multi-space Variational Encoder-Decoders for Semi-supervised Labeled Sequence Transduction

Labeled sequence transduction is a task of transforming one sequence int...
research
11/11/2019

Deep Contextualized Self-training for Low Resource Dependency Parsing

Neural dependency parsing has proven very effective, achieving state-of-...
research
07/25/2018

Differentiable Perturb-and-Parse: Semi-Supervised Parsing with a Structured Variational Autoencoder

Human annotation for syntactic parsing is expensive, and large resources...
research
05/27/2018

Dual Swap Disentangling

Learning interpretable disentangled representations is a crucial yet cha...
research
06/11/2017

Generic Axiomatization of Families of Noncrossing Graphs in Dependency Parsing

We present a simple encoding for unlabeled noncrossing graphs and show h...
research
11/01/2018

Neural Rendering Model: Joint Generation and Prediction for Semi-Supervised Learning

Unsupervised and semi-supervised learning are important problems that ar...

Please sign up or login with your details

Forgot password? Click here to reset