A Structured Variational Autoencoder for Contextual Morphological Inflection

06/10/2018
by   Lawrence Wolf-Sonkin, et al.
0

Statistical morphological inflectors are typically trained on fully supervised, type-level data. One remaining open research question is the following: How can we effectively exploit raw, token-level data to improve their performance? To this end, we introduce a novel generative latent-variable model for the semi-supervised learning of inflection generation. To enable posterior inference over the latent variables, we derive an efficient variational inference procedure based on the wake-sleep algorithm. We experiment on 23 languages, using the Universal Dependencies corpora in a simulated low-resource setting, and find improvements of over 10 accuracy in some cases.

READ FULL TEXT
research
10/13/2020

Controlling the Interaction Between Generation and Inference in Semi-Supervised Variational Autoencoders Using Importance Weighting

Even though Variational Autoencoders (VAEs) are widely used for semi-sup...
research
09/21/2023

Semi-Supervised Variational Inference over Nonlinear Channels

Deep learning methods for communications over unknown nonlinear channels...
research
04/06/2017

Multi-space Variational Encoder-Decoders for Semi-supervised Labeled Sequence Transduction

Labeled sequence transduction is a task of transforming one sequence int...
research
12/18/2018

A Novel Variational Autoencoder with Applications to Generative Modelling, Classification, and Ordinal Regression

We develop a novel probabilistic generative model based on the variation...
research
06/07/2019

Semi-supervised Stochastic Multi-Domain Learning using Variational Inference

Supervised models of NLP rely on large collections of text which closely...
research
11/10/2018

Dual Latent Variable Model for Low-Resource Natural Language Generation in Dialogue Systems

Recent deep learning models have shown improving results to natural lang...
research
06/30/2020

Semi-supervised Sequential Generative Models

We introduce a novel objective for training deep generative time-series ...

Please sign up or login with your details

Forgot password? Click here to reset