Deep Semi-Supervised Learning with Linguistically Motivated Sequence Labeling Task Hierarchies

12/29/2016
by   Jonathan Godwin, et al.
0

In this paper we present a novel Neural Network algorithm for conducting semi-supervised learning for sequence labeling tasks arranged in a linguistically motivated hierarchy. This relationship is exploited to regularise the representations of supervised tasks by backpropagating the error of the unsupervised task through the supervised tasks. We introduce a neural network where lower layers are supervised by junior downstream tasks and the final layer task is an auxiliary unsupervised task. The architecture shows improvements of up to two percentage points F1 for Chunking compared to a plausible baseline.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/07/2017

Semi-Supervised Phoneme Recognition with Recurrent Ladder Networks

Ladder networks are a notable new concept in the field of semi-supervise...
research
05/02/2018

SaaS: Speed as a Supervisor for Semi-supervised Learning

We introduce the SaaS Algorithm for semi-supervised learning, which uses...
research
11/02/2020

RandomForestMLP: An Ensemble-Based Multi-Layer Perceptron Against Curse of Dimensionality

We present a novel and practical deep learning pipeline termed RandomFor...
research
10/28/2018

Semi-Supervised Translation with MMD Networks

This work aims to improve semi-supervised learning in a neural network a...
research
08/27/2021

Subjective Learning for Open-Ended Data

Conventional machine learning methods typically assume that data is spli...
research
12/06/2017

Product Function Need Recognition via Semi-supervised Attention Network

Functionality is of utmost importance to customers when they purchase pr...
research
07/28/2017

Recurrent Ladder Networks

We propose a recurrent extension of the Ladder networks whose structure ...

Please sign up or login with your details

Forgot password? Click here to reset