DeepAI AI Chat
Log In Sign Up

Bidirectional Recursive Neural Networks for Token-Level Labeling with Structure

by   Ozan Irsoy, et al.
cornell university

Recently, deep architectures, such as recurrent and recursive neural networks have been successfully applied to various natural language processing tasks. Inspired by bidirectional recurrent neural networks which use representations that summarize the past and future around an instance, we propose a novel architecture that aims to capture the structural information around an input, and use it to label instances. We apply our method to the task of opinion expression extraction, where we employ the binary parse tree of a sentence as the structure, and word vector representations as the initial representation of a single token. We conduct preliminary experiments to investigate its performance and compare it to the sequential approach.


page 1

page 2

page 3

page 4


Neural Tree Indexers for Text Understanding

Recurrent neural networks (RNNs) process input text sequentially and mod...

Joint Learning of Correlated Sequence Labelling Tasks Using Bidirectional Recurrent Neural Networks

The stream of words produced by Automatic Speech Recognition (ASR) syste...

A Deep Neural Network Approach To Parallel Sentence Extraction

Parallel sentence extraction is a task addressing the data sparsity prob...

Towards Formula Translation using Recursive Neural Networks

While it has become common to perform automated translations on natural ...

Solving Math Word Problems by Scoring Equations with Recursive Neural Networks

Solving math word problems is a cornerstone task in assessing language u...

Cutting Recursive Autoencoder Trees

Deep Learning models enjoy considerable success in Natural Language Proc...

Jet Charge and Machine Learning

Modern machine learning techniques, such as convolutional, recurrent and...