DeepAI AI Chat
Log In Sign Up

Bidirectional Recursive Neural Networks for Token-Level Labeling with Structure

12/02/2013
by   Ozan Irsoy, et al.
cornell university
0

Recently, deep architectures, such as recurrent and recursive neural networks have been successfully applied to various natural language processing tasks. Inspired by bidirectional recurrent neural networks which use representations that summarize the past and future around an instance, we propose a novel architecture that aims to capture the structural information around an input, and use it to label instances. We apply our method to the task of opinion expression extraction, where we employ the binary parse tree of a sentence as the structure, and word vector representations as the initial representation of a single token. We conduct preliminary experiments to investigate its performance and compare it to the sequential approach.

READ FULL TEXT

page 1

page 2

page 3

page 4

07/15/2016

Neural Tree Indexers for Text Understanding

Recurrent neural networks (RNNs) process input text sequentially and mod...
03/14/2017

Joint Learning of Correlated Sequence Labelling Tasks Using Bidirectional Recurrent Neural Networks

The stream of words produced by Automatic Speech Recognition (ASR) syste...
09/28/2017

A Deep Neural Network Approach To Parallel Sentence Extraction

Parallel sentence extraction is a task addressing the data sparsity prob...
11/10/2018

Towards Formula Translation using Recursive Neural Networks

While it has become common to perform automated translations on natural ...
09/11/2020

Solving Math Word Problems by Scoring Equations with Recursive Neural Networks

Solving math word problems is a cornerstone task in assessing language u...
01/13/2013

Cutting Recursive Autoencoder Trees

Deep Learning models enjoy considerable success in Natural Language Proc...
03/21/2018

Jet Charge and Machine Learning

Modern machine learning techniques, such as convolutional, recurrent and...