Shallow Discourse Parsing Using Distributed Argument Representations and Bayesian Optimization

06/14/2016
by   Akanksha, et al.
0

This paper describes the Georgia Tech team's approach to the CoNLL-2016 supplementary evaluation on discourse relation sense classification. We use long short-term memories (LSTM) to induce distributed representations of each argument, and then combine these representations with surface features in a neural network. The architecture of the neural network is determined by Bayesian hyperparameter search.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/11/2017

Argument Labeling of Explicit Discourse Relations using LSTM Neural Networks

Argument labeling of explicit discourse relations is a challenging task....
research
06/07/2016

Neural Network Models for Implicit Discourse Relation Classification in English and Chinese without Surface Features

Inferring implicit discourse relations in natural language text is the m...
research
03/12/2016

Neural Discourse Relation Recognition with Semantic Memory

Humans comprehend the meanings and relations of discourses heavily relyi...
research
03/07/2016

A Latent Variable Recurrent Neural Network for Discourse Relation Language Models

This paper presents a novel latent variable recurrent neural network arc...
research
06/16/2021

On the long-term learning ability of LSTM LMs

We inspect the long-term learning ability of Long Short-Term Memory lang...
research
03/05/2018

Memory Search and Sense from Shallow Hierarchies

This paper describes an automatic process for combining patterns and fea...
research
10/22/2021

Bayesian Optimization and Deep Learning forsteering wheel angle prediction

Automated driving systems (ADS) have undergone a significant improvement...

Please sign up or login with your details

Forgot password? Click here to reset