A Recurrent Neural Model with Attention for the Recognition of Chinese Implicit Discourse Relations

04/26/2017
by   Samuel Rönnqvist, et al.
0

We introduce an attention-based Bi-LSTM for Chinese implicit discourse relations and demonstrate that modeling argument pairs as a joint sequence can outperform word order-agnostic approaches. Our model benefits from a partial sampling scheme and is conceptually simple, yet achieves state-of-the-art performance on the Chinese Discourse Treebank. We also visualize its attention activity to illustrate the model's ability to selectively focus on the relevant parts of an input sequence.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/13/2020

Extending Implicit Discourse Relation Recognition to the PDTB-3

The PDTB-3 contains many more Implicit discourse relations than the prev...
research
03/07/2016

A Latent Variable Recurrent Neural Network for Discourse Relation Language Models

This paper presents a novel latent variable recurrent neural network arc...
research
10/21/2019

Semantic Graph Convolutional Network for Implicit Discourse Relation Classification

Implicit discourse relation classification is of great importance for di...
research
01/25/2016

Survey on the attention based RNN model and its applications in computer vision

The recurrent neural networks (RNN) can be used to solve the sequence to...
research
06/07/2016

Neural Network Models for Implicit Discourse Relation Classification in English and Chinese without Surface Features

Inferring implicit discourse relations in natural language text is the m...
research
05/14/2017

Joint Modeling of Content and Discourse Relations in Dialogues

We present a joint modeling approach to identify salient discussion poin...
research
05/16/2019

Incorporating Sememes into Chinese Definition Modeling

Chinese definition modeling is a challenging task that generates a dicti...

Please sign up or login with your details

Forgot password? Click here to reset