Neural Discourse Relation Recognition with Semantic Memory

03/12/2016
by   Biao Zhang, et al.
0

Humans comprehend the meanings and relations of discourses heavily relying on their semantic memory that encodes general knowledge about concepts and facts. Inspired by this, we propose a neural recognizer for implicit discourse relation analysis, which builds upon a semantic memory that stores knowledge in a distributed fashion. We refer to this recognizer as SeMDER. Starting from word embeddings of discourse arguments, SeMDER employs a shallow encoder to generate a distributed surface representation for a discourse. A semantic encoder with attention to the semantic memory matrix is further established over surface representations. It is able to retrieve a deep semantic meaning representation for the discourse from the memory. Using the surface and semantic representations as input, SeMDER finally predicts implicit discourse relations via a neural recognizer. Experiments on the benchmark data set show that SeMDER benefits from the semantic memory and achieves substantial improvements of 2.56% on average over current state-of-the-art baselines in terms of F1-score.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset