Compositional Sentence Representation from Character within Large Context Text

05/02/2016
by   Geonmin Kim, et al.
0

This paper describes a Hierarchical Composition Recurrent Network (HCRN) consisting of a 3-level hierarchy of compositional models: character, word and sentence. This model is designed to overcome two problems of representing a sentence on the basis of a constituent word sequence. The first is a data-sparsity problem in word embedding, and the other is a no usage of inter-sentence dependency. In the HCRN, word representations are built from characters, thus resolving the data-sparsity problem, and inter-sentence dependency is embedded into sentence representation at the level of sentence composition. We adopt a hierarchy-wise learning scheme in order to alleviate the optimization difficulties of learning deep hierarchical recurrent network in end-to-end fashion. The HCRN was quantitatively and qualitatively evaluated on a dialogue act classification task. Especially, sentence representations with an inter-sentence dependency are able to capture both implicit and explicit semantics of sentence, significantly improving performance. In the end, the HCRN achieved state-of-the-art performance with a test error rate of 22.7

READ FULL TEXT

page 6

page 12

research
07/10/2016

Charagram: Embedding Words and Sentences via Character n-grams

We present Charagram embeddings, a simple approach for learning characte...
research
08/10/2015

Syntax-Aware Multi-Sense Word Embeddings for Deep Compositional Models of Meaning

Deep compositional models of meaning acting on distributional representa...
research
05/29/2017

Character-Based Text Classification using Top Down Semantic Model for Sentence Representation

Despite the success of deep learning on many fronts especially image and...
research
04/11/2019

Gating Mechanisms for Combining Character and Word-level Word Representations: An Empirical Study

In this paper we study how different ways of combining character and wor...
research
07/05/2017

A Deep Network with Visual Text Composition Behavior

While natural languages are compositional, how state-of-the-art neural m...
research
01/21/2023

Syntax-guided Neural Module Distillation to Probe Compositionality in Sentence Embeddings

Past work probing compositionality in sentence embedding models faces is...
research
06/13/2019

A Comparison of Word-based and Context-based Representations for Classification Problems in Health Informatics

Distributed representations of text can be used as features when trainin...

Please sign up or login with your details

Forgot password? Click here to reset