DeepAI AI Chat
Log In Sign Up

Semantic Hilbert Space for Text Representation Learning

by   Benyou Wang, et al.

Capturing the meaning of sentences has long been a challenging task. Current models tend to apply linear combinations of word features to conduct semantic composition for bigger-granularity units e.g. phrases, sentences, and documents. However, the semantic linearity does not always hold in human language. For instance, the meaning of the phrase `ivory tower' can not be deduced by linearly combining the meanings of `ivory' and `tower'. To address this issue, we propose a new framework that models different levels of semantic units (e.g. sememe, word, sentence, and semantic abstraction) on a single Semantic Hilbert Space, which naturally admits a non-linear semantic composition by means of a complex-valued vector word representation. An end-to-end neural network [] is proposed to implement the framework in the text classification task, and evaluation results on six benchmarking text classification datasets demonstrate the effectiveness, robustness and self-explanation power of the proposed model. Furthermore, intuitive case studies are conducted to help end users to understand how the framework works.


page 1

page 2

page 3

page 4


A semantic hierarchical graph neural network for text classification

The key to the text classification task is language representation and i...

Character-Based Text Classification using Top Down Semantic Model for Sentence Representation

Despite the success of deep learning on many fronts especially image and...

Transformer-F: A Transformer network with effective methods for learning universal sentence representation

The Transformer model is widely used in natural language processing for ...

Extending Multi-Sense Word Embedding to Phrases and Sentences for Unsupervised Semantic Applications

Most unsupervised NLP models represent each word with a single point or ...

Why and How to Pay Different Attention to Phrase Alignments of Different Intensities

This work studies comparatively two typical sentence pair classification...

Meta Multi-Task Learning for Sequence Modeling

Semantic composition functions have been playing a pivotal role in neura...

CNM: An Interpretable Complex-valued Network for Matching

This paper seeks to model human language by the mathematical framework o...