Recurrent Neural Networks with Mixed Hierarchical Structures for Natural Language Processing

06/04/2021
by   Zhaoxin Luo, et al.
0

Hierarchical structures exist in both linguistics and Natural Language Processing (NLP) tasks. How to design RNNs to learn hierarchical representations of natural languages remains a long-standing challenge. In this paper, we define two different types of boundaries referred to as static and dynamic boundaries, respectively, and then use them to construct a multi-layer hierarchical structure for document classification tasks. In particular, we focus on a three-layer hierarchical structure with static word- and sentence- layers and a dynamic phrase-layer. LSTM cells and two boundary detectors are used to implement the proposed structure, and the resulting network is called the Recurrent Neural Network with Mixed Hierarchical Structures (MHS-RNN). We further add three layers of attention mechanisms to the MHS-RNN model. Incorporating attention mechanisms allows our model to use more important content to construct document representation and enhance its performance on document classification tasks. Experiments on five different datasets show that the proposed architecture outperforms previous methods on all the five tasks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/21/2022

Recurrent Neural Networks with Mixed Hierarchical Structures and EM Algorithm for Natural Language Processing

How to obtain hierarchical representations with an increasing level of a...
research
09/10/2023

Unsupervised Chunking with Hierarchical RNN

In Natural Language Processing (NLP), predicting linguistic structures, ...
research
11/01/2016

Recurrent Neural Network Language Model Adaptation Derived Document Vector

In many natural language processing (NLP) tasks, a document is commonly ...
research
02/19/2020

Tree-structured Attention with Hierarchical Accumulation

Incorporating hierarchical structures like constituency trees has been s...
research
08/12/2020

Text Classification based on Multi-granularity Attention Hybrid Neural Network

Neural network-based approaches have become the driven forces for Natura...
research
01/20/2019

Hierarchical Attentional Hybrid Neural Networks for Document Classification

Document classification is a challenging task with important application...
research
03/24/2017

Interactive Natural Language Acquisition in a Multi-modal Recurrent Neural Architecture

The human brain is one of the most complex dynamic systems that enables ...

Please sign up or login with your details

Forgot password? Click here to reset