Cutting Recursive Autoencoder Trees

01/13/2013
by   Christian Scheible, et al.
0

Deep Learning models enjoy considerable success in Natural Language Processing. While deep architectures produce useful representations that lead to improvements in various tasks, they are often difficult to interpret. This makes the analysis of learned structures particularly difficult. In this paper, we rely on empirical tests to see whether a particular structure makes sense. We present an analysis of the Semi-Supervised Recursive Autoencoder, a well-known model that produces structural representations of text. We show that for certain tasks, the structure of the autoencoder can be significantly reduced without loss of classification accuracy and we evaluate the produced structures using human judgment.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/10/2021

Modeling Hierarchical Structures with Continuous Recursive Neural Networks

Recursive Neural Networks (RvNNs), which compose sequences according to ...
research
02/05/2020

Structural Deep Clustering Network

Clustering is a fundamental task in data analysis. Recently, deep cluste...
research
10/16/2019

Evolution of transfer learning in natural language processing

In this paper, we present a study of the recent advancements which have ...
research
12/02/2013

Bidirectional Recursive Neural Networks for Token-Level Labeling with Structure

Recently, deep architectures, such as recurrent and recursive neural net...
research
04/11/2021

Unsupervised Learning of Explainable Parse Trees for Improved Generalisation

Recursive neural networks (RvNN) have been shown useful for learning sen...
research
10/16/2018

INFODENS: An Open-source Framework for Learning Text Representations

The advent of representation learning methods enabled large performance ...
research
03/15/2023

ROSE: A Neurocomputational Architecture for Syntax

A comprehensive model of natural language processing in the brain must a...

Please sign up or login with your details

Forgot password? Click here to reset