SCAT: Second Chance Autoencoder for Textual Data

05/11/2020
by   Somaieh Goudarzvand, et al.
0

We present a k-competitive learning approach for textual autoencoders named Second Chance Autoencoder (SCAT). SCAT selects the k largest and smallest positive activations as the winner neurons, which gain the activation values of the loser neurons during the learning process, and thus focus on retrieving well-representative features for topics. Our experiments show that SCAT achieves outstanding performance in classification, topic modeling, and document visualization compared to LDA, K-Sparse, NVCTM, and KATE.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/04/2017

KATE: K-Competitive Autoencoder for Text

Autoencoders have been successful in learning meaningful representations...
research
10/18/2021

Uncertainty-aware Topic Modeling Visualization

Topic modeling is a state-of-the-art technique for analyzing text corpor...
research
02/02/2021

Deep Autoencoder-based Fuzzy C-Means for Topic Detection

Topic detection is a process for determining topics from a collection of...
research
02/17/2022

When, where, and how to add new neurons to ANNs

Neurogenesis in ANNs is an understudied and difficult problem, even comp...
research
04/05/2022

Complex-Valued Autoencoders for Object Discovery

Object-centric representations form the basis of human perception and en...
research
04/11/2012

Concept Modeling with Superwords

In information retrieval, a fundamental goal is to transform a document ...
research
08/29/2022

A Missing Value Filling Model Based on Feature Fusion Enhanced Autoencoder

With the advent of the big data era, the data quality problem is becomin...

Please sign up or login with your details

Forgot password? Click here to reset