SCAT: Second Chance Autoencoder for Textual Data

05/11/2020
by   Somaieh Goudarzvand, et al.
0

We present a k-competitive learning approach for textual autoencoders named Second Chance Autoencoder (SCAT). SCAT selects the k largest and smallest positive activations as the winner neurons, which gain the activation values of the loser neurons during the learning process, and thus focus on retrieving well-representative features for topics. Our experiments show that SCAT achieves outstanding performance in classification, topic modeling, and document visualization compared to LDA, K-Sparse, NVCTM, and KATE.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset