Do Neural Topic Models Really Need Dropout? Analysis of the Effect of Dropout in Topic Modeling

03/28/2023
by   Suman Adhya, et al.
0

Dropout is a widely used regularization trick to resolve the overfitting issue in large feedforward neural networks trained on a small dataset, which performs poorly on the held-out test subset. Although the effectiveness of this regularization trick has been extensively studied for convolutional neural networks, there is a lack of analysis of it for unsupervised models and in particular, VAE-based neural topic models. In this paper, we have analyzed the consequences of dropout in the encoder as well as in the decoder of the VAE architecture in three widely used neural topic models, namely, contextualized topic model (CTM), ProdLDA, and embedded topic model (ETM) using four publicly available datasets. We characterize the dropout effect on these models in terms of the quality and predictive performance of the generated topics.

READ FULL TEXT

page 3

page 4

page 9

page 10

research
02/28/2020

The Implicit and Explicit Regularization Effects of Dropout

Dropout is a widely-used regularization technique, often required to obt...
research
12/29/2022

Macro-block dropout for improved regularization in training end-to-end speech recognition models

This paper proposes a new regularization algorithm referred to as macro-...
research
10/21/2020

TargetDrop: A Targeted Regularization Method for Convolutional Neural Networks

Dropout regularization has been widely used in deep learning but perform...
research
08/15/2015

A Comparative Study on Regularization Strategies for Embedding-based Neural Networks

This paper aims to compare different regularization strategies to addres...
research
06/22/2022

Information Geometry of Dropout Training

Dropout is one of the most popular regularization techniques in neural n...
research
04/06/2019

Effective and Efficient Dropout for Deep Convolutional Neural Networks

Machine-learning-based data-driven applications have become ubiquitous, ...
research
07/03/2012

Improving neural networks by preventing co-adaptation of feature detectors

When a large feedforward neural network is trained on a small training s...

Please sign up or login with your details

Forgot password? Click here to reset