Centroid estimation based on symmetric KL divergence for Multinomial text classification problem

08/29/2018
by   Jiangning Chen, et al.
0

We define a new method to estimate centroid for text classification based on the symmetric KL-divergence between the distribution of words in training documents and their class centroids. Experiments on several standard data sets indicate that the new method achieves substantial improvements over the traditional classifiers.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/23/2010

A hybrid learning algorithm for text classification

Text classification is the process of classifying documents into predefi...
research
09/23/2022

A Jensen-Shannon Divergence Based Loss Function for Bayesian Neural Networks

Kullback-Leibler (KL) divergence is widely used for variational inferenc...
research
05/08/2019

Naive Bayes with Correlation Factor for Text Classification Problem

Naive Bayes estimator is widely used in text classification problems. Ho...
research
07/06/2016

Bag of Tricks for Efficient Text Classification

This paper explores a simple and efficient baseline for text classificat...
research
06/11/2020

Symmetric-Approximation Energy-Based Estimation of Distribution (SEED): A Continuous Optimization Algorithm

Estimation of Distribution Algorithms (EDAs) maintain and iteratively up...
research
06/03/2020

Exploiting Class Labels to Boost Performance on Embedding-based Text Classification

Text classification is one of the most frequent tasks for processing tex...
research
03/16/2018

Corpus Statistics in Text Classification of Online Data

Transformation of Machine Learning (ML) from a boutique science to a gen...

Please sign up or login with your details

Forgot password? Click here to reset