Sparse Topical Coding

02/14/2012
by   Jun Zhu, et al.
0

We present sparse topical coding (STC), a non-probabilistic formulation of topic models for discovering latent representations of large collections of data. Unlike probabilistic topic models, STC relaxes the normalization constraint of admixture proportions and the constraint of defining a normalized likelihood function. Such relaxations make STC amenable to: 1) directly control the sparsity of inferred representations by using sparsity-inducing regularizers; 2) be seamlessly integrated with a convex error function (e.g., SVM hinge loss) for supervised learning; and 3) be efficiently learned with a simply structured coordinate descent algorithm. Our results demonstrate the advantages of STC and supervised MedSTC on identifying topical meanings of words and improving classification accuracy and time efficiency.

READ FULL TEXT
research
10/26/2012

Managing sparsity, time, and quality of inference in topic models

Inference is an integral part of probabilistic topic models, but is ofte...
research
04/03/2023

Learning Sparsity of Representations with Discrete Latent Variables

Deep latent generative models have attracted increasing attention due to...
research
05/09/2019

Stochastic Iterative Hard Thresholding for Graph-structured Sparsity Optimization

Stochastic optimization algorithms update models with cheap per-iteratio...
research
10/10/2013

Gibbs Max-margin Topic Models with Data Augmentation

Max-margin learning is a powerful approach to building classifiers and s...
research
08/03/2011

Optimization with Sparsity-Inducing Penalties

Sparse estimation methods are aimed at using or obtaining parsimonious r...
research
08/13/2013

When are Overcomplete Topic Models Identifiable? Uniqueness of Tensor Tucker Decompositions with Structured Sparsity

Overcomplete latent representations have been very popular for unsupervi...
research
07/11/2020

Transfer learning extensions for the probabilistic classification vector machine

Transfer learning is focused on the reuse of supervised learning models ...

Please sign up or login with your details

Forgot password? Click here to reset