Regularized K-means through hard-thresholding

10/02/2020
by   Jakob Raymaekers, et al.
0

We study a framework of regularized K-means methods based on direct penalization of the size of the cluster centers. Different penalization strategies are considered and compared through simulation and theoretical analysis. Based on the results, we propose HT K-means, which uses an ℓ_0 penalty to induce sparsity in the variables. Different techniques for selecting the tuning parameter are discussed and compared. The proposed method stacks up favorably with the most popular regularized K-means methods in an extensive simulation study. Finally, HT K-means is applied to several real data examples. Graphical displays are presented and used in these examples to gain more insight into the datasets.

READ FULL TEXT

page 22

page 24

page 27

research
08/28/2023

Biclustering Methods via Sparse Penalty

In this paper, we first reviewed several biclustering methods that are u...
research
10/31/2012

Iterative Hard Thresholding Methods for l_0 Regularized Convex Cone Programming

In this paper we consider l_0 regularized convex cone programming proble...
research
06/19/2019

Robust Clustering Using Tau-Scales

K means is a popular non-parametric clustering procedure introduced by S...
research
04/11/2018

Compressive Regularized Discriminant Analysis of High-Dimensional Data with Applications to Microarray Studies

We propose a modification of linear discriminant analysis, referred to a...
research
03/04/2022

On Uses of Van der Waerden Test: A Graphical Approach

Although several nonparametric tests are available for testing populatio...
research
10/09/2018

Deep clustering: On the link between discriminative models and K-means

In the context of recent deep clustering studies, discriminative models ...
research
12/22/2019

Pooled variable scaling for cluster analysis

We propose a new approach for scaling prior to cluster analysis based on...

Please sign up or login with your details

Forgot password? Click here to reset