Spectral information criterion for automatic elbow detection

08/17/2023
by   L. Martino, et al.
0

We introduce a generalized information criterion that contains other well-known information criteria, such as Bayesian information Criterion (BIC) and Akaike information criterion (AIC), as special cases. Furthermore, the proposed spectral information criterion (SIC) is also more general than the other information criteria, e.g., since the knowledge of a likelihood function is not strictly required. SIC extracts geometric features of the error curve and, as a consequence, it can be considered an automatic elbow detector. SIC provides a subset of all possible models, with a cardinality that often is much smaller than the total number of possible models. The elements of this subset are elbows of the error curve. A practical rule for selecting a unique model within the sets of elbows is suggested as well. Theoretical invariance properties of SIC are analyzed. Moreover, we test SIC in ideal scenarios where provides always the optimal expected results. We also test SIC in several numerical experiments: some involving synthetic data, and two experiments involving real datasets. They are all real-world applications such as clustering, variable selection, or polynomial order selection, to name a few. The results show the benefits of the proposed scheme. Matlab code related to the experiments is also provided. Possible future research lines are finally discussed.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/10/2019

A constrained minimum criterion for variable selection

For variable selection in the sparse linear model setting, we propose a ...
research
08/17/2023

Universal and Automatic Elbow Detection for Learning the Effective Number of Components in Model Selection Problems

We design a Universal Automatic Elbow Detector (UAED) for deciding the e...
research
09/16/2021

On variable selection in joint modeling of mean and dispersion

The joint modeling of mean and dispersion (JMMD) provides an efficient m...
research
08/08/2023

Are Information criteria good enough to choose the right the number of regimes in Hidden Markov Models?

Selecting the number of regimes in Hidden Markov models is an important ...
research
03/29/2022

Information criteria for sparse methods in causal inference

For propensity score analysis and sparse estimation, we develop an infor...
research
04/23/2021

Certifiably Polynomial Algorithm for Best Group Subset Selection

Best group subset selection aims to choose a small part of non-overlappi...
research
07/11/2012

Selection of Identifiability Criteria for Total Effects by using Path Diagrams

Pearl has provided the back door criterion, the front door criterion and...

Please sign up or login with your details

Forgot password? Click here to reset