Sample-efficient L0-L2 constrained structure learning of sparse Ising models

12/03/2020
by   Antoine Dedieu, et al.
3

We consider the problem of learning the underlying graph of a sparse Ising model with p nodes from n i.i.d. samples. The most recent and best performing approaches combine an empirical loss (the logistic regression loss or the interaction screening loss) with a regularizer (an L1 penalty or an L1 constraint). This results in a convex problem that can be solved separately for each node of the graph. In this work, we leverage the cardinality constraint L0 norm, which is known to properly induce sparsity, and further combine it with an L2 norm to better model the non-zero coefficients. We show that our proposed estimators achieve an improved sample complexity, both (a) theoretically – by reaching new state-of-the-art upper bounds for recovery guarantees – and (b) empirically – by showing sharper phase transitions between poor and full recovery for graph topologies studied in the literature – when compared to their L1-based counterparts.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/20/2019

Sparse (group) learning with Lipschitz loss functions: a unified analysis

We study a family of sparse estimators defined as minimizers of some emp...
research
10/28/2018

Sparse Logistic Regression Learns All Discrete Pairwise Graphical Models

We characterize the effectiveness of a natural and classic algorithm for...
research
04/05/2023

Optimal Sketching Bounds for Sparse Linear Regression

We study oblivious sketching for k-sparse linear regression under variou...
research
10/07/2018

Error bounds for sparse classifiers in high-dimensions

We prove an L2 recovery bound for a family of sparse estimators defined ...
research
05/21/2019

Exploring the effects of Lx-norm penalty terms in multivariate curve resolution methods for resolving LC/GC-MS data

There are different problems for resolution of complex LC-MS or GC-MS da...
research
01/04/2021

Minimizing L1 over L2 norms on the gradient

In this paper, we study the L1/L2 minimization on the gradient for imagi...
research
04/01/2022

On Distributed Exact Sparse Linear Regression over Networks

In this work, we propose an algorithm for solving exact sparse linear re...

Please sign up or login with your details

Forgot password? Click here to reset