Independently Interpretable Lasso: A New Regularizer for Sparse Regression with Uncorrelated Variables

11/06/2017
by   Masaaki Takada, et al.
0

Sparse regularization such as ℓ_1 regularization is a quite powerful and widely used strategy for high dimensional learning problems. The effectiveness of sparse regularization have been supported practically and theoretically by several studies. However, one of the biggest issues in sparse regularization is that its performance is quite sensitive to correlations between features. Ordinary ℓ_1 regularization often selects variables correlated with each other, which results in deterioration of not only its generalization error but also interpretability. In this paper, we propose a new regularization method, "Independently Interpretable Lasso" (IILasso for short). Our proposed regularizer suppresses selecting correlated variables, and thus each active variables independently affect the objective variable in the model. Hence, we can interpret regression coefficients intuitively and also improve the performance by avoiding overfitting. We analyze theoretical property of IILasso and show that the proposed method is much advantageous for its sign recovery and achieves almost minimax optimal convergence rate. Synthetic and real data analyses also indicate the effectiveness of IILasso.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/16/2016

Bayesian generalized fused lasso modeling via NEG distribution

The fused lasso penalizes a loss function by the L_1 norm for both the r...
research
03/20/2018

Graph-based regularization for regression problems with highly-correlated designs

Sparse models for high-dimensional linear regression and machine learnin...
research
03/20/2020

An Inexact Manifold Augmented Lagrangian Method for Adaptive Sparse Canonical Correlation Analysis with Trace Lasso Regularization

Canonical correlation analysis (CCA for short) describes the relationshi...
research
07/11/2023

CR-Lasso: Robust cellwise regularized sparse regression

Cellwise contamination remains a challenging problem for data scientists...
research
05/21/2022

Theoretically Accurate Regularization Technique for Matrix Factorization based Recommender Systems

Regularization is a popular technique to solve the overfitting problem o...
research
11/16/2016

ROS Regression: Integrating Regularization and Optimal Scaling Regression

In this paper we combine two important extensions of ordinary least squa...
research
12/01/2017

A Pliable Lasso

We propose a generalization of the lasso that allows the model coefficie...

Please sign up or login with your details

Forgot password? Click here to reset