Elastic Gradient Descent and Elastic Gradient Flow: LARS Like Algorithms Approximating the Solution Paths of the Elastic Net

02/04/2022
by   Oskar Allerbo, et al.
0

The elastic net combines lasso and ridge regression to fuse the sparsity property of lasso with the grouping property of ridge regression. The connections between ridge regression and gradient descent and between lasso and forward stagewise regression have previously been shown. Here we combine gradient descent and forward stagewise regression into elastic gradient descent. We investigate the case of infinitesimal step size, obtaining a piecewise analytical solution, similar to the LARS algorithm. We also compare elastic gradient descent and the elastic net on real and simulated data, and show that they provide similar solution paths. Compared to the elastic net, elastic gradient descent is more robust with respect to penalization as to which parameters to include in the model. Elastic gradient descent also provides significantly better sensitivity and mean squared prediction error, without sacrificing specificity.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset