Analysis of Generalized Bregman Surrogate Algorithms for Nonsmooth Nonconvex Statistical Learning

12/16/2021
by   Yiyuan She, et al.
0

Modern statistical applications often involve minimizing an objective function that may be nonsmooth and/or nonconvex. This paper focuses on a broad Bregman-surrogate algorithm framework including the local linear approximation, mirror descent, iterative thresholding, DC programming and many others as particular instances. The recharacterization via generalized Bregman functions enables us to construct suitable error measures and establish global convergence rates for nonconvex and nonsmooth objectives in possibly high dimensions. For sparse learning problems with a composite objective, under some regularity conditions, the obtained estimators as the surrogate's fixed points, though not necessarily local minimizers, enjoy provable statistical guarantees, and the sequence of iterates can be shown to approach the statistical truth within the desired accuracy geometrically fast. The paper also studies how to design adaptive momentum based accelerations without assuming convexity or smoothness by carefully controlling stepsize and relaxation parameters.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/10/2013

Regularized M-estimators with nonconvexity: Statistical and algorithmic theory for local optima

We provide novel theoretical results regarding local optima of regulariz...
research
06/19/2017

On Quadratic Convergence of DC Proximal Newton Algorithm for Nonconvex Sparse Learning in High Dimensions

We propose a DC proximal Newton algorithm for solving nonconvex regulari...
research
06/20/2013

Optimal computational and statistical rates of convergence for sparse nonconvex learning problems

We provide theoretical analysis of the statistical and computational pro...
research
09/09/2021

Coordinate Descent Methods for DC Minimization

Difference-of-Convex (DC) minimization, referring to the problem of mini...
research
05/11/2018

Fast Rates of ERM and Stochastic Approximation: Adaptive to Error Bound Conditions

Error bound conditions (EBC) are properties that characterize the growth...
research
09/22/2020

Improving Convergence for Nonconvex Composite Programming

High-dimensional nonconvex problems are popular in today's machine learn...
research
03/11/2015

Optimal prediction for sparse linear models? Lower bounds for coordinate-separable M-estimators

For the problem of high-dimensional sparse linear regression, it is know...

Please sign up or login with your details

Forgot password? Click here to reset