Sensitivity Analysis for Mirror-Stratifiable Convex Functions

by   Jalal Fadili, et al.

This paper provides a set of sensitivity analysis and activity identification results for a class of convex functions with a strong geometric structure, that we coined "mirror-stratifiable". These functions are such that there is a bijection between a primal and a dual stratification of the space into partitioning sets, called strata. This pairing is crucial to track the strata that are identifiable by solutions of parametrized optimization problems or by iterates of optimization algorithms. This class of functions encompasses all regularizers routinely used in signal and image processing, machine learning, and statistics. We show that this "mirror-stratifiable" structure enjoys a nice sensitivity theory, allowing us to study stability of solutions of optimization problems to small perturbations, as well as activity identification of first-order proximal splitting-type algorithms. Existing results in the literature typically assume that, under a non-degeneracy condition, the active set associated to a minimizer is stable to small perturbations and is identified in finite time by optimization schemes. In contrast, our results do not require any non-degeneracy assumption: in consequence, the optimal active set is not necessarily stable anymore, but we are able to track precisely the set of identifiable strata.We show that these results have crucial implications when solving challenging ill-posed inverse problems via regularization, a typical scenario where the non-degeneracy condition is not fulfilled. Our theoretical results, illustrated by numerical simulations, allow to characterize the instability behaviour of the regularized solutions, by locating the set of all low-dimensional strata that can be potentially identified by these solutions.



page 1

page 2

page 3

page 4


First-Order Methods for Convex Optimization

First-order methods for solving convex optimization problems have been a...

Iterative regularization for low complexity regularizers

Iterative regularization exploits the implicit bias of an optimization a...

Model Consistency of Partly Smooth Regularizers

This paper studies least-square regression penalized with partly smooth ...

Low Complexity Regularization of Linear Inverse Problems

Inverse problems and regularization theory is a central theme in contemp...

Proximal algorithms for constrained composite optimization, with applications to solving low-rank SDPs

We study a family of (potentially non-convex) constrained optimization p...

Convex Geometry of the Generalized Matrix-Fractional Function

Generalized matrix-fractional (GMF) functions are a class of matrix supp...

On strong homogeneity of a class of global optimization algorithms working with infinite and infinitesimal scales

The necessity to find the global optimum of multiextremal functions aris...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.