Efficient Proximal Mapping Computation for Unitarily Invariant Low-Rank Inducing Norms

by   Christian Grussler, et al.

Low-rank inducing unitarily invariant norms have been introduced to convexify problems with low-rank/sparsity constraint. They are the convex envelope of a unitary invariant norm and the indicator function of an upper bounding rank constraint. The most well-known member of this family is the so-called nuclear norm. To solve optimization problems involving such norms with proximal splitting methods, efficient ways of evaluating the proximal mapping of the low-rank inducing norms are needed. This is known for the nuclear norm, but not for most other members of the low-rank inducing family. This work supplies a framework that reduces the proximal mapping evaluation into a nested binary search, in which each iteration requires the solution of a much simpler problem. This simpler problem can often be solved analytically as it is demonstrated for the so-called low-rank inducing Frobenius and spectral norms. Moreover, the framework allows to compute the proximal mapping of compositions of these norms with increasing convex functions and the projections onto their epigraphs. This has the additional advantage that we can also deal with compositions of increasing convex functions and low-rank inducing norms in proximal splitting methods.



There are no comments yet.


page 1

page 2

page 3

page 4


ℓ_0-Motivated Low-Rank Sparse Subspace Clustering

In many applications, high-dimensional data points can be well represent...

Unsupervised Deep Learning by Injecting Low-Rank and Sparse Priors

What if deep neural networks can learn from sparsity-inducing priors? Wh...

Local Convergence of Proximal Splitting Methods for Rank Constrained Problems

We analyze the local convergence of proximal splitting algorithms to sol...

Visual Processing by a Unified Schatten-p Norm and ℓ_q Norm Regularized Principal Component Pursuit

In this paper, we propose a non-convex formulation to recover the authen...

Square Deal: Lower Bounds and Improved Relaxations for Tensor Recovery

Recovering a low-rank tensor from incomplete information is a recurring ...

Convex and Network Flow Optimization for Structured Sparsity

We consider a class of learning problems regularized by a structured spa...

On the robustness of minimum-norm interpolators

This article develops a general theory for minimum-norm interpolated est...

Code Repositories


Low-Rank Inducing Norms in Python

view repo


Low-Rank Inducing Norms in MATLAB

view repo
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.