The Proxy Step-size Technique for Regularized Optimization on the Sphere Manifold

09/05/2022
by   Fang Bai, et al.
0

We give an effective solution to the regularized optimization problem g (x) + h (x), where x is constrained on the unit sphere ‖x‖_2 = 1. Here g (·) is a smooth cost with Lipschitz continuous gradient within the unit ball {x : ‖x‖_2 ≤ 1 } whereas h (·) is typically non-smooth but convex and absolutely homogeneous, e.g., norm regularizers and their combinations. Our solution is based on the Riemannian proximal gradient, using an idea we call proxy step-size – a scalar variable which we prove is monotone with respect to the actual step-size within an interval. The proxy step-size exists ubiquitously for convex and absolutely homogeneous h(·), and decides the actual step-size and the tangent update in closed-form, thus the complete proximal gradient iteration. Based on these insights, we design a Riemannian proximal gradient method using the proxy step-size. We prove that our method converges to a critical point, guided by a line-search technique based on the g(·) cost only. The proposed method can be implemented in a couple of lines of code. We show its usefulness by applying nuclear norm, ℓ_1 norm, and nuclear-spectral norm regularization to three classical computer vision problems. The improvements are consistent and backed by numerical experiments.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset