Alternating minimization and alternating descent over nonconvex sets

09/13/2017
by   Wooseok Ha, et al.
0

We analyze the performance of alternating minimization for loss functions optimized over two variables, where each variable may be restricted to lie in some potentially nonconvex constraint set. This type of setting arises naturally in high-dimensional statistics and signal processing, where the variables often reflect different structures or components within the signals being considered. Our analysis depends strongly on the notion of local concavity coefficients, which have been recently proposed in Barber and Ha (2017) to measure and quantify the concavity of a general nonconvex set. Our results further reveal important distinctions between alternating and non-alternating methods. Since computing the alternating minimization steps may not be tractable for some problems, we also consider an inexact version of the algorithm and provide a set of sufficient conditions to ensure fast convergence of the inexact algorithms. We demonstrate our framework on several examples, including low rank + sparse decomposition and multitask regression, and provide numerical experiments to validate our theoretical results.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset