Smoothed Low Rank and Sparse Matrix Recovery by Iteratively Reweighted Least Squares Minimization
This work presents a general framework for solving the low rank and/or sparse matrix minimization problems, which may involve multiple non-smooth terms. The Iteratively Reweighted Least Squares (IRLS) method is a fast solver, which smooths the objective function and minimizes it by alternately updating the variables and their weights. However, the traditional IRLS can only solve a sparse only or low rank only minimization problem with squared loss or an affine constraint. This work generalizes IRLS to solve joint/mixed low rank and sparse minimization problems, which are essential formulations for many tasks. As a concrete example, we solve the Schatten-p norm and ℓ_2,q-norm regularized Low-Rank Representation (LRR) problem by IRLS, and theoretically prove that the derived solution is a stationary point (globally optimal if p,q≥1). Our convergence proof of IRLS is more general than previous one which depends on the special properties of the Schatten-p norm and ℓ_2,q-norm. Extensive experiments on both synthetic and real data sets demonstrate that our IRLS is much more efficient.
READ FULL TEXT