Efficient Alternating Minimization with Applications to Weighted Low Rank Approximation

06/07/2023
by   Zhao Song, et al.
0

Weighted low rank approximation is a fundamental problem in numerical linear algebra, and it has many applications in machine learning. Given a matrix M ∈ℝ^n × n, a weight matrix W ∈ℝ_≥ 0^n × n, a parameter k, the goal is to output two matrices U, V ∈ℝ^n × k such that W ∘ (M - U V) _F is minimized, where ∘ denotes the Hadamard product. Such a problem is known to be NP-hard and even hard to approximate [RSW16]. Meanwhile, alternating minimization is a good heuristic solution for approximating weighted low rank approximation. The work [LLR16] shows that, under mild assumptions, alternating minimization does provide provable guarantees. In this work, we develop an efficient and robust framework for alternating minimization. For weighted low rank approximation, this improves the runtime of [LLR16] from n^2 k^2 to n^2k. At the heart of our work framework is a high-accuracy multiple response regression solver together with a robust analysis of alternating minimization.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset