Fast and Sample-Efficient Federated Low Rank Matrix Recovery from Column-wise Linear and Quadratic Projections
This work studies the following problem and its magnitude-only extension: develop a federated solution to recover an n × q rank-r matrix, X^* =[x^*_1 , x^*_2 ,...x^*_q], from m independent linear projections of each of its columns, i.e., from y_k := A_k x^*_k , k ∈ [q], where y_k is an m-length vector. Even though low-rank recovery problems have been extensively studied in the last decade, this particular problem has received surprisingly little attention. There exist only two provable solutions with a reasonable sample complexity, both of which are slow, have sub-optimal sample-complexity, and cannot be federated efficiently. We introduce a novel gradient descent (GD) based solution called GD-min that needs only Ω((n+q) r^2 log(1/ϵ)) samples and O( mq nr log (1/ϵ)) time to obtain an ϵ-accurate estimate. Based on comparison with other well-studied problems, this is the best achievable sample complexity guarantee for a non-convex solution to the above problem. The time complexity is nearly linear and cannot be improved significantly either. Finally, in a federated setting, our solution has low communication cost and maintains privacy of the nodes' data and of the corresponding column estimates.
READ FULL TEXT