On Approximation Guarantees for Greedy Low Rank Optimization

03/08/2017
by   Rajiv Khanna, et al.
0

We provide new approximation guarantees for greedy low rank matrix estimation under standard assumptions of restricted strong convexity and smoothness. Our novel analysis also uncovers previously unknown connections between the low rank estimation and combinatorial optimization, so much so that our bounds are reminiscent of corresponding approximation bounds in submodular maximization. Additionally, we also provide statistical recovery guarantees. Finally, we present empirical comparison of greedy estimation with established baselines on two important real-world problems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/30/2017

Approximation Algorithms for ℓ_0-Low Rank Approximation

We study the ℓ_0-Low Rank Approximation Problem, where the goal is, give...
research
11/16/2019

Regularized Weighted Low Rank Approximation

The classical low rank approximation problem is to find a rank k matrix ...
research
11/07/2018

Global Optimality in Distributed Low-rank Matrix Factorization

We study the convergence of a variant of distributed gradient descent (D...
research
03/24/2021

Approximation, Gelfand, and Kolmogorov numbers of Schatten class embeddings

Let 0<p,q≤∞ and denote by 𝒮_p^N and 𝒮_q^N the corresponding Schatten cla...
research
03/27/2023

Spatial-photonic Boltzmann machines: low-rank combinatorial optimization and statistical learning by spatial light modulation

The spatial-photonic Ising machine (SPIM) [D. Pierangeli et al., Phys. R...
research
10/01/2019

An improved analysis and unified perspective on deterministic and randomized low rank matrix approximations

We introduce a Generalized LU-Factorization (GLU) for low-rank matrix ap...
research
06/03/2020

Batch greedy maximization of non-submodular functions: Guarantees and applications to experimental design

We propose and analyze batch greedy heuristics for cardinality constrain...

Please sign up or login with your details

Forgot password? Click here to reset