Robust Matrix Decomposition with Outliers

11/05/2010
by   Daniel Hsu, et al.
0

Suppose a given observation matrix can be decomposed as the sum of a low-rank matrix and a sparse matrix (outliers), and the goal is to recover these individual components from the observed sum. Such additive decompositions have applications in a variety of numerical problems including system identification, latent variable graphical modeling, and principal components analysis. We study conditions under which recovering such a decomposition is possible via a combination of ℓ_1 norm and trace norm minimization. We are specifically interested in the question of how many outliers are allowed so that convex programming can still achieve accurate recovery, and we obtain stronger recovery guarantees than previous studies. Moreover, we do not assume that the spatial pattern of outliers is random, which stands in contrast to related analyses under such assumptions via matrix completion.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/26/2015

Analysis of Nuclear Norm Regularization for Full-rank Matrix Completion

In this paper, we provide a theoretical analysis of the nuclear-norm reg...
research
02/10/2014

Universal Matrix Completion

The problem of low-rank matrix completion has recently generated a lot o...
research
09/17/2020

Low-Rank Matrix Recovery from Noisy via an MDL Framework-based Atomic Norm

The recovery of the underlying low-rank structure of clean data corrupte...
research
07/21/2023

Sparse plus low-rank identification for dynamical latent-variable graphical AR models

This paper focuses on the identification of graphical autoregressive mod...
research
10/01/2013

Incoherence-Optimal Matrix Completion

This paper considers the matrix completion problem. We show that it is n...
research
12/07/2015

Pseudo-Bayesian Robust PCA: Algorithms and Analyses

Commonly used in computer vision and other applications, robust PCA repr...
research
03/12/2018

Representation Learning and Recovery in the ReLU Model

Rectified linear units, or ReLUs, have become the preferred activation f...

Please sign up or login with your details

Forgot password? Click here to reset