Dual Principal Component Pursuit

10/15/2015
by   Manolis C. Tsakiris, et al.
0

We consider the problem of outlier rejection in single subspace learning. Classical approaches work with a direct representation of the subspace, and are thus efficient when the subspace dimension is small. Our approach works with a dual representation of the subspace and hence aims to find its orthogonal complement; as such, it is particularly suitable for subspaces whose dimension is very close to the ambient dimension (subspaces of high relative dimension). We pose the problem of computing normal vectors to the subspace as a non-convex ℓ_1 minimization problem on the sphere, which we call Dual Principal Component Pursuit (DPCP) problem. We provide theoretical guarantees, under which every global solution of DPCP is a vector in the orthogonal complement of the inlier subspace. Moreover, we relax the non-convex DPCP problem to a recursion of linear programming problems, which, as we show, converges in a finite number of steps to a vector orthogonal to the subspace. In particular, when the inlier subspace is a hyperplane, then the linear programming recursion converges in a finite number of steps to the global minimum of the non-convex DPCP problem. We also propose algorithms based on alternating minimization and Iteratively Reweighted Least-Squares, that are suitable for dealing with large-scale data. Extensive experiments on synthetic data show that the proposed methods are able to handle more outliers and higher relative dimensions than the state-of-the-art methods, while experiments with real face and object images show that our DPCP-based methods are competitive to the state-of-the-art.

READ FULL TEXT
research
12/24/2018

Dual Principal Component Pursuit: Probability Analysis and Efficient Algorithms

Recent methods for learning a linear subspace from data corrupted by out...
research
06/06/2017

Hyperplane Clustering Via Dual Principal Component Pursuit

We extend the theoretical analysis of a recently proposed single subspac...
research
01/22/2022

Implicit Bias of Projected Subgradient Method Gives Provable Robust Recovery of Subspaces of Unknown Codimension

Robust subspace recovery (RSR) is a fundamental problem in robust repres...
research
08/09/2018

Efficient Outlier Removal for Large Scale Global Structure-from-Motion

This work addresses the outlier removal problem in large-scale global st...
research
10/06/2021

Boosting RANSAC via Dual Principal Component Pursuit

In this paper, we revisit the problem of local optimization in RANSAC. O...
research
06/24/2014

Fast, Robust and Non-convex Subspace Recovery

This work presents a fast and non-convex algorithm for robust subspace r...
research
12/18/2010

lp-Recovery of the Most Significant Subspace among Multiple Subspaces with Outliers

We assume data sampled from a mixture of d-dimensional linear subspaces ...

Please sign up or login with your details

Forgot password? Click here to reset