An Algebraic-Geometric Approach to Shuffled Linear Regression

10/12/2018
by   Manolis C. Tsakiris, et al.
0

Shuffled linear regression is the problem of performing a linear regression fit to a dataset for which the correspondences between the independent samples and the observations are unknown. Such a problem arises in diverse domains such as computer vision, communications and biology. In its simplest form, it is tantamount to solving a linear system of equations, for which the entries of the right hand side vector have been permuted. This type of data corruption renders the linear regression task considerably harder, even in the absence of other corruptions, such as noise, outliers or missing entries. Existing methods are either applicable only to noiseless data or they are very sensitive to initialization and work only for partially shuffled data. In this paper we address both of these issues via an algebraic geometric approach, which uses symmetric polynomials to extract permutation-invariant constraints that the parameters x∈R^n of the linear regression model must satisfy. This naturally leads to a polynomial system of n equations in n unknowns, which contains x in its root locus. Using the machinery of algebraic geometry we prove that as long as the independent samples are generic, this polynomial system is always consistent with at most n! complex roots, regardless of any type of corruption inflicted on the observations. The algorithmic implication of this fact is that one can always solve this polynomial system and use its most suitable root as initialization to the Expectation Maximization algorithm. To the best of our knowledge, the resulting method is the first working solution for small values of n able to handle thousands of fully shuffled noisy observations in milliseconds.

READ FULL TEXT
research
04/02/2018

Stochastic EM for Shuffled Linear Regression

We consider the problem of inference in a linear regression model in whi...
research
03/18/2018

High Dimensional Linear Regression using Lattice Basis Reduction

We consider a high dimensional linear regression problem where the goal ...
research
07/12/2020

A spectral algorithm for robust regression with subgaussian rates

We study a new linear up to quadratic time algorithm for linear regressi...
research
01/23/2021

Unlabeled Principal Component Analysis

We consider the problem of principal component analysis from a data matr...
research
10/24/2019

Inference in High-Dimensional Linear Regression via Lattice Basis Reduction and Integer Relation Detection

We focus on the high-dimensional linear regression problem, where the al...
research
03/17/2020

Linear Regression without Correspondences via Concave Minimization

Linear regression without correspondences concerns the recovery of a sig...
research
10/03/2019

A Pseudo-Likelihood Approach to Linear Regression with Partially Shuffled Data

Recently, there has been significant interest in linear regression in th...

Please sign up or login with your details

Forgot password? Click here to reset