A Convex Formulation for Mixed Regression with Two Components: Minimax Optimal Rates

12/25/2013
by   Yudong Chen, et al.
0

We consider the mixed regression problem with two components, under adversarial and stochastic noise. We give a convex optimization formulation that provably recovers the true solution, and provide upper bounds on the recovery errors for both arbitrary noise and stochastic noise settings. We also give matching minimax lower bounds (up to log factors), showing that under certain assumptions, our algorithm is information-theoretically optimal. Our results represent the first tractable algorithm guaranteeing successful recovery with tight bounds on recovery errors and sample complexity.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/03/2010

Information-theoretic lower bounds on the oracle complexity of stochastic convex optimization

Relative to the large literature on upper bounds on complexity of convex...
research
03/19/2023

Lower Generalization Bounds for GD and SGD in Smooth Stochastic Convex Optimization

Recent progress was made in characterizing the generalization error of g...
research
01/21/2015

Minimax Optimal Sparse Signal Recovery with Poisson Statistics

We are motivated by problems that arise in a number of applications such...
research
06/25/2015

Minimax Structured Normal Means Inference

We provide a unified treatment of a broad class of noisy structure recov...
research
06/27/2021

Support Recovery for Orthogonal Matching Pursuit: Upper and Lower bounds

This paper studies the problem of sparse regression where the goal is to...
research
05/24/2016

Local Minimax Complexity of Stochastic Convex Optimization

We extend the traditional worst-case, minimax analysis of stochastic con...
research
02/07/2021

Lower Bounds and Accelerated Algorithms for Bilevel Optimization

Bilevel optimization has recently attracted growing interests due to its...

Please sign up or login with your details

Forgot password? Click here to reset