Information-Computation Tradeoffs for Learning Margin Halfspaces with Random Classification Noise

06/28/2023
by   Ilias Diakonikolas, et al.
0

We study the problem of PAC learning γ-margin halfspaces with Random Classification Noise. We establish an information-computation tradeoff suggesting an inherent gap between the sample complexity of the problem and the sample complexity of computationally efficient algorithms. Concretely, the sample complexity of the problem is Θ(1/(γ^2 ϵ)). We start by giving a simple efficient algorithm with sample complexity O(1/(γ^2 ϵ^2)). Our main result is a lower bound for Statistical Query (SQ) algorithms and low-degree polynomial tests suggesting that the quadratic dependence on 1/ϵ in the sample complexity is inherent for computationally efficient algorithms. Specifically, our results imply a lower bound of Ω(1/(γ^1/2ϵ^2)) on the sample complexity of any efficient SQ learner or low-degree test.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/13/2023

Near-Optimal Bounds for Learning Gaussian Halfspaces with Random Classification Noise

We study the problem of learning general (i.e., not necessarily homogene...
research
03/08/2018

Learning with Rules

Complex classifiers may exhibit "embarassing" failures in cases that wou...
research
07/18/2023

The Full Landscape of Robust Mean Testing: Sharp Separations between Oblivious and Adaptive Contamination

We consider the question of Gaussian mean testing, a fundamental task in...
research
01/20/2022

Reproducibility in Learning

We introduce the notion of a reproducible algorithm in the context of le...
research
03/31/2023

Large Dimensional Independent Component Analysis: Statistical Optimality and Computational Tractability

In this paper, we investigate the optimal statistical performance and th...
research
05/28/2023

On the Role of Noise in the Sample Complexity of Learning Recurrent Neural Networks: Exponential Gaps for Long Sequences

We consider the class of noisy multi-layered sigmoid recurrent neural ne...
research
11/04/2021

Label Ranking through Nonparametric Regression

Label Ranking (LR) corresponds to the problem of learning a hypothesis t...

Please sign up or login with your details

Forgot password? Click here to reset