Agnostic Learnability of Halfspaces via Logistic Loss

01/31/2022
by   Ziwei Ji, et al.
0

We investigate approximation guarantees provided by logistic regression for the fundamental problem of agnostic learning of homogeneous halfspaces. Previously, for a certain broad class of "well-behaved" distributions on the examples, Diakonikolas et al. (2020) proved an Ω̃(OPT) lower bound, while Frei et al. (2021) proved an Õ(√(OPT)) upper bound, where OPT denotes the best zero-one/misclassification risk of a homogeneous halfspace. In this paper, we close this gap by constructing a well-behaved distribution such that the global minimizer of the logistic risk over this distribution only achieves Ω(√(OPT)) misclassification risk, matching the upper bound in (Frei et al., 2021). On the other hand, we also show that if we impose a radial-Lipschitzness condition in addition to well-behaved-ness on the distribution, logistic regression on a ball of bounded radius reaches Õ(OPT) misclassification risk. Our techniques also show for any well-behaved distribution, regardless of radial Lipschitzness, we can overcome the Ω(√(OPT)) lower bound for logistic loss simply at the cost of one additional convex optimization step involving the hinge loss and attain Õ(OPT) misclassification risk. This two-step convex optimization algorithm is simpler than previous methods obtaining this guarantee, all of which require solving O(log(1/OPT)) minimization problems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/18/2020

Efficient improper learning for online logistic regression

We consider the setting of online logistic regression and consider the r...
research
05/31/2021

A Minimax Lower Bound for Low-Rank Matrix-Variate Logistic Regression

This paper considers the problem of matrix-variate logistic regression. ...
research
08/15/2016

Generalization of ERM in Stochastic Convex Optimization: The Dimension Strikes Back

In stochastic convex optimization the goal is to minimize a convex funct...
research
07/06/2021

Unifying Width-Reduced Methods for Quasi-Self-Concordant Optimization

We provide several algorithms for constrained optimization of a large cl...
research
06/22/2020

Improved Bounds for Metric Capacitated Covering Problems

In the Metric Capacitated Covering (MCC) problem, given a set of balls ℬ...
research
05/19/2021

Localization, Convexity, and Star Aggregation

Offset Rademacher complexities have been shown to imply sharp, data-depe...
research
04/02/2018

Recursive Optimization of Convex Risk Measures: Mean-Semideviation Models

We develop and analyze stochastic subgradient methods for optimizing a n...

Please sign up or login with your details

Forgot password? Click here to reset