`local' vs. `global' parameters -- breaking the gaussian complexity barrier

04/09/2015
by   Shahar Mendelson, et al.
0

We show that if F is a convex class of functions that is L-subgaussian, the error rate of learning problems generated by independent noise is equivalent to a fixed point determined by `local' covering estimates of the class, rather than by the gaussian averages. To that end, we establish new sharp upper and lower estimates on the error rate for such problems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/13/2014

Learning without Concentration for General Loss Functions

We study prediction and estimation problems using empirical risk minimiz...
research
04/04/2019

Optimal Rate-Exponent Region for a Class of Hypothesis Testing Against Conditional Independence Problems

We study a class of distributed hypothesis testing against conditional i...
research
11/19/2022

Upper and Lower Bounds on Bit-Error Rate for Convolutional Codes

In this paper, we provide a new approach to the analytical estimation of...
research
12/29/2018

Non-Asymptotic Chernoff Lower Bound and Its Application to Community Detection in Stochastic Block Model

Chernoff coefficient is an upper bound of Bayes error probability in cla...
research
02/25/2015

On aggregation for heavy-tailed classes

We introduce an alternative to the notion of `fast rate' in Learning The...
research
03/31/2022

Learning from many trajectories

We initiate a study of supervised learning from many independent sequenc...
research
01/13/2022

Sharp estimates on random hyperplane tessellations

We study the problem of generating a hyperplane tessellation of an arbit...

Please sign up or login with your details

Forgot password? Click here to reset