S-Concave Distributions: Towards Broader Distributions for Noise-Tolerant and Sample-Efficient Learning Algorithms

03/22/2017
by   Maria-Florina Balcan, et al.
0

We provide new results concerning noise-tolerant and sample-efficient learning algorithms under s-concave distributions over R^n for -1/2n+3< s< 0. The new class of s-concave distributions is a broad and natural generalization of log-concavity, and includes many important additional distributions, e.g., the Pareto distribution and t-distribution. This class has been studied in the context of efficient sampling, integration, and optimization, but much remains unknown concerning the geometry of this class of distributions and their applications in the context of learning. The challenge is that unlike the commonly used distributions in learning (uniform or more generally log-concave distributions), this broader class is not closed under the marginalization operator and many such distributions are fat-tailed. In this work, we introduce new convex geometry tools to study the properties of s-concave distributions and use these properties to provide bounds on quantities of interest to learning including the probability of disagreement between two halfspaces, disagreement outside a band, and disagreement coefficient. We use these results to significantly generalize prior results for margin-based active learning, disagreement-based active learning, and passively learning of intersections of halfspaces. Our analysis of geometric properties of s-concave distributions might be of independent interest to optimization more broadly.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset