On Communication Complexity of Classification Problems

11/16/2017
by   Daniel M. Kane, et al.
0

This work introduces a model of distributed learning in the spirit of Yao's communication complexity model. We consider a two-party setting, where each of the players gets a list of labelled examplesand they communicate in order to jointly perform some learning task. To naturally fit into the framework of learning theory, we allow the players to send each other labelled examples, where each example costs one unit of communication. This model can also be thought of as a distributed version of sample compression schemes. We study several fundamental questions in this model. For example, we define the analogues of the complexity classes P, NP and coNP, and show that in this model P equals the intersection of NP and coNP. The proof does not seem to follow from the analogous statement in classical communication complexity; in particular, our proof uses different techniques, including boosting and metric properties of VC classes. This framework allows to prove, in the context of distributed learning, unconditional separations between various learning contexts, like realizable versus agnostic learning, and proper versus improper learning. The proofs here are based on standard ideas from communication complexity as well as learning theory and geometric constructions in Euclidean space. As a corollary, we also obtain lower bounds that match the performance of algorithms from previous works on distributed classification.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/08/2019

Proof compression and NP versus PSPACE. Part 2

We upgrade [1] to a complete proof of the conjecture NP = PSPACE. [1]:...
research
09/08/2019

Convex Set Disjointness, Distributed Learning of Halfspaces, and LP Feasibility

We study the Convex Set Disjointness (CSD) problem, where two players ha...
research
12/04/2020

On proof theory in computer science

The subject logic in computer science should entail proof theoretic appl...
research
03/26/2018

Local verification of global proofs

In this work we study the cost of local and global proofs on distributed...
research
02/05/2019

The Hardest Halfspace

We study the approximation of halfspaces h:{0,1}^n→{0,1} in the infinity...
research
03/30/2017

On Fundamental Limits of Robust Learning

We consider the problems of robust PAC learning from distributed and str...
research
08/24/2016

AIDE: Fast and Communication Efficient Distributed Optimization

In this paper, we present two new communication-efficient methods for di...

Please sign up or login with your details

Forgot password? Click here to reset