Learning without Interaction Requires Separation

09/24/2018
by   Amit Daniely, et al.
2

One of the key resources in large-scale learning systems is the number of rounds of communication between the server and the clients holding the data points. We study this resource for systems with two types of constraints on the communication from each of the clients: local differential privacy and limited number of bits communicated. For both models the number of rounds of communications is captured by the number of rounds of interaction when solving the learning problem in the statistical query (SQ) model. For many learning problems known efficient algorithms require many rounds of interaction. Yet little is known on whether this is actually necessary. In the context of classification in the PAC learning model, Kasiviswanathan et al. (2008) constructed an artificial class of functions that is PAC learnable with respect to a fixed distribution but cannot be learned by an efficient non-interactive (or one-round) SQ algorithm. Here we show that a similar separation holds for learning linear separators and decision lists without assumptions on the distribution. To prove this separation we show that non-interactive SQ algorithms can only learn function classes of low margin complexity, that is classes of functions that can be represented as large-margin linear separators.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/17/2022

On PAC Learning Halfspaces in Non-interactive Local Privacy Model with Public Unlabeled Data

In this paper, we study the problem of PAC learning halfspaces in the no...
research
08/10/2021

FedPAGE: A Fast Local Stochastic Gradient Method for Communication-Efficient Federated Learning

Federated Averaging (FedAvg, also known as Local-SGD) (McMahan et al., 2...
research
08/10/2020

Improved Bounds for Distributed Load Balancing

In the load balancing problem, the input is an n-vertex bipartite graph ...
research
09/01/2019

Round Complexity of Common Randomness Generation: The Amortized Setting

We study the effect of rounds of interaction on the common randomness ge...
research
11/11/2019

Interaction is necessary for distributed learning with privacy or communication constraints

Local differential privacy (LDP) is a model where users send privatized ...
research
08/27/2018

Communication-Rounds Tradeoffs for Common Randomness and Secret Key Generation

We study the role of interaction in the Common Randomness Generation (CR...
research
07/25/2023

Federated Heavy Hitter Recovery under Linear Sketching

Motivated by real-life deployments of multi-round federated analytics wi...

Please sign up or login with your details

Forgot password? Click here to reset