Communication Efficient Distributed Agnostic Boosting

06/21/2015
by   Shang-Tse Chen, et al.
0

We consider the problem of learning from distributed data in the agnostic setting, i.e., in the presence of arbitrary forms of noise. Our main contribution is a general distributed boosting-based procedure for learning an arbitrary concept space, that is simultaneously noise tolerant, communication efficient, and computationally efficient. This improves significantly over prior works that were either communication efficient only in noise-free scenarios or computationally prohibitive. Empirical results on large synthetic and real-world datasets demonstrate the effectiveness and scalability of the proposed approach.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/14/2021

Boosting in the Presence of Massart Noise

We study the problem of boosting the accuracy of a weak learner in the (...
research
06/09/2022

A Resilient Distributed Boosting Algorithm

Given a learning task where the data is distributed among several partie...
research
02/27/2012

Protocols for Learning Classifiers on Distributed Data

We consider the problem of learning classifiers for labeled data that ha...
research
05/15/2020

Efficiently Learning Adversarially Robust Halfspaces with Noise

We study the problem of learning adversarially robust halfspaces in the ...
research
05/25/2016

Efficient Distributed Learning with Sparsity

We propose a novel, efficient approach for distributed sparse learning i...
research
07/30/2020

The Complexity of Adversarially Robust Proper Learning of Halfspaces with Agnostic Noise

We study the computational complexity of adversarially robust proper lea...
research
05/29/2019

Fast and Robust Rank Aggregation against Model Misspecification

In rank aggregation, preferences from different users are summarized int...

Please sign up or login with your details

Forgot password? Click here to reset