A Resilient Distributed Boosting Algorithm

06/09/2022
by   Yuval Filmus, et al.
0

Given a learning task where the data is distributed among several parties, communication is one of the fundamental resources which the parties would like to minimize. We present a distributed boosting algorithm which is resilient to a limited amount of noise. Our algorithm is similar to classical boosting algorithms, although it is equipped with a new component, inspired by Impagliazzo's hard-core lemma [Impagliazzo95], adding a robustness quality to the algorithm. We also complement this result by showing that resilience to any asymptotically larger noise is not achievable by a communication-efficient algorithm.

READ FULL TEXT
research
09/26/2013

Boosting in the presence of label noise

Boosting is known to be sensitive to label noise. We studied two approac...
research
06/21/2015

Communication Efficient Distributed Agnostic Boosting

We consider the problem of learning from distributed data in the agnosti...
research
11/13/2014

SelfieBoost: A Boosting Algorithm for Deep Learning

We describe and analyze a new boosting algorithm for deep learning calle...
research
09/09/2022

Resilient Consensus via Voronoi Communication Graphs

Consensus algorithms form the foundation for many distributed algorithms...
research
06/15/2018

Straggler-Resilient and Communication-Efficient Distributed Iterative Linear Solver

We propose a novel distributed iterative linear inverse solver method. O...
research
11/10/2020

Distributed Learning with Low Communication Cost via Gradient Boosting Untrained Neural Network

For high-dimensional data, there are huge communication costs for distri...
research
10/18/2021

Noise-Resilient Ensemble Learning using Evidence Accumulation Clustering

Ensemble Learning methods combine multiple algorithms performing the sam...

Please sign up or login with your details

Forgot password? Click here to reset