A Huber Loss Minimization Approach to Byzantine Robust Federated Learning

08/24/2023
by   Puning Zhao, et al.
0

Federated learning systems are susceptible to adversarial attacks. To combat this, we introduce a novel aggregator based on Huber loss minimization, and provide a comprehensive theoretical analysis. Under independent and identically distributed (i.i.d) assumption, our approach has several advantages compared to existing methods. Firstly, it has optimal dependence on ϵ, which stands for the ratio of attacked clients. Secondly, our approach does not need precise knowledge of ϵ. Thirdly, it allows different clients to have unequal data sizes. We then broaden our analysis to include non-i.i.d data, such that clients have slightly different distributions.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset