Communication-Efficient Federated Learning over Capacity-Limited Wireless Networks

07/20/2023
by   Jaewon Yun, et al.
0

In this paper, a communication-efficient federated learning (FL) framework is proposed for improving the convergence rate of FL under a limited uplink capacity. The central idea of the proposed framework is to transmit the values and positions of the top-S entries of a local model update for uplink transmission. A lossless encoding technique is considered for transmitting the positions of these entries, while a linear transformation followed by the Lloyd-Max scalar quantization is considered for transmitting their values. For an accurate reconstruction of the top-S values, a linear minimum mean squared error method is developed based on the Bussgang decomposition. Moreover, an error feedback strategy is introduced to compensate for both compression and reconstruction errors. The convergence rate of the proposed framework is analyzed for a non-convex loss function with consideration of the compression and reconstruction errors. From the analytical result, the key parameters of the proposed framework are optimized for maximizing the convergence rate for the given capacity. Simulation results on the MNIST and CIFAR-10 datasets demonstrate that the proposed framework outperforms state-of-the-art FL frameworks in terms of classification accuracy under the limited uplink capacity.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset