Federated Learning via Indirect Server-Client Communications

02/14/2023
by   Jieming Bian, et al.
0

Federated Learning (FL) is a communication-efficient and privacy-preserving distributed machine learning framework that has gained a significant amount of research attention recently. Despite the different forms of FL algorithms (e.g., synchronous FL, asynchronous FL) and the underlying optimization methods, nearly all existing works implicitly assumed the existence of a communication infrastructure that facilitates the direct communication between the server and the clients for the model data exchange. This assumption, however, does not hold in many real-world applications that can benefit from distributed learning but lack a proper communication infrastructure (e.g., smart sensing in remote areas). In this paper, we propose a novel FL framework, named FedEx (short for FL via Model Express Delivery), that utilizes mobile transporters (e.g., Unmanned Aerial Vehicles) to establish indirect communication channels between the server and the clients. Two algorithms, called FedEx-Sync and FedEx-Async, are developed depending on whether the transporters adopt a synchronized or an asynchronized schedule. Even though the indirect communications introduce heterogeneous delays to clients for both the global model dissemination and the local model collection, we prove the convergence of both versions of FedEx. The convergence analysis subsequently sheds lights on how to assign clients to different transporters and design the routes among the clients. The performance of FedEx is evaluated through experiments in a simulated network on two public datasets.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset