An Energy and Carbon Footprint Analysis of Distributed and Federated Learning
Classical and centralized Artificial Intelligence (AI) methods require moving data from producers (sensors, machines) to energy hungry data centers, raising environmental concerns due to computational and communication resource demands, while violating privacy. Emerging alternatives to mitigate such high energy costs propose to efficiently distribute, or federate, the learning tasks across devices, which are typically low-power. This paper proposes a novel framework for the analysis of energy and carbon footprints in distributed and federated learning (FL). The proposed framework quantifies both the energy footprints and the carbon equivalent emissions for vanilla FL methods and consensus-based fully decentralized approaches. We discuss optimal bounds and operational points that support green FL designs and underpin their sustainability assessment. Two case studies from emerging 5G industry verticals are analyzed: these quantify the environmental footprints of continual and reinforcement learning setups, where the training process is repeated periodically for continuous improvements. For all cases, sustainability of distributed learning relies on the fulfillment of specific requirements on communication efficiency and learner population size. Energy and test accuracy should be also traded off considering the model and the data footprints for the targeted industrial applications.
READ FULL TEXT