On the Tradeoff between Energy, Precision, and Accuracy in Federated Quantized Neural Networks

11/15/2021
by   Minsu Kim, et al.
0

Deploying federated learning (FL) over wireless networks with resource-constrained devices requires balancing between accuracy, energy efficiency, and precision. Prior art on FL often requires devices to train deep neural networks (DNNs) using a 32-bit precision level for data representation to improve accuracy. However, such algorithms are impractical for resource-constrained devices since DNNs could require execution of millions of operations. Thus, training DNNs with a high precision level incurs a high energy cost for FL. In this paper, a quantized FL framework, that represents data with a finite level of precision in both local training and uplink transmission, is proposed. Here, the finite level of precision is captured through the use of quantized neural networks (QNNs) that quantize weights and activations in fixed-precision format. In the considered FL model, each device trains its QNN and transmits a quantized training result to the base station. Energy models for the local training and the transmission with the quantization are rigorously derived. An energy minimization problem is formulated with respect to the level of precision while ensuring convergence. To solve the problem, we first analytically derive the FL convergence rate and use a line search method. Simulation results show that our FL framework can reduce energy consumption by up to 53 light on the tradeoff between precision, energy, and accuracy in FL over wireless networks.

READ FULL TEXT

page 1

page 5

research
07/19/2022

Green, Quantized Federated Learning over Wireless Networks: An Energy-Efficient Design

In this paper, a green, quantized FL framework, which represents data wi...
research
11/06/2019

Energy Efficient Federated Learning Over Wireless Communication Networks

In this paper, the problem of energy efficient transmission and computat...
research
06/22/2022

Quantization Robust Federated Learning for Efficient Inference on Heterogeneous Devices

Federated Learning (FL) is a machine learning paradigm to distributively...
research
09/21/2022

Performance Optimization for Variable Bitwidth Federated Learning in Wireless Networks

This paper considers improving wireless communication and computation ef...
research
12/12/2022

A Bargaining Game for Personalized, Energy Efficient Split Learning over Wireless Networks

Split learning (SL) is an emergent distributed learning framework which ...
research
12/07/2020

Design and Analysis of Uplink and Downlink Communications for Federated Learning

Communication has been known to be one of the primary bottlenecks of fed...
research
12/22/2021

Energy-Efficient Massive MIMO for Federated Learning: Transmission Designs and Resource Allocations

Future wireless networks require the integration of machine learning wit...

Please sign up or login with your details

Forgot password? Click here to reset