Hierarchical Training of Deep Neural Networks Using Early Exiting

03/04/2023
by   Yamin Sepehri, et al.
1

Deep neural networks provide state-of-the-art accuracy for vision tasks but they require significant resources for training. Thus, they are trained on cloud servers far from the edge devices that acquire the data. This issue increases communication cost, runtime and privacy concerns. In this study, a novel hierarchical training method for deep neural networks is proposed that uses early exits in a divided architecture between edge and cloud workers to reduce the communication cost, training runtime and privacy concerns. The method proposes a brand-new use case for early exits to separate the backward pass of neural networks between the edge and the cloud during the training phase. We address the issues of most available methods that due to the sequential nature of the training phase, cannot train the levels of hierarchy simultaneously or they do it with the cost of compromising privacy. In contrast, our method can use both edge and cloud workers simultaneously, does not share the raw input data with the cloud and does not require communication during the backward pass. Several simulations and on-device experiments for different neural network architectures demonstrate the effectiveness of this method. It is shown that the proposed method reduces the training runtime by 29 the communication with the cloud is done at a low bit rate channel. This gain in the runtime is achieved whilst the accuracy drop is negligible. This method is advantageous for online learning of high-accuracy deep neural networks on low-resource devices such as mobile phones or robots as a part of an edge-cloud system, making them more flexible in facing new tasks and classes of data.

READ FULL TEXT

page 1

page 6

research
09/06/2017

Distributed Deep Neural Networks over the Cloud, the Edge and End Devices

We propose distributed deep neural networks (DDNNs) over distributed com...
research
05/30/2018

Privacy Aware Offloading of Deep Neural Networks

Deep neural networks require large amounts of resources which makes them...
research
11/11/2019

Privacy is What We Care About: Experimental Investigation of Federated Learning on Edge Devices

Federated Learning enables training of a general model through edge devi...
research
11/22/2018

Towards Robust Neural Networks with Lipschitz Continuity

Deep neural networks have shown remarkable performance across a wide ran...
research
04/09/2020

Dithered backprop: A sparse and quantized backpropagation algorithm for more efficient deep neural network training

Deep Neural Networks are successful but highly computationally expensive...
research
01/04/2022

Reliable Transactions in Serverless-Edge Architecture

With a growing interest in edge applications, such as the Internet of Th...
research
07/29/2018

ROPNN: Detection of ROP Payloads Using Deep Neural Networks

Return-oriented programming (ROP) is a code reuse attack that chains sho...

Please sign up or login with your details

Forgot password? Click here to reset