Guardians of the Deep Fog: Failure-Resilient DNN Inference from Edge to Cloud

09/03/2019
by   Ashkan Yousefpour, et al.
0

Partitioning and distributing deep neural networks (DNNs) over physical nodes such as edge, fog, or cloud nodes, could enhance sensor fusion, and reduce bandwidth and inference latency. However, when a DNN is distributed over physical nodes, failure of the physical nodes causes the failure of the DNN units that are placed on those nodes. The performance of the inference task will be unpredictable, and most likely, poor, if the distributed DNN is not specifically designed and properly trained for failures. Motivated by this, we introduce deepFogGuard, a method for making the distributed DNN inference task failure-resilient. To articulate deepFogGuard, we introduce the elements and a model for the resiliency of distributed DNN inference. Inspired by the concept of residual connections in DNNs, we introduce skip hyperconnections in distributed DNNs, which are the basis of deepFogGuard's design to provide resiliency. Next, our extensive experiments using two existing datasets for the state-of-the-art sensing and vision applications confirm the ability of deepFogGuard to provide resiliency for distributed DNNs in edge-cloud networks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/18/2020

Failout: Achieving Failure-Resilient Inference in Distributed Neural Networks

When a neural network is partitioned and distributed across physical nod...
research
11/09/2021

QUDOS: Quorum-Based Cloud-Edge Distributed DNNs for Security Enhanced Industry 4.0

Distributed machine learning algorithms that employ Deep Neural Networks...
research
09/06/2017

Distributed Deep Neural Networks over the Cloud, the Edge and End Devices

We propose distributed deep neural networks (DDNNs) over distributed com...
research
02/22/2023

DISCO: Distributed Inference with Sparse Communications

Deep neural networks (DNNs) have great potential to solve many real-worl...
research
01/21/2019

No DNN Left Behind: Improving Inference in the Cloud with Multi-Tenancy

With the rise of machine learning, inference on deep neural networks (DN...
research
10/21/2022

Partitioning and Placement of Deep Neural Networks on Distributed Edge Devices to Maximize Inference Throughput

Edge inference has become more widespread, as its diverse applications r...
research
06/15/2022

Understanding and Optimizing Deep Learning Cold-Start Latency on Edge Devices

DNNs are ubiquitous on edge devices nowadays. With its increasing import...

Please sign up or login with your details

Forgot password? Click here to reset