Modeling of Deep Neural Network (DNN) Placement and Inference in Edge Computing

01/19/2020
by   Mounir Bensalem, et al.
0

With the edge computing becoming an increasingly adopted concept in system architectures, it is expected its utilization will be additionally heightened when combined with deep learning (DL) techniques. The idea behind integrating demanding processing algorithms in Internet of Things (IoT) and edge devices, such as Deep Neural Network (DNN), has in large measure benefited from the development of edge computing hardware, as well as from adapting the algorithms for use in resource constrained IoT devices. Surprisingly, there are no models yet to optimally place and use machine learning in edge computing. In this paper, we propose the first model of optimal placement of Deep Neural Network (DNN) Placement and Inference in edge computing. We present a mathematical formulation to the DNN Model Variant Selection and Placement (MVSP) problem considering the inference latency of different model-variants, communication latency between nodes, and utilization cost of edge computing nodes. We evaluate our model numerically, and show that for low load increasing model co-location decreases the average latency by 33 request, and for high load, by 21

READ FULL TEXT
research
10/22/2019

Deep Learning at the Edge

The ever-increasing number of Internet of Things (IoT) devices has creat...
research
06/01/2020

Understanding Uncertainty of Edge Computing: New Principle and Design Approach

Due to the edge's position between the cloud and the users, and the rece...
research
12/22/2022

Mind Your Heart: Stealthy Backdoor Attack on Dynamic Deep Neural Network in Edge Computing

Transforming off-the-shelf deep neural network (DNN) models into dynamic...
research
10/10/2021

SplitPlace: Intelligent Placement of Split Neural Nets in Mobile Edge Environments

In recent years, deep learning models have become ubiquitous in industry...
research
05/28/2021

Optimal Model Placement and Online Model Splitting for Device-Edge Co-Inference

Device-edge co-inference opens up new possibilities for resource-constra...
research
10/15/2019

Alleviating Bottlenecks for DNN Execution on GPUs via Opportunistic Computing

Edge computing and IoT applications are severely constrained by limited ...
research
03/23/2022

Verifying Outsourced Computation in an Edge Computing Marketplace

An edge computing marketplace could enable IoT devices (Outsourcers) to ...

Please sign up or login with your details

Forgot password? Click here to reset