Inference Time Optimization Using BranchyNet Partitioning

by   Roberto G. Pacheco, et al.

Deep Neural Network (DNN) applications with edge computing presents a trade-off between responsiveness and computational resources. On one hand, edge computing can provide high responsiveness deploying computational resources close to end devices, which may be prohibitive for the majority of cloud computing services. On the other hand, DNN inference requires computational power to be executed, which may not be available on edge devices, but a cloud server can provide it. To solve this problem (trade-off), we partition a DNN between edge device and cloud server, which means the first DNN layers are processed at the edge and the other layers at the cloud. This paper proposes an optimal partition of DNN, according to network bandwidth, computational resources of edge and cloud, and parameter inherent to data. Our proposal aims to minimize the inference time, to allow high responsiveness applications. To this end, we show the equivalency between DNN partitioning problem and shortest path problem to find an optimal solution, using Dijkstra's algorithm.


page 1

page 2

page 3

page 4


Dynamic DNN Decomposition for Lossless Synergistic Inference

Deep neural networks (DNNs) sustain high performance in today's data pro...

A Survey on Deep Neural Network Partition over Cloud, Edge and End Devices

Deep neural network (DNN) partition is a research problem that involves ...

Cost-effective Machine Learning Inference Offload for Edge Computing

Computing at the edge is increasingly important since a massive amount o...

Serdab: An IoT Framework for Partitioning Neural Networks Computation across Multiple Enclaves

Recent advances in Deep Neural Networks (DNN) and Edge Computing have ma...

A Framework for Routing DNN Inference Jobs over Distributed Computing Networks

Ubiquitous artificial intelligence (AI) is considered one of the key ser...

Analyzing the Performance of Smart Industry 4.0 Applications on Cloud Computing Systems

Cloud-based Deep Neural Network (DNN) applications that make latency-sen...

On the Modeling of Reliability in Extreme Edge Computing Systems

Extreme edge computing (EEC) refers to the endmost part of edge computin...

Please sign up or login with your details

Forgot password? Click here to reset