Enabling Incremental Knowledge Transfer for Object Detection at the Edge

Object detection using deep neural networks (DNNs) involves a huge amount of computation which impedes its implementation on resource/energy-limited user-end devices. The reason for the success of DNNs is due to having knowledge over all different domains of observed environments. However, we need a limited knowledge of the observed environment at inference time which can be learned using a shallow neural network (SHNN). In this paper, a system-level design is proposed to improve the energy consumption of object detection on the user-end device. An SHNN is deployed on the user-end device to detect objects in the observing environment. Also, a knowledge transfer mechanism is implemented to update the SHNN model using the DNN knowledge when there is a change in the object domain. DNN knowledge can be obtained from a powerful edge device connected to the user-end device through LAN or Wi-Fi. Experiments demonstrate that the energy consumption of the user-end device and the inference time can be improved by 78 device.

READ FULL TEXT

page 1

page 4

page 5

research
10/15/2022

The Effects of Partitioning Strategies on Energy Consumption in Distributed CNN Inference at The Edge

Nowadays, many AI applications utilizing resource-constrained edge devic...
research
01/18/2022

DEFER: Distributed Edge Inference for Deep Neural Networks

Modern machine learning tools such as deep neural networks (DNNs) are pl...
research
02/21/2023

Dynamic Resource Partitioning for Multi-Tenant Systolic Array Based DNN Accelerator

Deep neural networks (DNN) have become significant applications in both ...
research
03/25/2021

Enabling Incremental Training with Forward Pass for Edge Devices

Deep Neural Networks (DNNs) are commonly deployed on end devices that ex...
research
06/07/2022

Decentralized Low-Latency Collaborative Inference via Ensembles on the Edge

The success of deep neural networks (DNNs) is heavily dependent on compu...
research
11/26/2019

Domain-Aware Dynamic Networks

Deep neural networks with more parameters and FLOPs have higher capacity...
research
08/12/2022

Zeus: Understanding and Optimizing GPU Energy Consumption of DNN Training

Training deep neural networks (DNNs) is becoming increasingly more resou...

Please sign up or login with your details

Forgot password? Click here to reset