AttendNets: Tiny Deep Image Recognition Neural Networks for the Edge via Visual Attention Condensers

09/30/2020
by   Alexander Wong, et al.
0

While significant advances in deep learning has resulted in state-of-the-art performance across a large number of complex visual perception tasks, the widespread deployment of deep neural networks for TinyML applications involving on-device, low-power image recognition remains a big challenge given the complexity of deep neural networks. In this study, we introduce AttendNets, low-precision, highly compact deep neural networks tailored for on-device image recognition. More specifically, AttendNets possess deep self-attention architectures based on visual attention condensers, which extends on the recently introduced stand-alone attention condensers to improve spatial-channel selective attention. Furthermore, AttendNets have unique machine-designed macroarchitecture and microarchitecture designs achieved via a machine-driven design exploration strategy. Experimental results on ImageNet_50 benchmark dataset for the task of on-device image recognition showed that AttendNets have significantly lower architectural and computational complexity when compared to several deep neural networks in research literature designed for efficiency while achieving highest accuracies (with the smallest AttendNet achieving ∼7.2 operations, ∼4.17× fewer parameters, and ∼16.7× lower weight memory requirements than MobileNet-V1). Based on these promising results, AttendNets illustrate the effectiveness of visual attention condensers as building blocks for enabling various on-device visual perception tasks for TinyML applications.

READ FULL TEXT
research
08/10/2020

TinySpeech: Attention Condensers for Deep Speech Recognition Neural Networks on Edge Devices

Advances in deep learning have led to state-of-the-art performance acros...
research
04/29/2021

AttendSeg: A Tiny Attention Condenser Neural Network for Semantic Segmentation on the Edge

In this study, we introduce AttendSeg, a low-precision, highly compact d...
research
02/05/2018

Dream Formulations and Deep Neural Networks: Humanistic Themes in the Iconology of the Machine-Learned Image

This paper addresses the interpretability of deep learning-enabled image...
research
03/18/2019

AttoNets: Compact and Efficient Deep Neural Networks for the Edge via Human-Machine Collaborative Design

While deep neural networks have achieved state-of-the-art performance ac...
research
04/25/2018

Progressive Neural Networks for Image Classification

The inference structures and computational complexity of existing deep n...
research
09/30/2020

Where Does Trust Break Down? A Quantitative Trust Analysis of Deep Neural Networks via Trust Matrix and Conditional Trust Densities

The advances and successes in deep learning in recent years have led to ...

Please sign up or login with your details

Forgot password? Click here to reset