Enabling Capsule Networks at the Edge through Approximate Softmax and Squash Operations

06/21/2022
by   Alberto Marchisio, et al.
0

Complex Deep Neural Networks such as Capsule Networks (CapsNets) exhibit high learning capabilities at the cost of compute-intensive operations. To enable their deployment on edge devices, we propose to leverage approximate computing for designing approximate variants of the complex operations like softmax and squash. In our experiments, we evaluate tradeoffs between area, power consumption, and critical path delay of the designs implemented with the ASIC design flow, and the accuracy of the quantized CapsNets, compared to the exact functions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/20/2022

HEAM: High-Efficiency Approximate Multiplier Optimization for Deep Neural Networks

We propose an optimization method for the automatic design of approximat...
research
12/02/2019

ReD-CaNe: A Systematic Methodology for Resilience Analysis and Design of Capsule Networks under Approximations

Recent advances in Capsule Networks (CapsNets) have shown their superior...
research
04/15/2020

Q-CapsNets: A Specialized Framework for Quantizing Capsule Networks

Capsule Networks (CapsNets), recently proposed by the Google Brain team,...
research
08/19/2022

Towards Efficient Capsule Networks

From the moment Neural Networks dominated the scene for image processing...
research
10/06/2021

Shifting Capsule Networks from the Cloud to the Deep Edge

Capsule networks (CapsNets) are an emerging trend in image processing. I...
research
03/07/2019

A Capsule-unified Framework of Deep Neural Networks for Graphical Programming

Recently, the growth of deep learning has produced a large number of dee...
research
12/23/2022

Approximate Scan Flip-flop to Reduce Functional Path Delay and Power Consumption

The scan-based testing has been widely used as a Design-for-Test (DfT) m...

Please sign up or login with your details

Forgot password? Click here to reset