DeepAI AI Chat
Log In Sign Up

Enabling Capsule Networks at the Edge through Approximate Softmax and Squash Operations

by   Alberto Marchisio, et al.
NYU college
Politecnico di Torino
TU Wien

Complex Deep Neural Networks such as Capsule Networks (CapsNets) exhibit high learning capabilities at the cost of compute-intensive operations. To enable their deployment on edge devices, we propose to leverage approximate computing for designing approximate variants of the complex operations like softmax and squash. In our experiments, we evaluate tradeoffs between area, power consumption, and critical path delay of the designs implemented with the ASIC design flow, and the accuracy of the quantized CapsNets, compared to the exact functions.


page 1

page 2

page 3

page 4


HEAM: High-Efficiency Approximate Multiplier Optimization for Deep Neural Networks

We propose an optimization method for the automatic design of approximat...

ReD-CaNe: A Systematic Methodology for Resilience Analysis and Design of Capsule Networks under Approximations

Recent advances in Capsule Networks (CapsNets) have shown their superior...

Q-CapsNets: A Specialized Framework for Quantizing Capsule Networks

Capsule Networks (CapsNets), recently proposed by the Google Brain team,...

Towards Efficient Capsule Networks

From the moment Neural Networks dominated the scene for image processing...

Shifting Capsule Networks from the Cloud to the Deep Edge

Capsule networks (CapsNets) are an emerging trend in image processing. I...

A Capsule-unified Framework of Deep Neural Networks for Graphical Programming

Recently, the growth of deep learning has produced a large number of dee...

Approximate Scan Flip-flop to Reduce Functional Path Delay and Power Consumption

The scan-based testing has been widely used as a Design-for-Test (DfT) m...