Efficiency and Scalability of Multi-Lane Capsule Networks (MLCN)

08/11/2019
by   Vanderson M. do Rosario, et al.
0

Some Deep Neural Networks (DNN) have what we call lanes, or they can be reorganized as such. Lanes are paths in the network which are data-independent and typically learn different features or add resilience to the network. Given their data-independence, lanes are amenable for parallel processing. The Multi-lane CapsNet (MLCN) is a proposed reorganization of the Capsule Network which is shown to achieve better accuracy while bringing highly-parallel lanes. However, the efficiency and scalability of MLCN had not been systematically examined. In this work, we study the MLCN network with multiple GPUs finding that it is 2x more efficient than the original CapsNet when using model-parallelism. Further, we present the load balancing problem of distributing heterogeneous lanes in homogeneous or heterogeneous accelerators and show that a simple greedy heuristic can be almost 50 random approach.

READ FULL TEXT
research
02/22/2019

The Multi-Lane Capsule Network (MLCN)

We introduce Multi-Lane Capsule Networks (MLCN), which are a separable a...
research
08/19/2022

Towards Efficient Capsule Networks

From the moment Neural Networks dominated the scene for image processing...
research
02/11/2019

Path Capsule Networks

Capsule network (CapsNet) was introduced as an enhancement over convolut...
research
07/22/2020

Wasserstein Routed Capsule Networks

Capsule networks offer interesting properties and provide an alternative...
research
08/19/2020

NASCaps: A Framework for Neural Architecture Search to Optimize the Accuracy and Hardware Efficiency of Convolutional Capsule Networks

Deep Neural Networks (DNNs) have made significant improvements to reach ...
research
08/05/2021

Parallel Capsule Networks for Classification of White Blood Cells

Capsule Networks (CapsNets) is a machine learning architecture proposed ...

Please sign up or login with your details

Forgot password? Click here to reset