Learning Compositional Structures for Deep Learning: Why Routing-by-agreement is Necessary

10/04/2020
by   Sai Raam Venkatraman, et al.
0

A formal description of the compositionality of neural networks is associated directly with the formal grammar-structure of the objects it seeks to represent. This formal grammar-structure specifies the kind of components that make up an object, and also the configurations they are allowed to be in. In other words, objects can be described as a parse-tree of its components – a structure that can be seen as a candidate for building connection-patterns among neurons in neural networks. We present a formal grammar description of convolutional neural networks and capsule networks that shows how capsule networks can enforce such parse-tree structures, while CNNs do not. Specifically, we show that the entropy of routing coefficients in the dynamic routing algorithm controls this ability. Thus, we introduce the entropy of routing weights as a loss function for better compositionality among capsules. We show by experiments, on data with a compositional structure, that the use of this loss enables capsule networks to better detect changes in compositionality. Our experiments show that as the entropy of the routing weights increases, the ability to detect changes in compositionality reduces. We see that, without routing, capsule networks perform similar to convolutional neural networks in that both these models perform badly at detecting changes in compositionality. Our results indicate that routing is an important part of capsule networks – effectively answering recent work that has questioned its necessity. We also, by experiments on SmallNORB, CIFAR-10, and FashionMNIST, show that this loss keeps the accuracy of capsule network models comparable to models that do not use it .

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/23/2018

Training Deep Capsule Networks

The capsules of Capsule Networks are collections of neurons that represe...
research
04/04/2022

REM: Routing Entropy Minimization for Capsule Networks

Capsule Networks ambition is to build an explainable and biologically-in...
research
10/11/2022

Effectiveness of the Recent Advances in Capsule Networks

Convolutional neural networks (CNNs) have revolutionized the field of de...
research
03/29/2021

Capsule Network is Not More Robust than Convolutional Network

The Capsule Network is widely believed to be more robust than Convolutio...
research
02/11/2017

Group Scissor: Scaling Neuromorphic Computing Design to Large Neural Networks

Synapse crossbar is an elementary structure in Neuromorphic Computing Sy...
research
10/20/2022

Iterative collaborative routing among equivariant capsules for transformation-robust capsule networks

Transformation-robustness is an important feature for machine learning m...
research
08/27/2018

Cognitive Consistency Routing Algorithm of Capsule-network

Artificial Neural Networks (ANNs) are computational models inspired by t...

Please sign up or login with your details

Forgot password? Click here to reset