Capsule networks for low-data transfer learning

04/26/2018
by   Andrew Gritsevskiy, et al.
0

We propose a capsule network-based architecture for generalizing learning to new data with few examples. Using both generative and non-generative capsule networks with intermediate routing, we are able to generalize to new information over 25 times faster than a similar convolutional neural network. We train the networks on the multiMNIST dataset lacking one digit. After the networks reach their maximum accuracy, we inject 1-100 examples of the missing digit into the training set, and measure the number of batches needed to return to a comparable level of accuracy. We then discuss the improvement in low-data transfer learning that capsule networks bring, and propose future directions for capsule research.

READ FULL TEXT

page 2

page 10

research
11/12/2019

Grouping Capsules Based Different Types

Capsule network was introduced as a new architecture of neural networks,...
research
07/30/2020

An Improvement for Capsule Networks using Depthwise Separable Convolution

Capsule Networks face a critical problem in computer vision in the sense...
research
02/13/2019

Improving performance and inference on audio classification tasks using capsule networks

Classification of audio samples is an important part of many auditory sy...
research
07/22/2020

Wasserstein Routed Capsule Networks

Capsule networks offer interesting properties and provide an alternative...
research
06/07/2019

Kernelized Capsule Networks

Capsule Networks attempt to represent patterns in images in a way that p...
research
08/22/2018

Capsule Networks for Protein Structure Classification and Prediction

Capsule Networks have great potential to tackle problems in structural b...
research
07/13/2019

Using dynamic routing to extract intermediate features for developing scalable capsule networks

Capsule networks have gained a lot of popularity in short time due to it...

Please sign up or login with your details

Forgot password? Click here to reset