SplineNets: Continuous Neural Decision Graphs

10/31/2018
by   Cem Keskin, et al.
0

We present SplineNets, a practical and novel approach for using conditioning in convolutional neural networks (CNNs). SplineNets are continuous generalizations of neural decision graphs, and they can dramatically reduce runtime complexity and computation costs of CNNs, while maintaining or even increasing accuracy. Functions of SplineNets are both dynamic (i.e., conditioned on the input) and hierarchical (i.e., conditioned on the computational path). SplineNets employ a unified loss function with a desired level of smoothness over both the network and decision parameters, while allowing for sparse activation of a subset of nodes for individual samples. In particular, we embed infinitely many function weights (e.g. filters) on smooth, low dimensional manifolds parameterized by compact B-splines, which are indexed by a position parameter. Instead of sampling from a categorical distribution to pick a branch, samples choose a continuous position to pick a function weight. We further show that by maximizing the mutual information between spline positions and class labels, the network can be optimally utilized and specialized for classification tasks. Experiments show that our approach can significantly increase the accuracy of ResNets with negligible cost in speed, matching the precision of a 110 level ResNet with a 32 level SplineNet.

READ FULL TEXT
research
09/27/2022

Continuous approximation by convolutional neural networks with a sigmoidal function

In this paper we present a class of convolutional neural networks (CNNs)...
research
12/09/2019

An Empirical Study on Position of the Batch Normalization Layer in Convolutional Neural Networks

In this paper, we have studied how the training of the convolutional neu...
research
11/04/2020

Filter Pruning using Hierarchical Group Sparse Regularization for Deep Convolutional Neural Networks

Since the convolutional neural networks are often trained with redundant...
research
02/27/2019

Modulated binary cliquenet

Although Convolutional Neural Networks (CNNs) achieve effectiveness in v...
research
12/16/2020

AdjointBackMap: Reconstructing Effective Decision Hypersurfaces from CNN Layers Using Adjoint Operators

There are several effective methods in explaining the inner workings of ...
research
03/03/2016

Decision Forests, Convolutional Networks and the Models in-Between

This paper investigates the connections between two state of the art cla...
research
01/22/2020

How Much Position Information Do Convolutional Neural Networks Encode?

In contrast to fully connected networks, Convolutional Neural Networks (...

Please sign up or login with your details

Forgot password? Click here to reset