Channel redundancy and overlap in convolutional neural networks with channel-wise NNK graphs

10/18/2021
by   David Bonet, et al.
4

Feature spaces in the deep layers of convolutional neural networks (CNNs) are often very high-dimensional and difficult to interpret. However, convolutional layers consist of multiple channels that are activated by different types of inputs, which suggests that more insights may be gained by studying the channels and how they relate to each other. In this paper, we first analyze theoretically channel-wise non-negative kernel (CW-NNK) regression graphs, which allow us to quantify the overlap between channels and, indirectly, the intrinsic dimension of the data representation manifold. We find that redundancy between channels is significant and varies with the layer depth and the level of regularization during training. Additionally, we observe that there is a correlation between channel overlap in the last convolutional layer and generalization performance. Our experimental results demonstrate that these techniques can lead to a better understanding of deep representations.

READ FULL TEXT
research
07/27/2021

Channel-Wise Early Stopping without a Validation Set via NNK Polytope Interpolation

State-of-the-art neural network architectures continue to scale in size ...
research
06/10/2019

Unit Impulse Response as an Explainer of Redundancy in a Deep Convolutional Neural Network

Convolutional neural networks (CNN) are generally designed with a heuris...
research
07/03/2018

Stochastic Channel Decorrelation Network and Its Application to Visual Tracking

Deep convolutional neural networks (CNNs) have dominated many computer v...
research
07/02/2020

Channel Compression: Rethinking Information Redundancy among Channels in CNN Architecture

Model compression and acceleration are attracting increasing attentions ...
research
02/10/2021

CIFS: Improving Adversarial Robustness of CNNs via Channel-wise Importance-based Feature Selection

We investigate the adversarial robustness of CNNs from the perspective o...
research
10/31/2022

ωGNNs: Deep Graph Neural Networks Enhanced by Multiple Propagation Operators

Graph Neural Networks (GNNs) are limited in their propagation operators....
research
02/29/2020

Channel Equilibrium Networks for Learning Deep Representation

Convolutional Neural Networks (CNNs) are typically constructed by stacki...

Please sign up or login with your details

Forgot password? Click here to reset