DeepAI AI Chat
Log In Sign Up

Channel redundancy and overlap in convolutional neural networks with channel-wise NNK graphs

by   David Bonet, et al.

Feature spaces in the deep layers of convolutional neural networks (CNNs) are often very high-dimensional and difficult to interpret. However, convolutional layers consist of multiple channels that are activated by different types of inputs, which suggests that more insights may be gained by studying the channels and how they relate to each other. In this paper, we first analyze theoretically channel-wise non-negative kernel (CW-NNK) regression graphs, which allow us to quantify the overlap between channels and, indirectly, the intrinsic dimension of the data representation manifold. We find that redundancy between channels is significant and varies with the layer depth and the level of regularization during training. Additionally, we observe that there is a correlation between channel overlap in the last convolutional layer and generalization performance. Our experimental results demonstrate that these techniques can lead to a better understanding of deep representations.


Channel-Wise Early Stopping without a Validation Set via NNK Polytope Interpolation

State-of-the-art neural network architectures continue to scale in size ...

Unit Impulse Response as an Explainer of Redundancy in a Deep Convolutional Neural Network

Convolutional neural networks (CNN) are generally designed with a heuris...

Stochastic Channel Decorrelation Network and Its Application to Visual Tracking

Deep convolutional neural networks (CNNs) have dominated many computer v...

Channel Compression: Rethinking Information Redundancy among Channels in CNN Architecture

Model compression and acceleration are attracting increasing attentions ...

CIFS: Improving Adversarial Robustness of CNNs via Channel-wise Importance-based Feature Selection

We investigate the adversarial robustness of CNNs from the perspective o...

ωGNNs: Deep Graph Neural Networks Enhanced by Multiple Propagation Operators

Graph Neural Networks (GNNs) are limited in their propagation operators....

Channel Equilibrium Networks for Learning Deep Representation

Convolutional Neural Networks (CNNs) are typically constructed by stacki...