Why do CNNs Learn Consistent Representations in their First Layer Independent of Labels and Architecture?

06/06/2022
by   Rhea Chowers, et al.
0

It has previously been observed that the filters learned in the first layer of a CNN are qualitatively similar for different networks and tasks. We extend this finding and show a high quantitative similarity between filters learned by different networks. We consider the CNN filters as a filter bank and measure the sensitivity of the filter bank to different frequencies. We show that the sensitivity profile of different networks is almost identical, yet far from initialization. Remarkably, we show that it remains the same even when the network is trained with random labels. To understand this effect, we derive an analytic formula for the sensitivity of the filters in the first layer of a linear CNN. We prove that when the average patch in images of the two classes is identical, the sensitivity profile of the filters in the first layer will be identical in expectation when using the true labels or random labels and will only depend on the second-order statistics of image patches. We empirically demonstrate that the average patch assumption holds for realistic datasets. Finally we show that the energy profile of filters in nonlinear CNNs is highly correlated with the energy profile of linear CNNs and that our analysis of linear networks allows us to predict when representations learned by state-of-the-art networks trained on benchmark classification tasks will depend on the labels.

READ FULL TEXT

page 17

page 18

page 19

page 20

page 23

research
11/07/2017

Interpreting Convolutional Neural Networks Through Compression

Convolutional neural networks (CNNs) achieve state-of-the-art performanc...
research
05/31/2021

Speaker Identification from Raw Waveform with LineNet

Speaker Identification using i-vector has gradually been replaced by spe...
research
06/11/2019

BasisConv: A method for compressed representation and learning in CNNs

It is well known that Convolutional Neural Networks (CNNs) have signific...
research
11/27/2017

Transfer Learning in CNNs Using Filter-Trees

Convolutional Neural Networks (CNNs) are very effective for many pattern...
research
10/29/2021

Gabor filter incorporated CNN for compression

Convolutional neural networks (CNNs) are remarkably successful in many c...
research
06/14/2018

Insights on representational similarity in neural networks with canonical correlation

Comparing different neural network representations and determining how r...
research
05/20/2021

Biologically Inspired Semantic Lateral Connectivity for Convolutional Neural Networks

Lateral connections play an important role for sensory processing in vis...

Please sign up or login with your details

Forgot password? Click here to reset