Investigating Learning in Deep Neural Networks using Layer-Wise Weight Change

11/13/2020
by   Ayush Manish Agrawal, et al.
23

Understanding the per-layer learning dynamics of deep neural networks is of significant interest as it may provide insights into how neural networks learn and the potential for better training regimens. We investigate learning in Deep Convolutional Neural Networks (CNNs) by measuring the relative weight change of layers while training. Several interesting trends emerge in a variety of CNN architectures across various computer vision classification tasks, including the overall increase in relative weight change of later layers as compared to earlier ones.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/27/2020

¶ILCRO: Making Importance Landscapes Flat Again

Convolutional neural networks have had a great success in numerous tasks...
research
05/15/2022

What is an equivariant neural network?

We explain equivariant neural networks, a notion underlying breakthrough...
research
08/15/2022

Preventing Deterioration of Classification Accuracy in Predictive Coding Networks

Predictive Coding Networks (PCNs) aim to learn a generative model of the...
research
11/18/2019

ISP4ML: Understanding the Role of Image Signal Processing in Efficient Deep Learning Vision Systems

Convolutional neural networks (CNNs) are now predominant components in a...
research
01/30/2023

Equivariant Architectures for Learning in Deep Weight Spaces

Designing machine learning architectures for processing neural networks ...

Please sign up or login with your details

Forgot password? Click here to reset