DeepAI AI Chat
Log In Sign Up

Using UNet and PSPNet to explore the reusability principle of CNN parameters

08/08/2020
by   Wei Wang, et al.
20

How to reduce the requirement on training dataset size is a hot topic in deep learning community. One straightforward way is to reuse some pre-trained parameters. Some previous work like Deep transfer learning reuse the model parameters trained for the first task as the starting point for the second task, and semi-supervised learning is trained upon a combination of labeled and unlabeled data. However, the fundamental reason of the success of these methods is unclear. In this paper, the reusability of parameters in each layer of a deep convolutional neural network is experimentally quantified by using a network to do segmentation and auto-encoder task. This paper proves that network parameters can be reused for two reasons: first, the network features are general; Second, there is little difference between the pre-trained parameters and the ideal network parameters. Through the use of parameter replacement and comparison, we demonstrate that reusability is different in BN(Batch Normalization)[7] layer and Convolution layer and some observations: (1)Running mean and running variance plays an important role than Weight and Bias in BN layer.(2)The weight and bias can be reused in BN layers.( 3) The network is very sensitive to the weight of convolutional layer.(4) The bias in Convolution layers are not sensitive, and it can be reused directly.

READ FULL TEXT

page 1

page 2

page 3

page 4

11/20/2019

Inspect Transfer Learning Architecture with Dilated Convolution

There are many award-winning pre-trained Convolutional Neural Network (C...
12/13/2018

When Semi-Supervised Learning Meets Transfer Learning: Training Strategies, Models and Datasets

Semi-Supervised Learning (SSL) has been proved to be an effective way to...
02/01/2017

PCA-Initialized Deep Neural Networks Applied To Document Image Analysis

In this paper, we present a novel approach for initializing deep neural ...
05/12/2022

Target Aware Network Architecture Search and Compression for Efficient Knowledge Transfer

Transfer Learning enables Convolutional Neural Networks (CNN) to acquire...
06/18/2021

Recurrent Stacking of Layers in Neural Networks: An Application to Neural Machine Translation

In deep neural network modeling, the most common practice is to stack a ...