Why Layer-Wise Learning is Hard to Scale-up and a Possible Solution via Accelerated Downsampling

10/15/2020
by   Wenchi Ma, et al.
0

Layer-wise learning, as an alternative to global back-propagation, is easy to interpret, analyze, and it is memory efficient. Recent studies demonstrate that layer-wise learning can achieve state-of-the-art performance in image classification on various datasets. However, previous studies of layer-wise learning are limited to networks with simple hierarchical structures, and the performance decreases severely for deeper networks like ResNet. This paper, for the first time, reveals the fundamental reason that impedes the scale-up of layer-wise learning is due to the relatively poor separability of the feature space in shallow layers. This argument is empirically verified by controlling the intensity of the convolution operation in local layers. We discover that the poorly-separable features from shallow layers are mismatched with the strong supervision constraint throughout the entire network, making the layer-wise learning sensitive to network depth. The paper further proposes a downsampling acceleration approach to weaken the poor learning of shallow layers so as to transfer the learning emphasis to deep feature space where the separability matches better with the supervision restraint. Extensive experiments have been conducted to verify the new finding and demonstrate the advantages of the proposed downsampling acceleration in improving the performance of layer-wise learning.

READ FULL TEXT
research
07/12/2022

Contrastive Deep Supervision

The success of deep learning is usually accompanied by the growth in neu...
research
04/04/2016

Layer-wise Relevance Propagation for Neural Networks with Local Renormalization Layers

Layer-wise relevance propagation is a framework which allows to decompos...
research
08/26/2019

Deep Concept-wise Temporal Convolutional Networks for Action Localization

Existing action localization approaches adopt shallow temporal convoluti...
research
05/07/2017

Deep Visual Attention Prediction

Deep Convolutional Neural Networks (CNNs) have made substantial improvem...
research
07/20/2017

Deep Layer Aggregation

Convolutional networks have had great success in image classification an...
research
04/26/2021

EigenGAN: Layer-Wise Eigen-Learning for GANs

Recent studies on Generative Adversarial Network (GAN) reveal that diffe...
research
06/27/2018

DeepObfuscation: Securing the Structure of Convolutional Neural Networks via Knowledge Distillation

This paper investigates the piracy problem of deep learning models. Desi...

Please sign up or login with your details

Forgot password? Click here to reset