Deeply-supervised Knowledge Synergy

06/03/2019
by   Dawei Sun, et al.
0

Convolutional Neural Networks (CNNs) have become deeper and more complicated compared with the pioneering AlexNet. However, current prevailing training scheme follows the previous way of adding supervision to the last layer of the network only and propagating error information up layer-by-layer. In this paper, we propose Deeply-supervised Knowledge Synergy (DKS), a new method aiming to train CNNs with improved generalization ability for image classification tasks without introducing extra computational cost during inference. Inspired by the deeply-supervised learning scheme, we first append auxiliary supervision branches on top of certain intermediate network layers. While properly using auxiliary supervision can improve model accuracy to some degree, we go one step further to explore the possibility of utilizing the probabilistic knowledge dynamically learnt by the classifiers connected to the backbone network as a new regularization to improve the training. A novel synergy loss, which considers pairwise knowledge matching among all supervision branches, is presented. Intriguingly, it enables dense pairwise knowledge matching operations in both top-down and bottom-up directions at each training iteration, resembling a dynamic synergy process for the same task. We evaluate DKS on image classification datasets using state-of-the-art CNN architectures, and show that the models trained with it are consistently better than the corresponding counterparts. For instance, on the ImageNet classification benchmark, our ResNet-152 model outperforms the baseline model with a 1.47 margin in Top-1 accuracy. Code is available at https://github.com/sundw2014/DKS.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/24/2020

Dynamic Hierarchical Mimicking Towards Consistent Optimization Objectives

While the depth of modern Convolutional Neural Networks (CNNs) surpasses...
research
07/12/2022

Contrastive Deep Supervision

The success of deep learning is usually accompanied by the growth in neu...
research
05/11/2015

Training Deeper Convolutional Networks with Deep Supervision

One of the most promising ways of improving the performance of deep conv...
research
09/28/2022

Deeply Supervised Layer Selective Attention Network: Towards Label-Efficient Learning for Medical Image Classification

Labeling medical images depends on professional knowledge, making it dif...
research
01/24/2017

Training Group Orthogonal Neural Networks with Privileged Information

Learning rich and diverse representations is critical for the performanc...
research
12/29/2022

BiMLP: Compact Binary Architectures for Vision Multi-Layer Perceptrons

This paper studies the problem of designing compact binary architectures...
research
11/19/2019

Inter-layer Collision Networks

Deeper neural networks are hard to train. Inspired by the elastic collis...

Please sign up or login with your details

Forgot password? Click here to reset