Distilling Critical Paths in Convolutional Neural Networks

10/28/2018
by   Fuxun Yu, et al.
0

Neural network compression and acceleration are widely demanded currently due to the resource constraints on most deployment targets. In this paper, through analyzing the filter activation, gradients, and visualizing the filters' functionality in convolutional neural networks, we show that the filters in higher layers learn extremely task-specific features, which are exclusive for only a small subset of the overall tasks, or even a single class. Based on such findings, we reveal the critical paths of information flow for different classes. And by their intrinsic property of exclusiveness, we propose a critical path distillation method, which can effectively customize the convolutional neural networks to small ones with much smaller model size and less computation.

READ FULL TEXT

page 2

page 3

research
06/15/2015

Multi-path Convolutional Neural Networks for Complex Image Classification

Convolutional Neural Networks demonstrate high performance on ImageNet L...
research
10/12/2018

Interpretable Convolutional Filter Pruning

The sophisticated structure of Convolutional Neural Network (CNN) allows...
research
01/23/2018

Stacked Filters Stationary Flow For Hardware-Oriented Acceleration Of Deep Convolutional Neural Networks

To address memory and computation resource limitations for hardware-orie...
research
10/29/2018

Demystifying Neural Network Filter Pruning

Based on filter magnitude ranking (e.g. L1 norm), conventional filter pr...
research
12/17/2014

Flattened Convolutional Neural Networks for Feedforward Acceleration

We present flattened convolutional neural networks that are designed for...
research
09/20/2021

Learning Versatile Convolution Filters for Efficient Visual Recognition

This paper introduces versatile filters to construct efficient convoluti...
research
06/13/2017

Deep Control - a simple automatic gain control for memory efficient and high performance training of deep convolutional neural networks

Training a deep convolutional neural net typically starts with a random ...

Please sign up or login with your details

Forgot password? Click here to reset