Graph Modularity: Towards Understanding the Cross-Layer Transition of Feature Representations in Deep Neural Networks

11/24/2021
by   Yao Lu, et al.
0

There are good arguments to support the claim that feature representations eventually transition from general to specific in deep neural networks (DNNs), but this transition remains relatively underexplored. In this work, we move a tiny step towards understanding the transition of feature representations. We first characterize this transition by analyzing the class separation in intermediate layers, and next model the process of class separation as community evolution in dynamic graphs. Then, we introduce modularity, a common metric in graph theory, to quantify the evolution of communities. We find that modularity tends to rise as the layer goes deeper, but descends or reaches a plateau at particular layers. Through an asymptotic analysis, we show that modularity can provide quantitative analysis of the transition of the feature representations. With the insight on feature representations, we demonstrate that modularity can also be used to identify and locate redundant layers in DNNs, which provides theoretical guidance for layer pruning. Based on this inspiring finding, we propose a layer-wise pruning method based on modularity. Further experiments show that our method can prune redundant layers with minimal impact on performance. The codes are available at https://github.com/yaolu-zjut/Dynamic-Graphs-Construction.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/11/2021

Leveraging Sparse Linear Layers for Debuggable Deep Networks

We show how fitting sparse linear models over learned deep feature repre...
research
01/09/2019

How Compact?: Assessing Compactness of Representations through Layer-Wise Pruning

Various forms of representations may arise in the many layers embedded i...
research
08/21/2023

Efficient Joint Optimization of Layer-Adaptive Weight Pruning in Deep Neural Networks

In this paper, we propose a novel layer-adaptive weight-pruning approach...
research
02/24/2023

Defending Against Backdoor Attacks by Layer-wise Feature Analysis

Training deep neural networks (DNNs) usually requires massive training d...
research
03/30/2020

Architecture Disentanglement for Deep Neural Networks

Deep Neural Networks (DNNs) are central to deep learning, and understand...
research
09/21/2023

CoMFLP: Correlation Measure based Fast Search on ASR Layer Pruning

Transformer-based speech recognition (ASR) model with deep layers exhibi...
research
07/07/2020

Hierarchical nucleation in deep neural networks

Deep convolutional networks (DCNs) learn meaningful representations wher...

Please sign up or login with your details

Forgot password? Click here to reset