Convolutional Neural Network Dynamics: A Graph Perspective

11/09/2021
by   Fatemeh Vahedian, et al.
0

The success of neural networks (NNs) in a wide range of applications has led to increased interest in understanding the underlying learning dynamics of these models. In this paper, we go beyond mere descriptions of the learning dynamics by taking a graph perspective and investigating the relationship between the graph structure of NNs and their performance. Specifically, we propose (1) representing the neural network learning process as a time-evolving graph (i.e., a series of static graph snapshots over epochs), (2) capturing the structural changes of the NN during the training phase in a simple temporal summary, and (3) leveraging the structural summary to predict the accuracy of the underlying NN in a classification or regression task. For the dynamic graph representation of NNs, we explore structural representations for fully-connected and convolutional layers, which are key components of powerful NN models. Our analysis shows that a simple summary of graph statistics, such as weighted degree and eigenvector centrality, over just a few epochs can be used to accurately predict the performance of NNs. For example, a weighted degree-based summary of the time-evolving graph that is constructed based on 5 training epochs of the LeNet architecture achieves classification accuracy of over 93 LeNet, VGG, AlexNet and ResNet.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/26/2020

A Protection against the Extraction of Neural Network Models

Given oracle access to a Neural Network (NN), it is possible to extract ...
research
11/06/2020

Deep learning architectures for inference of AC-OPF solutions

We present a systematic comparison between neural network (NN) architect...
research
01/04/2022

Efficient-Dyn: Dynamic Graph Representation Learning via Event-based Temporal Sparse Attention Network

Static graph neural networks have been widely used in modeling and repre...
research
08/22/2023

Development of a Novel Quantum Pre-processing Filter to Improve Image Classification Accuracy of Neural Network Models

This paper proposes a novel quantum pre-processing filter (QPF) to impro...
research
07/29/2021

Structure and Performance of Fully Connected Neural Networks: Emerging Complex Network Properties

Understanding the behavior of Artificial Neural Networks is one of the m...
research
08/29/2022

On Time and Space: An Experimental Study on Graph Structural and Temporal Encodings

Dynamic networks reflect temporal changes occurring to the graph's struc...
research
06/22/2020

The GCE in a New Light: Disentangling the γ-ray Sky with Bayesian Graph Convolutional Neural Networks

A fundamental question regarding the Galactic Center Excess (GCE) is whe...

Please sign up or login with your details

Forgot password? Click here to reset