How Neural Architectures Affect Deep Learning for Communication Networks?

11/03/2021
by   Yifei Shen, et al.
0

In recent years, there has been a surge in applying deep learning to various challenging design problems in communication networks. The early attempts adopt neural architectures inherited from applications such as computer vision, which suffer from poor generalization, scalability, and lack of interpretability. To tackle these issues, domain knowledge has been integrated into the neural architecture design, which achieves near-optimal performance in large-scale networks and generalizes well under different system settings. This paper endeavors to theoretically validate the importance and effects of neural architectures when applying deep learning to design communication networks. We prove that by exploiting permutation invariance, a common property in communication networks, graph neural networks (GNNs) converge faster and generalize better than fully connected multi-layer perceptrons (MLPs), especially when the number of nodes (e.g., users, base stations, or antennas) is large. Specifically, we prove that under common assumptions, for a communication network with n nodes, GNNs converge O(n log n) times faster and their generalization error is O(n) times lower, compared with MLPs.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/21/2022

Graph Neural Networks for Wireless Communications: From Theory to Practice

Deep learning-based approaches have been developed to solve challenging ...
research
07/15/2020

Graph Neural Networks for Scalable Radio Resource Management: Architecture Design and Theoretical Analysis

Deep learning has recently emerged as a disruptive technology to solve c...
research
01/17/2020

Node Masking: Making Graph Neural Networks Generalize and Scale Better

Graph Neural Networks (GNNs) have received a lot of interest in the rece...
research
12/18/2022

Graph Neural Networks are Inherently Good Generalizers: Insights by Bridging GNNs and MLPs

Graph neural networks (GNNs), as the de-facto model class for representa...
research
09/07/2020

GraphNorm: A Principled Approach to Accelerating Graph Neural Network Training

Normalization plays an important role in the optimization of deep neural...
research
07/06/2021

Discrete-Valued Neural Communication

Deep learning has advanced from fully connected architectures to structu...
research
02/02/2021

Customizing Graph500 for Tianhe Pre-exacale system

BFS (Breadth-First Search) is a typical graph algorithm used as a key co...

Please sign up or login with your details

Forgot password? Click here to reset