Gradient scarcity with Bilevel Optimization for Graph Learning

03/24/2023
by   Hashem Ghanem, et al.
0

A common issue in graph learning under the semi-supervised setting is referred to as gradient scarcity. That is, learning graphs by minimizing a loss on a subset of nodes causes edges between unlabelled nodes that are far from labelled ones to receive zero gradients. The phenomenon was first described when optimizing the graph and the weights of a Graph Neural Network (GCN) with a joint optimization algorithm. In this work, we give a precise mathematical characterization of this phenomenon, and prove that it also emerges in bilevel optimization, where additional dependency exists between the parameters of the problem. While for GCNs gradient scarcity occurs due to their finite receptive field, we show that it also occurs with the Laplacian regularization model, in the sense that gradients amplitude decreases exponentially with distance to labelled nodes. To alleviate this issue, we study several solutions: we propose to resort to latent graph learning using a Graph-to-Graph model (G2G), graph regularization to impose a prior structure on the graph, or optimizing on a larger graph than the original one with a reduced diameter. Our experiments on synthetic and real datasets validate our analysis and prove the efficiency of the proposed solutions.

READ FULL TEXT
research
01/28/2023

Laplacian-based Semi-Supervised Learning in Multilayer Hypergraphs by Coordinate Descent

Graph Semi-Supervised learning is an important data analysis tool, where...
research
05/29/2018

Lovasz Convolutional Networks

Semi-supervised learning on graph structured data has received significa...
research
02/06/2023

On Over-Squashing in Message Passing Neural Networks: The Impact of Width, Depth, and Topology

Message Passing Neural Networks (MPNNs) are instances of Graph Neural Ne...
research
06/28/2022

Graph Condensation via Receptive Field Distribution Matching

Graph neural networks (GNNs) enable the analysis of graphs using deep le...
research
08/21/2020

Optimization of Graph Neural Networks with Natural Gradient Descent

In this work, we propose to employ information-geometric tools to optimi...
research
06/18/2020

Sequential Graph Convolutional Network for Active Learning

We propose a novel generic sequential Graph Convolution Network (GCN) tr...
research
07/28/2021

Effective Eigendecomposition based Graph Adaptation for Heterophilic Networks

Graph Neural Networks (GNNs) exhibit excellent performance when graphs h...

Please sign up or login with your details

Forgot password? Click here to reset