GNN-FiLM: Graph Neural Networks with Feature-wise Linear Modulation

06/28/2019
by   Marc Brockschmidt, et al.
0

This paper presents a new Graph Neural Network (GNN) type using feature-wise linear modulations (FiLM). Many GNN variants propagate information along the edges of a graph by computing "messages" based only on the representation source of each edge. In GNN-FiLM, the representation of the target node of an edge is additionally used to compute a transformation that can be applied to all incoming messages, allowing feature-wise modulation of the passed information. Experiments with GNN-FiLM as well as a number of baselines and related extensions show that it outperforms baseline methods while not being significantly slower.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/07/2022

Explicit Feature Interaction-aware Graph Neural Networks

Graph neural networks are powerful methods to handle graph-structured da...
research
07/11/2023

A Modal Logic for Explaining some Graph Neural Networks

In this paper, we propose a modal logic in which counting modalities app...
research
06/27/2022

Reduced Optimal Power Flow Using Graph Neural Network

OPF problems are formulated and solved for power system operations, espe...
research
07/31/2023

TFE-GNN: A Temporal Fusion Encoder Using Graph Neural Networks for Fine-grained Encrypted Traffic Classification

Encrypted traffic classification is receiving widespread attention from ...
research
03/02/2023

Technical report: Graph Neural Networks go Grammatical

This paper proposes a new GNN design strategy. This strategy relies on C...
research
01/27/2023

TrojanSAINT: Gate-Level Netlist Sampling-Based Inductive Learning for Hardware Trojan Detection

We propose TrojanSAINT, a graph neural network (GNN)-based hardware Troj...
research
04/07/2021

Optimizing Memory Efficiency of Graph NeuralNetworks on Edge Computing Platforms

Graph neural networks (GNN) have achieved state-of-the-art performance o...

Please sign up or login with your details

Forgot password? Click here to reset