Graph Neural Processes: Towards Bayesian Graph Neural Networks

02/26/2019
by   Andrew N. Carr, et al.
0

We introduce Graph Neural Processes (GNP), inspired by the recent work in conditional and latent neural processes. A Graph Neural Process is defined as a Conditional Neural Process that operates on arbitrary graph data. It takes features of sparsely observed context points as input, and outputs a distribution over target points. We demonstrate graph neural processes in edge imputation and discuss benefits and drawbacks of the method for other application areas. One major benefit of GNPs is the ability to quantify uncertainty in deep learning on graph structures. An additional benefit of this method is the ability to extend graph neural networks to inputs of dynamic sized graphs.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/17/2020

Bridging the Gap Between Spectral and Spatial Domainsin Graph Neural Networks

Bridging the Gap Between Spectral and Spatial Domainsin Graph Neural Net...
research
12/13/2018

Conditional Graph Neural Processes: A Functional Autoencoder Approach

We introduce a novel encoder-decoder architecture to embed functional pr...
research
06/02/2022

Invertible Neural Networks for Graph Prediction

In this work, we address conditional generation using deep invertible ne...
research
12/12/2020

Bayesian graph neural networks for strain-based crack localization

A common shortcoming of vibration-based damage localization techniques i...
research
06/27/2019

Fast Training of Sparse Graph Neural Networks on Dense Hardware

Graph neural networks have become increasingly popular in recent years d...
research
06/22/2020

Graph Neural Networks and Reinforcement Learning for Behavior Generation in Semantic Environments

Most reinforcement learning approaches used in behavior generation utili...

Please sign up or login with your details

Forgot password? Click here to reset