DDGK: Learning Graph Representations for Deep Divergence Graph Kernels

04/21/2019
by   Rami Al-Rfou, et al.
0

Can neural networks learn to compare graphs without feature engineering? In this paper, we show that it is possible to learn representations for graph similarity with neither domain knowledge nor supervision (i.e. feature engineering or labeled graphs). We propose Deep Divergence Graph Kernels, an unsupervised method for learning representations over graphs that encodes a relaxed notion of graph isomorphism. Our method consists of three parts. First, we learn an encoder for each anchor graph to capture its structure. Second, for each pair of graphs, we train a cross-graph attention network which uses the node representations of an anchor graph to reconstruct another graph. This approach, which we call isomorphism attention, captures how well the representations of one graph can encode another. We use the attention-augmented encoder's predictions to define a divergence score for each pair of graphs. Finally, we construct an embedding space for all graphs using these pair-wise divergence scores. Unlike previous work, much of which relies on 1) supervision, 2) domain specific knowledge (e.g. a reliance on Weisfeiler-Lehman kernels), and 3) known node alignment, our unsupervised method jointly learns node representations, graph representations, and an attention-based alignment between graphs. Our experimental results show that Deep Divergence Graph Kernels can learn an unsupervised alignment between graphs, and that the learned representations achieve competitive results when used as features on a number of challenging graph classification tasks. Furthermore, we illustrate how the learned attention allows insight into the the alignment of sub-structures across graphs.

READ FULL TEXT
research
07/01/2019

Unsupervised Adversarial Graph Alignment with Graph Embedding

Graph alignment, also known as network alignment, is a fundamental task ...
research
11/25/2019

Scalable Global Alignment Graph Kernel Using Random Features: From Node Embedding to Graph Embedding

Graph kernels are widely used for measuring the similarity between graph...
research
11/19/2019

GraphTER: Unsupervised Learning of Graph Transformation Equivariant Representations via Auto-Encoding Node-wise Transformations

Recent advances in Graph Convolutional Neural Networks (GCNNs) have show...
research
03/16/2020

Spectral Graph Attention Network

Variants of Graph Neural Networks (GNNs) for representation learning hav...
research
11/15/2022

Adaptive Multi-Neighborhood Attention based Transformer for Graph Representation Learning

By incorporating the graph structural information into Transformers, gra...
research
04/05/2020

DeepMap: Learning Deep Representations for Graph Classification

Graph-structured data arise in many scenarios. A fundamental problem is ...
research
08/01/2016

Learning Semantically Coherent and Reusable Kernels in Convolution Neural Nets for Sentence Classification

The state-of-the-art CNN models give good performance on sentence classi...

Please sign up or login with your details

Forgot password? Click here to reset