Graph-Based Neural Network Models with Multiple Self-Supervised Auxiliary Tasks

11/14/2020
by   Franco Manessi, et al.
0

Self-supervised learning is currently gaining a lot of attention, as it allows neural networks to learn robust representations from large quantities of unlabeled data. Additionally, multi-task learning can further improve representation learning by training networks simultaneously on related tasks, leading to significant performance improvements. In this paper, we propose a general framework to improve graph-based neural network models by combining self-supervised auxiliary learning tasks in a multi-task fashion. Since Graph Convolutional Networks are among the most promising approaches for capturing relationships among structured data points, we use them as a building block to achieve competitive results on standard semi-supervised graph classification tasks.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset