Curvature-informed multi-task learning for graph networks

08/02/2022
by   Alexander New, et al.
0

Properties of interest for crystals and molecules, such as band gap, elasticity, and solubility, are generally related to each other: they are governed by the same underlying laws of physics. However, when state-of-the-art graph neural networks attempt to predict multiple properties simultaneously (the multi-task learning (MTL) setting), they frequently underperform a suite of single property predictors. This suggests graph networks may not be fully leveraging these underlying similarities. Here we investigate a potential explanation for this phenomenon: the curvature of each property's loss surface significantly varies, leading to inefficient learning. This difference in curvature can be assessed by looking at spectral properties of the Hessians of each property's loss function, which is done in a matrix-free manner via randomized numerical linear algebra. We evaluate our hypothesis on two benchmark datasets (Materials Project (MP) and QM8) and consider how these findings can inform the training of novel multi-task learning models.

READ FULL TEXT
research
10/28/2020

Polymer Informatics with Multi-Task Learning

Modern data-driven tools are transforming application-specific polymer d...
research
08/23/2023

A Scale-Invariant Task Balancing Approach for Multi-Task Learning

Multi-task learning (MTL), a learning paradigm to learn multiple related...
research
09/11/2020

Towards Interpretable Multi-Task Learning Using Bilevel Programming

Interpretable Multi-Task Learning can be expressed as learning a sparse ...
research
02/04/2022

Multi-task graph neural networks for simultaneous prediction of global and atomic properties in ferromagnetic systems

We introduce a multi-tasking graph convolutional neural network, HydraGN...
research
07/11/2023

Memorization Through the Lens of Curvature of Loss Function Around Samples

Neural networks are overparametrized and easily overfit the datasets the...
research
11/02/2017

Overcoming data scarcity with transfer learning

Despite increasing focus on data publication and discovery in materials ...

Please sign up or login with your details

Forgot password? Click here to reset