Accelerating Molecular Graph Neural Networks via Knowledge Distillation

06/26/2023
by   Filip Ekström Kelvinius, et al.
0

Recent advances in graph neural networks (GNNs) have allowed molecular simulations with accuracy on par with conventional gold-standard methods at a fraction of the computational cost. Nonetheless, as the field has been progressing to bigger and more complex architectures, state-of-the-art GNNs have become largely prohibitive for many large-scale applications. In this paper, we, for the first time, explore the utility of knowledge distillation (KD) for accelerating molecular GNNs. To this end, we devise KD strategies that facilitate the distillation of hidden representations in directional and equivariant GNNs and evaluate their performance on the regression task of energy and force prediction. We validate our protocols across different teacher-student configurations and demonstrate that they can boost the predictive accuracy of student models without altering their architecture. We also conduct comprehensive optimization of various components of our framework, and investigate the potential of data augmentation to further enhance performance. All in all, we manage to close as much as 59 predictive accuracy between models like GemNet-OC and PaiNN with zero additional cost at inference.

READ FULL TEXT

page 22

page 24

research
02/27/2023

Graph-based Knowledge Distillation: A survey and experimental evaluation

Graph, such as citation networks, social networks, and transportation ne...
research
05/08/2022

Data-Free Adversarial Knowledge Distillation for Graph Neural Networks

Graph neural networks (GNNs) have been widely used in modeling graph str...
research
10/25/2022

Online Cross-Layer Knowledge Distillation on Graph Neural Networks with Deep Supervision

Graph neural networks (GNNs) have become one of the most popular researc...
research
06/20/2023

Uncertainty Estimation for Molecules: Desiderata and Methods

Graph Neural Networks (GNNs) are promising surrogates for quantum mechan...
research
09/01/2023

Catalyst Property Prediction with CatBERTa: Unveiling Feature Exploration Strategies through Large Language Models

Efficient catalyst screening necessitates predictive models for adsorpti...
research
09/02/2020

Self-supervised Smoothing Graph Neural Networks

This paper studies learning node representations with GNNs for unsupervi...
research
02/01/2023

Knowledge Distillation on Graphs: A Survey

Graph Neural Networks (GNNs) have attracted tremendous attention by demo...

Please sign up or login with your details

Forgot password? Click here to reset