Scalable training of graph convolutional neural networks for fast and accurate predictions of HOMO-LUMO gap in molecules

07/22/2022
by   Jong Youl Choi, et al.
0

Graph Convolutional Neural Network (GCNN) is a popular class of deep learning (DL) models in material science to predict material properties from the graph representation of molecular structures. Training an accurate and comprehensive GCNN surrogate for molecular design requires large-scale graph datasets and is usually a time-consuming process. Recent advances in GPUs and distributed computing open a path to reduce the computational cost for GCNN training effectively. However, efficient utilization of high performance computing (HPC) resources for training requires simultaneously optimizing large-scale data management and scalable stochastic batched optimization techniques. In this work, we focus on building GCNN models on HPC systems to predict material properties of millions of molecules. We use HydraGNN, our in-house library for large-scale GCNN training, leveraging distributed data parallelism in PyTorch. We use ADIOS, a high-performance data management framework for efficient storage and reading of large molecular graph data. We perform parallel training on two open-source large-scale graph datasets to build a GCNN predictor for an important quantum property known as the HOMO-LUMO gap. We measure the scalability, accuracy, and convergence of our approach on two DOE supercomputers: the Summit supercomputer at the Oak Ridge Leadership Computing Facility (OLCF) and the Perlmutter system at the National Energy Research Scientific Computing Center (NERSC). We present our experimental results with HydraGNN showing i) reduction of data loading time up to 4.2 times compared with a conventional method and ii) linear scaling performance for training up to 1,024 GPUs on both Summit and Perlmutter.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/06/2021

Scalable Geometric Deep Learning on Molecular Graphs

Deep learning in molecular and materials sciences is limited by the lack...
research
11/28/2020

Scalable Deep-Learning-Accelerated Topology Optimization for Additively Manufactured Materials

Topology optimization (TO) is a popular and powerful computational appro...
research
10/05/2019

Parallelizing Training of Deep Generative Models on Massive Scientific Datasets

Training deep neural networks on large scientific data is a challenging ...
research
03/02/2022

Hyperparameter optimization of data-driven AI models on HPC systems

In the European Center of Excellence in Exascale computing "Research on ...
research
01/18/2023

A novel preconditioned conjugate gradient multigrid method for multi-material topology optimization

In recent years, topology optimization has been developed sufficiently a...
research
04/22/2021

An Accurate and Efficient Large-scale Regression Method through Best Friend Clustering

As the data size in Machine Learning fields grows exponentially, it is i...
research
05/17/2023

Predicting Side Effect of Drug Molecules using Recurrent Neural Networks

Identification and verification of molecular properties such as side eff...

Please sign up or login with your details

Forgot password? Click here to reset