Deep Learning for Molecular Graphs with Tiered Graph Autoencoders and Graph Classification

10/24/2019
by   Daniel T Chang, et al.
0

Tiered graph autoencoders provide the architecture and mechanisms for learning tiered latent representations and latent spaces for molecular graphs that explicitly represent and utilize groups (e.g., functional groups). This enables the utilization and exploration of tiered molecular latent spaces, either individually – the node (atom) tier, the group tier, or the graph (molecule) tier – or jointly, as well as navigation across the tiers. In this paper, we discuss the use of tiered graph autoencoders together with graph classification for molecular graphs. We show features of molecular graphs used, and groups in molecular graphs identified for some sample molecules. We briefly review graph classification and the QM9 dataset for background information, and discuss the use of tiered graph embeddings for graph classification, particularly weighted group pooling. We find that functional groups and ring groups effectively capture and represent the chemical essence of molecular graphs (structures). Further, tiered graph autoencoders and graph classification together provide effective, efficient and interpretable deep learning for molecular graphs, with the former providing unsupervised, transferable learning and the latter providing supervised, task-optimized learning.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/22/2019

Tiered Graph Autoencoders with PyTorch Geometric for Molecular Graphs

Tiered latent representations and latent spaces for molecular graphs pro...
research
05/13/2022

Embodied-Symbolic Contrastive Graph Self-Supervised Learning for Molecular Graphs

Dual embodied-symbolic concept representations are the foundation for de...
research
05/28/2019

GraphNVP: An Invertible Flow Model for Generating Molecular Graphs

We propose GraphNVP, the first invertible, normalizing flow-based molecu...
research
04/18/2019

Decoding Molecular Graph Embeddings with Reinforcement Learning

We present RL-VAE, a graph-to-graph variational autoencoder that uses re...
research
05/30/2019

All SMILES VAE

Variational autoencoders (VAEs) defined over SMILES string and graph-bas...
research
04/29/2023

Conditional Graph Information Bottleneck for Molecular Relational Learning

Molecular relational learning, whose goal is to learn the interaction be...
research
02/17/2023

Multiresolution Graph Transformers and Wavelet Positional Encoding for Learning Hierarchical Structures

Contemporary graph learning algorithms are not well-defined for large mo...

Please sign up or login with your details

Forgot password? Click here to reset