A Graph VAE and Graph Transformer Approach to Generating Molecular Graphs

04/09/2021
by   Joshua Mitton, et al.
0

We propose a combination of a variational autoencoder and a transformer based model which fully utilises graph convolutional and graph pooling layers to operate directly on graphs. The transformer model implements a novel node encoding layer, replacing the position encoding typically used in transformers, to create a transformer with no position information that operates on graphs, encoding adjacent node properties into the edge generation process. The proposed model builds on graph generative work operating on graphs with edge features, creating a model that offers improved scalability with the number of nodes in a graph. In addition, our model is capable of learning a disentangled, interpretable latent space that represents graph properties through a mapping between latent variables and graph properties. In experiments we chose a benchmark task of molecular generation, given the importance of both generated node and edge features. Using the QM9 dataset we demonstrate that our model performs strongly across the task of generating valid, unique and novel molecules. Finally, we demonstrate that the model is interpretable by generating molecules controlled by molecular properties, and we then analyse and visualise the learned latent representation.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/28/2019

GraphNVP: An Invertible Flow Model for Generating Molecular Graphs

We propose GraphNVP, the first invertible, normalizing flow-based molecu...
research
05/28/2019

Towards Interpretable Sparse Graph Representation Learning with Laplacian Pooling

Recent work in graph neural networks (GNNs) has lead to improvements in ...
research
10/12/2019

Disentangling Interpretable Generative Parameters of Random and Real-World Graphs

While a wide range of interpretable generative procedures for graphs exi...
research
05/03/2021

Recovering Barabási-Albert Parameters of Graphs through Disentanglement

Classical graph modeling approaches such as Erdős Rényi (ER) random grap...
research
10/04/2021

3D-Transformer: Molecular Representation with Transformer in 3D Space

Spatial structures in the 3D space are important to determine molecular ...
research
08/22/2019

Tiered Graph Autoencoders with PyTorch Geometric for Molecular Graphs

Tiered latent representations and latent spaces for molecular graphs pro...
research
05/30/2019

All SMILES Variational Autoencoder

Variational autoencoders (VAEs) defined over SMILES string and graph-bas...

Please sign up or login with your details

Forgot password? Click here to reset