ChemBERTa-2: Towards Chemical Foundation Models

09/05/2022
by   Walid Ahmad, et al.
14

Large pretrained models such as GPT-3 have had tremendous impact on modern natural language processing by leveraging self-supervised learning to learn salient representations that can be used to readily finetune on a wide variety of downstream tasks. We investigate the possibility of transferring such advances to molecular machine learning by building a chemical foundation model, ChemBERTa-2, using the language of SMILES. While labeled data for molecular prediction tasks is typically scarce, libraries of SMILES strings are readily available. In this work, we build upon ChemBERTa by optimizing the pretraining process. We compare multi-task and self-supervised pretraining by varying hyperparameters and pretraining dataset size, up to 77M compounds from PubChem. To our knowledge, the 77M set constitutes one of the largest datasets used for molecular pretraining to date. We find that with these pretraining improvements, we are competitive with existing state-of-the-art architectures on the MoleculeNet benchmark suite. We analyze the degree to which improvements in pretraining translate to improvement on downstream tasks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/03/2022

MolE: a molecular foundation model for drug discovery

Models that accurately predict properties based on chemical structure ar...
research
12/09/2022

Audiovisual Masked Autoencoders

Can we leverage the audiovisual information already present in video to ...
research
10/19/2020

ChemBERTa: Large-Scale Self-Supervised Pretraining for Molecular Property Prediction

GNNs and chemical fingerprints are the predominant approaches to represe...
research
03/02/2023

On the Provable Advantage of Unsupervised Pretraining

Unsupervised pretraining, which learns a useful representation using a l...
research
08/17/2022

DPA-1: Pretraining of Attention-based Deep Potential Model for Molecular Simulation

Machine learning assisted modeling of the inter-atomic potential energy ...
research
09/29/2022

Improving Molecular Pretraining with Complementary Featurizations

Molecular pretraining, which learns molecular representations over massi...
research
11/19/2022

Molecular Structure-Property Co-Trained Foundation Model for In Silico Chemistry

Recently, deep learning approaches have been extensively studied for var...

Please sign up or login with your details

Forgot password? Click here to reset