Exploiting Pretrained Biochemical Language Models for Targeted Drug Design

09/02/2022
by   Gökçe Uludoğan, et al.
0

Motivation: The development of novel compounds targeting proteins of interest is one of the most important tasks in the pharmaceutical industry. Deep generative models have been applied to targeted molecular design and have shown promising results. Recently, target-specific molecule generation has been viewed as a translation between the protein language and the chemical language. However, such a model is limited by the availability of interacting protein-ligand pairs. On the other hand, large amounts of unlabeled protein sequences and chemical compounds are available and have been used to train language models that learn useful representations. In this study, we propose exploiting pretrained biochemical language models to initialize (i.e. warm start) targeted molecule generation models. We investigate two warm start strategies: (i) a one-stage strategy where the initialized model is trained on targeted molecule generation (ii) a two-stage strategy containing a pre-finetuning on molecular generation followed by target specific training. We also compare two decoding strategies to generate compounds: beam search and sampling. Results: The results show that the warm-started models perform better than a baseline model trained from scratch. The two proposed warm-start strategies achieve similar results to each other with respect to widely used metrics from benchmarks. However, docking evaluation of the generated compounds for a number of novel proteins suggests that the one-stage strategy generalizes better than the two-stage strategy. Additionally, we observe that beam search outperforms sampling in both docking evaluation and benchmark metrics for assessing compound quality. Availability and implementation: The source code is available at https://github.com/boun-tabi/biochemical-lms-for-drug-design and the materials are archived in Zenodo at https://doi.org/10.5281/zenodo.6832145

READ FULL TEXT

page 5

page 12

research
08/16/2023

Atom-by-atom protein generation and beyond with language models

Protein language models learn powerful representations directly from seq...
research
05/09/2023

Language models can generate molecules, materials, and protein binding sites directly in three dimensions as XYZ, CIF, and PDB files

Language models are powerful tools for molecular design. Currently, the ...
research
02/15/2023

Target Specific De Novo Design of Drug Candidate Molecules with Graph Transformer-based Generative Adversarial Networks

Discovering novel drug candidate molecules is one of the most fundamenta...
research
06/17/2022

LIMO: Latent Inceptionism for Targeted Molecule Generation

Generation of drug-like molecules with high binding affinity to target p...
research
09/15/2020

A Systematic Characterization of Sampling Algorithms for Open-ended Language Generation

This work studies the widely adopted ancestral sampling algorithms for a...
research
05/17/2023

Lingo3DMol: Generation of a Pocket-based 3D Molecule using a Language Model

Structure-based drug design powered by deep generative models have attra...
research
10/16/2020

Substance over Style: Document-Level Targeted Content Transfer

Existing language models excel at writing from scratch, but many real-wo...

Please sign up or login with your details

Forgot password? Click here to reset