Home > Article > Technology peripherals > Tsinghua team proposes knowledge-guided graph Transformer pre-training framework: a method to improve molecular representation learning
In order to facilitate molecular property prediction, in the field of drug discovery, it is very important to learn effective molecular feature representations. Recently, people have overcome the challenge of data scarcity by pre-training graph neural networks (GNN) using self-supervised learning techniques. However, there are two main problems with current methods based on self-supervised learning: the lack of clear self-supervised learning strategies and the limited capabilities of GNN
Recently, a research team from Tsinghua University, West Lake University and Zhijiang Laboratory, We propose Knowledge-guided Pre-training of Graph Transformer (KPGT), a self-supervised learning framework that provides improved, generalizable and robust learning through significantly enhanced molecular representation learning. Molecular property prediction. The KPGT framework integrates a graph Transformer designed specifically for molecular graphs and a knowledge-guided pre-training strategy to fully capture the structural and semantic knowledge of molecules.
Through extensive computational testing on 63 data sets, KPGT has demonstrated superior performance in predicting molecular properties in various fields. Furthermore, the practical applicability of KPGT in drug discovery was verified by identifying potential inhibitors of two antitumor targets. Overall, KPGT can provide a powerful and useful tool for advancing the AI-assisted drug discovery process.
The research was titled "A knowledge-guided pre-training framework for improving molecular representation learning" and was published in "Nature Communications" on November 21, 2023.
Determining molecular properties experimentally requires significant time and resources, and identifying molecules with desired properties is one of the most significant challenges in drug discovery. In recent years, artificial intelligence-based methods have played an increasingly important role in predicting molecular properties. One of the main challenges of artificial intelligence-based methods for predicting molecular properties is the characterization of molecules
In recent years, the emergence of deep learning-based methods has emerged as a potentially useful tool for predicting molecular properties, mainly because they have the ability to transform from simple inputs Excellent ability to automatically extract effective features from data. Notably, various neural network architectures, including recurrent neural networks (RNN), convolutional neural networks (CNN), and graph neural networks (GNN), are adept at modeling molecular data in various formats, ranging from simplified molecular inputs to Line input system (SMILES) to molecular images and molecular diagrams. However, the limited availability of marker molecules and the vastness of chemical space limit their predictive performance, especially when dealing with out-of-distribution data samples.
With the remarkable achievements of self-supervised learning methods in the fields of natural language processing and computer vision, these techniques have been applied to pre-train GNNs and improve representation learning of molecules, thereby achieving success in downstream molecular property prediction tasks. Substantial progress has been made
The researchers hypothesize that introducing additional knowledge that quantitatively describes molecular characteristics into a self-supervised learning framework can effectively address these challenges. Molecules have many quantitative characteristics, such as molecular descriptors and fingerprints, that can be easily obtained with currently established computational tools. Integrating this additional knowledge can introduce rich molecular semantic information into self-supervised learning, thereby greatly enhancing the acquisition of semantically rich molecular representations.
Generally, existing self-supervised learning methods rely on GNN as the core model. However, GNN has limited model capacity. Furthermore, GNNs can have difficulty capturing long-range interactions between atoms. And Transformer-based models have become a game-changing model. It is characterized by an increasing number of parameters and the ability to capture long-range interactions, providing a promising approach to comprehensively simulating the structural characteristics of molecules
In this study, the researchers introduced a self-supervised learning framework called KPGT, which aims to enhance molecular representation learning to promote downstream molecular property prediction tasks. The KPGT framework consists of two main components: a backbone model called Line Graph Transformer (LiGhT) and a knowledge-guided pre-training policy. The KPGT framework combines the high-capacity LiGhT model, which is specifically designed to accurately model molecular graph structures, and utilizes a knowledge-guided pre-training strategy to capture molecular structure and semantic knowledge
The research team used About 2 million molecules, LiGhT was pre-trained through a knowledge-guided pre-training strategy
Rewritten content: Figure: KPGT Overview. (Source: paper)
KPGT outperforms baseline methods in molecular property prediction. Compared with several baseline methods, KPGT achieves significant improvements on 63 datasets.
In addition, by successfully using KPGT to identify potential inhibitors of two anti-tumor targets, hematopoietic progenitor kinase 1 (HPK1) and fibroblast growth factor receptor (FGFR1), it was demonstrated Practical applicability of KPGT.
Despite the advantages of KPGT in effective molecular property prediction, there are still some limitations.
Overall, KPGT provides a powerful self-supervised learning framework for effective molecular representation learning, thereby advancing the field of artificial intelligence-assisted drug discovery.
Paper link: https://www.nature.com/articles/s41467-023-43214-1
The above is the detailed content of Tsinghua team proposes knowledge-guided graph Transformer pre-training framework: a method to improve molecular representation learning. For more information, please follow other related articles on the PHP Chinese website!