Antiviral Medication Prediction Using A Deep Learning Model of Drug-Target Interaction for The Coronavirus SARS-COV
DOI:
https://doi.org/10.21512/emacsjournal.v6i2.11290Keywords:
Message-passing neural networks, Transformer, Drug Target Interaction, SARS-COVAbstract
Graph convolutional neural networks (GCNs) have shown promising performance in modeling graph data, particularly for small-scale molecules. Message-passing neural networks (MPNNs) are an important form of GCN variant. They excel at gathering and integrating particular information about molecules via several repetitions of message transmission. This capability has resulted in major advances in molecular modeling and property prediction. By combining the self-attention mechanism with MPNNs, there is potential to improve molecular representation while using Transformers' proven efficacy in other artificial intelligence disciplines. This research introduces a transformer-based message-passing neural network (T-MPNN) that is intended to improve the process of embedding molecular representations for property prediction. Our technique incorporates attention processes into MPNNs' message-passing and readout phases, resulting in molecular representations that are seamlessly integrated. The experimental results from three datasets show that T-MPNN outperforms or equals cutting-edge baseline models in tasks involving quantitative structure-property connections. By studying case studies of SARS-COV growth inhibitors, we demonstrate our model's ability to graphically depict attention at the atomic level. This enables us to pinpoint individual chemical atoms or functional groups linked with desirable biological properties. The model we propose improves the interpretability of classic MPNNs and is a useful tool for investigating the impact of self-attention on chemical substructures and functional groups in molecular representation learning. This leads to a better understanding of medication modes of action.
Plum Analytics
References
Alhamoud, K., Ghunaim, Y., Alshehri, A. S., Li, G., Ghanem, B., & You, F. (2024). Leveraging 2D molecular graph pretraining for improved 3D conformer generation with graph neural networks. Computers & Chemical Engineering, 183, 108622. https://doi.org/10.1016/j.compchemeng.2024.108622
Dosovitskiy, A., Beyer, L., Kolesnikov, A., Weissenborn, D., Zhai, X., Unterthiner, T., Dehghani, M., Minderer, M., Heigold, G., Gelly, S., Uszkoreit, J., & Houlsby, N. (2020). An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale.
Gilmer, J., Schoenholz, S. S., Riley, P. F., Vinyals, O., & Dahl, G. E. (2020). Message Passing Neural Networks (pp. 199–214). https://doi.org/10.1007/978-3-030-40245-7_10
Huang, J., Fang, Y., Wu, R., Xia, T., Wang, X., Jia, J., & Wang, G. (2024). All-trans retinoic acid acts as a dual-purpose inhibitor of SARS-CoV-2 infection and inflammation. Computers in Biology and Medicine, 169, 107942. https://doi.org/10.1016/j.compbiomed.2024.107942
Jha, B., Deepak, A., Kumar, V., & Krishnasamy, G. (2024). DFDTA-MultiAtt: Multi-attention based deep learning ensemble fusion network for drug target affinity prediction. Journal of Autonomous Intelligence, 7(5), 851. https://doi.org/10.32629/jai.v7i5.851
Jumper, J., Evans, R., Pritzel, A., Green, T., Figurnov, M., Ronneberger, O., Tunyasuvunakool, K., Bates, R., Žídek, A., Potapenko, A., Bridgland, A., Meyer, C., Kohl, S. A. A., Ballard, A. J., Cowie, A., Romera-Paredes, B., Nikolov, S., Jain, R., Adler, J., … Hassabis, D. (2021). Highly accurate protein structure prediction with AlphaFold. Nature, 596(7873), 583–589. https://doi.org/10.1038/s41586-021-03819-2
Kang, H., & Kang, P. (2024). Transformer-based multivariate time series anomaly detection using inter-variable attention mechanism. Knowledge-Based Systems, 290, 111507. https://doi.org/10.1016/j.knosys.2024.111507
Li, X., Yang, Q., Xu, L., Dong, W., Luo, G., Wang, W., Dong, S., Wang, K., Xuan, P., Zhang, X., & Gao, X. (2024). DrugMGR: a deep bioactive molecule binding method to identify compounds targeting proteins. Bioinformatics, 40(4). https://doi.org/10.1093/bioinformatics/btae176
Maziarka, Ł., Danel, T., Mucha, S., Rataj, K., Tabor, J., & Jastrzębski, S. (2020). Molecule Attention Transformer.
Pushkaran, A. C., & Arabi, A. A. (2024). From understanding diseases to drug design: can artificial intelligence bridge the gap? Artificial Intelligence Review, 57(4), 86. https://doi.org/10.1007/s10462-024-10714-5
Shin, B., Park, S., Kang, K., & Ho, J. C. (2019). Self-Attention Based Molecule Representation for Predicting Drug-Target Interaction. In F. Doshi-Velez, J. Fackler, K. Jung, D. Kale, R. Ranganath, B. Wallace, & J. Wiens (Eds.), Proceedings of the 4th Machine Learning for Healthcare Conference (Vol. 106, pp. 230–248). PMLR. https://proceedings.mlr.press/v106/shin19a.html
Tang, X., Lei, X., & Zhang, Y. (2024). Prediction of Drug-Target Affinity Using Attention Neural Network. International Journal of Molecular Sciences, 25(10), 5126. https://doi.org/10.3390/ijms25105126
Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A. N., Kaiser, Ł., & Polosukhin, I. (2017). Attention is all you need. Advances in Neural Information Processing Systems, 2017-Decem(Nips), 5999–6009.
Wu, Z., Pan, S., Chen, F., Long, G., Zhang, C., & Yu, P. S. (2021). A Comprehensive Survey on Graph Neural Networks. IEEE Transactions on Neural Networks and Learning Systems, 32(1), 4–24. https://doi.org/10.1109/TNNLS.2020.2978386
Xiong, Z., Wang, D., Liu, X., Zhong, F., Wan, X., Li, X., Li, Z., Luo, X., Chen, K., Jiang, H., & Zheng, M. (2020). Pushing the Boundaries of Molecular Representation for Drug Discovery with the Graph Attention Mechanism. Journal of Medicinal Chemistry, 63(16), 8749–8760. https://doi.org/10.1021/acs.jmedchem.9b00959
Xu, F., Xiao, Q., Zhu, Y., Wang, H., & Ding, H. (2024). Drug-Target Binding Affinity Prediction by Combination Graph Neural Network and BiLSTM. 2024 IEEE 7th Advanced Information Technology, Electronic and Automation Control Conference (IAEAC), 1012–1016. https://doi.org/10.1109/IAEAC59436.2024.10503589
Yu, X., Qin, W., Cai, H., Ren, C., Huang, S., Lin, X., Tang, L., Shan, Z., AL-Ameer, W. H. A., Wang, L., Yan, H., & Chen, M. (2024). Analyzing the molecular mechanism of xuefuzhuyu decoction in the treatment of pulmonary hypertension with network pharmacology and bioinformatics and verifying molecular docking. Computers in Biology and Medicine, 169, 107863. https://doi.org/10.1016/j.compbiomed.2023.107863
Downloads
Published
Issue
Section
License
Copyright (c) 2024 Engineering, MAthematics and Computer Science Journal (EMACS)
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Authors who publish with this journal agree to the following terms:
a. Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License - Share Alike that allows others to share the work with an acknowledgment of the work's authorship and initial publication in this journal.
b. Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgment of its initial publication in this journal.
c. Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work.
USER RIGHTS
All articles published Open Access will be immediately and permanently free for everyone to read and download. We are continuously working with our author communities to select the best choice of license options, currently being defined for this journal as follows: Creative Commons Attribution-Share Alike (CC BY-SA)