Deep learning and attention mechanisms in RNA secondary structure prediction: A critical survey

Authors: Musaab Nabil Ali Askar 1, Azian Azamimi Abdullah 1, 2, *, Mohd Yusoff Mashor 1, Zeti-Azura Mohamed-Hussein 3, Zeehaida Mohamed 4, Wei Chern Ang 5, Shigehiko Kanaya 6

Affiliations:

1Faculty of Electronic Engineering and Technology, Universiti Malaysia Perlis, Arau, Malaysia
2Advanced Sensor Technology, Centre of Excellence (CEASTech), Universiti Malaysia Perlis (UniMAP), Arau, Malaysia
3Department of Applied Physics, Faculty of Science and Technology, Universiti Kebangsaan Malaysia, Bangi, Malaysia
4Department of Medical Microbiology and Parasitology, School of Medical Sciences, Universiti Sains Malaysia, George Town, Malaysia
5Clinical Research Centre, Hospital Tuanku Fauziah, Ministry of Health Malaysia, Perlis, Kangar, Malaysia
6Graduate School of Science and Technology, Nara Institute of Science and Technology, Ikoma, Nara, Japan

Abstract

The secondary structure of ribonucleic acid (RNA) plays a key role in understanding gene regulation, cellular processes, and the development of new treatments. Traditional thermodynamic methods, especially those using Minimum Free Energy (MFE) algorithms, have provided a reliable physics-based approach for predicting RNA structures. Although these methods remain important, there is increasing interest in using deep learning models to detect new structural patterns, such as pseudoknots and long-range interactions, in large RNA datasets. Building on thermodynamic principles, these models aim to extend current knowledge and offer new ways to study RNA structure and function. In particular, attention-based transformer models are effective at capturing both short- and long-distance relationships, making them well-suited for modeling complex RNA sequences. This review highlights recent advances in RNA secondary structure prediction using transformer-based approaches, focusing on key models such as E2EFold, ATTFold, RNAformer, and DEBFold. It also discusses current challenges, future research directions, and the impact of attention-based deep learning on the field of RNA structural bioinformatics.

Keywords

RNA structure, Deep learning, Transformer models, Thermodynamic methods, Pseudoknot prediction

Download

📄 Full PDF

DOI

https://doi.org/10.21833/ijaas.2025.09.006

Citation (APA)

Askar, M. N. A., Abdullah, A. A., Mashor, M. Y., Mohamed-Hussein, Z.-A., Mohamed, Z., Ang, W. C., & Kanaya, S. (2025). Deep learning and attention mechanisms in RNA secondary structure prediction: A critical survey. International Journal of Advanced and Applied Sciences, 12(9), 61–78. https://doi.org/10.21833/ijaas.2025.09.006