The PDF file you selected should load here if your Web browser has a PDF reader plug-in installed (for example, a recent version of Adobe Acrobat Reader).
If you would like more information about how to print, save, and work with PDFs, Highwire Press provides a helpful Frequently Asked Questions about PDFs.
Alternatively, you can download the PDF file directly to your computer, from where it can be opened using a PDF reader. To download the PDF, click the Download link above.
BibTex Citation Data :
@article{JOIV1165, author = {Tan Wen Yit and Rohayanti Hassan and Noor Hidayah Zakaria and Shahreen Kasim and Sim Hiew Moi and Alif Ridzuan Khairuddin and Hidra Amnur}, title = {Transformer in mRNA Degradation Prediction}, journal = {JOIV : International Journal on Informatics Visualization}, volume = {7}, number = {2}, year = {2023}, keywords = {Transformer; optimizer; AdaBelief}, abstract = {The unstable properties and the advantages of the mRNA vaccine have encouraged many experts worldwide in tackling the degradation problem. Machine learning models have been highly implemented in bioinformatics and the healthcare fieldstone insights from biological data. Thus, machine learning plays an important role in predicting the degradation rate of mRNA vaccine candidates. Stanford University has held an OpenVaccine Challenge competition on Kaggle to gather top solutions in solving the mentioned problems, and a multi-column root means square error (MCRMSE) has been used as a main performance metric. The Nucleic Transformer has been proposed by different researchers as a deep learning solution that is able to utilize a self-attention mechanism and Convolutional Neural Network (CNN). Hence, this paper would like to enhance the existing Nucleic Transformer performance by utilizing the AdaBelief or RangerAdaBelief optimizer with a proposed decoder that consists of a normalization layer between two linear layers. Based on the experimental result, the performance of the enhanced Nucleic Transformer outperforms the existing solution. In this study, the AdaBelief optimizer performs better than the RangerAdaBelief optimizer, even though it possesses Ranger’s advantages. The advantages of the proposed decoder can only be shown when there is limited data. When the data is sufficient, the performance might be similar but still better than the linear decoder if and only if the AdaBelief optimizer is used. As a result, the combination of the AdaBelief optimizer with the proposed decoder performs the best with 2.79% and 1.38% performance boost in public and private MCRMSE, respectively.}, issn = {2549-9904}, pages = {588--599}, doi = {10.30630/joiv.7.2.1165}, url = {https://joiv.org/index.php/joiv/article/view/1165} }
Refworks Citation Data :
@article{{JOIV}{1165}, author = {Yit, T., Hassan, R., Zakaria, N., Kasim, S., Moi, S., Khairuddin, A., Amnur, H.}, title = {Transformer in mRNA Degradation Prediction}, journal = {JOIV : International Journal on Informatics Visualization}, volume = {7}, number = {2}, year = {2023}, doi = {10.30630/joiv.7.2.1165}, url = {} }Refbacks
- There are currently no refbacks.

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
__________________________________________________________________________
JOIV : International Journal on Informatics Visualization
ISSN 2549-9610 (print) | 2549-9904 (online)
Organized by Society of Visual Informatocs, and Institute of Visual Informatics - UKM and Soft Computing and Data Mining Centre - UTHM
W : http://joiv.org
E : joiv@pnp.ac.id, hidra@pnp.ac.id, rahmat@pnp.ac.id
View JOIV Stats
is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.