Effectiveness Analysis of RoBERTa and DistilBERT in Emotion Classification Task on Social Media Text Data
DOI:
https://doi.org/10.21512/emacsjournal.v7i1.12618Keywords:
DistilBERT, Emotion Classification, RoBERTaAbstract
The development of social media provides various benefits in various ways, especially in the dissemination of information and communication. Through social media, users can express their opinions, or even their feelings. In this regard, sometimes users also convey information or opinions according to the user's feelings or emotions. This triggers the impact of aggressive online behavior, including cyberbullying, which triggers unhealthy debates on social media. The development of deep learning models has also been developed in several ways, especially emotion classification. In addition to using deep learning models, the development of classification tasks has also been carried out using transformer architectures, such as BERT. The development of the BERT model continues to be carried out, so this study will analyze and explore the application of BERT model development, such as RoBERTa and DistilBERT. The optimal result of this study is with an accuracy value of 92.69% using the RoBERTa model.
Plum Analytics
References
Bayer, J. B., Triệu, P., & Ellison, N. B. (2020). Social Media Elements, Ecologies, and Effects. Annual Review of Psychology, 71(1), 471–497. https://doi.org/10.1146/annurev-psych-010419-050944
Bhimani, H., Mention, A.-L., & Barlatier, P.-J. (2019). Social media and innovation: A systematic literature review and future research directions. Technological Forecasting and Social Change, 144, 251–269. https://doi.org/10.1016/j.techfore.2018.10.007
Cheruku, R., Hussain, K., Kavati, I., Reddy, A. M., & Reddy, K. S. (2023). Sentiment classification with modified RoBERTa and recurrent neural networks. Multimedia Tools and Applications, 83(10), 29399–29417. https://doi.org/10.1007/s11042-023-16833-5
Cust, E. E., Sweeting, A. J., Ball, K., & Robertson, S. (2019). Machine and deep learning for sport-specific movement recognition: a systematic review of model development and performance. Journal of Sports Sciences, 37(5), 568–600. https://doi.org/10.1080/02640414.2018.1521769
Diwakar, & Raj, D. (2024). DistilBERT-based Text Classification for Automated Diagnosis of Mental Health Conditions (pp. 93–106). https://doi.org/10.1007/978-981-99-9621-6_6
Dong, S., Wang, P., & Abbas, K. (2021). A survey on deep learning and its applications. Computer Science Review, 40, 100379. https://doi.org/10.1016/j.cosrev.2021.100379
Elgiriyewithana, N. (2024). Emotions. Https://Www.Kaggle.Com/Datasets/Nelgiriyewithana/Emotions.
Jojoa, M., Eftekhar, P., Nowrouzi-Kia, B., & Garcia-Zapirain, B. (2024). Natural language processing analysis applied to COVID-19 open-text opinions using a distilBERT model for sentiment categorization. AI & SOCIETY, 39(3), 883–890. https://doi.org/10.1007/s00146-022-01594-w
Kumar, T., Mahrishi, M., & Sharma, G. (2023). Emotion recognition in Hindi text using multilingual BERT transformer. Multimedia Tools and Applications, 82(27), 42373–42394. https://doi.org/10.1007/s11042-023-15150-1
Li, B. (2024). A Study of DistilBERT-Based Answer Extraction Machine Reading Comprehension Algorithm. Proceedings of the 2024 3rd International Conference on Cyber Security, Artificial Intelligence and Digital Economy, 261–268. https://doi.org/10.1145/3672919.3672968
Liu, P., & Lv, S. (2023). Chinese RoBERTa Distillation For Emotion Classification. The Computer Journal, 66(12), 3107–3118. https://doi.org/10.1093/comjnl/bxac153
Liu, Y., Ott, M., Goyal, N., Du, J., Joshi, M., Chen, D., Levy, O., Lewis, M., Zettlemoyer, L., & Stoyanov, V. (2019). RoBERTa: A Robustly Optimized BERT Pretraining Approach. http://arxiv.org/abs/1907.11692
Malik, M. S. I., Nazarova, A., Jamjoom, M. M., & Ignatov, D. I. (2023). Multilingual hope speech detection: A Robust framework using transfer learning of fine-tuning RoBERTa model. Journal of King Saud University - Computer and Information Sciences, 35(8), 101736. https://doi.org/10.1016/j.jksuci.2023.101736
Naslund, J. A., Bondre, A., Torous, J., & Aschbrenner, K. A. (2020). Social Media and Mental Health: Benefits, Risks, and Opportunities for Research and Practice. Journal of Technology in Behavioral Science, 5(3), 245–257. https://doi.org/10.1007/s41347-020-00134-x
Nisar, T. M., Prabhakar, G., & Strakova, L. (2019). Social media information benefits, knowledge management and smart organizations. Journal of Business Research, 94, 264–272. https://doi.org/10.1016/j.jbusres.2018.05.005
Qasim, R., Bangyal, W. H., Alqarni, M. A., & Ali Almazroi, A. (2022). A Fine-Tuned BERT-Based Transfer Learning Approach for Text Classification. Journal of Healthcare Engineering, 2022, 1–17. https://doi.org/10.1155/2022/3498123
Sanh, V., Debut, L., Chaumond, J., & Wolf, T. (2019). DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter. http://arxiv.org/abs/1910.01108
Wolf, T., Debut, L., Sanh, V., Chaumond, J., Delangue, C., Moi, A., Cistac, P., Rault, T., Louf, R., Funtowicz, M., Davison, J., Shleifer, S., von Platen, P., Ma, C., Jernite, Y., Plu, J., Xu, C., Le Scao, T., Gugger, S., … Rush, A. (2020). Transformers: State-of-the-Art Natural Language Processing. Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing: System Demonstrations, 38–45. https://doi.org/10.18653/v1/2020.emnlp-demos.6
Downloads
Published
Issue
Section
License
Copyright (c) 2025 Engineering, MAthematics and Computer Science Journal (EMACS)

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Authors who publish with this journal agree to the following terms:
a. Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License - Share Alike that allows others to share the work with an acknowledgment of the work's authorship and initial publication in this journal.
b. Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgment of its initial publication in this journal.
c. Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work.
USER RIGHTS
All articles published Open Access will be immediately and permanently free for everyone to read and download. We are continuously working with our author communities to select the best choice of license options, currently being defined for this journal as follows: Creative Commons Attribution-Share Alike (CC BY-SA)