Fine-Tuning BERT for Automated News Classification

Authors

  • Mohammed I. Salih Computer Information System Department, Technical College of Zakho, Duhok Polytechnic University, Duhok, KRG, Iraq
  • Salim M. Mohammed Computer Science Department, College of Science, University of Zakho, Duhok, KRG, Iraq
  • Asaad Kh. Ibrahim Computer Information System Department, Technical College of Zakho, Duhok Polytechnic University, Duhok, KRG, Iraq
  • Omar M. Ahmed Computer Information System Department, Technical College of Zakho, Duhok Polytechnic University, Duhok, KRG, Iraq
  • Lailan M. Haji Computer Science Department, College of Science, University of Zakho, Duhok, KRG, Iraq
Volume: 15 | Issue: 3 | Pages: 22953-22959 | June 2025 | https://doi.org/10.48084/etasr.10625

Abstract

Text classification is a fundamental task in Natural Language Processing (NLP) with a wide range of applications such as sentiment analysis, document classification and content recommendation. Traditional approaches like Naive Bayes (NB), Support Vector Machine (SVM) and Random Forest (RF) relied on feature engineering but lacked contextual understanding. Deep learning came into the picture for text classification with transformer models such as Bidirectional Encoder Representations from Transformers (BERT), which could understand contextual words bidirectionally. In this article, we utilize a pre-trained BERT model fine-tuned on the Reuters-21578 dataset to classify news articles. We aim to measure the performance of transfer learning against common machine learning models and non-fine-tuned BERT. The fine-tuned model achieves 91.77% accuracy, which significantly outperforms the non-fine-tuned BERT, and performs better than classical classifiers such as NB, SVM and RF. The results show that fine-tuning allows BERT to contextualize domain-specific intricacies, resulting in improved classification performance. We also address the computational trade-offs associated with transformer models, highlighting the need for optimal methods for deployment. Thus, this study further enables the use of fine-tuned BERT in automatic news classification and is of significant value for information retrieval and content personalization.

Keywords:

fine-tuning, BERT, news classification, natural language processing, text classification

Downloads

Download data is not yet available.

References

A. K. Mehta and S. Kumar, "Comparative Analysis and Optimization of Spam Filtration Techniques Using Natural Language Processing," in 2024 International Conference on Communication, Computer Sciences and Engineering, Gautam Buddha Nagar, India, 2024, pp. 1005–1010.

H. Singh, S. Sood, H. Maity, and Y. Kumar, "Exploring Pre-processing Strategies and Feature Extraction in practical aspect for Effective Spam Detection," in 2024 IEEE International Conference on Interdisciplinary Approaches in Technology and Management for Social Innovation, Gwalior, India, 2024, pp. 1–6.

T. Vo, "A topic-driven graph-of-words convolutional network for improving text classification: Array," Journal of Science and Technology on Information and Communications, vol. 1, no. 1, pp. 10–19, Mar. 2022.

O. M. Ahmed, L. M. Haji, A. M. Ahmed, and N. M. Salih, "Bitcoin Price Prediction using the Hybrid Convolutional Recurrent Model Architecture," Engineering, Technology & Applied Science Research, vol. 13, no. 5, pp. 11735–11738, Oct. 2023.

L. M. Haji, O. M. Mustafa, S. A. Abdullah, and O. M. Ahmed, "Enhanced Convolutional Neural Network for Fashion Classification," Engineering, Technology & Applied Science Research, vol. 14, no. 5, pp. 16534–16538, Oct. 2024.

J. Devlin, M.-W. Chang, K. Lee, and K. Toutanova, "BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding," in Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), Minneapolis, Minnesota, 2019, pp. 4171–4186.

H. Yuan, "Natural Language Processing for Chest X‐Ray Reports in the Transformer Era: BERT‐Like Encoders for Comprehension and GPT‐Like Decoders for Generation," iRADIOLOGY, Jan. 2025.

Z. Lin, J. Xie, and Q. Li, "Multi-modal news event detection with external knowledge," Information Processing & Management, vol. 61, no. 3, May 2024, Art. no. 103697.

B. A. Baltes, Y. Cardinale, and B. Arroquia-Cuadros, "Automated Fact-checking based on Large Language Models: An application for the press," in Proceedings of the 1st Workshop on COuntering Disinformation with Artificial Intelligence, Santiago de Compostela, Spain, 2019, pp. 40–53.

B. Florea and A. Iftene, "The News in Brief - Leveraging Machine Learning and Artificial Intelligence in News Clustering, Summarization and Evaluation," in 18th International Conference on Linguistic Resources and Tools for Processing the Natural Language, Brasov, Romania, 2023.

S. Mishra, P. Shukla, and R. Agarwal, "Analyzing Machine Learning Enabled Fake News Detection Techniques for Diversified Datasets," Wireless Communications and Mobile Computing, vol. 2022, no. 1, Mar. 2022, Art. no. 1575365.

H.-G. Yoon, H.-J. Song, S.-B. Park, and K. Y. Kim, "A Personalized News Recommendation using User Location and News Contents," Applied Mathematics & Information Sciences, vol. 9, no. 2L, pp. 439–449, Apr. 2015.

S. A. Abdullah, M. I. Salih, and O. M. Ahmed, "Improving Sentiment Classification using Ensemble Learning," International Journal of Informatics, Information System and Computer Engineering, vol. 6, no. 2, pp. 200–211, Dec. 2025.

H. Alamoudi et al., "Arabic Sentiment Analysis for Student Evaluation using Machine Learning and the AraBERT Transformer," Engineering, Technology & Applied Science Research, vol. 13, no. 5, pp. 11945–11952, Oct. 2023.

H. Dabjan and M.-B. Kurdy, "Efficient Streamlined Online Arabic Web Page Classification Using Artificial Bee Colony Optimization," Iraqi Journal of Science, vol. 65, pp. 7220–7236, Dec. 2024.

M. Z. Naeem, F. Rustam, A. Mehmood, Mui-zzud-din, I. Ashraf, and G. S. Choi, "Classification of movie reviews using term frequency-inverse document frequency and optimized machine learning algorithms," PeerJ Computer Science, vol. 8, Mar. 2022, Art. no. e914.

B. Yosra and M. Hakim, "Enhancing Twitter Sentiment Analysis Using Hybrid Transformer and Sequence Models," Japan Journal of Research, vol. 6, no. 1, Nov. 2024, Art. no. 089.

W. Khan, A. Daud, K. Khan, S. Muhammad, and R. Haq, "Exploring the frontiers of deep learning and natural language processing: A comprehensive overview of key challenges and emerging trends," Natural Language Processing Journal, vol. 4, Sep. 2023, Art. no. 100026.

D. Uribe, E. Cuan, and E. Urquizo, "Fine-Tuning of BERT models for Sequence Classification," in 2022 International Conference on Mechatronics, Electronics and Automotive Engineering, Cuernavaca, Mexico, 2022, pp. 140–144.

G. W. Tatchum, A. J. N. Nzeko’o, F. S. Makembe, and X. Y. Djam, "Class-Oriented Text Vectorization for Text Classification: Case Study of Job Offer Classification," Journal of Computer Science and Engineering, vol. 5, no. 2, pp. 116–136, Aug. 2024.

M. M. Alnaddaf and M. S. Başarslan, "Sentiment Analysis Using Various Machine Learning Techniques on Depression Review Data," in 2024 8th International Artificial Intelligence and Data Processing Symposium, Malatya, Turkiye, 2024, pp. 1–5.

X. Jiang, X. Ding, C. Liu, X. Li, Y. Zhang, and S. Wang, "Research on the application of computer-aided deep learning model in natural language processing," Journal of Electronics and Information Science, vol. 9, no. 3, pp. 153–159, Nov. 2024.

S. Mikaeelzadeh and A. Mirzaei, "A Review of Natural Language Processing Models for Classifying Non-Functional Software Requirements." Researchgate, Dec. 03, 2024. [Online]. Available: https://www.researchgate.net/publication/384441061_A_Review_of_Natural_Language_Processing_Models_for_Classifying_Non-Functional_Software_Requirements.

M. A. Wani, A. A. A. El-Latif, M. ELAffendi, and A. Hussain, "AI-based Framework for Discriminating Human-authored and AI-generated Text," IEEE Transactions on Artificial Intelligence, vol. 1, no. 01, pp. 1–15, Dec. 2024.

L. Lilli et al., "Lupus Alberto: A Transformer-Based Approach for SLE Information Extraction from Italian Clinical Reports," in Proceedings of the 10th Italian Conference on Computational Linguistics (CLiC-it 2024), Pisa, Italy, 2024, pp. 510–516.

Z. Z. Chen et al., "A Survey on Large Language Models for Critical Societal Domains: Finance, Healthcare, and Law." arXiv, Nov. 21, 2024.

O. Jokinen, "Document-level embeddings and graph-augmented learning for Finnish articles," M.S. thesis, Faculty of Science, University of Helsinki, Finland, 2024.

R. Abilio, G. P. Coelho, and A. E. A. da Silva, "Evaluating Named Entity Recognition: A comparative analysis of mono- and multilingual transformer models on a novel Brazilian corporate earnings call transcripts dataset," Applied Soft Computing, vol. 166, Nov. 2024, Art. no. 112158.

D. D. Lewis, "Reuters-21578 Text Categorization Test Collection." [Online]. Available: https://www.daviddlewis.com/resources/testcollections/reuters21578/.

Downloads

How to Cite

[1]
Salih, M.I., Mohammed, S.M., Ibrahim, A.K., Ahmed, O.M. and Haji, L.M. 2025. Fine-Tuning BERT for Automated News Classification. Engineering, Technology & Applied Science Research. 15, 3 (Jun. 2025), 22953–22959. DOI:https://doi.org/10.48084/etasr.10625.

Metrics

Abstract Views: 142
PDF Downloads: 100

Metrics Information