Ünal, Muhammed CihatYurtalan, GökhanKaratas, Yahya BahadırKaramanlıoğlu, AlperDemirel, Berkan01. Çankaya Üniversitesi2025-10-062025-10-0620259798331566555https://doi.org/10.1109/SIU66497.2025.11112276https://hdl.handle.net/20.500.12416/15680Isik UniversityIn recent years, transformer-based models pre-trained on extensive corpora have played a critical role in the advancement of Natural Language Processing methodologies. Particularly, methods based on BERT have demonstrated remarkable performance across various tasks by offering robust capabilities in deeply understanding texts semantically. However, despite these advancements, there is a notable scarcity of studies applying these technologies in the aviation sector. This paper develops a multi-class classification model for aviation-specific texts using variants of BERT. The study encompasses the processes of collecting web content related to aircraft, labeling and model training. The details of the dataset are explained and the outcomes of the study are assessed based on the macro F1-score and accuracy of different models. © 2025 Elsevier B.V., All rights reserved.trinfo:eu-repo/semantics/closedAccessFighter AircraftNatural Language Processing SystemsPersonnel TrainingText ProcessingAviation SectorClassification ModelsLabelingsLanguage ProcessingMulti-Class ClassificationNatural LanguagesPerformanceRobust CapabilityText ClassificationWeb ContentTraining AircraftAviBERT: Transformer Tabanlı Hava Aracı Metni SınıflandırmaAviBERT: Transformer-based Aircraft Text ClassificationConference Object10.1109/SIU66497.2025.111122762-s2.0-105015382990