Çankaya GCRIS Standart veritabanının içerik oluşturulması ve kurulumu Research Ecosystems (https://www.researchecosystems.com) tarafından devam etmektedir. Bu süreçte gördüğünüz verilerde eksikler olabilir.
 

Deep learning methods with pre-trained word embeddings and pre-trained transformers for extreme multi label text classification

dc.authorscopusid 57478988600
dc.authorscopusid 7006606908
dc.contributor.author Erciyes, N.E.
dc.contributor.author Görür, A.K.
dc.date.accessioned 2023-07-19T13:21:54Z
dc.date.available 2023-07-19T13:21:54Z
dc.date.issued 2021
dc.department Çankaya University en_US
dc.department-temp Erciyes N.E., Computer Engineering Dept., Çankaya University, Ankara, Turkey; Görür A.K., Software Engineering Dept., Çanicaya University, Ankara, Turkey en_US
dc.description.abstract In recent years, there has been a considerable increase in textual documents online. This increase requires the creation of highly improved machine learning methods to classify text in many different domains. The effectiveness of these machine learning methods depends on the model capacity to understand the complex nature of the unstructured data and the relations of features that exist. Many different machine learning methods were proposed for a long time to solve text classification problems, such as SVM, kNN, and Rocchio classification. These shallow learning methods have achieved doubtless success in many different domains. For big and unstructured data like text, deep learning methods which can learn representations and features from the input data wtihout using any feature extraction methods have shown to be one of the major solutions. In this study, we explore the accuracy of recent recommended deep learning methods for multi-label text classification starting with simple RNN, CNN models to pretrained transformer models. We evaluated these methods' performances by computing multi-label evaluation metrics and compared the results with the previous studies. © 2021 IEEE en_US
dc.identifier.citation Erciyes, Necdet Eren (2022). Deep learning methods with pre-trained word embeddings and pre-trained transformers for extreme multi label text classification / Çoklu etiket sınıflandırması için önceden eğitilmiş kelime vektörleri ve önceden eğitilmiş transformer modelleri ile derin öğrenme yöntemleri. Yayımlanmış yüksek lisans tezi. Ankara: Çankaya Üniversitesi Fen Bilimleri Enstitüsü. en_US
dc.identifier.doi 10.1109/UBMK52708.2021.9558977
dc.identifier.endpage 55 en_US
dc.identifier.isbn 9781665429085
dc.identifier.scopus 2-s2.0-85122120404
dc.identifier.scopusquality N/A
dc.identifier.startpage 50 en_US
dc.identifier.uri https://doi.org/10.1109/UBMK52708.2021.9558977
dc.identifier.wosquality N/A
dc.language.iso en en_US
dc.publisher Institute of Electrical and Electronics Engineers Inc. en_US
dc.relation.ispartof Proceedings - 6th International Conference on Computer Science and Engineering, UBMK 2021 -- 6th International Conference on Computer Science and Engineering, UBMK 2021 -- 15 September 2021 through 17 September 2021 -- Ankara -- 176826 en_US
dc.relation.publicationcategory Konferans Öğesi - Uluslararası - Kurum Öğretim Elemanı en_US
dc.rights info:eu-repo/semantics/closedAccess en_US
dc.scopus.citedbyCount 4
dc.subject Deep Learning en_US
dc.subject Machine Learning en_US
dc.subject Multi-Label Text Classification en_US
dc.subject Transformers en_US
dc.subject Word Embedding en_US
dc.title Deep learning methods with pre-trained word embeddings and pre-trained transformers for extreme multi label text classification tr_TR
dc.title Deep Learning Methods With Pre-Trained Word Embeddings and Pre-Trained Transformers for Extreme Multi-Label Text Classification en_US
dc.type Conference Object en_US
dspace.entity.type Publication

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
thesis.pdf
Size:
1.06 MB
Format:
Adobe Portable Document Format
Description:
Yazar sürümü

License bundle

Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
1.71 KB
Format:
Item-specific license agreed upon to submission
Description: