Please use this identifier to cite or link to this item: https://hdl.handle.net/20.500.11779/1519
Full metadata record
DC FieldValueLanguage
dc.contributor.authorAras, Gizem-
dc.contributor.authorMakaroğlu, Didem-
dc.contributor.authorDemir, Şeniz-
dc.contributor.authorÇakır, Altan-
dc.date.accessioned2021-07-27T10:47:43Z-
dc.date.available2021-07-27T10:47:43Z-
dc.date.issued2021-
dc.identifier.issn0957-4174-
dc.identifier.urihttps://doi.org/10.1016/j.eswa.2021.115049-
dc.identifier.urihttps://hdl.handle.net/20.500.11779/1519-
dc.description.abstractNamed entity recognition (NER) is an extensively studied task that extracts and classifies named entities in a text. NER is crucial not only in downstream language processing applications such as relation extraction and question answering but also in large scale big data operations such as real-time analysis of online digital media content. Recent research efforts on Turkish, a less studied language with morphologically rich nature, have demonstrated the effectiveness of neural architectures on well-formed texts and yielded state-of-the art results by formulating the task as a sequence tagging problem. In this work, we empirically investigate the use of recent neural architectures (Bidirectional long short-term memory (BiLSTM) and Transformer-based networks) proposed for Turkish NER tagging in the same setting. Our results demonstrate that transformer-based networks which can model long-range context overcome the limitations of BiLSTM networks where different input features at the character, subword, and word levels are utilized. We also propose a transformer-based network with a conditional random field (CRF) layer that leads to the state-of-the-art result (95.95% f-measure) on a common dataset. Our study contributes to the literature that quantifies the impact of transfer learning on processing morphologically rich languages.en_US
dc.language.isoenen_US
dc.publisherElsevieren_US
dc.rightsinfo:eu-repo/semantics/openAccessen_US
dc.subjectNamed entity recognitionen_US
dc.subjectTurkishen_US
dc.subjectTransfer learningen_US
dc.subjectCRFen_US
dc.subjectDigital media industryen_US
dc.titleAn evaluation of recent neural sequence tagging models in Turkish named entity recognitionen_US
dc.typeArticleen_US
dc.identifier.doi10.1016/j.eswa.2021.115049-
dc.identifier.scopus2-s2.0-85107884455en_US
dc.authoridYazar ID-
dc.description.woscitationindexScience Citation Index Expanded-
dc.identifier.wosqualityQ1-
dc.description.WoSDocumentTypeArticle
dc.description.WoSInternationalCollaborationUluslararası işbirliği ile yapılmayan - HAYIRen_US
dc.description.WoSPublishedMonthKasımen_US
dc.description.WoSIndexDate2021en_US
dc.description.WoSYOKperiodYÖK - 2021-22en_US
dc.relation.publicationcategoryMakale - Uluslararası Hakemli Dergi - Kurum Öğretim Elemanıen_US
dc.identifier.endpage11en_US
dc.identifier.startpage1en_US
dc.identifier.issue15en_US
dc.identifier.volume182en_US
dc.departmentMühendislik Fakültesi, Bilgisayar Mühendisliği Bölümüen_US
dc.relation.journalExpert Systems with Applicationsen_US
dc.identifier.wosWOS:000688460900011en_US
dc.institutionauthorDemir, Şeniz-
item.openairecristypehttp://purl.org/coar/resource_type/c_18cf-
item.grantfulltextopen-
item.languageiso639-1en-
item.cerifentitytypePublications-
item.fulltextWith Fulltext-
item.openairetypeArticle-
crisitem.author.dept02.02. Department of Computer Engineering-
Appears in Collections:Bilgisayar Mühendisliği Bölümü koleksiyonu
Scopus İndeksli Yayınlar Koleksiyonu / Scopus Indexed Publications Collection
WoS İndeksli Yayınlar Koleksiyonu / WoS Indexed Publications Collection
Files in This Item:
File Description SizeFormat 
1-s2.0-S0957417421004905-main.pdfFull Text / Tam Metin1.17 MBAdobe PDFThumbnail
View/Open
Show simple item record



CORE Recommender

SCOPUSTM   
Citations

20
checked on Aug 1, 2024

WEB OF SCIENCETM
Citations

13
checked on Jun 23, 2024

Page view(s)

4
checked on Jun 26, 2024

Google ScholarTM

Check




Altmetric


Items in GCRIS Repository are protected by copyright, with all rights reserved, unless otherwise indicated.