Please use this identifier to cite or link to this item: https://hdl.handle.net/20.500.11779/1804
Full metadata record
DC FieldValueLanguage
dc.contributor.authorBayram, Buket-
dc.contributor.authorKulavuz, Bahadır-
dc.contributor.authorErtuğrul, Berkay-
dc.contributor.authorBayram, Bülent-
dc.contributor.authorBakırman, Tolga-
dc.contributor.authorÇakar, Tuna-
dc.contributor.authorDoğan, Metehan-
dc.date.accessioned2022-07-19T12:31:56Z-
dc.date.available2022-07-19T12:31:56Z-
dc.date.issued2022-
dc.identifier.citationBayram, B., Kulavuz, B., Ertugrul, B., Bayram, B., Bakirman, T., Cakar, T., & Doğan, M. (2022). Classification of Skin Lesion Images with Deep Learning Approaches. Baltic Journal of Modern Computing, 10(2), pp. 241-250. https://doi.org/10.22364/bjmc.2022.10.2.10en_US
dc.identifier.issn2255-8950-
dc.identifier.issn2255-8942-
dc.identifier.urihttps://doi.org/10.22364/bjmc.2022.10.2.10-
dc.identifier.urihttps://hdl.handle.net/20.500.11779/1804-
dc.description.abstractSkin cancer is one of the most dangerous cancer types in the world. Like any other cancer type, early detection is the key factor for the patient's recovery. Integration of artificial intelligence with medical image processing can aid to decrease misdiagnosis. The purpose of the article is to show that deep learning-based image classification can aid doctors in the healthcare field for better diagnosis of skin lesions. VGG16 and ResNet50 architectures were chosen to examine the effect of CNN networks on the classification of skin cancer types. For the implementation of these networks, the ISIC 2019 Challenge has been chosen due to the richness of data. As a result of the experiments, confusion matrices were obtained and it was observed that ResNet50 architecture achieved 91.23% accuracy and VGG16 architecture 83.89% accuracy. The study shows that deep learning methods can be sufficiently exploited for skin lesion image classification. © 2022 Baltic Journal of Modern Computing. All rights reserved.en_US
dc.language.isoenen_US
dc.publisherUniversity of Latviaen_US
dc.rightsinfo:eu-repo/semantics/openAccessen_US
dc.subjectDeep Learningen_US
dc.subjectImage classificationen_US
dc.subjectISIC 2019en_US
dc.subjectResNet50en_US
dc.subjectVGG16en_US
dc.titleClassification of skin lesion images with deep learning approachesen_US
dc.typeArticleen_US
dc.identifier.doi10.22364/bjmc.2022.10.2.10-
dc.identifier.scopus2-s2.0-85133124398en_US
dc.authoridTuna Çakar / 0000000185947399-
dc.description.woscitationindexEmerging Sources Citation Index-
dc.description.WoSDocumentTypeArticle; Proceedings Paper
dc.description.WoSInternationalCollaborationUluslararası işbirliği ile yapılmayan - HAYIRen_US
dc.description.WoSPublishedMonthJulyen_US
dc.description.WoSIndexDate2022en_US
dc.description.WoSYOKperiodYÖK - 2021-22en_US
dc.identifier.scopusqualityQ3-
dc.relation.publicationcategoryMakale - Uluslararası Hakemli Dergi - Kurum Öğretim Elemanıen_US
dc.identifier.endpage250en_US
dc.identifier.startpage241en_US
dc.identifier.issue2en_US
dc.identifier.volume10en_US
dc.departmentMühendislik Fakültesi, Bilgisayar Mühendisliği Bölümüen_US
dc.relation.journalBaltic Journal of Modern Computingen_US
dc.identifier.wosWOS:000821052300011en_US
dc.institutionauthorÇakar, Tuna-
item.openairecristypehttp://purl.org/coar/resource_type/c_18cf-
item.grantfulltextopen-
item.languageiso639-1en-
item.cerifentitytypePublications-
item.fulltextWith Fulltext-
item.openairetypeArticle-
crisitem.author.dept02.02. Department of Computer Engineering-
Appears in Collections:Bilgisayar Mühendisliği Bölümü koleksiyonu
Scopus İndeksli Yayınlar Koleksiyonu / Scopus Indexed Publications Collection
WoS İndeksli Yayınlar Koleksiyonu / WoS Indexed Publications Collection
Files in This Item:
File Description SizeFormat 
10_2_10_Bayram.pdfFull Text - Article676.46 kBAdobe PDFThumbnail
View/Open
Show simple item record



CORE Recommender

SCOPUSTM   
Citations

2
checked on Aug 1, 2024

Page view(s)

6
checked on Jun 26, 2024

Google ScholarTM

Check




Altmetric


Items in GCRIS Repository are protected by copyright, with all rights reserved, unless otherwise indicated.