Please use this identifier to cite or link to this item: https://hdl.handle.net/20.500.11779/1989
Full metadata record
DC FieldValueLanguage
dc.contributor.authorKaramatlı, Ertuğ-
dc.contributor.authorKırbız, Serap-
dc.date.accessioned2023-10-18T12:06:14Z
dc.date.available2023-10-18T12:06:14Z
dc.date.issued2022-
dc.identifier.citationKaramatlı, E., & Kırbız, S. (2022). MixCycle: Unsupervised Speech Separation via Cyclic Mixture Permutation Invariant Training. IEEE Signal Processing Letters, 29, 2637-2641.en_US
dc.identifier.issn1070-9908-
dc.identifier.issn1558-2361-
dc.identifier.urihttps://hdl.handle.net/20.500.11779/1989-
dc.identifier.urihttps://doi.org/10.1109/LSP.2022.3232276-
dc.description.abstractWe introduce two unsupervised source separation methods, which involve self-supervised training from single-channel two-source speech mixtures. Our first method, mixture permutation invariant training (MixPIT), enables learning a neural network model which separates the underlying sources via a challenging proxy task without supervision from the reference sources. Our second method, cyclic mixture permutation invariant training (MixCycle), uses MixPIT as a building block in a cyclic fashion for continuous learning. MixCycle gradually converts the problem from separating mixtures of mixtures into separating single mixtures. We compare our methods to common supervised and unsupervised baselines: permutation invariant training with dynamic mixing (PIT-DM) and mixture invariant training (MixIT). We show that MixCycle outperforms MixIT and reaches a performance level very close to the supervised baseline (PIT-DM) while circumventing the over-separation issue of MixIT. Also, we propose a self-evaluation technique inspired by MixCycle that estimates model performance without utilizing any reference sources. We show that it yields results consistent with an evaluation on reference sources (LibriMix) and also with an informal listening test conducted on a real-life mixtures dataset (REAL-M).en_US
dc.language.isoenen_US
dc.publisherIEEEen_US
dc.rightsinfo:eu-repo/semantics/openAccessen_US
dc.subjectTrainingen_US
dc.subjectRecordingen_US
dc.subjectSource separationen_US
dc.subjectTime-domain analysisen_US
dc.subjectTask analysisen_US
dc.subjectOptimized production technologyen_US
dc.subjectUnsupervised learningen_US
dc.subjectBlind source separationen_US
dc.subjectdeep learningen_US
dc.subjectself-supervised learningen_US
dc.subjectunsupervised learningen_US
dc.titleMixCycle: Unsupervised Speech Separation via Cyclic Mixture Permutation Invariant Trainingen_US
dc.typeArticleen_US
dc.identifier.doi10.1109/LSP.2022.3232276-
dc.identifier.scopus2-s2.0-85146250664en_US
dc.description.woscitationindexScience Citation Index Expanded-
dc.identifier.wosqualityQ2-
dc.description.WoSDocumentTypearticle
dc.description.WoSInternationalCollaborationUluslararası işbirliği ile yapılmayan - HAYIRen_US
dc.description.WoSPublishedMonthOcaken_US
dc.description.WoSIndexDate2022en_US
dc.description.WoSYOKperiodYÖK - 2022-23en_US
dc.identifier.scopusqualityQ1-
dc.relation.publicationcategoryMakale - Uluslararası Hakemli Dergi - Kurum Öğretim Elemanıen_US
dc.identifier.endpage2641en_US
dc.identifier.startpage2637en_US
dc.identifier.volume29en_US
dc.departmentMühendislik Fakültesi, Endüstri Mühendisliği Bölümüen_US
dc.relation.journalIeee Signal Processing Lettersen_US
dc.identifier.wosWOS:000910559500004en_US
dc.institutionauthorKırbız, Serap-
item.openairecristypehttp://purl.org/coar/resource_type/c_18cf-
item.grantfulltextopen-
item.languageiso639-1en-
item.cerifentitytypePublications-
item.fulltextWith Fulltext-
item.openairetypeArticle-
crisitem.author.dept02.05. Department of Electrical and Electronics Engineering-
Appears in Collections:Endüstri Mühendisliği Bölümü koleksiyonu
Scopus İndeksli Yayınlar Koleksiyonu / Scopus Indexed Publications Collection
WoS İndeksli Yayınlar Koleksiyonu / WoS Indexed Publications Collection
Files in This Item:
File Description SizeFormat 
MixCycle_Unsupervised_Speech_Separation_via_Cyclic_Mixture_Permutation_Invariant_Training.pdfFull Text- Article604.92 kBAdobe PDFThumbnail
View/Open
Show simple item record



CORE Recommender

SCOPUSTM   
Citations

3
checked on Aug 1, 2024

WEB OF SCIENCETM
Citations

1
checked on Jun 23, 2024

Page view(s)

4
checked on Jun 26, 2024

Google ScholarTM

Check




Altmetric


Items in GCRIS Repository are protected by copyright, with all rights reserved, unless otherwise indicated.