Center transfer for supervised domain adaptation

dc.authoridWitold Pedrycz / 0000-0002-9335-9930en_US
dc.authorscopusidWitold Pedrycz / 56854903200en_US
dc.authorwosidWitold Pedrycz / FPE-7309-2022en_US
dc.contributor.authorHuang, Xiuyu
dc.contributor.authorZhou, Nan
dc.contributor.authorHuang, Jian
dc.contributor.authorZhang, Huaidong
dc.contributor.authorPedrycz, Witold
dc.contributor.authorChoi, Kup-Sze
dc.date.accessioned2023-08-31T10:45:23Z
dc.date.available2023-08-31T10:45:23Z
dc.date.issued2023en_US
dc.departmentİstinye Üniversitesi, Mühendislik ve Doğa Bilimleri Fakültesi, Bilgisayar Mühendisliği Bölümüen_US
dc.description.abstractDomain adaptation (DA) is a popular strategy for pattern recognition and classification tasks. It leverages a large amount of data from the source domain to help train the model applied in the target domain. Supervised domain adaptation (SDA) approaches are desirable when only few labeled samples from the target domain are available. They can be easily adopted in many real-world applications where data collection is expensive. In this study, we propose a new supervision signal, namely center transfer loss (CTL), to efficiently align features under the SDA setting in the deep learning (DL) field. Unlike most previous SDA methods that rely on pairing up training samples, the proposed loss is trainable only using one-stream input based on the mini-batch strategy. The CTL exhibits two main functionalities in training to increase the performance of DL models, i.e., domain alignment and increasing the feature's discriminative power. The hyper-parameter to balance these two functionalities is waived in CTL, which is the second improvement from the previous approaches. Extensive experiments completed on well-known public datasets show that the proposed method performs better than recent state-of-the-art approaches.en_US
dc.identifier.citationHuang, X., Zhou, N., Huang, J., Zhang, H., Pedrycz, W., & Choi, K. S. (2023). Center transfer for supervised domain adaptation. Applied Intelligence, 1-17.en_US
dc.identifier.doi10.1007/s10489-022-04414-2en_US
dc.identifier.issn0924-669Xen_US
dc.identifier.issn1573-7497en_US
dc.identifier.pmid36718382en_US
dc.identifier.scopus2-s2.0-85146871386en_US
dc.identifier.urihttp://dx.doi.org/10.1007/s10489-022-04414-2
dc.identifier.urihttps://hdl.handle.net/20.500.12713/3958
dc.identifier.wosWOS:000918506600001en_US
dc.identifier.wosqualityQ2en_US
dc.indekslendigikaynakScopusen_US
dc.indekslendigikaynakPubMeden_US
dc.indekslendigikaynakWeb of Scienceen_US
dc.institutionauthorPedrycz, Witold
dc.language.isoenen_US
dc.publisherSpringeren_US
dc.relation.ispartofApplied Intelligenceen_US
dc.relation.publicationcategoryMakale - Uluslararası Hakemli Dergi - Kurum Öğretim Elemanıen_US
dc.rightsinfo:eu-repo/semantics/openAccessen_US
dc.subjectSupervised Domain Adaptationen_US
dc.subjectDeep Learningen_US
dc.subjectCenter Transfer Lossen_US
dc.subjectTransfer Learningen_US
dc.titleCenter transfer for supervised domain adaptationen_US
dc.typeArticleen_US

Dosyalar

Orijinal paket
Listeleniyor 1 - 1 / 1
Yükleniyor...
Küçük Resim
Ä°sim:
Huang-2023-Center-transfer-for-supervised-doma.pdf
Boyut:
2.95 MB
Biçim:
Adobe Portable Document Format
Açıklama:
Tam Metin / Full Text
Lisans paketi
Listeleniyor 1 - 1 / 1
Küçük Resim Yok
Ä°sim:
license.txt
Boyut:
1.44 KB
Biçim:
Item-specific license agreed upon to submission
Açıklama: