A Generalized f-Divergence With Applications in Pattern Classification

dc.authorscopusidWitold Pedrycz / 58861905800
dc.authorwosidWitold Pedrycz / HJZ-2779-2023
dc.contributor.authorXiao, Fuyuan
dc.contributor.authorDing, Weiping
dc.contributor.authorPedrycz, Witold
dc.date.accessioned2025-04-18T08:27:11Z
dc.date.available2025-04-18T08:27:11Z
dc.date.issued2025
dc.departmentİstinye Üniversitesi, Mühendislik ve Doğa Bilimleri Fakültesi, Bilgisayar Mühendisliği Bölümü
dc.description.abstractIn multisource information fusion (MSIF), Dempster-Shafer evidence (DSE) theory offers a useful framework for reasoning under uncertainty. However, measuring the divergence between belief functions within this theory remains an unresolved challenge, particularly in managing conflicts in MSIF, which is crucial for enhancing decision-making level. In this paper, several divergence and distance functions are proposed to quantitatively measure discrimination between belief functions in DSE theory, including the reverse evidential KullbackLeibler (REKL) divergence, evidential Jeffrey's (EJ) divergence, evidential JensenShannon (EJS) divergence, evidential χ2(Eχ2) divergence, evidential symmetric χ2 (ESχ2) divergence, evidential triangular (ET) discrimination, evidential Hellinger (EH) distance, and evidential total variation (ETV) distance. On this basis, a generalized f-divergence, also called the evidential f-divergence (Ef divergence), is proposed. Depending on different kernel functions, the Ef divergence degrades into several specific classes: EKL, REKL, EJ, EJS, Eχ2 and ESχ2 divergences, ET discrimination, and EH and ETV distances. Notably, when basic belief assignments (BBAs) are transformed into probability distributions, these classes of Ef divergence revert to their classical counterparts in statistics and information theory. In addition, several Ef-MSIF algorithms are proposed for pattern classification based on the classes of Ef divergence. These Ef-MSIF algorithms are evaluated on real-world datasets to demonstrate their practical effectiveness in solving classification problems. In summary, this work represents the first attempt to extend classical f-divergence within the DSE framework, capitalizing on the distinct properties of BBA functions. Experimental results show that the proposed Ef-MSIF algorithms improve classification accuracy, with the best-performing Ef-MSIF algorithm achieving an overall performance difference approximately 1.22 times smaller than the suboptimal method and 14.12 times smaller than the worst-performing method. © 1989-2012 IEEE.
dc.identifier.citationXiao, F., Ding, W., & Pedrycz, W. (2025). A Generalized $ f $-Divergence With Applications in Pattern Classification. IEEE Transactions on Knowledge and Data Engineering.
dc.identifier.doi10.1109/TKDE.2025.3530524
dc.identifier.issn10414347
dc.identifier.scopusqualityQ1
dc.identifier.urihttp://dx.doi.org/10.1109/TKDE.2025.3530524
dc.identifier.urihttps://hdl.handle.net/20.500.12713/6568
dc.indekslendigikaynakScopus
dc.institutionauthorPedrycz, Witold
dc.institutionauthoridWitold Pedrycz / 0000-0002-9335-9930
dc.language.isoen
dc.publisherIEEE Computer Society
dc.relation.ispartofIEEE Transactions on Knowledge and Data Engineering
dc.relation.publicationcategoryMakale - Uluslararası Hakemli Dergi - Kurum Öğretim Elemanı
dc.rightsinfo:eu-repo/semantics/closedAccess
dc.subjectBelief Divergence
dc.subjectDecision-making
dc.subjectPattern Classification
dc.titleA Generalized f-Divergence With Applications in Pattern Classification
dc.typeArticle

Dosyalar

Lisans paketi
Listeleniyor 1 - 1 / 1
Küçük Resim Yok
İsim:
license.txt
Boyut:
1.17 KB
Biçim:
Item-specific license agreed upon to submission
Açıklama: