A Generalized f-Divergence With Applications in Pattern Classification

Küçük Resim Yok

Tarih

2025

Dergi Başlığı

Dergi ISSN

Cilt Başlığı

Yayıncı

IEEE Computer Society

Erişim Hakkı

info:eu-repo/semantics/closedAccess

Özet

In multisource information fusion (MSIF), Dempster-Shafer evidence (DSE) theory offers a useful framework for reasoning under uncertainty. However, measuring the divergence between belief functions within this theory remains an unresolved challenge, particularly in managing conflicts in MSIF, which is crucial for enhancing decision-making level. In this paper, several divergence and distance functions are proposed to quantitatively measure discrimination between belief functions in DSE theory, including the reverse evidential KullbackLeibler (REKL) divergence, evidential Jeffrey's (EJ) divergence, evidential JensenShannon (EJS) divergence, evidential χ2(Eχ2) divergence, evidential symmetric χ2 (ESχ2) divergence, evidential triangular (ET) discrimination, evidential Hellinger (EH) distance, and evidential total variation (ETV) distance. On this basis, a generalized f-divergence, also called the evidential f-divergence (Ef divergence), is proposed. Depending on different kernel functions, the Ef divergence degrades into several specific classes: EKL, REKL, EJ, EJS, Eχ2 and ESχ2 divergences, ET discrimination, and EH and ETV distances. Notably, when basic belief assignments (BBAs) are transformed into probability distributions, these classes of Ef divergence revert to their classical counterparts in statistics and information theory. In addition, several Ef-MSIF algorithms are proposed for pattern classification based on the classes of Ef divergence. These Ef-MSIF algorithms are evaluated on real-world datasets to demonstrate their practical effectiveness in solving classification problems. In summary, this work represents the first attempt to extend classical f-divergence within the DSE framework, capitalizing on the distinct properties of BBA functions. Experimental results show that the proposed Ef-MSIF algorithms improve classification accuracy, with the best-performing Ef-MSIF algorithm achieving an overall performance difference approximately 1.22 times smaller than the suboptimal method and 14.12 times smaller than the worst-performing method. © 1989-2012 IEEE.

Açıklama

Anahtar Kelimeler

Belief Divergence, Decision-making, Pattern Classification

Kaynak

IEEE Transactions on Knowledge and Data Engineering

WoS Q Değeri

Scopus Q Değeri

Q1

Cilt

Sayı

Künye

Xiao, F., Ding, W., & Pedrycz, W. (2025). A Generalized $ f $-Divergence With Applications in Pattern Classification. IEEE Transactions on Knowledge and Data Engineering.