Yazar "Wang, Jie" seçeneğine göre listele
Listeleniyor 1 - 2 / 2
Sayfa Başına Sonuç
Sıralama seçenekleri
Öğe TrueCome: Effective data truth discovery based on fuzzy clustering with prior constraints(Elsevier Inc., 2025) Gao, Lijun; Wu, Fei; Wang, Jie; Yan, Zheng; Pedrycz, WitoldData truth discovery is a process to determine accurate information from multiple conflicting data sources. Existing truth discovery schemes suffer from low efficiency and insufficient accuracy caused by noisy data and source unreliability, particularly facing three key limitations: (1) inability to leverage inter-attribute constraints for distance metric learning, (2) lack of effective mechanisms for distinguishing truth clusters from noisy streaming data. Few of them can discover the truth for streaming data with noise, and (3) static source reliability estimation that fails to adapt to streaming data dynamics. Few of them can discover the truth for streaming data with noise. To overcome these problems, we propose TrueCome, a possibilistic C-Means truth discovery scheme that leverages constraints between different attributes of an object and applies dynamically updated data source reliability to discover truth for both static and streaming data. TrueCome contains two functional modules: distance learning and truth discovery. The distance learning module constructs a distance function by mining prior constraints of object attributes. Then, the truth discovery module obtains the true values of an object through three steps: data clustering based on data sample distance and data source reliability and attribute weights, truth cluster identification by calculating cluster trust degrees, and truth acquisition derived from True Value Clusters (TVCs). In particular, TrueCome employs Maximum A Posteriori (MAP) estimation to adaptively update source reliability (i.e., source weight), allowing it to handle both static and streaming data effectively. Extensive experiments on two real-world datasets and one synthetic dataset exhibit the superiority of TrueCome over several baselines in terms of accuracy and efficiency, particularly for streaming data with noise. We also validate the design rationality of TrueCome through ablation studies. © 2025 Elsevier Inc.Öğe TrustGuard: GNN-based robust and explainable trust evaluation with dynamicity support(Institute of electrical and electronics engineers inc., 2024) Wang, Jie; Yan, Zheng; Lan, Jiahe; Bertino, Elisa; Pedrycz, WitoldTrust evaluation assesses trust relationships between entities and facilitates decision-making. Machine Learning (ML) shows great potential for trust evaluation owing to its learning capabilities. In recent years, Graph Neural Networks (GNNs), as a new ML paradigm, have demonstrated superiority in dealing with graph data. This has motivated researchers to explore their use in trust evaluation, as trust relationships among entities can be modeled as a graph. However, current trust evaluation methods that employ GNNs fail to fully satisfy the dynamic nature of trust, overlook the adverse effects of trust-related attacks, and cannot provide convincing explanations on evaluation results. To address these problems, we propose TrustGuard, a GNN-based accurate trust evaluation model that supports trust dynamicity, is robust against typical attacks, and provides explanations through visualization. Specifically, TrustGuard is designed with a layered architecture that contains a snapshot input layer, a spatial aggregation layer, a temporal aggregation layer, and a prediction layer. Among them, the spatial aggregation layer adopts a defense mechanism to robustly aggregate local trust, and the temporal aggregation layer applies an attention mechanism for effective learning of temporal patterns. Extensive experiments on two real-world datasets show that TrustGuard outperforms state-of-the-art GNN-based trust evaluation models with respect to trust prediction across single-timeslot and multi-timeslot, even in the presence of attacks. In addition, TrustGuard can explain its evaluation results by visualizing both spatial and temporal views.