Yazar "Pal, Nikhil R." seçeneğine göre listele
Listeleniyor 1 - 3 / 3
Sayfa Başına Sonuç
Sıralama seçenekleri
Öğe Adaptive Nonstationary Fuzzy Neural Network(Elsevier, 2024) Chang, Qin; Zhang, Zhen; Wei, Fanyue; Wang, Jian; Pedrycz, Witold; Pal, Nikhil R.Fuzzy neural network (FNN) plays an important role as an inference system in practical applications. To enhance its ability of handling uncertainty without invoking high computational cost, and to take variations in rules into consideration as well, we propose a new inference framework-nonstationary fuzzy neural network (NFNN). This NFNN is composed of a series of zero -order TSK FNNs with the same structure but using slightly perturbed fuzzy sets in the corresponding neurons, which is inspired from the non -stationary fuzzy sets and can mimic the variation in human's decision -making process. In order to obtain a concise and adaptive rule base for NFNN, a modified affinity propagation (MAP) clustering method is proposed. The MAP can determine the number of rules in an adaptive manner, and is used to initialize the rule parameters of NFNN, which we call Adaptive NFNN (ANFNN). Numerical experiments have been carried out over 17 classification datasets and three regression datasets. The experimental results demonstrate that ANFNN exhibits better accuracy, generalization ability, and fault -tolerance ability compared with the classical type -1 fuzzy neural network. In 15 of the 17 classification datasets, ANFNN achieves the same or better accuracy performance compared to interval type -2 FNNs with about half time consumed. This work confirms the feasibility of integrating simplestructured type -1 TSK FNNs to achieve the performance of interval type -2 FNNs, and proves that ANFNN can be a more accurate and reliable alternative to classical type -1 FNN.Öğe Bi-level spectral feature selection(IEEE-INST electrical electronics engineers, 2024) Hu, Zebiao; Wang, Jian; Zhang, Kai; Pedrycz, Witold; Pal, Nikhil R.Unsupervised feature selection (UFS) aims to learn an indicator matrix relying on some characteristics of the high-dimensional data to identify the features to be selected. However, traditional unsupervised methods perform only at the feature level, i.e., they directly select useful features by feature ranking. Such methods do not pay any attention to the interaction information with other tasks such as classification, which severely degrades their feature selection performance. In this article, we propose an UFS method which also takes into account the classification level, and selects features that perform well both in clustering and classification. To achieve this, we design a bi-level spectral feature selection (BLSFS) method, which combines classification level and feature level. More concretely, at the classification level, we first apply the spectral clustering to generate pseudolabels, and then train a linear classifier to obtain the optimal regression matrix. At the feature level, we select useful features via maintaining the intrinsic structure of data in the embedding space with the learned regression matrix from the classification level, which in turn guides classifier training. We utilize a balancing parameter to seamlessly bridge the classification and feature levels together to construct a unified framework. A series of experiments on 12 benchmark datasets are carried out to demonstrate the superiority of BLSFS in both clustering and classification performance.Öğe Takagi-sugeno-kang fuzzy systems for high-dimensional multilabel classification(IEEE-INST electrical electronics engineers, 2024) Bian, Ziwei; Chang, Qin; Wang, Jian; Pedrycz, Witold; Pal, Nikhil R.Multilabel classification (MLC) refers to associating each instance with multiple labels simultaneously. MLC has gained much importance due to its ability to better reflect the complexity of the real world classification problems. Fuzzy system (FS) has excellent nonlinear modeling capability and strong interpretability, which makes it a promising model for complex MLC problems. However, it is widely known that FS suffers from the "curse of dimensionality." Here, an adaptive membership function (MF) along with its generalized version is proposed to address high-dimensional problems. These MFs can effectively overcome "numeric underflow" in FS while preserving interpretability as much as possible. On this basis, a novel fuzzy rule based MLC framework called multilabel high-dimensional Takagi-Sugeno-Kang fuzzy system (ML-HDTSK FS) is proposed. This model can handle data with over ten thousand dimensionality. In addition, ML-HDTSK FS uses a decomposed label correlation learning strategy to efficiently capture both high and low levels of relationship between labels, and adopts a group L21 penalty to realize the learning of label-specific features. Combining these two new multilabel learning strategies and the novel adaptive MF, ML-HDTSK FS becomes a more powerful tool for various MLC problems. The effectiveness of ML-HDTSK FS is demonstrated on seventeen benchmark multilabel datasets, and its performance is compared with eleven MLC algorithms. The experimental results confirm the validity of the proposed ML-HDTSK FS, and demonstrate the superiority of it in dealing with MLC problems, especially for high dimensional ones.