Yazar "Yoon, Jin Hee" seçeneğine göre listele
Listeleniyor 1 - 2 / 2
Sayfa Başına Sonuç
Sıralama seçenekleri
Öğe Design of progressive fuzzy polynomial neural networks through gated recurrent unit structure and correlation/probabilistic selection strategies(Elsevier, 2023) Wang, Zhen; Oh, Sung-Kwun; Wang, Zheng; Fu, Zunwei; Pedrycz, Witold; Yoon, Jin HeeThis study focuses on two critical design aspects of a progressive fuzzy polynomial neural network (PFPNN): the influence of the gated recurrent unit (GRU) structure and the implementation of fitness-based candidate neuron selection (FCNS) through two probabilistic strategies. The primary objectives are to enhance modeling accuracy and to reduce the computational load associated with nonlinear regression tasks. Compared with the existing fuzzy rule-based modeling architecture, the proposed dynamic model consists of the GRU structure and the hybrid fuzzy polynomial architecture. In the initial two layers of the PFPNN, we introduce three types of polynomial and fuzzy rules into the GRU neurons (GNs) and fuzzy polynomial neurons (FPNs), which can effectively reveal potential complex relationships in the data space. The synergy of the FCNS strategies and the l2 regularization learning method is to design a progressive regression model adept at melding the GRU structure with a self-organizing architecture. The proposed GRU structure and polynomial-based neurons significantly improve the modeling accuracy for time-series datasets. The rational utilization of FCNS strategies can reinforce the network structure and discover the potential performance of neurons of the network. Furthermore, the inclusion of l2 norm regularization provides additional stability to the proposed model and mitigates the overfitting issue commonly encountered in many existing learning methods. We validated the proposed neural networks using six time-series, four machine learning, and two real-world datasets. The PFPNN outperformed other models in the comparison studies in 83.3% of the datasets, emphasizing its superiority in terms of developing a stable deep structure from diverse candidate neurons and reducing computational overhead. (c) 2023 Elsevier B.V. All rights reserved.Öğe Reinforced Interval Type-2 Fuzzy Clustering-Based Neural Network Realized Through Attention-Based Clustering Mechanism and Successive Learning(Ieee-Inst Electrical Electronics Engineers Inc, 2024) Liu, Shuangrong; Oh, Sung-Kwun; Pedrycz, Witold; Yang, Bo; Wang, Lin; Yoon, Jin HeeIn this article, a novel attention-based reinforced interval type-2 fuzzy clustering neural network (ARIT2FCN) is developed to improve the generalization performance of fuzzy clustering-based neural networks (FCNNs). Commonly, fuzzy rules in FCNNs are generated through the clustering-based rule generator. However, the generated fuzzy rules may not be able to fully describe the given data, because the clustering-based rule generator does not simultaneously consider the intracluster homogeneity and intercluster heterogeneity for both of data characteristics and label information when defining membership functions (MFs) of fuzzy rules. This negatively affects fuzzy rules to accurately quantify the interclass heterogeneity and intraclass homogeneity and degrades the performance of FCNNs. The ARIT2FCN is proposed with the aid of the attention-based clustering mechanism and the successive learning method. The attention-based clustering mechanism is designed to define MFs by simultaneously considering data characteristics and label information. The successive learning method is adopted to construct the desired fuzzy rules that can capture the interclass heterogeneity and intraclass homogeneity. Moreover, L-2 norm regularization is used to alleviate the overfitting effect. The performance of ARIT2FCN is evaluated on machine learning datasets with 16 comparative methods. In addition, two real-world problems are adopted to validate the effectiveness of ARIT2FCN. Experimental results demonstrate that the ARIT2FCN outperforms the comparative methods, and the statistical tests also support the superiority of ARIT2FCN.