Nonsingular Gradient Descent Algorithm for Interval Type-2 Fuzzy Neural Network

IEEE Trans Neural Netw Learn Syst. 2022 Dec 7:PP. doi: 10.1109/TNNLS.2022.3225181. Online ahead of print.

Abstract

Interval type-2 fuzzy neural network (IT2FNN) is widely used to model nonlinear systems. Unfortunately, the gradient descent-based IT2FNN with uncertain variances always suffers from low convergence speed due to its inherent singularity. To cope with this problem, a nonsingular gradient descent algorithm (NSGDA) is developed to update IT2FNN in this article. First, the widths of type-2 fuzzy rules are transformed into root inverse variances (RIVs) that always satisfy the sufficient condition of differentiability. Second, the singular RIVs are reformulated by the nonsingular Shapley-based matrices associated with type-2 fuzzy rules. It averts the convergence stagnation caused by zero derivatives of singular RIVs, thereby sustaining the gradient convergence. Third, an integrated-form update strategy (IUS) is designed to obtain the derivatives of parameters, including RIVs, centers, weight coefficients, deviations, and proportionality coefficient of IT2FNN. These parameters are packed into multiple subvariable matrices, which are capable to accelerate gradient convergence using parallel calculation instead of sequence iteration. Finally, the experiments showcase that the proposed NSGDA-based IT2FNN can improve the convergence speed through the improved learning algorithm.