Curse of Dimensionality for TSK Fuzzy Neural Networks: Explanation and Solutions

by   Yuqi Cui, et al.

Takagi-Sugeno-Kang (TSK) fuzzy system with Gaussian membership functions (MFs) is one of the most widely used fuzzy systems in machine learning. However, it usually has difficulty handling high-dimensional datasets. This paper explores why TSK fuzzy systems with Gaussian MFs may fail on high-dimensional inputs. After transforming defuzzification to an equivalent form of softmax function, we find that the poor performance is due to the saturation of softmax. We show that two defuzzification operations, LogTSK and HTSK, the latter of which is first proposed in this paper, can avoid the saturation. Experimental results on datasets with various dimensionalities validated our analysis and demonstrated the effectiveness of LogTSK and HTSK.


page 1

page 2

page 3

page 4


Recommendations on Designing Practical Interval Type-2 Fuzzy Systems

Interval type-2 (IT2) fuzzy systems have become increasingly popular in ...

Deep Fuzzy Systems

An investigation of deep fuzzy systems is presented in this paper. A dee...

Accurate Prediction Using Triangular Type-2 Fuzzy Linear Regression

Many works have been done to handle the uncertainties in the data using ...

Sparse-softmax: A Simpler and Faster Alternative Softmax Transformation

The softmax function is widely used in artificial neural networks for th...

Intuitionistic Fuzzy Broad Learning System: Enhancing Robustness Against Noise and Outliers

In the realm of data classification, broad learning system (BLS) has pro...

Please sign up or login with your details

Forgot password? Click here to reset