Evolving Fuzzy k-Nearest Neighbors Using an Enhanced Sine Cosine Algorithm: Case Study of Lupus Nephritis

08/16/2021
by   Ali Asghar Heidari, et al.
0

Because of its simplicity and effectiveness, fuzzy K-nearest neighbors (FKNN) is widely used in literature. The parameters have an essential impact on the performance of FKNN. Hence, the parameters need to be attuned to suit different problems. Also, choosing more representative features can enhance the performance of FKNN. This research proposes an improved optimization technique based on the sine cosine algorithm (LSCA), which introduces a linear population size reduction mechanism for enhancing the original algorithm's performance. Moreover, we developed an FKNN model based on the LSCA, it simultaneously performs feature selection and parameter optimization. Firstly, the search performance of LSCA is verified on the IEEE CEC2017 benchmark test function compared to the classical and improved algorithms. Secondly, the validity of the LSCA-FKNN model is verified on three medical datasets. Finally, we used the proposed LSCA-FKNN to predict lupus nephritis classes, and the model showed competitive results. The paper will be supported by an online web service for any question at https://aliasgharheidari.com.

READ FULL TEXT

page 6

page 9

page 10

page 12

research
11/15/2019

Binary Sine Cosine Algorithms for Feature Selection from Medical Data

A well-constructed classification model highly depends on input feature ...
research
03/26/2021

Applying k-nearest neighbors to time series forecasting : two new approaches

K-nearest neighbors algorithm is one of the prominent techniques used in...
research
11/20/2020

Multi-population differential evolution-assisted Harris hawks optimization: Framework and case studies

The first powerful variant of the Harris hawks optimization (HHO) is pro...
research
03/05/2020

Fuzzy k-Nearest Neighbors with monotonicity constraints: Moving towards the robustness of monotonic noise

This paper proposes a new model based on Fuzzy k-Nearest Neighbors for c...
research
04/05/2020

A new hashing based nearest neighbors selection technique for big datasets

KNN has the reputation to be the word simplest but efficient supervised ...
research
08/16/2021

Orthogonal learning covariance matrix for defect of grey wolf optimizer: Insights, balance, diversity, and feature selection

This research’s genesis is in two aspects: first, a guaranteed solution ...
research
05/16/2017

Multiobjective Programming for Type-2 Hierarchical Fuzzy Inference Trees

This paper proposes a design of hierarchical fuzzy inference tree (HFIT)...

Please sign up or login with your details

Forgot password? Click here to reset