Unsupervised Fuzzy eIX: Evolving Internal-eXternal Fuzzy Clustering

03/25/2020
by   Charles Aguiar, et al.
Federal University of Lavras
0

Time-varying classifiers, namely, evolving classifiers, play an important role in a scenario in which information is available as a never-ending online data stream. We present a new unsupervised learning method for numerical data called evolving Internal-eXternal Fuzzy clustering method (Fuzzy eIX). We develop the notion of double-boundary fuzzy granules and elaborate on its implications. Type 1 and type 2 fuzzy inference systems can be obtained from the projection of Fuzzy eIX granules. We perform the principle of the balanced information granularity within Fuzzy eIX classifiers to achieve a higher level of model understandability. Internal and external granules are updated from a numerical data stream at the same time that the global granular structure of the classifier is autonomously evolved. A synthetic nonstationary problem called Rotation of Twin Gaussians shows the behavior of the classifier. The Fuzzy eIX classifier could keep up with its accuracy in a scenario in which offline-trained classifiers would clearly have their accuracy drastically dropped.

READ FULL TEXT VIEW PDF

Authors

page 1

page 2

page 3

page 4

10/20/2016

An Ensemble of Adaptive Neuro-Fuzzy Kohonen Networks for Online Data Stream Fuzzy Clustering

A new approach to data stream clustering with the help of an ensemble of...
11/10/2018

New Movement and Transformation Principle of Fuzzy Reasoning and Its Application to Fuzzy Neural Network

In this paper, we propose a new fuzzy reasoning principle, so called Mov...
10/20/2016

An Evolving Neuro-Fuzzy System with Online Learning/Self-learning

An architecture of a new neuro-fuzzy system is proposed. The basic idea ...
02/27/2020

Supervised Enhanced Soft Subspace Clustering (SESSC) for TSK Fuzzy Classifiers

Fuzzy c-means based clustering algorithms are frequently used for Takagi...
03/19/2019

A Choquet Fuzzy Integral Vertical Bagging Classifier for Mobile Telematics Data Analysis

Mobile app development in recent years has resulted in new products and ...
12/03/2017

Avaliação do método dialético na quantização de imagens multiespectrais

The unsupervised classification has a very important role in the analysi...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.

I Introduction

Real-world data streams usually undergo changes over time, which portray the evolution of the environment they come from. To conceive, represent and handle information, a computational model should be able to adapt itself in response to changes of the underlying process or phenomenon. The ability of self-adjustment consists in updating the model parameters and structure to track unknown events, behavioral patterns, and gradual and abrupt changes of the system operating conditions. Inference and learning algorithms able to notice changes in input data streams and update model parameters and structure to keep a synopsis of the current data profile have been largely studied [27, 1, 14, 15].

Fuzzy models can be constructed and updated by means of online incremental algorithms. As discussed in [27, 1, 14, 15, 5, 6], evolving models are characterized by:

  • ability to learn online, i.e., the model is updated, generally in an instance-per-instance basis, as data are available;

  • ability to gradually tune its parameters and structure to track concept drifts and shifts; and

  • needlessness of a priori knowledge about the data, data properties, and amount of classes/patterns.

Online learning should trade off structural plasticity and stability. In other words, learning procedures should parsimoniously reconcile and decide on either creating new granules or updating existing ones. Structural plasticity means creating new granules to memorize new concepts. Plasticity avoids learned granules to be exposed to catastrophic forgetting. Structural stability preserves the model structure, but allows adaptation of existing granules to smooth and gradual changes. Usually, online machine learning and data mining algorithms are not effective in finding such equilibrium because methods to assess the current relevance of information granules, balance the size of granules along the problem dimensions, and merge similar granules are not considered.

Storage of large volumes of data from nonstationary environments is very often infeasible or ineffective [30, 26, 22]

. Evolving intelligent systems provide an autonomous approach for data stream analysis. These systems usually comprise a single-scan-through-the-data learning method, which is of utmost importance in time-critical, high-frequency, nonstationary, and big data applications. Furthermore, evolving systems offer an open platform in which local components, e.g., granules, neurons, clusters, leaves or rules, can be automatically generated, updated, merged, split, and recalled based on the behaviour of the data stream

[34, 11, 1]. Evolving models have shown to be self organizing since no previous knowledge about the data is needed [6, 25, 27, 19].

In addition to change over time, data streams produced from electronic devices and real-world perceptions consist of inherently uncertain values. Uncertainty is a feature that indicates how much a measured value deviates from the true value. Uncertainty can originate from fluctuations of random nature in data flows, numerical imprecision due to binary computation, fusion of information from different sources, non-ideal measuring instruments, data pre-processing methods, and others [15, 17, 8, 18]. Evolving granular models particularly capable of learning from interval and fuzzy data streams, and incorporate and take advantage of data uncertainties are given in [16, 13, 12]. Most evolving models, however, learn from numerical data streams while the representation of the data is given by fuzzy or interval objects (local models) [27].

This paper presents a new online learning framework called evolving Internal-eXternal Fuzzy clustering method, or Fuzzy eIX for short. A Fuzzy eIX model is useful for classification. Different from any other evolving approach, a Fuzzy eIX model is formed by double-boundary information granules extracted from an unsupervised numerical data stream. The granules contain an internal structure which summarizes the statistics of a fraction of the dataset. Therefore, local internal information is used for autonomous decision-making along the online learning process. The double-boundary feature of Fuzzy eIX granules can be used to translate clustering results into a type-1 or type-2 fuzzy inference system at any time. Additionally, Fuzzy eIX performs the principle for a balanced information granularity [3]

, which affirms that granules should be balanced along all dimensions for a better general understandability of the results in an application domain. Therefore, Fuzzy eIX is strongly aligned with the concept of eXplainable Artificial Intelligence (XAI), which says that the solutions provided by machine learning algorithms and models should be understood by humans.

The remainder of this paper is structured as follows. Section II reviews some related online unsupervised algorithms. Section III describes practical implications of developing double-boundary information granules. Section IV outlines the Fuzzy eIX framework for unsupervised numerical data-stream modeling. We show how internal and external boundaries arise and are updated from the data to represent uncertainties. Section V gives preliminary results on a synthetic nonstationary problem called Rotation of Twin Gaussians [15, 16]. Section VI concludes the paper.

Ii Related Literature

We address some related studies on evolving clustering and briefly discuss some studies that somehow go into the idea of double-boundary granulation.

Ii-a Evolving Clustering

Historically, the evolving Clustering Method (eCM) [29] was the first evolving approach to the clustering problem. ECM is based on the Euclidean distance to define the similarity between new data and clusters. A distance threshold between an instance and the center of a cluster is used to compute membership levels. A new cluster is created if an instance is sufficiently distant to all clusters. Cluster centers are dragged toward instances that have a significant membership degree in the cluster. Although ECM updates clusters’ centers and radii and creates new clusters on the fly, it does not delete, split and combine clusters.

SOStream is an evolving clustering algorithm based on self-organizing density maps [9]. The algorithm processes a new instance similar to the ECM algorithm. However, at each iteration SOStream checks for clusters that overlap at a given level, and merge them. Clusters are created until a minimum value of a meta-parameter is reached. The ability to merge and update clusters structure based on the data flow gives SOStream the property of self-organization. However, if little or nothing is known about the data, it is impractical to provide an assertive value for the minimum number of clusters to be generated. An excessive number of clusters may be created to represent the data. This issue is aggravated since the approach is not supplied with deleting mechanism.

Typicality and Eccentricity-based Data Analysis (TEDA) is a framework by [2] that introduces the concept of eccentricity and typicality. Eccentricity means how distinct a particular data instance is from other instances and from the current focal points of clouds. Typicality is based on how similar a data instance is in relation to the entire data set and a cloud [4]

. These concepts provide measures of density and proximity. For each new instance, the eccentricity and typicality are recursively calculated to determine if an instance is either typical or anomalous. TEDA establishes data density levels that allow the identification of anomalies based on similarity levels. TEDA is applicable to fault and outlier detection problems, and prediction. In

[28], a variation of the TEDA method is given for weather prediction.

Ii-B Double-Boundary Granulation

Pawlak rough sets [21] can be used to perform approximate classification of uncertain and inaccurate data. A rough set is defined by lower and upper approximation sets, which can be crisp or fuzzy. The region between the lower and upper sets is the boundary region – a region in which a point may or may not belong to the set. A rough set is a uni-granular construct, i.e., the boundary of one knowledge granule is the issue for the definition of a rough model [32]. However, many studies have extended the original ideas toward multi-granular constructs.

Rough-set-based models for information granulation are addressed in [35]. Optimistic multi-granular rough sets are proposed in [23]. A pessimistic multi-granular rough set approach for problem solving in the context of multiple granulation is presented in [24]. Several properties and extensions of optimistic and pessimistic multi-granulation methods are described in [33, 31, 20].

Iii Why Fuzzy eIX Clustering?

We envision implications of developing Fuzzy eIX granules from online data streams.

Iii-a Type-1 Evolving Fuzzy Inference System

By projecting the internal and external boundaries of Fuzzy eIX granules in orthogonal axes representing the attributes of a problem, trapezoidal membership functions and, therefore, an evolving type-1 fuzzy inference system, are obtained. As granules are created and updated in the Cartesian product space, the core and support of associated membership functions evolve from the data stream. Figure 1 shows an example. Naturally, the overall Fuzzy eIX model can be read linguistically from a set of If-Then rules – a rule per granule.

Fig. 1: Evolving type-1 membership functions and a type-1 inference system from Fuzzy eIX double-boundary granules

Iii-B Type-2 Evolving Fuzzy Inference System

A rigorous way to project a Fuzzy eIX granule in orthogonal axes consists in considering the midpoint of a granule as the prototype, and the information of the internal granule only. Rigorous membership functions that describe the uncertainty related to the proximity of a point to the prototypical point of a granule are established by means of the inner boundaries of the granule. An evolving type-2 fuzzy inference system can be obtained from the inner and outer projections, yielding and , respectively, as shown in Fig. 2. The hatched region between membership functions is called footprint of uncertainty (FOU). The larger the FOU area, the greater the uncertainty, and vice-versa. As the parameters of and are straightly obtained from evolving granules, they autonomously learn values for themselves from the data stream.

Fig. 2: Evolving type-2 membership functions and a type-2 inference system from Fuzzy eIX double-boundary granules

Iii-C Local and Global Inter-Granular Uncertainty

A hyper-rectangular double-boundary granule contains four parameters per problem dimension, which are the bounds of internal and external intervals. Updating intervals according to proportions looking to the values of a single attribute only is a straightforward and fast way of learning in dynamic environment. In addition to local uncertainty representation given by the hatched regions of Fig. 3, information about inter-granular uncertainty can also be computed and used to make decisions within an evolving learning algorithm. For example, multiple merging and conflict-solving procedures can be derived by comparing double-boundary granules.

Fig. 3: Inter-granular and local uncertainty information in Fuzzy eIX granules

Iv Fuzzy eIX: Learning Algorithm

Fuzzy eIX granules are delimited by inner and outer hyper-boxes. The membership degree of an instance that belongs to the inner region of a granule is 1. An instance belonging to the area between the inner and outer boxes is partially considered a member of the granule. Central points and inner and outer bounds of granules are recursively updated over time according to a data stream.

Iv-a Initialization

Let a data stream be denoted by , . Given the first instance, , namely, – being the number of attributes – the first granule is created. Its center, , is equal to . The initial widths of the inner and outer intervals of are equal to and , respectively, in any dimension , being a meta-parameter. Therefore, the inner and outer bounds are given by and

, respectively. The vector of centers and inner and outer bounds are adaptive over time.

A generic granule of a collection is characterized by:

  • a prototypical point or center, ; and

  • lower and upper inner bounds, , , and lower and upper outer bounds, , .

Additionally, the meta-parameter defines the initial and minimum possible width for the attributes of granules. The meta-parameter is useful to the merging procedure, as described later. Thus, is a five-fold collection of -dimensional vectors. Figure 4 shows a bi-dimensional example.

Fig. 4: A generic bi-dimensional Fuzzy eIX granule,

Iv-B Instance in Inner Region

If an instance is placed within the limits of the inner region of a granule , i.e.,

(1)

then the membership degree of in is 1 by means of any T-norm. As the certainty on the placement of is greater with the inclusion of , internal and external widths become smaller. The lower inner bound is increased from

(2)

, with given by

(3)

The default value of is 0.3. The lower outer bound, , is increased analogously to (2)-(3). The upper inner bound, , is reduced proportionally from

(4)

. Similarly, the upper outer bound, , is obtained from (4) using the lower outer bounds in the last term. Notice that the shrinkage is larger when is closer to .

The size of Fuzzy eIX granules is limited such that the width of the internal and external regions, in any dimension, is greater than and , respectively, at any iteration.

After shrinking the -th granule, the granule slides toward the current instance inversely proportional to its density, i.e., to the number of instances, , that belonged to the inner region of previously (a weighted drift).

The center is moved using

(5)

Let be any boundary , , , . To preserve symmetry of the hyper-rectangular shape, inner and outer boundaries move accordingly,

(6)

Figure 5 shows an example of the shrinking and sliding procedures due to the current instance , which belongs to its inner region of .

Fig. 5: Sliding and shrinking the -th granule as the current instance belongs to its inner region and therefore increase the certainty of its location

Iv-C Instance in Outer Region

An instance may belong to the outer granule, with one or more of its attributes placed outside the inner region. In this case,

(7)

In addition to the feasibility of (7), Eq. (1) must be false for at least one dimension, . The distance from to is greater than the distance between and any instance within the inner bounds of .

Instances belonging to the outer region of imply that the center should not be drifted as we are not sure about its content. The uncertainty on the membership of in rather suggests granular expansion for inclusion.

Internal and external widths become larger as follows. The lower inner bound is reduced from

(8)

with

(9)

considering only the dimensions in which . Otherwise, is not changed. The default value of is 0.3; and . The lower outer bound, , is reduced analogously, using (8) and the same obtained in (9).

The upper inner bound, , is increased proportionally from

(10)

. Similarly, the upper outer bound, , is obtained from (10) using the lower outer bounds in the last term.

Additionally, for the dimensions in which , Eq. (8) is applied using , instead of ; and is got from

(11)

In this case, . The upper outer bound, , is increased by analogy, i.e., using (8), and as in (11). Moreover, the lower inner bound, , is reduced proportionally from

(12)

. Similarly, the lower outer bound, , is obtained from (12) using the upper outer bounds in the last term. Notice that the expansion is larger as the attributes of approach the outer borders.

The size of Fuzzy eIX granules is limited such that the width of the internal and external regions, in any dimension and at any iteration, is greater than and , respectively. Figure 6 shows an example of granular expansion in reaction to the evidence , which belongs to the outer region of .

Fig. 6: Granular expansion as a consequence of the instance be included in the outer region of , but not included in its inner region

Iv-D Eccentric Instance

If does not belong to any internal and external regions of all granules, a new granule is created to include . The granule creation procedure is similar to that when the first instance arises. The new granule grasps the new information, brought by .

Iv-E Balanced Information Granularity

The balanced granularity principle [3] states that preference should be given to the development of local objects with balanced granularity along the different axes. This means that the width of the lines that delineate a hyper-rectangle should be ideally similar. A better understandability of the results in an application domain can be reached by means of a balanced granular model.

Fuzzy eIX internal and external granules are balanced at each iteration based on average widths. Let

(13)

, be the internal and external widths, respectively.

Let be the current amount of granules. For individual attributes , , the internal and external average width of all granules along the -th axis are

(14)

Individual widths (13) are updated toward average widths (14) considering one attribute at a time. Relatively smaller sides of hyper-rectangles are gradually expanded whereas larger sides are reduced. Formally,

(15)

; ; , in which is the balancing rate. We set as default value. Higher values of increase the speed of convergence of granules to a similar size, and provide higher model interpretability at the price of a potential lost in accuracy related to the existence of classes with different spreads. If accuracy is the most important aspect, then, smaller values of keep the original hyper-rectangular geometry of granules and different spreads.

Iv-F Weighted Mean and Convex Hull Merging

Merging happens when a pair of granules is notably overlapped. Often, a sequence of data instances belongs to the gap between granules, which used to be disjoint at a former time instant. Therefore, redundancy is avoided by merging them [27]. When two granules, say and , are close enough, i.e., when their centers, and , are such that

(16)

with , then they are merged.

We introduce two merging methods. In the Weighted Mean method, the center of the new granule, , is

(17)

in which is the number of times the -th granule was chosen to be updated for a given input instances. The new granule is placed on the line between and , and depends on the data density previous granules used to represent. The internal, , , and external, , , endpoints of the new granule, , are obtained from (17) by analogy.

Fig. 7: Convex hull approach to merge Fuzzy eIX granules

An alternative merging approach, the Convex Hull method, consists in producing a coarser granule that encapsulates all information inherent to the previous granules. In this case, the center of the new granule arises naturally from operations on endpoints. Namely, let

(18)

. Then,

(19)

is the midpoint of the granule. At first sight, this merging approach increases the coverage area of the model, and preserves past information. The granule tends to be shrunk afterwards, toward average widths. Figure 7 exemplifies the convex-hull merging procedure.

Iv-G Summary

The Fuzzy eIX unsupervised algorithm is summarized below. The resulting classifier is parametrically and structurally adaptive. Hence, it deals with nonstationary data streams.

Result: : a set of granules
,
foreach , (data stream) do
       if  then
             MakeGranule(, )
       else
             foreach  do
                   if  then
                         ShrinkInternal(, , )
                         ShrinkExternal(, , )
                         SlideCenter(, )
                   else if  then
                         ExpandInternal(, )
                         ExpandExternal(, )
                   else
                         MakeGranule(, )
                   end if
                  
             end foreach
            
       end if
      MergeClusters(, )
       BalanceGranules(, )
end foreach
Algorithm 1 FuzzyEIX (, )

V Preliminary Results

The experiment called Rotation of the Twin Gaussians [15] is a nonstationary classification problem useful to evaluate the effectiveness of the Fuzzy eIX method.

A data stream is generated from two partially-overlapped Gaussian functions. The Gaussians are initially centered at and

, and have standard deviation of

. They rotate around the point according to

(20)

in which is the counterclockwise angle, around , of the position of the center of the -th Gaussian. Initially, as the centers are and , then degrees for the ‘Class 1’, and degrees for the ‘Class 2’. We consider two stages. First, the rotating rate, , is during time steps (stationary Gaussians). Then, the rotating rate, , is from to (concept drift). An instance is produced from one of the Gaussians per time step.

The data is scaled in [0,1]. We assume that the first class is the positive class. Consider a confusion matrix consisting of two rows and two columns, which represent the number of true positives (TP), false positives (FP), true negatives (TN), and false negatives (FN)

[7]. The accuracy of a classifier is obtained from

(21)

Table I shows the Fuzzy eIX results for the stationary ( ) and nonstationary () stages, considering the convex hull merging procedure and different initial meta-parameters.

Stationary stage (200 instances)
Parameters (%) Avg. Granules Time (s)
85.0 6.01 0.26
88.0 5.78 0.10
94.5 3.12 0.18
84.5 4.41 0.21
90.0 3.86 0.19
81.0 3.07 0.24
Nonstationary stage (200 instances)
Parameters (%) Avg. Granules Time (s)
81.0 12.78 0.45
83.0 9.32 0.41
90.5 5.69 0.36
79.5 4.12 0.31
85.5 5.43 0.35
77.5 4.25 0.29
TABLE I: Fuzzy eIX - Rotation of the Twin Gaussians

For the parameters and , an accuracy of and using and granules, for the stationary and nonstationary stages, respectively, were reached. For higher values of , the accuracy is degraded as larger granules are assigned to instances belonging to different classes in the initial time steps. An average of additional granules is needed to keep the accuracy up during the gradual rotation of the data sources. Figure 8 shows the Fuzzy eIX local elements, and the decision boundary for the best classifier.

(a) Stationary stage
(b) Nonstationary stage
Fig. 8: Final position of Fuzzy eIX granules for the best setting of meta-parameters, ,

Figure 9 shows the evolution of the number of granules over time for the best classifier. Notice that the amount of granules increases generally faster after to address the classes drift. Granules are created on the fly to cover new instances. From Table I, it is also noteworthy that the model’s accuracy is not significantly affected by the concept change since structural and parametric changes are carried out by the Fuzzy eIX algorithm.

Fig. 9: Evolution of the structure of the Fuzzy eIX classifier

Vi Conclusion and Perspectives

We proposed an unsupervised evolving granular framework that uses double-boundary granules and numerical data streams to construct classification models. The framework is called Fuzzy eIX. Local internal and external regions are updated recursively for instances with partial or full membership in a granule. Data properties and model structure are unknown beforehand, which gives a Fuzzy eIX classifier the characteristic of self-organization. Type 1 and type 2 fuzzy inference systems can be obtained from the projection of Fuzzy eIX granules in orthogonal axes. Moreover, the rule-based systems obtained is linguistically understandable in an application since the Fuzzy eIX learning algorithm balances the granules along the problem dimensions. Encouraging results have been obtained, such as results in the time-varying Rotation of the Twin Gaussians problem. The classifier could keep its accuracy in a certain level (90.5 - 94.5%) in a scenario in which offline trained classifiers would clearly have their accuracy drastically dropped.

Several possible extensions of the Fuzzy eIX framework can be mentioned:

  • Granules’ initial dimensions as well as minimum acceptable dimensions can be time-varying by updating the key meta-parameter, ;

  • The weighted sliding procedure takes place if an instance lays within the inner bounds of a granule. Other relations should be considered to set weights, e.g., local width and uncertainty information;

  • Merging methods should be further analysed and compared considering hyper-volumes, weighted means, and optimistic and pessimistic similarities. For instance, Hausdorff distance should be considered, as in [10];

  • Heuristics will be evaluated to update the meta-parameter to achieve faster or slower balancing speed;

  • Results of Fuzzy eIX will be compared with those of other evolving classifiers using benchmark datasets;

  • Interval and fuzzy data streams, and weak supervision shall be addressed in the future.

References

  • [1] P. Angelov (2013) Evolving rule-based models: a tool for design of flexible adaptive systems. Vol. 92, Physica. Cited by: §I, §I, §I.
  • [2] P. Angelov (2014) Outside the box: an alternative data analytics framework. J. Autom. Mob. Robot. Intell. Syst. 8 (2), pp. 29–35. Cited by: §II-A.
  • [3] A. Bargiela and W. Pedrycz (2016) Granular computing. In Handbook on Computational Intelligence, pp. 43–66. Cited by: §I, §IV-E.
  • [4] C. Bezerra, B. Costa, L. A. Guedes, and P. Angelov (2020) An evolving approach to data streams clustering based on typicality and eccentricity data analytics. Information Sciences 518, pp. 13–28. Cited by: §II-A.
  • [5] A. Bouchachia, B. Gabrys, and Z. Sahel (2007) Overview of some incremental learning algorithms. In 2007 IEEE Int Conf Fuzzy Syst, pp. 1–6. Cited by: §I.
  • [6] L. A. Cordovil, P. H. Coutinho, I. Bessa, M. F. D’Angelo, and R. Palhares (2019) Uncertain data modeling based on evolving ellipsoidal fuzzy information granules. IEEE Transactions on Fuzzy Systems, pp. 11p. DOI: doi.org/10.1109/TFUZZ.2019.2937052. External Links: Document Cited by: §I, §I.
  • [7] T. Fawcett (2006) An introduction to roc analysis. Pattern recognition letters 27 (8), pp. 861–874. Cited by: §V.
  • [8] C. Garcia, A. Esmin, D. Leite, and I. Skrjanc (2019) Evolvable fuzzy systems from data streams with missing values: with application to temporal pattern recognition and cryptocurrency prediction. Pattern Recognition Letters 128, pp. 278–282. Cited by: §I.
  • [9] C. Isaksson, M. H. Dunham, and M. Hahsler (2012) SOStream: self organizing density-based clustering over data stream. In Int Wksp on Mach Learn and Data Mining, pp. 264–278. Cited by: §II-A.
  • [10] J. Jeng, C. Chen, S. Chang, and C. Chuang (2019) IPFCM clustering algorithm under euclidean and hausdorff distance measure for symbolic interval data. Int J Fuzzy Syst 21 (7), pp. 2102–2119. Cited by: 3rd item.
  • [11] I. Lana, J. L. Lobo, E. Capecci, J. Del Ser, and N. Kasabov (2019)

    Adaptive long-term traffic state estimation with evolving spiking neural networks

    .
    Transportation Research Part C: Emerging Technologies 101, pp. 126–144. Cited by: §I.
  • [12] D. Leite and I. Skrjanc (2019) Ensemble of evolving optimal granular experts, owa aggregation, and time series prediction. Information Sciences 504, pp. 95–112. Cited by: §I.
  • [13] D. Leite, R. Palhares, V. Campos, and F. Gomide (2015) Evolving granular fuzzy model-based control of nonlinear dynamic systems. IEEE Trans Fuzzy Syst 23 (4), pp. 923–938. Cited by: §I.
  • [14] D. Leite, I. Škrjanc, and F. Gomide (2020) An overview on evolving systems and learning from stream data. Evolving Systems, pp. 18p. DOI: doi.org/10.1007/s12530–020–09334–5. External Links: Document Cited by: §I, §I.
  • [15] D. Leite (2012) Evolving granular systems. Ph.D. Thesis, State University of Campinas (UNICAMP), Sao Paulo - Brazil. Cited by: §I, §I, §I, §I, §V.
  • [16] D. Leite (2013)

    Evolving granular neural networks from fuzzy data streams

    .
    Neural Netw 38, pp. 1–16. Cited by: §I, §I.
  • [17] W. Lio and B. Liu (2018)

    Residual and confidence interval for uncertain regression model with imprecise observations

    .
    Journal of Intelligent & Fuzzy Systems 35 (2), pp. 2573–2583. Cited by: §I.
  • [18] B. Liu (2007) Uncertainty theory. In Uncertainty theory, pp. 205–234. Cited by: §I.
  • [19] E. Lughofer (2011) Evolving fuzzy systems-methodologies, advanced concepts and applications. Vol. 53, Springer. Cited by: §I.
  • [20] M. Nagaraju and B. Tripathy (2015) Approximate equalities for covering based optimistic multigranular rough sets and their properties. IIOAB Journal 6 (4), pp. 77–97. Cited by: §II-B.
  • [21] Z. Pawlak (1982) Rough sets. International journal of computer & information sciences 11 (5), pp. 341–356. Cited by: §II-B.
  • [22] M. Pratama, W. Pedrycz, and E. Lughofer (2018) Evolving ensemble fuzzy classifier. IEEE Transactions on Fuzzy Systems 26 (5), pp. 2552–2567. Cited by: §I.
  • [23] Y. Qian and J. Liang (2006) Rough set method based on multi-granulations. In IEEE Int Conf on Cognitive Info, pp. 297–304. Cited by: §II-B.
  • [24] Y. Qian, J. Liang, and W. Wei (2010) Pessimistic rough decision. Sec Int Wksp on Rough Set Theory 29 (5), pp. 440–449. Cited by: §II-B.
  • [25] A. Silva, W. Caminhas, A. Lemos, and F. Gomide (2014) A fast learning algorithm for evolving neo-fuzzy neuron. Applied Soft Computing 14, pp. 194–209. Cited by: §I.
  • [26] P. Silva, H. Sadaei, R. Ballini, and F. Guimaraes (2019) Probabilistic forecasting with fuzzy time series. IEEE Transactions on Fuzzy Systems, pp. 14p. DOI: doi.org/10.1109/TFUZZ.2019.2922152. External Links: Document Cited by: §I.
  • [27] I. Škrjanc, J. A. Iglesias, A. Sanchis, D. Leite, E. Lughofer, and F. Gomide (2019) Evolving fuzzy and neuro-fuzzy approaches in clustering, regression, identification, and classification: a survey. Info Sci 490, pp. 344–368. Cited by: §I, §I, §I, §I, §IV-F.
  • [28] E. Soares, P. Costa Jr, B. Costa, and D. Leite (2018) Ensemble of evolving data clouds and fuzzy models for weather time series prediction. Applied Soft Computing 64, pp. 445–453. Cited by: §II-A.
  • [29] Q. Song and N. Kasabov (2001) ECM-a novel on-line, evolving clustering method and its applications. Foundations of cognitive science, pp. 631–682. Cited by: §II-A.
  • [30] P. V. Souza, T. Rezende, A. Guimaraes, V. Araujo, L. Batista, G. Silva, and V. Silva (2019) Evolving fuzzy neural networks to aid in the construction of systems specialists in cyber attacks. Journal of Intelligent & Fuzzy Systems 36 (6), pp. 6773–6763. Cited by: §I.
  • [31] B. Tripathy and K. G. Rajulu (2014) On covering based pessimistic multigranular rough sets. In 2014 Int Conf on Comput Intel and Comm Net, pp. 708–713. Cited by: §II-B.
  • [32] B. Tripathy, P. Saraf, and S. C. Parida (2015) On multigranular approximate rough equivalence of sets and approximate reasoning. In Computational Intelligence in Data Mining-Volume 2, pp. 605–616. Cited by: §II-B.
  • [33] B. Tripathy (2014) Multi-granular computing through rough sets. In Advances in Secure Comput, Internet Serv, and Appl, pp. 1–34. Cited by: §II-B.
  • [34] S. W. Tung, C. Quek, and C. Guan (2013) ET2FIS: an evolving type-2 neural fuzzy inference system. Inform Sciences 220, pp. 124–148. Cited by: §I.
  • [35] Y. Yao (2001) Information granulation and rough set approximation. Int J Intell Syst 16 (1), pp. 87–104. Cited by: §II-B.