I Introduction
The world is full of different kinds of information and more and more information is produced. Therefore, how to properly measure information with uncertainty has become a hot topic in recent years. Because the measure of information is very helpful to extract a general figure of information provided before a concise process of it. To satisfy this kind of need to appropriately measure information, lots of related theories have been designed. Among them, some theories are very representative in managing uncertain information, such as fuzzy mathematics [24, 13, 37, 33], the extension of evidence theory [45, 8, 47, 31], soft theories [14, 39, 41, 12],  number [3, 4, 27, 26],  number [28, 20] and maximum theory [46, 5, 18, 48]. All of the theories mentioned performs well in extracting truly useful information from uncertainty. With the development of technology of computer, some new visions on the presentation form of information has been proposed. The related works attempts to dispose information from a completely new dimension, which are quantum theory [22, 10, 1, 17] and complex function [44, 19, 43]. Due to the effectiveness of these theories in disposing information, some practical applications also benefit from these meaningful works, such as target recognition [32, 11], decision making [52, 15, 16, 25] and pattern classification [42, 29].
Among all of the previously proposed theories, DempsterShafer evidence theory (DS evidence theory) [2, 35] is the representative work to handle uncertainty contained in information. To better adapt to actual environment of application in an open world, an improved version of DS evidence theory is proposed, namely generalized evidence theory (GET) [7]
. However, the two theory consider all of the incidents as a still figure, which is lack of description on the dynamic process of the transition of different things. Fortunately, the Markov chain can be utilized to avoid this drawback by setting all the incidents under judgment at a random and mutually connected process. Some works have taken the Markov chain into different applications
[34, 40, 53, 54], which works very well. Therefore, in order to enable the generalized evidence theory to have a dynamic figure on the evidences given, a specially customised Markov model is introduced into GET. So, when the Markov model is integrated into the GET, the completely new frame of evidence is able to possess some new properties which is given as follows:
[(1)]

The improved generalized evidence theory can manifest the process of transition of propositions, which describes a dynamic process of evidences.

Transition probability and further extensions of it can be generated as a dual certificate of the degree of belief of the first dimension.

Completely new distance measure, similarity measure, uncertainty measure and method of combination are proposed based on the concept of MMGET.
The rest of paper is written as follows. The section of preliminary introduces some basic concepts related to the work proposed in this paper. Then, the next section present every details of the proposed model. And the part of numerical examples gives some examples to verify the validity and correctness of the proposed method.
Ii Preliminaries
In this section, some basic concepts are introduced. And lots of meaningful work have been completed to solve different kinds of problems [30, 49, 51, 38, 23].
Iia Generalized evidence theory (GET) [7]
Definition 1.
Mass function
Assume there exists a frame of discernment (FOD), , in an open world. denotes the power set of the whole FOD which is consist of elements. For , a mass function which is also a mapping can be defined as:
(1) 
And the properties the mass function satisfies can be defined as:
(2) 
(3) 
And thus the is a generalized basic probability assignment (GBPA) of the FOD, . Besides, it can be concluded that the is not a restriction in GBPA, which means the element can be also regarded as a focal element in the generalized evidence theory. When the value of is equal to 0, the GBPA can exactly degenerate in to the form of classic BPA.
Definition 2.
Generalized belief function (GBF)
Assume there is a GBPA , the GBF with respect to can be defined as:
(4) 
(5) 
Definition 3.
Generalized plausible function (GPF)
Assume there is a GBPA , the GPF with respect to can be defined as:
(6) 
(7) 
Definition 4.
Generalized combination rule (GCR)
In the GET, given two GBPAs contained in the same frame of discernment, the GCR can be defined as:
(8) 
(9) 
(10) 
(11) 
Definition 5.
Generalized evidence distance (GED)
Assume there exists two GBPA, and , on the frame of discernment, . And the generalized evidence distance between and can be defined as:
(12) 
in which the is a matrix whose elements are expressed as:
(13) 
IiB The Markov chain medel
Definition 6.
Markov property
Assume there exists a random series of conditions, , namely the Markov chain, which is defined a space of probability in which denotes probability measure on the mapping between probability and conditions of events. Besides, all the probability is range from to
in the restriction of the mapping. For any given moment
, any states belongs to arbitrary space and any underlying and possible series of states before the moment , the Markov property can be defined as:(14) 
Definition 7.
The transition probability matrix of Markov chain
Assume the transition probability is represented by which means the possibility of a state transits to state from moment m to moment n. And it can be expressed as:
(15) 
In order to simplify the procedure of presenting the probability , when the factor of time is not related with aspects discussed about the events, the icon of corresponding probability can be expressed as . And for a complete Markov chain, the transition probability matrix (TPM) can be defined as:
(16) 
For every element contained in the matrix, some properties which are supposed to be satisfied can be defined as:
(17) 
(18) 
IiC The method to obtain the transition probability matrix
Definition 8.
Cohort approach
For any observed targets under given state , the transition probability of transferring from state to state in a period of inspection can be defined as:
(19) 
is the number of targets being observed and represents the moment when the observation terminates. However, for all the observed targets, it is necessary to weight all the transition probability in a period of observation and the process can be defined as:
(20) 
In order to restrict the sum of element in every row exactly equal to , a step of normalization is carried out and the detailed process is defined as:
(21) 
IiD Some entropy theories
Definition 9.
Deng entropy [9]
Given a FOD, then the Deng entropy can be defined as:
(22) 
In the expression of Deng entropy, is a mass function defined on the FOD and is set as a focal element. The mass of indicates the support of belief of proposition . Besides, the cardinality of proposition is represented by .
Definition 10.
Shannon entropy [36]
Given a series of distribution of probability, then the Shannon can be defined as:
(23) 
The sum of is equal to .
IiE A kind of measure of similarity of evidences
Definition 11.
Deng et al.’s method [6]
Assume there exists two pieces of evidences and and the distances can be calculated by the algorithm proposed in [21]. So, for any two pieces of evidences and , the similarity between them can be calculated as:
(24) 
Then, for the whole body of evidences, a matrix manifest the similarity among evidences (SMM) can be given as:
(25) 
Therefore, after obtaining the SMM, the corresponding support degree of pieces of evidences are defined as:
(26) 
Then, the credibility degree CRD of according evidences are defined as:
(27) 
which indicates an underlying relationship between evidences. To obtain a modified value of propositions of evidences, a parameter is defined as:
(28) 
After getting the modified values of propositions, if there exists pieces of evidence, then combine the evidences for times.
IiF Details of Znumbers
Definition 12.
Znumbers [50]
A Znumber is composed of two fuzzy numbers to given a corresponding figure of the practical situations. Suppose there exists a kind of incident, , and a pair of Znumber is relative with it. Then, the first dimension of Znumber is a probability measure of the incident, , to happen and the second dimension is a check on the reliability of the judgement given in the first dimension. Therefore, the mathematics form of Znumbers can be defined as:
(29) 
IiG Sigmoid function
Sigmoid function is often utilized as a activation function in neural network to map the variable into the range (0,1) and is defined as:
(30) 
Iii Proposed Markov model for GET
Iiia The matrix form of GET (MFGET)
Assume pieces of evidence is given on an FOD and the FOD is given as . Then, the matrix of GBPAs distribution of evidences can be defined as:
(31) 
For each column of the matrix, it can be regarded as a detailed figure about the situations of a specific proposition . And the vectorial form of the with respect to any proposition can be defined as:
(32) 
IiiB An elimination designed to erase dirty data
An important probability distribution in the area of mathematics, physics and engineer is utilized in this customised elimination which is called as normal distribution. And the formula of it is defined as:
(33) 
which can be also written as . Besides, the parameter and can be defined as respectively:
(34) 
(35) 
where the total variance is denoted by
, represents each of variable, the mean of the whole is represented by and the number of the examples is . The normal distribution is presented in Figure 2.Therefore, for a vectorial form of , the corresponding parameters can be given as:
(36) 
(37) 
After both of the parameters are obtained, the evidences whose values of propositions provided lies in the corresponding range are retained to ensure the data is effective and not conflicting with main body of evidences.
IiiC A match between Markov chain (MC) and GET
It can be obtained that a sequence of conditions is contained in the Markov chain which is very similar to the propositions, , contained in a frame of discernment (FOD). Let denotes an incident which corresponding to a proposition defined on the frame of discernment which can be expressed as:
(38) 
Suppose there exists a series of incidents and the chain of incidents can be given as:
(39) 
According to the definition of the GET, each proposition contained in the FOD is allocated a mass of GBPA which means the proposition owns a possibility to take place. In the most optimistic situation, all of the proposition can happen which indicates that when one incident occur then it can be transferred into another condition that the same or a different proposition to take place. It can be appropriately and accurately described by a transiting chain matrix (TCM) which can be defined as:
TCM=[IN1⟶IN1IN1⟶IN2…IN1⟶IN2AIN2⟶IN1IN2⟶IN2…IN2⟶IN2A…………INF⟶IN1INF⟶IN2…INF⟶IN2AINF+1⟶IN1INF+1⟶IN2…INF+1⟶IN2A…………IN2A⟶IN1IN2A⟶IN2…IN2A⟶IN2A]

(40) 
where represents the probability of the transformation from incident to incident . In order to simplify the process of manifesting each details of propositions’ changes, a icon is proposed and the matrix can be rewritten as:
TCM=[T11T12…T1nT1(n+1)…T12AT21T22…T2nT2(n+1)…T22A…………………Tk1Tk2…TknTk(n+1)…Tk2AT(k+1)1T(k+1)2…T(k+1)nT(k+1)(n+1)…T(k+1)2A…………………T2A1T2A2…T2AnT2A(n+1)…T2A2A]

(41) 
Besides, one restriction is supposed to be satisfied which is defined as:
(42) 
Assume there exists a FOD which is defined as , the Markov process of the FOD is given in Figure 3.
IiiD The definition of transition probability
A series of GBPAs are given in a piece of evidence on the definition of FOD and it can be concluded that the higher the value of GBPA of a proposition is, the bigger probability the incident corresponding to proposition is, which also affect the next state of development of things. Therefore, it is reasonable and rational to regard that if the absolute mass of subtraction of values of GBPAs of two propositions is high, then the underlying possibility of a transferring between the two proposition is low. Then, a cost function to measure the cost in the process of transferring can be defined as:
(43) 
And the settings for the cost function is based on the theory of increase of entropy. It can be easily concluded that everything is becoming more and more uncertainty simultaneously which is corresponding to the phenomenon that the state with a high mass transfers to a state which owns a low mass. Therefore, the cost of the process is regarded as relatively low. On the other side, if one uncertainty state with a very low mass intends to become a certain state with a relatively high mass, the cost is reasonably much bigger due to the property that everything is going to be more and more chaotic.
So, with respect to the range of the mass of propositions given in a piece of evidence, the transition probability can be defined as:
(44) 
IiiE Concomitant sets for incidents in MC
The transition probabilities of a incident can be obtained through the procedure proposed above. Then, a concomitant sets can be given to describe the condition of an incident, which can be defined as:
(45) 
It can be easily concluded that if the sum of transition probability of the is relatively low, then the certainty of the proposition is relatively high which means the next state of the sequence of things is probably still the same incident. Therefore, the certainty measure of the can be defined as:
(46) 
In the expression of the CM, the bigger the value of CM is, the more certain the proposition or incident is which can be regarded as a kind of steady state of the evidence given. If a evidence owns a prominent CM of one proposition, then the information transferred by the evidence is explicit and not confusing.
IiiF Belief function for MMGET (BF)
Assume there exists a and its CS is given as . According to the definition of GBF, the specially designed belief function for MMGET can be defined as:
(47) 
The variable is the number of the process of the transferring from one state to another state.
A simple example is designed to illustrate the process of obtaining the value of BF. Assume there exists three CSs which is given as , , . Then, with respect to the incident , the BF of it can be calculated as:
IiiG Plausible function for MMGET (PF)
Assume there exists a and its CS is given as . According to the definition of GPF, the specially designed belief function for MMGET can be defined as:
(48) 
The variable is the number of the process of the transferring from one state to another state.
A simple example is designed to illustrate the process of obtaining the value of BF. Assume there exists three CSs which is given as , , . Then, with respect to the incident , the BF of it can be calculated as:
IiiH Combination rule for MMGET
In the MMGET, some according mass is allocated to a . A certainty measure is proposed in this paper called CM which can be utilized as a credibility proof of the specific proposition. For a piece of evidence after modification, , the allocation of mass of propositions can be defined as:
(49) 
In the process of combination, the element is treated differently due to its effect in indicating the extend of completion of the whole OFD. The detailed process of combination for MMGET can be defined as:
(50) 
(51) 
(52) 
(53) 
A simple example is designed to illustrate the process of the combination. Assume there exists two series of distribution of which is given as and . And the process of combination can be given as:
IiiI Distances of elements in the MP
IiiI1 Distance of MCs in MMGET
Assume there exists two MCs, and , and the vectorial form of chains is given as . Then, the distance measure for MC is defined as:
(54) 
in which the is a matrix whose elements are expressed as:
(55) 
A simple example is revised to illustrate the effectiveness of the proposed method of measuring the distance of MC. Assume there exists two pieces of MCs which is given as and . Then, the process of calculating the distance can be given as:
IiiI2 Distance of CSs in MMGET
Assume there exists two CSs, and , and the vectorial form of chains is given as . Then, the distance measure for CS is defined as:
(56) 
Besides, for the corresponding series of TP of propositions, the distance between TPs is defined as:
(57) 
Note: For the , the incidents of the matrix is the ones corresponding to the transferred state.
in which the is a matrix whose elements are expressed as:
(58) 
All in all, the distance of CSs is defined as:
(59) 
The process of calculation of distances is the same to the procedure provided in the section of distances of MCs.
IiiI3 Distance of modified evidences in MMGET
Assume there exists two pieces of modified evidences, and , whose vectorial form are given as . Then, the measure of distance of the modified evidences is defined as:
(60) 
in which the is a matrix whose elements are expressed as:
(61) 
The process of calculation of distances is the same to the procedure provided in the section of distances of MCs.
IiiJ Measure of similarity of original evidences
It is necessary to calculate a degree of divergence of a piece of evidence among the main body of evidences. In this part, the difference of the probability assignment of an incident, the corresponding series of transition probability and modified evidences is taken into consideration. Therefore, the similarity measure of original evidences is defined as:
(62) 
From the formula defined above, it can be concluded that if the distance to other evidences of a piece of evidence is relatively big which means the one judged is much more different than other ones, then the value of parameter is relatively lower in turn. A simple example is designed to illustrate the effectiveness of the .
Assume there exists a series of group of is given respectively as . Then, the degree of reliability of each VF can be calculated as:
Example Assume there exists a series of distribution of and , three propositions contained in the frame of discernment and three piece of evidences obtained. And the process of fluctuation of the variable SM is given in Figure 6.
IiiK Certainty measure of FOD
In GET, the value of indicate the degree of uncertainty of the frame of discernment provided. Therefore, an innovative method to recognize the uncertainty is designed to manifest this kind of phenomenon. And first, with the changes of values of propositions and empty set, the degree of uncertainty of the frame of discernment is also changed. A simple Figure 7 is drawn to present this kind of fluctuation.
It can be easily concluded that if all of the propositions except the empty set are much more easily to transfer from the original state to the state of empty set, then it tends to regard the FOD as incomplete, which also indicates that if the sum of transition probability is bigger, then the FOD is more incomplete. To reflect the level of uncertainty of the FOD, an uncertaintyrecognition entropy is defined as:
(63) 
Note: If the is the object a proposition transferred to, then the value of is set as the number of biggest cardinality of propositions. And if , then the value of is set as the number of smallest cardinality of propositions. The two settings are designed to help better present the degree of uncertainty of FOD.
From the definition of the formula, if the transition probability gets bigger, then the CRE also gets bigger which indicates that if some states corresponding to some propositions are easily to transit to the state of empty set, then the FOD tends to be more unsteady. What is expected to be pointed out that, if a state of multiple propositions transfers, then the degree of uncertainty reduces which means a much bigger cost in the process of transition and shows that the transition to state of empty set is much more attractive which means there exists a more unsteady FOD.
Assume the elements contained in a proposition is less than 10, and a simple Figure 8 is drawn to indicate the fluctuation of values of CRE.
Besides, a simple example is revised to illustrate the process of obtaining the CRE. Assume there exists a series of TPs whose according propositions are single proposition which is given as . And the CRE can be obtained as:
In the last, a flow chart is designed to illustrate the details of proposed model. All of the process is given in Figure 9.
Iv Numerical example
Example: Illustration of the similarity measure
Assume there exists three pieces of evidences which owns 3 propositions whose FOD can be defined as . The values of propositions contained in the evidences are listed in Table I. Besides, the corresponding cost and TPs are listed in Table II and III. According to the definition of similarity measure, three kinds of distances are supposed to be calculated to obtain a base to calculate the value of similarity. Then, the three kinds of distances, , and , are produced by the uniform method of measure of distance which are given in Table IV, V and VII. In the last, the final judgment on the similarity of evidences are listed in Table VIII.
Comments
There are no comments yet.