Next Article in Journal
A Novel Framework for Anomaly Detection for Satellite Momentum Wheel Based on Optimized SVM and Huffman-Multi-Scale Entropy
Next Article in Special Issue
Distance-Based Knowledge Measure for Intuitionistic Fuzzy Sets with Its Application in Decision Making
Previous Article in Journal
Analyzing the Effects of Topological Defect (TD) on the Energy Spectra and Thermal Properties of LiH, TiC and I2 Diatomic Molecules
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A New Total Uncertainty Measure from A Perspective of Maximum Entropy Requirement

School of Electronics And Information, Northwestern Polytechnical University, Xi’an 710072, China
*
Author to whom correspondence should be addressed.
Submission received: 30 July 2021 / Revised: 12 August 2021 / Accepted: 12 August 2021 / Published: 17 August 2021
(This article belongs to the Special Issue Recent Progress of Deng Entropy)

Abstract

:
The Dempster-Shafer theory (DST) is an information fusion framework and widely used in many fields. However, the uncertainty measure of a basic probability assignment (BPA) is still an open issue in DST. There are many methods to quantify the uncertainty of BPAs. However, the existing methods have some limitations. In this paper, a new total uncertainty measure from a perspective of maximum entropy requirement is proposed. The proposed method can measure both dissonance and non-specificity in BPA, which includes two components. The first component is consistent with Yager’s dissonance measure. The second component is the non-specificity measurement with different functions. We also prove the desirable properties of the proposed method. Besides, numerical examples and applications are provided to illustrate the effectiveness of the proposed total uncertainty measure.

1. Introduction

With the development of sensor technology, it has become a trend for complex systems to be equipped with multiple sensors. Compared with single-sensor monitoring, multi-sensor monitoring could have higher reliability. Obviously, it is a key issue to effectively fuse the multi-sensor information. Many techniques are proposed to solve the issue, such as the Dempster-Shafer theory (DST) [1,2], Kalman filtering (KF) [3,4], fuzzy theory [5], Bayesian reasoning method [6,7], neural network [8], and so on. However, there are many uncertainties, for example, randomness and imprecision, in the real world. The treatment of uncertainty is an important aspect in information fusion theories. Among them, DST is an effective framework to deal with uncertain information. This theory was first proposed by Dempster [9] and further developed by Shafer [10]. It is widely used in fault diagnosis [11,12,13,14], decision-making [15,16], risk assessment [17], and so on. Many studies in recent years have focused on conflict resolution [18,19,20], evidence revision [21], combination rules [22,23,24,25], and information volume [26,27]. Many methods about uncertainty quantification have also been proposed [28]. However, the existing methods have some limitations, and the uncertainty measure of BPAs is still an open issue in DST.
The concept of entropy was first proposed by the German physicist Clausius in 1865. In thermodynamics, entropy is a measure of the “chaos” of a system [29]. In information theory, entropy, also known as Shannon entropy, represents a measure of the uncertainty of a random variable [30]. Besides, Ilya Prigogine proposed a famous statement: “The entropy …leads to an ordering process” [31]. Parker and Jeynes also showed from a MaxEnt standpoint that the (stupendously gigantic) entropy of the supermassive black hole at the centre of the Milky Way can account for the geometrical stability of the galaxy [32]. Among them, the Shannon entropy, verified in the past few decades, is an effective way to measure uncertainty in probability theory (PT), but its direct application in DST is inappropriate. That is because that PT describes the probability of the occurrence of singletons, while the evidence theory is based on the theory of non-additive probabilities, which can represent the possibility of the occurrence of propositions with multiple elements [33].
Based on the above analysis, many scholars have proposed different entropy-like measures to quantify the uncertainty of a body of evidences (BOEs) in DST. For instance, Nguyen proposed a belief entropy based on the original basic probability assignment (BPA) [34]. Dubois and Prade proposed a weighted Hartley entropy for measuring the non-specificity of BPAs [35]. In addition, many other belief entropies have been proposed, including Höhle’s entropy [36], Yager’s dissonance measure [37], Klir and Ramer’s discord measure [38], Klir and Parviz’s strife measure [39], Jousselme’s ambiguity measure (AM) [40], Deng entropy [41], Yang and Han’s measure [42], the aggregated uncertainty measure (AU) [43], Wang and Song’s measure (SU) [44], Jirousek and Shenoy’s entropy (JS) [45], Deng’s measure [46], and so on [47,48,49]. These methods can effectively measure the uncertainty of BOEs in some cases, and satisfy some desirable properties of uncertainty quantification in DST [50]. Intuitively, when the system is completely unknown, that is, m Ω = 1 , Ω represents a frame of discernment (FOD), the uncertainty of evidence is the greatest, which is called the maximum entropy property. Some of the existing methods do not support this property. However, we think that maximum entropy should be a property that must be satisfied.
Motivated by the above discussions, a new total uncertainty measure from a perspective of maximum entropy requirement is proposed. The proposed method can measure both dissonance conflict and non-specificity in BPA, which includes two components. The first component is consistent with Yager’s dissonance measure. The second component is the non-specificity with different functions. We also prove the majority of the desired properties of the proposed method, such as non-negativity, monotonicity, probability consistency, and so on. The main contributions are summarized as follows.
  • We propose a new total uncertainty measure from the perspective of the maximum entropy requirement to quantify the uncertainty of BPAs in DST. Besides, properties of the proposed method have also been proved, such as non-negativity, monotonicity, maximum entropy, and so on.
  • We conduct some numerical examples to evaluate the effectiveness of our proposed method. The simulation results indicated that our proposed total uncertainty measure could be degraded to Shannon entropy when BPA is a Bayesian mass function. Furthermore, the proposed entropy could effectively deal with the redundant information of the focal element.
The remainder of this paper is organized as follows. Section 1 reviews the related concepts and works. Some preliminaries are introduced in Section 2. In Section 3, we illustrate the proposed method in detail. Simulation results and discussion compared with other methods are presented in Section 4. In Section 5, we give an application in feature evaluation for pattern classification based on the Iris dataset. Section 6 is a conclusion of the paper.

2. Preliminaries

Some basic concepts and existing methods are briefly introduced in this section.

2.1. Dempster-Shafer Theory

The Dempster-Shafer theory, proposed by Dempster [9] and expanded by Shafer [10], is a mathematical theory of multi-source information. This method can effectively cope with uncertainty. It is widely used in target identification, fusion decision, and so on. Some definitions about this theory are as follows.
  • Frame o f D i s c e r n m e n t ( F O D ) . If Θ = θ 1 , θ 2 , , θ r is a finite complete set of r mutually exclusive elements, it is called a frame of discernment [9,10,51].
  • B a s i c P r o b a b i l i t y A s s i g n m e n t ( B P A ) . Set Θ is a FOD, and its power set constitutes a set of propositions. If a function m : 2 Θ 0 , 1 satisfies the following formula [9,10,52]:
    m = 0 S m S = 1 ,
    the mass function m is a BPA. In this definition, m S is the basic probability number of proposition S, and indicates the belief assigned to S.
  • D e m p s t e r C o m b i n a t i o n R u l e ( D C R ) . Let m 1 and m 2 be two BPAs, and then the Dempster combination rule is as follows [9]:
    m = 0 m C = S B = C m 1 S m 2 B 1 k , C
    where k = S B = m 1 S m 2 B .

2.2. Belief and Plausibility Function

Let m be a BPA on FOD Θ , if B e l : 2 Θ [ 0 , 1 ] statisfies [9]:
B e l S = B S m B , S 2 Θ
then, B e l S is called the belief measure of proposition S.
P l S is the plausibility function that is defined as follows:
P l S = 1 B e l S ¯ = B S = m B
P l S is the degree to which you do not disagree with proposition S.

2.3. Shannon Entropy

Let X be a sample space with possible values x 1 , x 2 , , x n , then, the Shannon entropy is defined as [30]:
H s = x i X p x i log 2 1 p x i ,
where p x i is the probability of x i . It also satisfies x i X p x i = 1 .

2.4. Some Existing Entropies in DST

In the information theory, Shannon entropy has been widely used, but it has an inherent limitation of handling the uncertainty in DST. Nonetheless, the idea of Shannon entropy still plays a crucial role in guiding the uncertainty measurement of BPA. Many scholars proposed methods of uncertainty measurements. Let X be a FOD, and some existing uncertainty measures in DST are listed as follows.
Nguyen’s entropy. 
Nguyen [34] proposed a belief entropy based on the original BPA:
H N = S 2 X m S log 2 1 m S .
Weighted Hartley entropy. 
Dubois and Prade [35] proposed a entropy for the non-specificity measure:
H D P = S 2 X m S log 2 S ,
where S is the cardinality of S.
Aggregated uncertainty measure (AU). 
Harmanec and Klir [43] proposed a total uncertainty measure of non-specificity and inconsistency:
A U m = max x X p x log 2 p x ,
where p x is defined as:
p x 0 , 1 , x X x X p x = 1 x S p x B e l S , P l S , S X
Yager’s entropy. 
Yager [37] proposed a dissonance measure of BPAs based on the plausibility function:
H Y m = S 2 X m S log 2 P l S ,
where P l S is the plausibility measurement of S in m.
Deng entropy. 
Deng [41] proposed a new uncertainty measurement method, namely, ”Deng entropy”. It is defined as:
H D e n g m = S 2 X m S log 2 m S 2 S 1 = S 2 X m S log 2 1 m S + S 2 X m S log 2 2 S 1 .
Höhle entropy. 
Höhle [36] proposed a belief entropy based on a belief function, which is defined as:
H c m = S 2 X m S log 2 B e l S ,
where B e l S is the belief measurement of S.
Yang and Han’s measure ( T U I ). 
Yang and Han [42] defined a total uncertainty measurement. The formula is defined as:
T U I m = 1 3 n x X d I B e l ( x ) , P l x , [ 0 , 1 ] ,
where n is the elements number of FOD X. The interval distance is defined as:
d I c 1 , d 1 , c 2 , d 2 = c 1 + d 1 2 c 2 + d 2 2 2 + 1 3 d 1 c 1 2 d 2 c 2 2 2 .
Deng’s measure ( T U E I ). 
In addition, Deng et al. [46] proposed an improved total uncertainty measurement method based on belief intervals:
T U E I m = x X 1 d E I B e l x , P l x , 0 , 1 X ,
where d E I · indicates the Euclidean distance between two interval numbers B e l x , P l x and 0 , 1 .

3. Proposed Uncertainty Measure in DST

3.1. The Proposed Method

In this section, a new total uncertainty measure of BPAs in DST is proposed, which is from a perspective of maximum entropy requirement. It can quantify the total uncertainty of BPAs, including conflict and non-specificity. As for the conflict measure of BPAs, we utilize the method of Yager’s dissonance entropy [37]. As for the non-specificity measure of BPAs, H n s m , we think that H n s m should be consistent with the maximum entropy requirement. For example, the uncertainty of BPA m S = 1 defined on FOD X, S X , should equal to the uncertainty of vacuous BPA m Ω = 1 , where Ω is the FOD and Ω = S . Furthermore, the uncertainty of m S = a should hence be a function of a and the uncertainty degree of BPA m S = 1 , but not measured by the weighted Hartley entropy.
Based on the above idea, the proposed new total uncertainty measure is defined as follows:
H m = S 2 X m S log 2 1 P l S + S 2 X m S Q S = S 2 X m S log 2 2 Q S P l S ,
where X is the FOD, and Q S represents the maximum entropy in S, that is, the uncertainty of m S = 1 . Logically, for S X , Q S is a monotonically increasing function of the cardinality of S, that is, Q X Q S . In addition, when the BPA is a Bayesian mass function, we expect the new entropy to be degraded to Shannon’s entropy. Therefore, Q S = 0 when S = 1 . In summary, Q S is a function Q S : S R , satisfying (i) Q S = 0 i f S = 1 ; and (ii) d Q S d S 0 .

3.2. Properties of the Proposed Method

Similar to the probability theory (PT), there are some properties which need to be satisfied for the uncertainty measurement in DST, such as probability consistency, additivity, non-negativity, and so forth. The properties analysis of the proposed entropy is explored as follows.
Property 1 
(Non-negativity). H m 0 , the equation holds if, and only if m x = 1 and x X .
Proof. 
Given X = x 1 , x 2 , , x n , for any S 2 X ,
0 m S 1 ,
S > 0 ,
Q A 0 ,
then,
S 2 X m S log 2 1 P l S 0 ,
S 2 X m S Q S 0 ,
hence, H m 0 . If H m = 0 , there must be m S = 0 or 2 Q S P l S = 1 , that is, m x i = 1 and x i X . □
Property 2 
(Set Monotonicity). Let m X 1 = 1 be a vacuous BPA on FOD X 1 , and m X 2 = 1 be also a vacuous BPA on FOD X 2 , if X 1 < X 2 , then H m 1 < H m 2 .
Proof. 
For any vacuous BPA m X = 1 , there is P l X = 1 , then
H m = m X Q X + m X · log 2 1 P l X = m X Q X = Q X .
From the analysis in Section 3.1, it can be obtained that Q S is a monotonically increasing function of X , and we have Q X 1 < Q X 2 if X 1 < X 2 . Hence, the proposed method satisfies the set monotonicity property. □
Property 3 
(Maximum entropy). For all BPAs defined on a FOD X, the vacuous BPA m X = 1 has the most uncertainty.
Proof. 
Let m be a BPA on FOD X. According to the analysis of Section 3.1, Q S is a monotonically increasing function of the cardinality of S. Therefore, the proposed method obviously satisfies the maximum entropy property. □
Property 4 
(Probability consistency). For a Bayesian BPA defined on FOD X, its uncertainty equals to the form of Shannon entropy H m = S X m S log 2 1 m S .
Proof. 
If m is a Bayesian BPA, then P l S = m S , hence,
H m = S X m S log 2 1 P l S + S X m S Q S = S X m S log 2 1 m S + S X m S Q S s i n c e Q S = 0 i f S = 1 = S X m S log 2 1 m S .
Therewith, the proposed method satisfies the property of probability consistency. □
Property 5 
(Range). The range of the proposed entropy is 0 , Q X , where Q X is a function of X .
Property 6 
(Non-Additivity). Let m X and m Y be two BPAs that are defined on FOD X and Y, respectively. Then, H m X m Y H m X + H m Y .
Proof. 
Let Z be the Cartesian product space Z = X × Y , where x X and y Y , Z = { z 11 , z 12 , , z l k } is the corresponding joint focal element on Z, z i j = { x i , y j } , and m X m Y A × B = m X A m Y B . Then, the new entropy of m X m Y is:
H m X m Y = z 2 Z m z Q z + z 2 Z m z log 2 1 P l z = z 2 Z m z log 2 2 Q z P l z S 2 X B 2 Y m X S m Y B log 2 2 Q S Q B P l S P l B = S 2 X m X S log 2 2 Q S P l S + B 2 Y m Y B log 2 2 Q B P l B = H m X + H m Y .
Therefore, the proposed method does not satisfy the additivity property. □
Property 7 
(Generalized Set Consistency). When m S = 1 , S is any subset of a FOD, H m = f S , where f is a monotonically increasing function of S .
Proof. 
Assuming m is a BPA defined on FOD X, for S 2 X and m S = 1 , the uncertainty measurement based on the proposed entropy is:
H m = S X m S Q S + S X m S log 2 1 P l S = m S Q S = Q S .
Therefore, the proposed entropy satisfies the property of generalized set consistency if, and only if Q S = log 2 S .

4. Numerical Examples

In this section, we give three different forms of Q S . Some numerical examples are given to verify the rationality and effectiveness of the proposed method.
Case 1 ( Q S 1 ). 
According to [45], the maximum entropy is 2 log 2 X , where X is a FOD. In this paper, Q S : S R represents the maximum entropy in S. Hence, based on the above analysis, one function form of Q S can be defined as:
Q S 1 = 2 log 2 S .
Case 2 ( Q S 2 ). 
According to [26], the maximum Deng entropy is log 2 S 2 X 2 S 1 . Theoretically, Q S 2 should be log 2 B 2 S 2 B 1 . However, the Deng entropy’s maximum uncertainty is obtained at m S = 2 S 1 S 2 X 2 S 1 , which is inconsistent with our idea. In this paper, we think that the uncertainty of m S = a should be a function of a and uncertainty degree of BPA m S = 1 . Hence, for any B 2 S , there is only one situation, B = S . Therefore, another function form of Q S can be defined as:
Q S 2 = log 2 2 S 1 .
Case 3 ( Q S 3 ). 
According to [44,46], the maximum entropy is X , where X is a FOD. Similarly, the third function form of Q S can be defined as:
Q S 3 = 0 , S = 1 S , S > 1
Then, the proposed entropy could be written as follows:
H 1 m = S 2 X m S log 2 1 P l S + S 2 X m S · 2 log 2 S = S 2 X m S log 2 2 2 log 2 S P l S .
H 2 m = S 2 X m S log 2 1 P l S + S 2 X m S log 2 2 S 1 = S 2 X m S log 2 2 S 1 P l S
H 3 m = S 2 X m S log 2 1 P l S , S = 1 S 2 X m S log 2 1 P l S + S 2 X m S · S , S > 1 = S 2 X m S log 2 1 P l S , S = 1 S 2 X m S log 2 2 S P l S , S > 1

4.1. Example 1

This example is adapted from [53]. Let the FOD be X = x 1 , x 2 , , x n . We give a BPA as m x i = 1 1 n n . Then, we calculate the uncertainty of this BPA when n changes.
According to the definition and the desired properties of the entropy, it can be inferred that as n increases, the uncertainty of this BPA increases. In addition, when the BPA is a Bayesian mass function, the uncertainty of the BPA should be consistent with Shannon entropy.
We calculate the uncertainty of this BPA in this example based on the proposed method and some existing methods, as shown in Figure 1.
For this example, all the methods gave exactly the same result as the Deng Entropy (except the weighted Hartley entropy, and the two different “measures” of Deng, as well as Yang and Han). In this paper, the “Deng entropy“ was proposed by Deng to measure the uncertainty of BPAs in 2016, while “Deng’s measure ( T U E I )“, proposed by Deng et al. in 2017, is an improved total uncertainty measure method based on the belief interval.
In Figure 1, the uncertainty calculated by Yang and Han’s measure, T U E I and the weighted Hartley entropy show a downward trend with the increase of n, which was inconsistent with our intuition. However, the uncertainty calculated by the proposed methods in this paper and the remaining existing methods gradually increases with the increase of n, which is consistent with the results calculated by Shannon entropy. Therefore, the proposed methods in this paper are effective when BPA is a Bayesian mass function.

4.2. Example 2

Let the FOD be X = x 1 , x 2 , , x n . We also give a BPA as m x 1 , , x n = 1 . When n increases from 1 to 14, the uncertainty measure of BPAs based on the proposed method and other existing methods are shown in Table 1. In addition, in order to visualize the change of uncertain measurement results with n, the uncertainty measurement results of different methods are given in Figure 2.
As shown in Figure 2, it is obvious that the uncertainty degree measured by the Yager’s dissonance entropy is always 0. Intuitively, however, the uncertainty of this BPA should increase as n increases. Therefore, Yager’s dissonance entropy will obtain wrong results when m is a vacuous BPA in this example.
The uncertainty measure results obtained by AU, weighted Hartley entropy, and AM are the same. This is because when BPA is a vacuous BPA, the three methods give us the same result log 2 n . The degree of uncertainty obtained by these three methods increases with the increase of n, which is consistent with expectations. Similarly, the degree of uncertainty calculated based on the methods proposed in this paper ( H 1 , H 2 , H 3 ), Deng’s entropy, SU, JS, Yang and Han’s measure, and T U E I also increases with the increase of n. Additionally, the growth trend of the proposed method H 2 in this paper is basically the same as that of the Deng entropy. This is because for a vacuous BPA, m A = P l A . Hence, the two methods have the same function form. The proposed method H 3 and SU, Yang and Han’s measure, T U E I also have the same growth trend. However, we think that when the change trend is consistent with the theoretical connotation of uncertainty, it can be considered as a reasonable and effective measure method. Hence, the proposed methods are all effective when BPA is a vacuous BPA.
For this example, the proposed method H 2 gave the same results as the Deng entropy.

4.3. Example 3

Let X = a , b , c , d be the FOD. We give two BPAs as follows.
m 1 : m 1 a = 1 5 , m 1 b = 1 5 , m 1 a , b = 3 5 m 2 : m 2 a = 1 5 , m 2 b = 1 5 , m 2 c , d = 3 5 .
For m 1 ,
P l m 1 a = 4 5 , P l m 1 b = 4 5 , P l m 1 a , b = 1 ,
for m 2 ,
P l m 2 a = 1 5 , P l m 2 b = 1 5 , P l m 2 c , d = 3 5 ,
then the uncertainty based on the proposed method are:
H 1 m 1 = 1 5 · log 2 2 2 · log 2 1 4 / 5 + 1 5 · log 2 2 2 · log 2 1 4 / 5 + 3 5 · log 2 2 2 · log 2 2 1 = 2 × 1 5 · log 2 1 4 / 5 + 3 5 × 2 = 1.3288 .
H 2 m 1 = 1 5 · log 2 2 log 2 2 1 1 4 / 5 + 1 5 · log 2 2 log 2 2 1 1 4 / 5 + 3 5 · log 2 2 log 2 2 2 1 1 = 2 × 1 5 · log 2 1 4 / 5 + 3 5 × log 2 3 = 1.0797 .
H 3 m 1 = 1 5 · log 2 2 0 4 / 5 + 1 5 · log 2 2 0 4 / 5 + 3 5 · log 2 2 2 1 = 2 × 1 5 · log 2 1 4 / 5 + 3 5 × log 2 4 = 1.3288 .
H 1 m 2 = 1 5 · log 2 2 2 · log 2 1 1 / 5 + 1 5 · log 2 2 2 · log 2 1 1 / 5 + 3 5 · log 2 2 2 · log 2 2 3 / 5 = 2 × 1 5 · log 2 1 1 / 5 + 3 5 × log 2 4 3 / 5 = 2.5710 .
H 2 m 2 = 1 5 · log 2 2 log 2 2 1 1 1 / 5 + 1 5 · log 2 2 log 2 2 1 1 1 / 5 + 3 5 · log 2 2 log 2 2 2 1 3 / 5 = 2 × 1 5 · log 2 1 1 / 5 + 3 5 × log 2 3 3 / 5 = 2.3219 .
H 3 m 2 = 1 5 · log 2 2 0 1 / 5 + 1 5 · log 2 2 0 1 / 5 + 3 5 · log 2 2 2 3 / 5 = 2 × 1 5 · log 2 1 1 / 5 + 3 5 × log 2 4 3 / 5 = 2.5710 .
In addition, the uncertainty measured by other methods, as shown in Table 2.
Obviously, owing to differences in the focal elements in these BPAs, the uncertainty degree of m 1 and m 2 are different, though the values of the two BPAs are the same, and H m 1 should be less than H m 2 . However, the results obtained by the Deng entropy and weighted Hartley entropy are the same for m 1 and m 2 . The results obtained by other methods are as expected. However, Yager’s dissonance entropy method did not consider the non-specificity measure. The methods proposed in this paper obtain reasonable results and consider the total uncertainty. Therefore, when the focal elements are different but the BPA values are the same, the proposed method in this paper can effectively measure the degree of uncertainty.

4.4. Example 4

Let X = a , b , c , d be a FOD. Two BPAs defined on the FOD are given as follows. There is an intersection relationship between propositions.
m 1 : m 1 a , b = 0.4 , m 1 c , d = 0.6 m 2 : m 2 a , c = 0.4 , m 2 b , c = 0.6
The uncertainties are measured by different methods, as shown in Table 3.
For the body of evidence (BOE) m 1 , the intersection between the two propositions a , b and c , d is empty. For BOE m 2 , the intersection between the two propositions a , b and c , d is a single element c. However, the values of BPAs are the same. Based on the above analysis, the uncertainties of two BOEs are obviously different. However, according to Table 3, the results are the same for m 1 and m 2 , measured by Deng entropy, AU, weighted Hartley entropy, Yang and Han’s measure, and Deng’s measure. Besides, the uncertainty of BOE m 2 calculated by Yager’s dissonance entropy is 0. This result is clearly wrong. This is because there is an intersection between two propositions of m 2 , and Yager’s dissonance entropy eliminates the distinction between the two propositions when calculating a plausibility function. On the contrary, the three methods we propose can all distinguish the uncertainty difference between the two BOEs. Therefore, the proposed method can effectively distinguish the uncertainty when there is an intersection relationship between propositions.

4.5. Example 5

Let X = 1 , 2 , , 15 be a FOD with 15 elements. A BPA defined on the FOD is:
m 3 , 4 , 5 = 0.05 , m 7 = 0.05 , m A = 0.8 , m X = 0.1 ,
where A is a variable subset of X, with the number of single elements changing from 1 to 14. This example was adopted from [53].
The results are shown in Table 4 and Table 5, and Figure 3. Table 4 shows changes in the number of elements in S from 1 to 7, and Table 5 shows changes in the number of elements in A from 8 to 14. All the results are plotted in Figure 3.
As shown in Figure 3, with the increase of the number of elements in A, the uncertainty calculated by Yager’s dissonance entropy shows a downward trend, which is inconsistent with the connotation of uncertainty. The reason is that when the number of elements in A gradually increases, it gradually intersects with other propositions, and Yager’s entropy does not show this difference. This suggests that Yager’s entropy does not correctly measure the uncertainty of the evidence in this example.
From a “common sense” point of view, the uncertainty of BPA increases as the number of elements increases in A. Additionally, other methods except Yager’s dissonance entropy show an increasing trend on the whole. The corresponding values can be found in Table 4 and Table 5. However, it should be noted that when A changes from { 1 , 2 } to { 1 , 2 , 3 } , A begins to intersect with the proposition { 3 , 4 , 5 } . Therefore, the change of the uncertainty should be slightly less than the change of A from { 1 } to { 1 , 2 } . From Figure 4, it can be obtained that the proposed methods in this paper present this change. As for the uncertainty measure of BOEs, as far as we know, there is no reasonable evaluation index at present, and it is not certain that the greater the uncertainty, the better. Nevertheless, when its change trend is consistent with the theoretical connotation of uncertainty, it can be considered as a reasonable and effective measurement method.

4.6. Example 6

Let X = θ 1 , θ 2 be a FOD with two elements. Additionally, we give a BPA as:
m θ 1 = a , m θ 2 = b , m θ 1 , θ 2 = 1 a b ,
where a , b 0 , 0.5 . This example is adopted from [53]. Here, we calculate the uncertainty values based on the proposed methods and some existing methods with the changes of a and b. The results are shown in Figure 4.
For the proposed methods H 1 , H 2 , and H 3 , Yang and Han’s measure, weighted Hartley entropy, SU, JS, and Deng’s measure ( T U E I ), it can be found that the maximum uncertainty is obtained when m X = 1 , which is consistent with the property of maximum entropy. In addition, as the value of m X decreases, the uncertainty of BOE decreases gradually. As for Deng entropy, according to [26], its maximum uncertainty is obtained at m θ 1 = 2 θ 1 1 2 θ 1 1 + 2 θ 2 1 + 2 θ 1 , θ 2 1 = 0.2 , m θ 2 = 0.2 , and m X = 0.6 , which is consistent with Figure 4d and does not satisfy the maximum entropy property. For Yager’s dissonance entropy, the maximum uncertainty is obtained at m θ 1 = 0.5 , m θ 2 = 0.5 , m X = 0 . As the value of m X decreases, the uncertainty also increases, which is counter-intuitive. Hence, in this example, Yager’s dissonance entropy fails to measure the uncertainty of BOEs. Besides, for the method of AU, it obviously fails to measure the uncertainty of BOEs in this example.
The above numerical examples are analyzed and summarized. For Example 1, since we give a Bayesian mass function, the uncertainty of the Bayesian BPA is proportional to the number of elements n in FOD based on the proposed method. It can also be understood from the concept of entropy that with the increase of the elements number of FOD, the “chaos” degree of information in this example also increases. The results obtained by all methods are consistent with this understanding, except weighted Hartley entropy, and two different “measures” of Yang and Han, as well as Deng. For Example 2, we give a vacuous BPA. Obviously, as the number of elements in the FOD increases, the disorder of the system increases. This is true for all methods except Yager’s dissonance entropy. It is because Yager’s dissonance entropy only measures dissonance, not non-specificity. For Examples 3 and 4, the two BPAs given in each example are assigned the same belief value, but with different propositions. Because the uncertainty of a BPA is related to its belief value and proposition, the uncertainty is obviously different when the propositions are completely disjointed and partially intersected. Example 5 further illustrates the problem. In this example, the degree of intersection between the different propositions of the BPA changes gradually, that is, the belief values for elements in the FOD change gradually. Therefore, the degree of confusion in the system changes gradually, and the results of the uncertainty measure change accordingly. However, overall, as the number of elements in A increases, the confusion of BPA to the system should also increase. For Example 6, different belief values are assigned to BPA, but the propositions are the same. Obviously, when m θ 1 , θ 2 = 1 , the system is in a completely unknown state, so the uncertainty should be at the maximum, which is the maximum entropy.
The proposed method in this paper can effectively measure the uncertainty of BOEs in the above examples. However, for the non-specificity measure of BPA, Q S function is determined based on the maximum entropy of three existing uncertainty measures. Actually, there are many other entropies which can be considered, such as info-entropy [32]. This is a good guide for our future research direction.

5. Application

In this section, feature evaluation is performed with the Iris dataset to further verify the rationality of the proposed uncertainty measure. In the Iris dataset, three types of iris plants are surveyed, including “Setosa“, “Versicolour“, and “Virginica“. Besides, sepal length (SL), sepal width (SW), petal length (PL), and petal width (PW) are taken as four features. With respect to each iris class, each feature of instances is a Gaussian distribution with different standard deviations and means, as shown in Table 6 and Figure 5.
As shown in Figure 5, intuitively, P L has the best class discriminability, which is attributed to the best separation of Gaussian probability density functions (PDFs) of the three iris types, while the PDFs of the three iris types in S W almost overlap. Thus, the class discriminability of S W is the worst.
In addition, the method proposed in [54] is utilized to quantify the discriminability of different fault features, which is as shown below.
J = t r S w t r S b ,
where t r is the trace of a matrix, and S w and S b are the within-types scatter matrix and between-types scatter matrix, respectively.
S w = i = 1 C P C i E X 1 N i X C i X X 1 N i X C i X T S b = i = 1 C P C i 1 N i X C i X M 1 N i X C i X M T ,
with
M = 1 C i = 1 C 1 N i X C i X ,
where X is a feature vector of a sample and M is the mean of all fault types’ centroids.
The smaller the value of J, the better the discriminability of the corresponding fault feature. For the fault features, the J values are
J S L = 0.6163 , J S W = 1.5518 , J P L = 0.0623 , J P W = 0.0766 .
The above results are the same as those intuitively obtained in Figure 5. The rank of the four fault features is P L P W S L S W .
Now, we turn to using uncertainty measures for feature evaluation, including weighted Hartley entropy, AU, Yang and Han’s measure, Deng’s measure, AM, JS, Deng entropy, SU, and the proposed method.
Step1 (BPA generation). 
For different features in S L , S W , P L , P W , we generate the BPA corresponding to each sample in the fault dataset according to [55].
Step2 (Uncertainty measure of BPAs). 
For each feature, calculate the uncertainty of each BPA on it by using all the above uncertainty measures.
Step3 (Average uncertainty measure). 
Calculate the average uncertainty value on different features corresponding to different methods.
The results are shown in Table 7, and visually represented in the histogram, which are shown in Figure 6.
Features with smaller average uncertainty have better discriminability. It can be found in Table 7 and Figure 6 that for the proposed methods H 1 , H 2 and H 3 , the average uncertainty on feature PL is the smallest. The result indicates the feature PL has the best ability to distinguish the iris types. The ranking of the discrimination of the four features is P L P W S L S W , which is consistent with intuition obtained by Figure 5. The same result can be obtained by Deng entropy, AM, SU, JS, Yang and Han’s measure, and Deng’s measure except weighted Hartley entropy and AU. Therefore, the application has demonstrated the effectiveness of the proposed method.

6. Conclusions

In this work, we proposed a new total uncertainty measure from the perspective of maximum entropy requirement. The properties of the proposed method are analyzed, such as non-negativity, monotonicity, maximum entropy, and so on. Besides, we give three uncertainty measure functions for the body of evidences, and analyze the effectiveness and reasonableness of the proposed methods through several different numerical examples and an application. It can be seen from these examples that our methods are in general agreement with the connotation of uncertainty. Compared with Deng entropy, the proposed method can effectively measure the uncertainty of BPA when the propositions intersect with the same reliability value, and satisfy the maximum entropy property. In addition, it is not that the greater the uncertainty for a BPA, the better the measure. How do we evaluate uncertainty measure methods more rationally in DST? Is there a reasonable indicator system for the evaluation? Actually, we think that when we adapt the proposed method, and the uncertainty trend of the measured evidence is consistent with the theory, it can be considered as a reasonable and effective method. Our study provides the framework for further studies to assess the performance characteristics of uncertainty function. Our results are encouraging and should be validated in application areas, such as decision-making, fault diagnosis, target recognition, and many other areas. We will be devoted to the applications in depth in further work. Beyond that, a maximum entropy from the Parker and Jeynes argument showed that the entropy of the supermassive black hole at the centre of the Milky Way can account for the geometrical stability of the galaxy. We believe that this is a good guide for our future work about uncertainty measures.

Author Contributions

Conceptualization, Y.Z., X.D. and W.J.; methodology, Y.Z., X.D. and W.J.; software, Y.Z. and F.H.; validation, Y.Z. and F.H.; data curation, Y.Z. and F.H.; writing—original draft preparation, Y.Z.; writing—review and editing, Y.Z. and F.H.; supervision, X.D. and W.J. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the Shaanxi Key Research and Development Program (Grant No.2021ZDLGY01-04), the Innovation Foundation for Doctor Dissertation of Northwestern Polytechnical University (Grant No.CX2021078) and the National Natural Science Foundation of China (Grant No.61671384).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest for publishing in this journal.

Abbreviations

The following abbreviations are used in this manuscript:
PTprobability theory
DSTDempster-Shafer evidence theory
FODFrame of Discernment
BPAbasic probability assignment
BOEbody of evidence

References

  1. Denoeux, T. A k-nearest neighbor classification rule based on Dempster-Shafer theory. IEEE Trans. Syst. Man Cybern. 1995, 25, 804–813. [Google Scholar] [CrossRef] [Green Version]
  2. Xiao, F. Multi-sensor data fusion based on the belief divergence measure of evidences and the belief entropy. Inf. Fusion 2019, 46, 23–32. [Google Scholar] [CrossRef]
  3. Farag, W. Kalman-filter-based sensor fusion applied to road-objects detection and tracking for autonomous vehicles. Proc. Inst. Mech. Eng. Part I J. Syst. Control. Eng. 2021, 23, 1125–1138. [Google Scholar]
  4. Liu, X.; Zhou, B.; Huang, P.; Xue, W.; Li, Q.; Zhu, J.; Qiu, L. Kalman Filter-Based Data Fusion of Wi-Fi RTT and PDR for Indoor Localization. IEEE Sens. J. 2021, 21, 8479–8490. [Google Scholar] [CrossRef]
  5. Xiao, F. CaFtR: A Fuzzy Complex Event Processing Method. Int. J. Fuzzy. Syst. 2021. [Google Scholar] [CrossRef]
  6. Sauta, E.; Demartini, A.; Vitali, F.; Riva, A.; Bellazzi, R. A Bayesian data fusion based approach for learning genome-wide transcriptional regulatory networks. BMC Bioinform. 2020, 21, 1–28. [Google Scholar] [CrossRef]
  7. Chen, H.; Maduranga, D.A.K.; Mundra, P.; Zheng, J. Bayesian Data Fusion of Gene Expression and Histone Modification Profiles for Inference of Gene Regulatory Network. IEEE-ACM Trans. Comput. Biol. Bioinform. 2020, 17, 516–525. [Google Scholar] [CrossRef] [PubMed]
  8. Holzinger, A.; Malle, B.; Saranti, A.; Pfeifer, B. Towards multi-modal causability with Graph Neural Networks enabling information fusion for explainable AI. Inf. Fusion 2021, 71, 28–37. [Google Scholar] [CrossRef]
  9. Dempster, A. Upper and Lower Probabilities Induced by a Multivalued Mapping. Ann. Mathmatical Stat. 1967, 38, 325–339. [Google Scholar] [CrossRef]
  10. Shafer, G. A Mathematical Theory of Evidence; Princeton University Press: Princeton, NJ, USA, 1976. [Google Scholar]
  11. Lin, Y.; Li, Y.; Yin, X.; Dou, Z. Multisensor Fault Diagnosis Modeling Based on the Evidence Theory. IEEE Trans. Reliab. 2018, 67, 513–521. [Google Scholar] [CrossRef]
  12. Zhang, Y.; Jiang, W.; Deng, X. Fault diagnosis method based on time domain weighted data aggregation and information fusion. Int. J. Distrib.Sens. Netw. 2019, 15. [Google Scholar] [CrossRef]
  13. Liu, Z.; Xiao, F. An Intuitionistic Evidential Method for Weight Determination in FMEA Based on Belief Entropy. Entropy 2019, 21, 211. [Google Scholar] [CrossRef] [Green Version]
  14. Ji, X.; Ren, Y.; Tang, H.; Shi, C.; Xiang, J. An intelligent fault diagnosis approach based on Dempster-Shafer theory for hydraulic valves. Measurement 2020, 165, 108129. [Google Scholar] [CrossRef]
  15. Pisano, R.; Sozzo, S. A Unified Theory of Human Judgements and Decision-Making under Uncertainty. Entropy 2020, 22, 738. [Google Scholar] [CrossRef] [PubMed]
  16. Zhang, H.; Jiang, W.; Deng, X. Data-driven multi-attribute decision-making by combining probability distributions based on compatibility and entropy. Appl. Intell. 2020, 50, 4081–4093. [Google Scholar] [CrossRef]
  17. Seiti, H.; Hafezalkotob, A.; Najafi, S.E.; Khalaj, M. A risk-based fuzzy evidential framework for FMEA analysis under uncertainty: An interval-valued DS approach. J. Int, Fuzzy Syst. 2018, 35, 1419–1430. [Google Scholar] [CrossRef]
  18. Zhang, W.; Deng, Y. Combining conflicting evidence using the DEMATEL method. Soft Comput. 2018. [Google Scholar] [CrossRef]
  19. Li, H.; Xiao, F. A method for combining conflicting evidences with improved distance function and Tsallis entropy. Int. J. Intell. Syst. 2020, 35, 1814–1830. [Google Scholar] [CrossRef]
  20. Liang, H.; Cai, R. A new correlation coefficient of BPA based on generalized information quality. Int. J. Intell. Syst. 2021. [Google Scholar] [CrossRef]
  21. Ni, S.; Lei, Y.; Tang, Y. Improved Base Belief Function-Based Conflict Data Fusion Approach Considering Belief Entropy in the Evidence Theory. Entropy 2020, 22, 801. [Google Scholar] [CrossRef] [PubMed]
  22. Smets, P. The Combination of Evidence in the Transferable Belief Model. IEEE Trans. Pattern Anal. Mach. Intell. 1990, 12, 447–458. [Google Scholar] [CrossRef]
  23. Zhan, J.; Jiang, W. A modified combination rule in generalized evidence theory. Appl. Intell. 2017, 46, 630–640. [Google Scholar]
  24. Wang, J.; Qiao, K.; Zhang, Z. An improvement for combination rule in evidence theory. Futur. Gener. Comp. Syst. 2019, 91, 1–9. [Google Scholar] [CrossRef]
  25. Matsuyama, T. Belief formation from observation and belief integration using virtual belief space in Dempster-Shafer probability model. In Proceedings of the 1994 IEEE International Conference on MFI’94. Multisensor Fusion and Integration for Intelligent Systems 1994, Las Vegas, NV, USA, 2–5 October 1994; pp. 379–386. [Google Scholar] [CrossRef]
  26. Deng, Y. Information Volume of Mass Function. arXiv 2020, arXiv:2012.07507. [Google Scholar]
  27. Zhou, Q.; Deng, Y. Higher order information volume of mass function. Int. J. Comput. Commun. Control 2020, 15. [Google Scholar] [CrossRef]
  28. Xiao, F. CEQD: A complex mass function to predict interference effects. IEEE Trans. Cybern. 2021. [Google Scholar] [CrossRef]
  29. Rényi, A. On measures of entropy and information. Virology 1985, 142, 158–174. [Google Scholar]
  30. Shannon, C. A mathematical theory of communication. ACM Sigmobile Mob. Comput. Commun. Rev. 2001, 5, 3–55. [Google Scholar] [CrossRef]
  31. Prigogine, I. The End of Certainty; Free Press: New York, NY, USA, 1997; ISBN 9780684837055. [Google Scholar]
  32. Parker, M.C.; Jeynes, C. Maximum Entropy (Most Likely) Double Helical and Double Logarithmic Spiral Trajectories in Space-Time. Sci. Rep. 2019, 1, 10779. [Google Scholar] [CrossRef]
  33. Zhou, M.; Liu, X.; Yang, J.; Chen, Y.; Wu, J. Evidential reasoning approach with multiple kinds of attributes and entropy-based weight assignment. Knowl. Syst. 2019, 163, 358–375. [Google Scholar] [CrossRef]
  34. Nguyen, H.T. On entropy of random sets and possibility distributions. Anal. Fuzzy Inf. 1987, 1, 145–156. [Google Scholar]
  35. Dubois, D.; Prade, H. Properties of measures of information in evidence and possibility theories. Fuzzy Sets Syst. 1987, 24, 161–182. [Google Scholar] [CrossRef]
  36. Höhle, U. Entropy with respect to plausibility measures. In Proceedings of the 12th International Symposium on Multiple-Valued Logic, Paris, France, 25–27 May 1982. [Google Scholar]
  37. Yager, R. Entropy and specificity in a mathematical theory theory of evidence. Int. J. Gen. Syst. 1983, 9, 249–260. [Google Scholar] [CrossRef]
  38. Klir, G.J.; Ramer, A. Uncertainty in the Dempster-Shafer theory: A critycal re-examination. Int. J. Gen. Syst. 1990, 18, 155–166. [Google Scholar] [CrossRef]
  39. Klir, G.J.; Parviz, B. A note on the measure of discord. Uncertain. Artificaial Intell. Proc. Eighth Conf. 1992, 18, 138–141. [Google Scholar]
  40. Jousselme, A.L.; Liu, C.; Grenier, D.; Bosse, E. Measuring ambiguity in the evidence theory. IEEE Trans. Syst. Man Cybern. Part A Syst. Hum. 2006, 36, 890–903. [Google Scholar] [CrossRef]
  41. Deng, Y. Deng entropy. Chaos, Solitons Fractals 2016, 91, 549–553. [Google Scholar] [CrossRef]
  42. Yang, Y.; Han, D. A New Distance-Based Total Uncertainty Measure in the Theory of Belief Functions. Know. Based Syst. 2016, 94, 114–123. [Google Scholar] [CrossRef]
  43. Harmanec, D.; Klir, G.J. Measuring total uncertainty in Dempster-Shafer theory: A novel approach. Int. J. Gen. Syst. 1994, 22, 405–419. [Google Scholar] [CrossRef]
  44. Wang, X.; Song, Y. Uncertainty measure in evidence theory with its applications. Appl. Intell. 2018, 48, 1672–1688. [Google Scholar] [CrossRef]
  45. Jirousek, R.; Shenoy, P. A new definition of entropy of belief functions in the Dempster-Shafer theory. Int. J. Approx. Reason. 2018, 92, 49–65. [Google Scholar] [CrossRef] [Green Version]
  46. Deng, X. Analyzing the monotonicity of belief interval based uncertainty measures in belief function theory. Int. J. Intell. Syst. 2018, 33, 1775–1985. [Google Scholar] [CrossRef]
  47. Abellán, J.; Moral, S. Completing a total uncertainty measure in the Dempster-Shfer theory. Int. J. Gen. Syst. 1999, 28, 299–314. [Google Scholar] [CrossRef]
  48. Yager, R.R. Interval valued entropies for Dempster–Shafer structures. Know. Based Syst. 2018, 161, 390–397. [Google Scholar] [CrossRef]
  49. Xue, Y.; Deng, Y. Interva1-va1ued be1ief entropies for Dempster Shafer structures. Soft Comput. 2021, 25, 8063–8071. [Google Scholar] [CrossRef] [PubMed]
  50. Abellán, J.; Masegosa, A. Requirements for total uncertainty measures in Dempster–Shafer theory of evidence. Int. J. Gen. Syst. 2008, 37, 733–747. [Google Scholar] [CrossRef]
  51. Deng, X.; Jiang, W.; Zhang, J. Zero-Sum Matrix Game with Payoffs of Dempster-Shafer Belief Structures and Its Applications on Sensors. Sensors 2017, 17, 922. [Google Scholar] [CrossRef] [Green Version]
  52. Jiang, W.; Hu, W. An improved soft likelihood function for Dempster–Shafer belief structures. Int. J. Intell. Syst. 2018, 33, 1264–1282. [Google Scholar] [CrossRef]
  53. Deng, Y. Uncertainty measure in evidence theory. Sci. China Inf. Sci. 2020, 63, 210201. [Google Scholar] [CrossRef]
  54. Duda, R.O.; Hart, P.E.; Stork, D.G. Pattern Classification; John Willey & Sons: Hoboken, NJ, USA, 2012. [Google Scholar]
  55. Masson, M.; Denoeux, T. ECM: An evidential version of the fuzzy c-means algorithm. Pattern Recognit 2008, 41, 1384–1397. [Google Scholar] [CrossRef]
Figure 1. Comparison results of Example 1.
Figure 1. Comparison results of Example 1.
Entropy 23 01061 g001
Figure 2. Comparison results of Example 2.
Figure 2. Comparison results of Example 2.
Entropy 23 01061 g002
Figure 3. Comparison results of Example 5.
Figure 3. Comparison results of Example 5.
Entropy 23 01061 g003
Figure 4. Comparison results of Example 7. (a) The proposed method H 1 ; (b) the proposed method H 2 ; (c) the proposed method H 3 ; (d) Deng entropy; (e) AU; (f) weighted Hartley entropy; (g) Yang and Han’s measure; (h) SU; (i)JS; (j) AM; (k) Deng’s measure( T U E I ); (l) Yager’s dissonance entropy.
Figure 4. Comparison results of Example 7. (a) The proposed method H 1 ; (b) the proposed method H 2 ; (c) the proposed method H 3 ; (d) Deng entropy; (e) AU; (f) weighted Hartley entropy; (g) Yang and Han’s measure; (h) SU; (i)JS; (j) AM; (k) Deng’s measure( T U E I ); (l) Yager’s dissonance entropy.
Entropy 23 01061 g004
Figure 5. Probability density functions (PDFs) of different features of samples in the Iris dataset.
Figure 5. Probability density functions (PDFs) of different features of samples in the Iris dataset.
Entropy 23 01061 g005
Figure 6. Average uncertainty of samples on each feature based on different uncertainty measures.
Figure 6. Average uncertainty of samples on each feature based on different uncertainty measures.
Entropy 23 01061 g006
Table 1. The comparison between the proposed method and some existing methods in Example 2.
Table 1. The comparison between the proposed method and some existing methods in Example 2.
Uncertainty Measures n = 1 n = 2 n = 3 n = 4 n = 5 n = 6 n = 7 n = 8 n = 9 n = 10 n = 11 n = 12 n = 13 n = 14
Deng entropy01.58502.80743.90694.95425.97736.98877.99448.99729.998610.999311.999612.999813.9999
AU011.585022.32192.58502.807433.16993.32193.45943.58503.70043.8074
Weighted Hartley entropy011.585022.32192.58502.807433.16993.32193.45943.58503.70043.8074
Yang and Han’s measure0234567891011121314
SU0234567891011121314
JS023.169944.64395.16995.614766.33996.64396.91897.16997.40097.6147
AM011.585022.32192.58502.807433.16993.32193.45943.58503.70043.8074
Deng’s measure( T U E I )0234567891011121314
Yager’s dissonance entropy00000000000000
Proposed method H 1 023.169944.64395.16995.614766.33996.64396.91897.16997.40097.6147
Proposed method H 2 01.58502.80743.90694.95425.97736.98877.99448.99729.998610.999311.999612.999813.9999
Proposed method H 3 0234567891011121314
Table 2. The uncertainty measured by other methods in Example 3.
Table 2. The uncertainty measured by other methods in Example 3.
Uncertainty Measures m 1 m 2
Deng entropy2.32192.3219
AU11.9710
Weighted Hartley entropy0.60.6
Yang and Han’s measure0.40.4394
SU1.62.5710
JS1.62.4113
AM11.9710
Deng’s measure ( T U E I )0.35860.3877
Yager’s dissonance entropy0.12881.3710
Table 3. The uncertainty measured by different methods in Example 4.
Table 3. The uncertainty measured by different methods in Example 4.
Uncertainty Measures m 1 m 2
Deng entropy2.55592.5559
AU1.97101.9710
Weighted Hartley entropy1.00001.0000
Yang and Han’s measure0.50000.5000
SU2.97102.4855
JS2.97102.4855
AM1.97101.4855
Deng’s measure ( T U E I )0.50000.5000
Yager’s dissonance entropy0.97100
Proposed method H 1 2.97102.0000
Proposed method H 2 2.55591.5850
Proposed method H 3 2.97102.0000
Table 4. The comparison between the proposed method and some existing methods in Example 5.
Table 4. The comparison between the proposed method and some existing methods in Example 5.
Uncertainty Measures A = 1 A = 1 , 2 A = 1 , 2 , 3 A = 1 , 2 , 3 , 4 A = 1 , 2 , , 5 A = 1 , 2 , , 6 A = 1 , 2 , , 7
Deng entropy2.66232.93034.90825.78786.62567.44418.2532
AU1.43282.21802.67072.98213.03783.24473.2956
Weighted Hartley entropy0.46991.26991.73792.06992.32752.53792.7158
Yang and Han’s measure0.12460.22160.27490.32830.38160.43490.4867
SU4.35835.77976.40967.03947.66938.37168.9394
JS3.83224.47894.88705.22505.52005.80596.0425
AM1.34612.10372.46232.70112.87623.06843.1083
Deng’s measure ( T U E I )0.11950.21990.27320.32660.37990.43320.4853
Yager’s dissonance entropy0.39530.39530.19970.19970.19970.19970.0074
Proposed method H 1 1.33522.93523.67564.33964.85475.27565.4390
Proposed method H 2 2.03573.30364.08604.96565.80356.62197.2387
Proposed method H 3 2.04533.64534.24975.04975.84976.64977.2574
Table 5. The comparison between the proposed method and some existing methods in Example 5.
Table 5. The comparison between the proposed method and some existing methods in Example 5.
Uncertainty Measures A = 1 , 2 , , 8 A = 1 , 2 , , 9 A = 1 , 2 , , 10 A = 1 , 2 , , 11 A = 1 , 2 , , 12 A = 1 , 2 , , 13 A = 1 , 2 , , 4
Deng entropy9.05789.860010.661211.461712.262013.062213.8622
AU3.44973.57963.69093.78243.85383.89863.9069
Weighted Hartley entropy2.86993.00593.12753.23753.33793.43033.5158
Yang and Han’s measure0.54000.59330.64670.70000.75330.80670.8600
SU9.641710.344011.046311.748612.451013.153313.8556
JS6.27726.49216.69036.87437.04617.20717.3587
AM3.25113.37473.48333.57973.66633.74463.8160
Deng’s measure ( T U E I )0.53860.59200.64530.69860.75200.80530.8586
Yager’s dissonance entropy0.00740.00740.00740.00740.00740.00740.0074
Proposed method H 1 5.74736.01926.26246.48246.68326.86807.0390
Proposed method H 2 8.04328.84559.646610.447211.247512.047612.8477
Proposed method H 3 8.05748.85749.657410.457411.257412.057412.8574
Table 6. The mean value and standard deviation value for features.
Table 6. The mean value and standard deviation value for features.
Features SetosaVersicolourVirginica
SLMean5.00605.93606.5880
Standard deviation0.35250.51620.6359
SWMean3.41802.77002.9740
Standard deviation0.38100.31380.3225
PLMean1.46404.26005.5520
Standard deviation0.17350.46990.5519
PWMean0.24401.32602.0260
Standard deviation0.10720.19780.2747
Table 7. The average uncertainty for different features.
Table 7. The average uncertainty for different features.
Uncertainty MeasuresSLSWPLPW
Weighted Hartley entropy0.66420.66600.55400.6556
Deng entropy3.73943.88633.10253.7501
AU1.58501.58501.48691.5850
AM1.53331.57561.27641.5183
SU2.19502.23641.83962.1723
JS2.22262.24631.95362.2068
Yang and Han’s measure0.60720.62270.47870.5986
Deng’s measure1.62081.65321.30231.5981
Proposed method H 1 1.70571.75091.36651.6833
Proposed method H 2 1.47291.51781.17091.4542
Proposed method H 3 1.68101.72601.34671.6586
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Zhang, Y.; Huang, F.; Deng, X.; Jiang, W. A New Total Uncertainty Measure from A Perspective of Maximum Entropy Requirement. Entropy 2021, 23, 1061. https://0-doi-org.brum.beds.ac.uk/10.3390/e23081061

AMA Style

Zhang Y, Huang F, Deng X, Jiang W. A New Total Uncertainty Measure from A Perspective of Maximum Entropy Requirement. Entropy. 2021; 23(8):1061. https://0-doi-org.brum.beds.ac.uk/10.3390/e23081061

Chicago/Turabian Style

Zhang, Yu, Fanghui Huang, Xinyang Deng, and Wen Jiang. 2021. "A New Total Uncertainty Measure from A Perspective of Maximum Entropy Requirement" Entropy 23, no. 8: 1061. https://0-doi-org.brum.beds.ac.uk/10.3390/e23081061

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop