Next Article in Journal
Comments on “The Principle of Least Action for Reversible Thermodynamic Processes and Cycles”, Entropy 2018, 20, 542
Previous Article in Journal
Heat Transfer Performance of a Novel Multi-Baffle-Type Heat Sink
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

An Entropy-Based Knowledge Measure for Atanassov’s Intuitionistic Fuzzy Sets and Its Application to Multiple Attribute Decision Making

1
Air and Missile Defense College, Air Force Engineering University, Xi’an 710051, China
2
Aviation Maintenance NCO Academy, Air Force Engineering University, Xinyang 464000, China
*
Author to whom correspondence should be addressed.
Submission received: 22 November 2018 / Revised: 10 December 2018 / Accepted: 16 December 2018 / Published: 17 December 2018
(This article belongs to the Section Information Theory, Probability and Statistics)

Abstract

:
As the complementary concept of intuitionistic fuzzy entropy, the knowledge measure of Atanassov’s intuitionistic fuzzy sets (AIFSs) has attracted more attention and is still an open topic. The amount of knowledge is important to evaluate intuitionistic fuzzy information. An entropy-based knowledge measure for AIFSs is defined in this paper to quantify the knowledge amount conveyed by AIFSs. An intuitive analysis on the properties of the knowledge amount in AIFSs is put forward to facilitate the introduction of axiomatic definition of the knowledge measure. Then we propose a new knowledge measure based on the entropy-based divergence measure with respect for the difference between the membership degree, the non-membership degree, and the hesitancy degree. The properties of the new knowledge measure are investigated in a mathematical viewpoint. Several examples are applied to illustrate the performance of the new knowledge measure. Comparison with several existing entropy and knowledge measures indicates that the proposed knowledge has a greater ability in discriminating different AIFSs and it is robust in quantifying the knowledge amount of different AIFSs. Lastly, the new knowledge measure is applied to the problem of multiple attribute decision making (MADM) in an intuitionistic fuzzy environment. Two models are presented to determine attribute weights in the cases that information on attribute weights is partially known and completely unknown. After obtaining attribute weights, we develop a new method to solve intuitionistic fuzzy MADM problems. An example is employed to show the effectiveness of the new MADM method.

1. Introduction

The concept of the fuzzy set [1] was developed by Zadeh to model and process uncertain information in a much better way. By assigning the membership degree between 0 and 1 to elements with respect to a set, the fuzzy set can describe the state between “belong to” and “not belong to.” Therefore, many kinds of uncertainties that cannot be depicted by classical sets can be well-described by fuzzy sets. Since its inception, fuzzy set theory has been applied in many areas such as automatic control, pattern recognition, decision making, and more [2,3,4,5,6]. To enhance its ability and agility in handling with uncertainty, many researchers have been dedicated to extend the concept of the fuzzy set. Atanassov’s intuitionistic fuzzy set (AIFS) [7], as an extension of the fuzzy set, was defined by introducing a hesitancy degree to quantify the gap between 1 and the sum of membership degree and non-membership degree. The introduction of the hesitancy degree can depict the uncertainty on membership and non-membership grades, which bring capability of dealing with uncertainty in practical applications. Due to its advantage in coping with uncertainty, the AIFS theory has attracted attention from researchers [8,9]. Current research on the AIFS theory mainly focuses on its mathematical characteristics [10,11,12], its application in decision-making [13,14,15,16], and its relation with other uncertainty theories [17,18], etc.
The concept of entropy was first introduced in thermodynamics. The entropy proposed by Boltzmann is a classical entropy, which is also known as Boltzmann entropy [19]. However, there is no effective method for calculating Boltzmann entropy until Shannon developed an alternative way [20]. Recently, some methods for computing Boltzmann entropy have been developed based on specific applications [19,21,22,23]. The entropy of a fuzzy set was first proposed by Zadeh [1] to depict the fuzziness. Following Zadeh’s work, De Luca and Termini [24] proposed a probabilistic entropy measure for fuzzy sets. They also put forward some axiomatic properties for the fuzzy entropy measure, according to which fuzzy entropy can be defined. Yager [25] proposed an entropy measure for fuzzy sets based on the distance between a fuzzy set and its complementation. Yager’s concept was extended by Higashi and Klir [26] to a more general kind of fuzzy complementation. Because of its importance in depicting a fuzzy set, the entropy measure of fuzzy sets has been developing to an active topic in fuzzy set theory.
Similarly, the entropy measure of AIFSs has also attracted researchers. Burillo and Bustince [27] first presented the entropy measure for AIFSs to quantify the intuitionism of AIFSs. Then, Szmidt and Kacprzyk [28] developed the axioms proposed by Burillo and Bustince [27] and introduced a new entropy measure for AIFSs based on the ratio between the nearer distance and the further distance. Hung and Yang [29] presented axiomatic definitions for entropy of AIFSs from a probabilistic point of view. It was pointed by Vlachos and Sergiadis [30] that entropy of an AIFS should capture both fuzziness and intuitionism of AIFS. Based on such a perspective, Vlachos and Sergiadis [30] defined an entropy measure based on the concept of discrimination information and cross entropy between AIFSs. Szmidt et al. [31,32] also insisted that we cannot measure the uncertainty hidden in AIFSs merely by using the entropy measure.
Knowledge measure is usually regarded as the dual measure of entropy. In fact, if an entropy cannot capture all uncertainty in AIFSs, it is not reasonable to treat knowledge and entropy as a dual measure. We hold the perspective that the uncertainty of an AIFS includes both fuzziness and intuitionism. Therefore, the entropy of an AIFS is the combination of fuzzy entropy and intuitionistic entropy. In such a case, the entropy of an AIFS can be called as intuitionistic fuzzy entropy, which captures all kinds of uncertainty in an AIFS. This kind of intuitionistic fuzzy entropy is also an uncertain measure of AIFS. Thus, the knowledge measure can be regarded as the dual measure of intuitionistic fuzzy entropy or uncertainty. Many attempts were made [32] to cope with the knowledge measure of AIFSs by combining entropy measure and the hesitancy margin. Nguyen [33] presented a knowledge measure for AIFSs based on the distance between an AIFS and the most uncertain one. Guo [34] also proposed an axiomatic definition for the knowledge measure of AIFSs and developed a new knowledge measure following his axioms. However, we find that Nguyen’s knowledge measure and Guo’s knowledge measure may bring unreasonable results due to the distance measure it used. Therefore, it is desirable to develop a new knowledge measure for AIFSs.
To provide a more effective and reasonable knowledge measure for AIFSs, we will cope with the definition of knowledge measure of AIFSs. In this paper, we will define a new knowledge measure based on the two factors affecting knowledge amount. Our aim is to provide a new technique to measure the knowledge amount conveyed by AIFSs. Based on the intuitive analysis on the properties of knowledge amount, we present a new axiomatic definition of the knowledge measure, which differs from existing axiomatic definitions in the desired monotonicity. Afterward, a new knowledge measure is put forward together with its properties and related proofs. Comparison with other measures is made to illustrate the performance of the new knowledge measure. To validate the applicability of the new knowledge measure, we apply it in the application of multiple attribute decision making (MADM) problems in an intuitionistic fuzzy environment. A new method for solving MADM problems under the intuitionistic fuzzy condition is developed based on the new knowledge measure. An illustrative example is presented to show the effectiveness and rationality of the provided method for solving intuitionistic fuzzy MADM problems. The main contribution of this paper lies in the introduction of the effective knowledge measure, which is proved to be robust in distinguishing the knowledge amount conveyed by different AIFSs.
The rest of this paper is arranged according to the following. A brief introduction on AIFSs is proposed in the second section. A new knowledge measure for AIFSs is introduced in the third section, following the proposition axiomatic definition of knowledge measure. The properties of the new knowledge measure are also investigated in Section 3. Numerical examples and comparative analysis are presented in the fourth section to validate the performance of the new knowledge measure. The new knowledge measure is applied to MADM problems in an intuitionistic fuzzy environment in the fifth section where two models for determining attribute weights and a method for solving MADM problems are developed. In Section 5, we also use an example on the MADM problem to verify the effectiveness of the proposed method for solving MADM problems under an intuitionistic fuzzy condition. Some conclusions are presented in the last section.

2. Atanassov’s Intuitionistic Fuzzy Sets

The concept of the fuzzy set was introduced by Zadeh in 1965. It was defined based on the following.
Definition 1 [1]. 
Let X = { x 1 , x 2 , , x n } be the universe of discourse, then a fuzzy set A defined in X can be expressed as follows:
A = { x , μ A ( x ) | x X }
where μ A ( x ) : X [ 0 , 1 ] is the degree of membership for x with respect to A.
Definition 2 [7].
An intuitionistic fuzzy set A in X defined by Atanassov can be written as:
A = { x , μ A ( x ) , v A ( x ) | x X }
where μ A ( x ) : X [ 0 , 1 ] is named as the membership degree and v A ( x ) : X [ 0 , 1 ] is named as the non-membership degree. The sum of them is less than 1, i.e.,
0 μ A ( x ) + v A ( x ) 1
The hesitancy degree of the element x X to the set A can be written as:
π A ( x ) = 1 μ A ( x ) v A ( x )
It is obvious that π A ( x ) [ 0 , 1 ] , x X .
When π A ( x ) = 0 , x X , the AIFS degenerates into an ordinary fuzzy set.
For clarity, the couple μ A ( x ) , v A ( x ) is usually written as μ , v , which is also called an intuitionistic fuzzy value (IFV).
Definition 3 [7]. 
For two AIFSs A and B defined in X, the following relations can be defined:
(R1)
A B x X μ A ( x ) μ B ( x ) , v A ( x ) v B ( x ) ;
(R2)
A = B x X μ A ( x ) = μ B ( x ) , v A ( x ) = v B ( x ) ;
(R3)
A C = { x , v A ( x ) , μ A ( x ) | x X } , where A C is the complement of A .
Definition 4 [7]. 
For two IFVs, a = μ a , v a , b = μ b , v b , the partial order between them is defined as: a b μ a μ b , v a v b .
For a linear order of IFVs, Chen and Tan [35] defined the score function of IFV as S ( a ) = μ a v a to rank multiple IFVs. Then Hong and Choi [36] defined an accuracy function H ( a ) = μ a + v a to measure the accuracy of an IFV. Based on score and accuracy functions, Xu and Yager [37] developed a linear order relation for IFVs. Given two IFVs a = μ a , v a and b = μ b , v b , we have the following relations: S ( a ) > S ( b ) a > b ; S ( a ) = S ( b ) , H ( a ) = H ( b ) a = b ; S ( a ) = S ( b ) , H ( a ) > H ( b ) a > b .

3. A New Knowledge Measure for AIFSs

Let X = { x 1 , x 2 , , x n } be the discourse universe. For an AIFS A defined in X, the measure used to quantify its knowledge amount should have some intuitive properties. Rationally, the knowledge measure must be a nonnegative function. The knowledge amount of an AIFS and that of its complement should be equal to each other. When an AIFS degrade into a classical fuzzy set, its knowledge amount increases with the difference between the membership degree and the non-membership degree. Moreover, for an AIFS in which the difference between membership and non-membership grades are fixed, the knowledge amount behaves dually to the hesitancy degree. Since a crisp set provided the maximum knowledge, the knowledge amount reaches its maximum if the AIFS reduces into a crisp set. On the contrary, π A ( x i ) = 1 for each i = 1 , 2 , , n indicates full ignorance, so the knowledge amount reaches its minimum value 0 in such a case. Moreover, in the condition of μ A ( x i ) = v A ( x i ) 0 , the greater hesitant degree π A ( x i ) implies the less uncertainty amount.
Having these intuitive properties in mind, we can propose the following axiomatic definition for the knowledge measure of AIFSs.
Definition 5. 
For an AIFS A defined in X = { x 1 , x 2 , , x n } , its knowledge measure is a mapping K : A I F S [ 0 , 1 ] satisfying the following properties:
(KP1) 
K ( A ) = 1 if and only if A is a crisp set.
(KP2) 
K ( A ) = 0 if and only if π A ( x i ) = 1 , i { 1 , 2 , , n } .
(KP3) 
K ( A ) is increasing with Δ A ( x i ) = | μ A ( x i ) v A ( x i ) | and decreasing with π A ( x i ) , i = 1 , 2 , , n .
(KP4) 
K ( A C ) = K ( A ) .
We note that in References [33,34], the third property of entropy for AIFS is stated as: K ( A ) K ( B ) if A is less fuzzy than B , i.e., A B for μ B ( x ) v B ( x ) , x X , or A B for μ B ( x i ) v B ( x i ) , x X . We can see that this property indicates that the knowledge amount behaves dually to the fuzziness of an AIFS, which does not consider the impact of the hesitancy degree. The condition A B and μ B ( x i ) v B ( x i ) indicate μ A ( x i ) μ B ( x i ) v B ( x i ) v A ( x i ) . Therefore, | μ A ( x i ) v A ( x i ) | | μ B ( x i ) v B ( x i ) | . Similarly, given A B and μ B ( x i ) v B ( x i ) , we have μ A ( x i ) μ B ( x i ) v B ( x i ) v A ( x i ) , | μ A ( x i ) v A ( x i ) | | μ B ( x i ) v B ( x i ) | . Therefore, we can say the knowledge measure is decreasing with fuzziness. It may be arbitrary to give the property that K ( A ) K ( B ) if A is less fuzzy than B . Hence, when the hesitancy degree is fixed, we also can say K ( A ) K ( B ) if A is less fuzzy than B , which is consistent with our proposed property. Moreover, the cases of μ A ( x i ) μ B ( x i ) v B ( x i ) v A ( x i ) and μ A ( x i ) μ B ( x i ) v B ( x i ) v A ( x i ) indicate that | μ A ( x i ) v A ( x i ) | | μ B ( x i ) v B ( x i ) | , but | μ A ( x i ) v A ( x i ) | | μ B ( x i ) v B ( x i ) | cannot infer μ A ( x i ) μ B ( x i ) v B ( x i ) v A ( x i ) or μ A ( x i ) μ B ( x i ) v B ( x i ) v A ( x i ) . Generally, it has been known that the fuzziness of an AIFS including classical fuzzy sets is related to the difference between membership and non-membership grades. Thus, it is not comprehensive to equate the concept of fuzziness to the relation of inclusion.
The above analysis indicates that the amount of knowledge for AIFSs is decreasing with fuzziness, but there is no determinative relation between them if the hesitancy degree is ignored. The third property in Reference [34] is a strict constraint for defining the knowledge measure. Similarly, this kind of properties for intuitionistic fuzzy entropy and uncertainty measures [28,33,38] are also incomplete and stricter because of the absence of the hesitancy degree and the equivalence between fuzziness and the inclusion relation of AIFSs.
It is known that several divergence measures have been proposed based on Shannon entropy. Kullback–Leibler divergence (K–L divergence) [39] is one of the most popular divergence measures developed from Shannon entropy [20]
Let X be a discrete random variable. P1 and P2 are two probability distributions for X. The K–L divergence between P1 and P2 is defined by the equation below [35].
D K L ( P 1 , P 2 ) = 1 ln 2 j p 1 j ln p 1 j p 2 j
where pij is the probability of occurrence of the value X = xj for each of the probability distribution Pi, I = 1, 2. To construct a symmetric divergence measure, the divergence measure between P1 and P2 can be defined by the formula below.
D K L S ( P 1 , P 2 ) = D K L ( P 1 , P 2 ) + D K L ( P 2 , P 1 ) = 1 ln 2 j ( p 1 j p 2 j ) ln p 1 j p 2 j
In applications, to avoid the undefined case of zero denominator, the symmetric divergence measure can be modified by the equation below.
D K L S M ( P 1 , P 2 ) = 1 ln 2 j ( p 1 j p 2 j ) ln 1 + p 1 j 1 + p 2 j
Then, we can construct a knowledge measure for an AIFS A defined in X = { x 1 , x 2 , , x n } by measuring the divergence between A and the most uncertain AIFS U = { x , 0 , 0 | x X } . Based on the previously mentioned axiomatic definition for the knowledge measure of AIFSs and the modified divergence measure in Reference (7), the knowledge measure can be expressed by the following equation.
K S ( A ) = 1 2 n ln 2 i = 1 n [ Δ A ( x i ) ln ( Δ A ( x i ) + 1 ) + ( π A ( x i ) 1 ) ln π A ( x i ) + 1 2 ]
where Δ A ( x i ) = | μ A ( x i ) v A ( x i ) | , i = 1 , 2 , , n .
Then, we will prove that the proposed knowledge measure KS(A) satisfies all properties in Definition 5.
Theorem 1.
For an AIFS A defined in X = { x 1 , x 2 , , x n } , then K S ( A ) = 1 if and only if A is a crisp set.
Proof. 
(1) For a crisp subset of X, we have π A ( x ) = 0 , and Δ A ( x ) = 1 , x X . Then we can get K S ( A ) = 1 . For AIFS A defined in X = { x 1 , x 2 , , x n } , we can get Δ A ( x i ) = | μ A ( x i ) v A ( x i ) | [ 0 , 1 ] and Δ A ( x i ) [ 0 , 1 ] . Then, it follows that 1 Δ A ( x ) + 1 2 , 1 / 2 ( 1 + Δ A ( x ) ) / 2 1 and 1 π A ( x ) 1 0 .
Considering the function f ( x , y ) = x ln ( x + 1 ) + ( y 1 ) ln ( 0.5 ( y + 1 ) ) with x , y [ 0 , 1 ] , we have:
f ( x , y ) x = ln ( x + 1 ) + x x + 1 > 0 , f ( x , y ) y = ln y + 1 2 + y 1 y + 1 < 0 .
This indicates that f(x,y) is strictly increasing with x and strictly decreasing with y. Hence, f(x,y) has a single maximum point (1,0) where x ln ( x + 1 ) = ln 2 , ( y 1 ) ln ( 0.5 ( y + 1 ) ) = ln 2 and f ( x , y ) = 2 ln 2 .
Therefore, each part of K S ( A ) , i.e., Δ A ( x i ) ln ( Δ A ( x i ) + 1 ) and ( π A ( x i ) 1 ) ln π A ( x i ) + 1 2 , is less than ln 2 . By K S ( A ) = 1 , we can infer that Δ A ( x i ) ln ( Δ A ( x i ) + 1 ) = ln 2 and ( π A ( x i ) 1 ) ln π A ( x i ) + 1 2 = ln 2 for all x i X , which indicates that π A ( x i ) = 0 , and Δ A ( x i ) = 1 for all x i X .
Therefore, A is a crisp set.
Therefore, we have K S ( A ) = 1 if and only if A is a crisp set. □
Theorem 2.
For an AIFS A defined in X = { x 1 , x 2 , , x n } , then K S ( A ) = 0 if and only if π A ( x i ) = 1 for each x i X .
Proof. 
The condition π A ( x i ) = 1 x i X indicates that μ A ( x i ) = v A ( x i ) = 0 x i X . Therefore, we have Δ A ( x i ) = 0 , x i X . Then K S ( A ) = 0 can be yielded.
For x i X , given 1 Δ A ( x i ) + 1 2 , 1 / 2 ( 1 + Δ A ( x i ) ) / 2 1 , and 1 π A ( x i ) 1 0 , we can get Δ A ( x i ) ln ( Δ A ( x i ) + 1 ) 0 and ( π A ( x i ) 1 ) ln π A ( x i ) + 1 2 0 . K S ( A ) = 0 indicates that Δ A ( x i ) ln ( Δ A ( x i ) + 1 ) = 0 and ( π A ( x i ) 1 ) ln π A ( x i ) + 1 2 = 0 for all x i X . Therefore, we have π A ( x i ) = 1 for each x i X .
Then, we can get that K S ( A ) = 0 if and only if π A ( x i ) = 1 for each x i X . □
Theorem 3.
For an AIFS A defined in X = { x 1 , x 2 , , x n } , K S ( A ) is increasing with Δ A ( x i ) = | μ A ( x i ) v A ( x i ) | and decreasing with π A ( x i ) , i = 1 , 2 , , n .
Proof. 
In the proof of Theorem 1, it has been pointed out that the function f ( x , y ) = x ln ( x + 1 ) + ( y 1 ) ln ( 0.5 ( y + 1 ) ) with x , y [ 0 , 1 ] is strictly increasing with x and strictly decreasing with y.
Therefore, the summation of all f ( Δ A ( x i ) , π A ( x i ) ) is increasing with Δ A ( x i ) and decreasing with π A ( x i ) .
Then, it follows that K S ( A ) is increasing with Δ A ( x i ) = | μ A ( x i ) v A ( x i ) | and decreasing with π A ( x i ) , i = 1 , 2 , , n . □
Theorem 4.
For an AIFS, A defined in X = { x 1 , x 2 , , x n } , K S ( A C ) = K S ( A ) .
Proof. 
This is straightforward by the definition of A C and K S . □
Theorem 1–4 illustrates that the proposed mapping K S : A I F S [ 0 , 1 ] satisfies all properties in the axiomatic definition of the knowledge measure. Therefore, K S is a knowledge measure of AIFSs.
To provide a visual perception on the proposed knowledge measure, we consider an AIFS A defined in X = {x}. Based on the geometric interpretation of AIFSs proposed by Szmidt and Kacprzyk [28], the values of the knowledge amount are projected to the hyper plane in the unit intuitionistic fuzzy cube, as shown in Figure 1. The change of knowledge amount according to the distribution of membership and non-membership grades can be reflected by this figure. We note that, in the condition of μ A = v A , the knowledge amount is decreasing with the hesitancy degree. For a fixed hesitancy degree, the knowledge amount is increasing the difference between the membership degree and the non-membership degree. This is consistent with the proposed axiomatic properties of the knowledge measure.

4. Numerical Examples

In this section, the performance of the proposed knowledge measure KS will be validated based on two numerical examples. To illustrate the effectiveness and performance of the proposed Biparametric uncertainty measure for AIFSs, some existing intuitionistic fuzzy entropy measures will be adopted for comparison. Therefore, we first recall some widely used entropy measures for AIFSs.
The entropy measure proposed by Zeng and Li [40] is shown below.
E Z L ( A ) = 1 1 n i = 1 n | μ A ( x i ) v A ( x i ) |
The entropy measure proposed by Burillo and Bustince [27] is shown below.
E B B ( A ) = 1 n i = 1 n ( 1 μ A ( x i ) v A ( x i ) )
The entropy measure proposed by Szmidt and Kacprzyk [28] is based on the equation below.
E S K ( A ) = 1 n i = 1 n min ( μ A ( x i ) , v A ( x i ) ) + π A ( x i ) max ( μ A ( x i ) , v A ( x i ) ) + π A ( x i )
The entropy measure proposed by Vlachos and Sergiadis [30] is shown below.
E V S ( A ) = 1 n ln 2 i = 1 n ( μ A ( x i ) ln μ A ( x i ) + v A ( x i ) ln v A ( x i ) ( 1 π A ( x i ) ) ln ( 1 π A ( x i ) ) ) + 1 n i = 1 n π A ( x i )
The entropy measure proposed by Hung and Yang [29] is illustrated below.
E H C 2 ( A ) = 1 n i = 1 n ( 1 ( μ A ( x i ) ) 2 ( v A ( x i ) ) 2 ( π A ( x i ) ) 2 )
The knowledge measure proposed by Szmidt, Kacprzyk, and Bujnowski [32] is shown in the equation below.
K S K B ( A ) = 1 1 2 n [ i = 1 n min ( μ A ( x i ) , v A ( x i ) ) + π A ( x i ) max ( μ A ( x i ) , v A ( x i ) ) + π A ( x i ) + π A ( x i ) ]
Nguyen’s knowledge measure [33] equals the following.
K N ( A ) = 1 n 2 i = 1 n ( μ A ( x i ) ) 2 + ( v A ( x i ) ) 2 + ( μ A ( x i ) + v A ( x i ) ) 2
Guo’s knowledge measure [34] is based on the equation below.
K G ( A ) = 1 1 2 n i = 1 n ( 1 | μ A ( x i ) v A ( x i ) | ) ( 1 + π A ( x i ) )
Example 1.
Four AIFSs defined in X = {x} are given as: A1 = <x,0.5,0.5>, A2 = <x,0.25,0.25>, A3 = <x,0.25,0.5>, A4 = <x, 0.2,0.3>.
According to the eight existing measures and our proposed uncertainty measure, we can calculate the uncertainty degree and amount of knowledge. The comparative results are shown in Table 1.
From Table 1, we can see that the entropy measure EZL, ESK, and EVS cannot discriminate the uncertainty of A1 and A2. Since the entropy measure EBB is defined based on the hesitancy degree of AIFSs, it assigns zero uncertainty to A1, which is unreasonable. Moreover, the uncertainty grades of A2 and A4 cannot be distinguished by EBB because their hesitancy degrees are identical. It is also shown that uncertainty grades of A2 and A4 calculated by the entropy measure E2HC are equal to each other.
It is also shown that the knowledge measures KSKB, KN, KG, together with our proposed measure KS can discriminate these four AIFSs well from the perspective of the amount of knowledge. For the ranking order of the knowledge amount, we see that the measures KSKB and KG yield the results K(A3) > K(A1) > K(A4) > K(A2). When the knowledge measure KN is applied, we can obtain that K(A1) > K(A3) > K(A2) > K(A4). Our proposed measure KS leads to the order K(A1) > K(A3) > K(A4) > K(A2). Comparing AIFSs A1 and A3, we can see that the difference between the membership and non-membership degree of A1 is less than that of A3. However, A3 has more of a hesitancy degree than A1. According to the monotonicity of the knowledge measure proposed in Definition 5, the knowledge amount of A1 and A3 cannot be determined. But for A2 and A4, it is shown that A2 has less Δ, i.e., the difference between its membership and non-membership grades, and more hesitancy degree π than A4. Therefore, A4 should convey a higher amount of knowledge than A2. In such a case, the knowledge measure KN is less reasonable than the other three knowledge measures.
This example indicates that our proposed knowledge measure is effective in measuring the knowledge amount. It is competent to reflect the intuitive relation between different AIFSs.
Example 2.
Nine AIFSs defined in X = {x} are considered. These AIFSs are given as: A1 = <x,0.7,0.2>, A2 = <x,0.5,0.3>, A3 = <x,0.5,0>, A4 = <x,0.5,0.5>, A5 = <x,0.5,0.4>, A6 = <x,0.6,0.2>, A7 = <x,0.4,0.4>, A8 = <x,1,0>, and A9 = <x,0,0>.
Based on five widely used entropy measures EZL, EBB, ESK, EVS, and E2HC, three knowledge measures KSKB, KN, and KG, and our proposed measures KS, we can calculate the uncertainty of these nine AIFSs. The results are shown in Table 2.
We note that the AIFS A9 = <x,0,0> is the most uncertain one with the least amount of knowledge. Therefore, its uncertainty measure should be 1 and the knowledge amount is 0. It can be seen that only E2HC cannot produce reasonable results for AIFS A9 = <x,0,0>. For AIFS A8 = <x,1,0>, it is the most certain one conveying the maximum knowledge amount. Hence, its uncertainty degree is 0 and the knowledge amount is 1 since the results were produced by all measures in Table 2. Moreover, we can see that all entropy measures may bring counter-intuitive results, which have been highlighted in a bold type in Table 2. These unreasonable results show that these entropy measures are not competent to distinguishing different AIFSs. The knowledge measure KSKB assigned the same knowledge amount to AIFSs A3 = <x,0.5,0> and A4 = <x,0.5,0.5>, which indicates the poorer discriminant ability of KSKB. By the proposed axiomatic definition, we cannot rank the knowledge amount of A3 and A4 since Δ A 4 and π A 4 is greater that Δ A 5 and π A 5 , respectively. Comparing A1 and A3, we find that Δ A 1 = Δ A 3 and π A 1 < π A 3 . Therefore, A1 has a greater knowledge amount than A3, which can be yielded by all knowledge measures. For AIFSs A1 and A5, they have the same hesitancy degree. The greater difference between the membership and non-membership grades of A1 brings more knowledge amount than A5, as shown by the results of all knowledge measures. We can also rank the knowledge amount of AIFSs A2, A6, and A7 in the same way. We note that AIFSs A4, A7, and A9 have the same Δ , so the less hesitancy degree indicates a greater knowledge amount. Thus, the knowledge conveyed by A4, A7, and A9 should be ranked as K(A4) > K(A7) > K(A9). It is shown that all knowledge measures can provide this ranking order. This example tells us that our proposed knowledge measure is effective to distinguish the knowledge amount of different AIFSs.
Example 3.
We consider an AIFS A defined in X = {6,7,8,9,10}. The AIFS A is defined as:
A = { 6 , 0.1 , 0.8 , 7 , 0.3 , 0.5 , 8 , 0.5 , 0.5 , 9 , 0.9 , 0 , 10 , 1 , 0 } .
De et al. [41] have defined an exponent operations for AIFS A defined in X . Given a non-negative real number m, A m is defined as:
A m = { x , ( μ A ( x ) ) m , 1 ( 1 v A ( x ) ) m x X | }
Based on the operations in Equation (17), we have:
A 0.5 = { 6 , 0.316 , 0.553 , 7 , 0.548 , 0.293 , 8 , 0.707 , 0.293 , 9 , 0.949 , 0 , 10 , 1 , 0 } .
A 2 = { 6 , 0.010 , 0.960 , 7 , 0.090 , 0.750 , 8 , 0.250 , 0.750 , 9 , 0.810 , 0 , 10 , 1 , 0 } .
A 3 = { 6 , 0.001 , 0.992 , 7 , 0.027 , 0.875 , 8 , 0.125 , 0.875 , 9 , 0.729 , 0 , 10 , 1 , 0 } .
A 4 = { 6 , 0.0001 , 0.998 , 7 , 0.008 , 0.938 , 8 , 0.062 , 0.938 , 9 , 0.656 , 0 , 10 , 1 , 0 } .
Considering the characterization analysis on linguistic variables, we can regard the AIFS A as ‘‘LARGE’’ in X. Correspondingly, AIFSs A0.5, A2, A3 and A4 can be regarded as “More or less LARGE”, “Very LARGE”, “Quite very LARGE”, and “Very very LARGE”, respectively.
Intuitively, from A0.5 to A4, the uncertainty hidden in them becomes less and the knowledge amount conveyed by them increases. So the following relations hold:
E ( A 0.5 ) > E ( A ) > E ( A 2 ) > E ( A 3 ) > E ( A 4 )
K ( A 0.5 ) < K ( A ) < K ( A 2 ) < K ( A 3 ) < K ( A 4 )
To make a comparison, entropy measures EZL, EBB, EVS, E2HC, and knowledge measures KSKB, KN, and KG are employed to facilitate analysis. In Table 3, we present the results obtained based on different measures to facilitate comparative analysis.
We can note that the AIFS A will be assigned more entropy than the AIFS A0.5 when entropy measures EZL, EBB, and ESK are applied. The ranking orders obtained based on these measures are listed below.
E Z L ( A ) > E Z L ( A 0.5 ) > E Z L ( A 2 ) > E Z L ( A 3 ) > E Z L ( A 4 ) .
E B B ( A ) > E B B ( A 0.5 ) > E B B ( A 2 ) > E B B ( A 3 ) > E B B ( A 4 ) .
E S K ( A ) > E S K ( A 0.5 ) > E S K ( A 2 ) > E S K ( A 3 ) > E S K ( A 4 ) .
It is shown that these ranked orders do not satisfy intuitive analysis in Equation (18), while other entropy measures can induce desirable results. In this example, E2HC and EVS perform well. This illustrates that these entropy measures are not robust enough to distinguish the uncertainty of AIFSs with linguistic information.
Moreover, the results produced by knowledge measures KSKB, KN, and KG are also not reasonable, which are shown as the equations below.
K S K B ( A ) < K S K B ( A 0.5 ) < K S K B ( A 2 ) < K S K B ( A 3 ) < K S K B ( A 4 ) ,
K N ( A ) < K N ( A 0.5 ) < K N ( A 2 ) < K N ( A 3 ) < K N ( A 4 ) ,
K G ( A ) < K G ( A 0.5 ) < K G ( A 2 ) < K G ( A 3 ) < K G ( A 4 ) .
However, our proposed knowledge measure KS indicates that:
K S ( A 0.5 ) < K S ( A ) < K S ( A 2 ) < K S ( A 3 ) < K S ( A 4 ) .
Therefore, the knowledge measures KSVB, KN, and KG are not suitable for differentiating the knowledge amount conveyed by AIFSs. The effectiveness of our proposed knowledge measure KI is indicated by this example once again.
From the above examples, we can conclude that entropy measures EZL, EBB, ESK, EVS, and E2HC perform poor because of their lack of robustness and discriminability. Our proposed knowledge measure performs much better than knowledge measures KSVB, KN, and KG.

5. Application in Solving Intuitionistic Fuzzy MADM

In this section, the proposed knowledge measure will be applied in solving multiple attribute decision making (MADM) problems. The MADM problems to be considered can be described below.
All alternatives consist of a set denoted as G = { G 1 , G 2 , , G m } . The set of all considered attributes is expressed as A = { A 1 , A 2 , , A n } . The weight vector of attributes is w = ( w 1 , w 2 , , w n ) T with i = 1 n w i = 1 . Due to the limitation of the decision-maker’s knowledge and expertise, an intuitionistic fuzzy form expresses the evaluation information provided under each attribute. The intuitionistic fuzzy decision matrix given by the decision maker is expressed as:
R =            A 1              A 2                   A n G 1 G 2 G m ( μ 11 , v 11 μ 12 , v 12 μ 1 n , v 1 n μ 21 , v 21 μ 22 , v 22 μ 2 n , v 2 n μ m 1 , v m 1 μ m 2 , v m 2 μ m n , v m n )
where the r i j = μ i j , v i j evaluation result of alternative Gi with respect to the attribute Aj, provided by the decision-maker in the form of an intuitionistic fuzzy value.
When solving MADM problems, the weights of attributes are important. If the attribute weights are completely known, this MADM problem can be solved by aggregating all intuitionistic fuzzy information under different attributes and comparing the final intuitionistic fuzzy values. However, in a practical application, the attribute weights are usually partially known or completely unknown [42]. Therefore, the attribute weights must be determined before solving the MADM problems. The attribute weights can be empirically assigned by experts. However, this method is subjective and the partial information on the attribute weights may not be used sufficiently. Therefore, we can propose a new model to determine the attribute weights based on the proposed knowledge measure. Generally, we hope the evaluation results of all alternatives under one attribute are distinguished enough to facilitate our decision making. Therefore, we can set the total knowledge amount as the objective function of optimization. By maximizing the sum of the knowledge amount under all attributes, we can construct the following models.
M a x T = j = 1 n w j i = 1 m K i j    s . t .   { w H , j = 1 n w j = 1 , w j 0 , j = 1 , 2 , , n
where H is the set of all incomplete information about attribute weights and K i j is the knowledge amount calculated by our proposed knowledge measure, i.e., K i j = K S ( r i j ) .
When the attribute weights are completely unknown, the attribute under which the total knowledge is greater should be assigned more weight. Therefore, we can calculate attribute weights by using the following equation.
w j = i = 1 m K i j j = 1 n i = 1 m K i j
When the attribute weights are obtained, the intuitionistic fuzzy MADM problems can be solved by aggregating the intuitionistic fuzzy information under all attributes. Many weighted aggregation operators have been proposed for integrating intuitionistic fuzzy information. In this section, we will use the well-known intuitionistic fuzzy weighted averaging (IFWA) operator [43] to solve the MADM problems under an intuitionistic fuzzy environment. The intuitionistic fuzzy information of alternative Gi can be aggregated to an intuitionistic fuzzy value zi, which is expressed as:
z i = μ i , v i = I F W A w ( μ i j , v i j , μ i j , v i j , , μ i j , v i j ) = 1 j = 1 n ( 1 μ i j ) w j , j = 1 n ( v i j ) w j
Then the score function S(zi) and accuracy function H(zi) of zi ( i = 1 , 2 , , m ) can be calculated as:
S ( z i ) = μ i v i
H ( z i ) = μ i + v i
Lastly, all alternatives can be ranked into an order according to the linear order relation between IFVs based on score function and accuracy function.
Next, an example will be applied to demonstrate the performance of our proposed method for solving MADM problems in an intuitionistic fuzzy condition.
Example 4.
A capital company will invest a sum of money to a project. Considering the complexity of economic development, they choose five companies as candidates. Five companies to be considered are shown as:
  • G1: A cell phone company.
  • G2: A food company.
  • G3: An automobile sales company.
  • G4: A computer company.
  • G5: a TV company.
The investment company evaluates these five companies from four attributes, which are:
  • A1: The investment risk.
  • A2: The capital gain.
  • A3: The social and political impact.
  • A4: The environmental impact.
The evaluated results with intuitionistic fuzzy information are shown as:
R =            A 1             A 2          A 3            A 4 G 1 G 2 G 3 G 4 G 5 ( 0.5 , 0.4 0.6 , 0.3 0.3 , 0.6 0.2 , 0.7 0.7 , 0.3 0.7 , 0.2 0.7 , 0.2 0.4 , 0.5 0.6 , 0.4 0.5 , 0.4 0.5 , 0.3 0.6 , 0.3 0.8 , 0.1 0.6 , 0.3 0.3 , 0.4 0.2 , 0.6 0.6 , 0.2 0.4 , 0.3 0.7 , 0.1 0.5 , 0.3 )
Case 1. 
The information on the attribute weights is incomplete. The partial information on attribute weights is listed in set H, H = { 0.15 w 1 0.2 ; 0.16 w 2 0.18 ; 0.3 w 3 0.35 ; 0.3 w 4 0.45 } .
The total knowledge amount under each attribute can be calculated by the equations below.
K 1 = i = 1 5 K 1 j = i = 1 5 K S ( r 1 j ) = 2.5663 ; K 2 = i = 1 5 K 2 j = i = 1 5 K S ( r 2 j ) = 2.0436 ;
K 3 = i = 1 5 K 3 j = i = 1 5 K S ( r 3 j ) = 2.0230 ; K 4 = i = 1 5 K 4 j = i = 1 5 K S ( r 4 j ) = 2.0872 .
The optimal model to determine the attribute weights can be constructed as:
M a x T = 2.5663 w 1 + 2.0436 w 2 + 2.0230 w 3 + 2.0872 w 4    s . t .   { w H j = 1 4 w j = 1 , w j 0 , j = 1 , 2 , , 4
Then the weighting vector of the attribute can be obtained as:
w = ( 0.20 , 0.16 , 0.30 , 0.34 ) T
Using the IFWA operator, we can aggregate the intuitionistic fuzzy information for each alternative with respect to all attributes. Then we have:
Z1 = <0.3738, 0.5218>, Z2 = <0.6203, 0.2962>, Z3 = <0.5568, 0.3327>,
Z4 = <0.4787, 0.3323>, Z5 = <0.5776, 0.1990>.
Calculating the score values of these integrated intuitionistic fuzzy information, we can get:
S(Z1) = −0.1480, S(Z2) = 0.3241, S(Z3) = 0.2240, S(Z4) = 0.1464, S(Z5) = 0.3787.
According to these scores and the linear order relation between IFVs, all alternatives can be ranked as the following order: G 5 G 2 G 3 G 4 G 1 .
Using the method proposed in Reference [44] based on E M 1.5 and C E M 1.5 , we can solve this MADM problem in such a case. The weighting vector can be obtained as: w = ( 0.19 , 0.16 , 0.35 , 0.30 ) T . The ranking order of all alternatives is G 5 G 2 G 3 G 4 G 1 , which is identical to the ranking order obtained by our proposed method. This indicates that the proposed method for solving MADM is reasonable and effective.
Case 2.
There is no information for the attribute weights. Based on the proposed knowledge measure, we can calculate the total knowledge amount of all intuitionistic fuzzy information under one attribute. They are listed as:
K 1 = i = 1 5 K 1 j = i = 1 5 K S ( r 1 j ) = 2.5663 ; K 2 = i = 1 5 K 2 j = i = 1 5 K S ( r 2 j ) = 2.0436 ;
K 3 = i = 1 5 K 3 j = i = 1 5 K S ( r 3 j ) = 2.0230 ; K 4 = i = 1 5 K 4 j = i = 1 5 K S ( r 4 j ) = 2.0872 .
The attribute weights can be obtained by Equation (19),
w 1 = K 1 / i = 1 4 K i = 0.2943 ; w 2 = K 2 / i = 1 4 K i = 0.2344 ; w 3 = K 3 / i = 1 4 K i = 0.2320 ; w 4 = K 4 / i = 1 4 K i = 0.2393 .
Therefore, the weighting vector is w = ( 0.2943 , 0.2344 , 0.2320 , 0.2393 ) T .
Applying the weighting factors and IFWA operator, we can aggregate the evaluation results of an alternative across all attributes. The integrated result for each alternative is:
Z1 = <0.4259,0.4697>, Z2 = <0.6459,0.2806>, Z3 = <0.5561,0.3493>,
Z4 = <0.5616,0.2740>, Z5 = <0.5660,0.2064>.
The scores of the intuitionistic fuzzy evaluation information of all alternatives can be obtained as:
S(Z1) = −0.0438, S(Z2) = 0.3562, S(Z3) = 0.2069, S(Z4) = 0.2876, S(Z5) = 0.3596.
All alternatives can be ranked into the following order: G 5 G 2 G 4 G 3 G 1 .
For comparative analysis, we can also solve this decision-making problem based on Xia and Xu’s method based on E M 1.5 and C E M 1.5 [44]. The yielded weighting vector is w = ( 0.2659 , 0.2486 , 0.2370 , 0.2486 ) T . The ranked order of five alternatives is G 5 G 2 G 3 G 4 G 1 . We can see that both of our proposed method based on KS and the method in Reference [44] can take G5 as the best choice for investment. Even though the order between G3 and G4 obtained by our method is different from that obtained by Xia and Xu’ method, this difference has no effect on choosing the best company to invest. Actually, the solution of an MADM problem only concerns the best alternative. The order of other alternatives is beyond the ultimate goal of an MADM problem.
This example demonstrates that the proposed methods for solving MADM problems are competent to getting reasonable results. Compared with Xia and Xu’s method, our proposed optimal model is easier, which will reduce the computation burden. Moreover, our defined knowledge measure is more concise than the cross entropy and entropy measure defied in Reference [44]. Thus, the priority of the new knowledge measure is also verified.

6. Conclusions

In this paper, the definition of knowledge measure for AIFSs is addressed. The axiomatic definition of the knowledge measure is first proposed based on intuitive analysis on the knowledge amount. Then we proposed a new knowledge measure for AIFSs. The properties of the new knowledge measure are investigated from a mathematical viewpoint. Numerical examples are applied to illustrate the performance of the new knowledge measure. Comparative analysis on the experimental results indicates that the proposed knowledge measure is effective in discriminating the knowledge amount of different AIFSs. Unlike other existing measures that may lead to unreasonable results, the proposed new knowledge measure is robust in quantifying the knowledge amount of different kinds of AIFSs. Based on the new knowledge measure, we proposed two methods for determining attribute weights in intuitionistic fuzzy MADM problems in the cases that the information on attribute weights is partially known and completely unknown, respectively. Following the determination of attribute weights, we develop a new method to solve the MADM problem. To verify the effectiveness and rationality of the proposed models and the MADM method, we take an MADM problem as an example to make a comparison. The results show that the proposed method is effective enough to obtain reasonable results. Moreover, the proposed method has advantages in reducing computation burden and implementation.
We must point out that the proposed knowledge measure is not the only effective knowledge measure for AIFSs. It is also not the most effective one. In the future, further investigation on the definition of a more effective knowledge measure should be carried out. Moreover, a deeper relation between the knowledge amount and entropy is not investigated in this paper. Although we claim that the membership and non-membership grades influence the knowledge amount, the mechanism behind such an influence is not mined. These are also areas to focus on in future research.

Author Contributions

Conceptualization, G.W. and Y.S. Methodology, J.Z. and Y.S. Validation, Y.S. and G.W. Writing—Original Draft Preparation, J.Z. Writing—Review and Editing, Y.S. Funding Acquisition, Y.S.

Funding

The National Natural Science Foundation of China, grant numbers 61703426, 61876189, 61806219, 61273275, and 61503407; the China Postdoctoral Science Foundation, grant number 2018M633680, funded this research.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Zadeh, L.A. Fuzzy sets. Inf. Control. 1965, 8, 338–356. [Google Scholar] [CrossRef]
  2. Bogiatzis, A.C.; Papadopoulos, B.K. Producing fuzzy inclusion and entropy measures and their application on global image thresholding. Evol. Syst. 2018, 9, 331–353. [Google Scholar] [CrossRef]
  3. Merigó, J.M.; Gil-Lafuente, A.M.; Yager, R.R. An overview of fuzzy research with bibliometric indicators. Appl. Soft Comput. 2015, 27, 420–433. [Google Scholar] [CrossRef]
  4. Blanco-Mesa, F.; Merigó, J.M.; Gil-Lafuente, A.M. Fuzzy decision making: A bibliometric-based review. J. Intell. Fuzzy Syst. 2017, 32, 2033–2050. [Google Scholar] [CrossRef] [Green Version]
  5. Pozna, C.; Minculete, N.; Precup, R.E.; Kóczy, L.T.; Ballagi, Á. Signatures: Definitions, operators and applications to fuzzy modeling. Fuzzy Sets Syst. 2012, 201, 86–104. [Google Scholar] [CrossRef]
  6. Jankowski, J.; Kazienko, P.; Watróbski, J.; Lewandowska, A.; Ziemba, P.; Zioło, M. Fuzzy multi-objective modeling of effectiveness and user experience in online advertising. Exp. Syst. Appl. 2016, 65, 315–331. [Google Scholar] [CrossRef]
  7. Atanassov, K. Intuitionistic fuzzy sets. Fuzzy Sets Syst. 1986, 20, 87–96. [Google Scholar] [CrossRef]
  8. Yu, D.; Liao, H.C. Visualization and quantitative research on intuitionistic fuzzy studies. J. Intell. Fuzzy Syst. 2016, 30, 3653–3663. [Google Scholar] [CrossRef]
  9. Yu, D.; Shi, S. Researching the development of Atanassov intuitionistic fuzzy set: Using a citation network analysis. Appl. Soft Comput. 2015, 32, 189–198. [Google Scholar] [CrossRef]
  10. Chen, S.M.; Chang, C.H. A novel similarity measure between Atanassov’s intuitionistic fuzzy sets based on transformation techniques with applications to pattern recognition. Inf. Sci. 2015, 291, 96–114. [Google Scholar] [CrossRef]
  11. Song, Y.; Wang, X.; Lei, L.; Xue, A. A novel similarity measure on intuitionistic fuzzy sets with its applications. Appl. Intell. 2015, 42, 252–261. [Google Scholar] [CrossRef]
  12. Song, Y.; Wang, X.; Quan, W.; Huang, W. A new approach to construct similarity measure for intuitionistic fuzzy sets. Soft Comput. 2017, 1–14. [Google Scholar] [CrossRef]
  13. Li, D.F.; Wang, Y.C.; Liu, S.; Shan, F. Fractional programming methodology for multi-attribute group decision-making using IFS. Appl. Soft Comput. 2009, 9, 219–225. [Google Scholar] [CrossRef]
  14. Yager, R.R. An intuitionistic view of the Dempster–Shafer belief structure. Soft Comput. 2014, 18, 2091–2099. [Google Scholar] [CrossRef]
  15. Wang, X.; Song, Y. Uncertainty measure in evidence theory with its applications. Appl. Intell. 2018, 48, 1672–1688. [Google Scholar] [CrossRef]
  16. Fan, C.; Song, Y.; Fu, Q.; Lei, L.; Wang, X. New operators for aggregating intuitionistic fuzzy information with their application in decision making. IEEE Access. 2018, 6, 27214–27238. [Google Scholar] [CrossRef]
  17. Fan, C.; Song, Y.; Lei, L.; Wang, X.; Bai, S. Evidence reasoning for temporal uncertain information based on relative reliability evaluation. Exp. Syst. Appl. 2018, 113, 264–276. [Google Scholar] [CrossRef]
  18. Song, Y.; Wang, X.; Zhu, J.; Lei, L. Sensor dynamic reliability evaluation based on evidence and intuitionistic fuzzy sets. Appl. Intell. 2018, 48, 3950–3962. [Google Scholar] [CrossRef]
  19. Gao, P.C.; Zhang, H.; Li, Z.L. An efficient analytical method for computing the Boltzmann entropy of a landscape gradient. Trans. GIS 2018, 22, 1046–1063. [Google Scholar] [CrossRef]
  20. Shannon, C.E. A mathematical theory of communication. Bell Syst. Tech. J. 1948, 27, 379–423. [Google Scholar] [CrossRef]
  21. Cushman, S.A. Calculating the configurational entropy of a landscape mosaic. Landsc. Ecol. 2016, 31, 481–489. [Google Scholar] [CrossRef]
  22. Cushman, S.A. Calculation of configurational entropy in complex landscapes. Entropy 2018, 20, 298. [Google Scholar] [CrossRef]
  23. Gao, P.C.; Zhang, H.; Li, Z.L. A hierarchy-based solution to calculate the configurational entropy of landscape gradients. Landsc. Ecol. 2017, 32, 1133–1146. [Google Scholar] [CrossRef]
  24. De Luca, A.; Termini, S. A definition of a nonprobabilistic entropy in the setting of fuzzy sets theory. Inf. Control. 1972, 20, 301–312. [Google Scholar] [CrossRef]
  25. Yager, R.R. On the measure of fuzziness and negation, part 1: Membership in the unit interval. Int. J. Gen. Syst. 1979, 5, 221–229. [Google Scholar] [CrossRef]
  26. Higashi, M.; Klir, G. On measures of fuzziness and fuzzy complements. Int. J. Gen. Syst. 1982, 8, 169–180. [Google Scholar] [CrossRef]
  27. Burillo, P.; Bustince, H. Entropy on intuitionistic fuzzy sets and on interval-valued fuzzy Sets. Fuzzy Sets Syst. 1996, 78, 305–316. [Google Scholar] [CrossRef]
  28. Szmidt, E.; Kacprzyk, J. Entropy for intuitionistic fuzzy sets. Fuzzy Sets Syst. 2001, 118, 467–477. [Google Scholar] [CrossRef]
  29. Hung, W.L.; Yang, M.S. Fuzzy entropy on intuitionistic fuzzy sets. Int. J. Intell. Syst. 2006, 21, 443–451. [Google Scholar] [CrossRef]
  30. Vlachos, I.K.; Sergiadis, G.D. Intuitionistic fuzzy information-applications to pattern recognition. Pattern Recognit. Lett. 2007, 28, 197–206. [Google Scholar] [CrossRef]
  31. Szmidt, E.; Kacprzyk, J.; Bujnowski, P. On some measures of information and knowledge for intuitionistic fuzzy sets. Notes IFS 2010, 16, 1–11. [Google Scholar]
  32. Szmidt, E.; Kacprzyk, J.; Bujnowski, P. How to measure the amount of knowledge conveyed by Atanassov’s intuitionistic fuzzy sets. Inf. Sci. 2014, 257, 276–285. [Google Scholar] [CrossRef]
  33. Nguyen, H. A new knowledge-based measure for intuitionistic fuzzy sets and its application in multiple attribute group decision making. Expert. Syst. Appl. 2015, 42, 8766–8774. [Google Scholar] [CrossRef]
  34. Guo, K. Knowledge measure for Atanassov’s intuitionistic fuzzy sets. IEEE Trans. Fuzzy Syst. 2016, 24, 1072–1078. [Google Scholar] [CrossRef]
  35. Chen, S.M.; Tan, J.M. Handling multicriteria fuzzy decision-making problems based on vague set theory. Fuzzy Sets Syst. 1994, 67, 163–172. [Google Scholar] [CrossRef]
  36. Hong, D.H.; Choi, C.H. Multicriteria fuzzy decision-making problems based on vague set theory. Fuzzy Sets Syst. 2000, 114, 103–113. [Google Scholar] [CrossRef]
  37. Xu, Z.; Yager, R.R. Some geometric aggregation operators based on intuitionistic fuzzy sets. Int. J. Gen. Syst. 2006, 35, 417–433. [Google Scholar] [CrossRef]
  38. Guo, K.; Song, Q. On the entropy for Atanassov’s intuitionistic fuzzy sets: An interpretation from the perspective of amount of knowledge. Appl. Soft Comput. 2014, 24, 328–340. [Google Scholar] [CrossRef]
  39. Kullback, S.; Leibler, R.A. On information and sufficiency. Ann. Math. Statist. 1951, 22, 79–86. [Google Scholar] [CrossRef]
  40. Zeng, W.Y.; Li, H.X. Relationship between similarity measure and entropy of interval-valued fuzzy sets. Fuzzy Sets Syst. 2006, 157, 1477–1484. [Google Scholar] [CrossRef]
  41. De, S.K.; Biswas, R.; Roy, A.R. Some operations on intuitionistic fuzzy sets. Fuzzy Sets Syst. 2000, 114, 477–484. [Google Scholar] [CrossRef]
  42. Wei, G.W. Maximizing deviation method for multiple attribute decision making in intuitionistic fuzzy setting. Knowl.-Based Syst. 2008, 21, 833–836. [Google Scholar] [CrossRef]
  43. Xu, Z. Intuitionistic fuzzy aggregation operators. IEEE Trans. Fuzzy Syst. 2007, 15, 1179–1187. [Google Scholar]
  44. Xia, M.; Xu, Z. Entropy/cross entropy-based group decision making under intuitionistic fuzzy environment. Inf. Fusion. 2012, 13, 31–47. [Google Scholar] [CrossRef]
Figure 1. The value of the knowledge measure KS (A).
Figure 1. The value of the knowledge measure KS (A).
Entropy 20 00981 g001
Table 1. Comparison results of example 1. (counter-intuitive results are in bold type).
Table 1. Comparison results of example 1. (counter-intuitive results are in bold type).
EZLEBBESKEVSE2HCKSKBKNKGKS
A1 = <x,0.5,0.5>10110.50.50.86600.50000.5
A2 = <x,0.25,0.25>10.5110.62500.250.43300.25000.1038
A3 = <x,0.25,0.5>0.75000.25000.66670.93870.62500.54170.70890.53130.2945
A4 = <x,0.2,0.3>0.90.50.87500.98550.62000.31250.42590.32500.1106
Table 2. Comparison results of example 2 (counter-intuitive results are in bold type).
Table 2. Comparison results of example 2 (counter-intuitive results are in bold type).
EZLEBBESKEVSE2HCKSKBKNKGKS
A1 = <x,0.7,0.2>0.50.10.37500.78780.46000.76250.81850.72500.5344
A2 = <x,0.5,0.3>0.80.20.71430.96350.62000.54290.70000.52000.3211
A3 = <x,0.5,0>0.50.50.50.50.50.50000.50000.62500.2500
A4 = <x,0.5,0.5>10110.50.50000.86600.50000.5000
A5 = <x,0.5,0.4>0.90.10.83330.9920.580.53330.78100.50500.3950
A6 = <x,0.6,0.2>0.60.20.50.8490.560.65000.72110.64000.3919
A7 = <x,0.4,0.4>10.2110.640.40000.69280.40000.2948
A8 = <x,1,0>000001111
A9 = <x,0,0>111100000
Table 3. Comparative results of all AIFSs with respect to B (counter-intuitive results are in bold type).
Table 3. Comparative results of all AIFSs with respect to B (counter-intuitive results are in bold type).
EZLEBBESKEVSE2HCKSKBKNKGKS
A0.50.42910.06830.35180.56400.33550.78990.86800.76330.6532
A0.44000.08000.40730.52330.32800.75630.86410.76000.6564
A20.21600.07600.16770.33690.28910.87820.89500.88280.7579
A30.13640.07520.11010.22120.26020.90740.91080.92300.8157
A40.10820.08000.09500.16120.23970.91250.91330.93370.8395

Share and Cite

MDPI and ACS Style

Wang, G.; Zhang, J.; Song, Y.; Li, Q. An Entropy-Based Knowledge Measure for Atanassov’s Intuitionistic Fuzzy Sets and Its Application to Multiple Attribute Decision Making. Entropy 2018, 20, 981. https://0-doi-org.brum.beds.ac.uk/10.3390/e20120981

AMA Style

Wang G, Zhang J, Song Y, Li Q. An Entropy-Based Knowledge Measure for Atanassov’s Intuitionistic Fuzzy Sets and Its Application to Multiple Attribute Decision Making. Entropy. 2018; 20(12):981. https://0-doi-org.brum.beds.ac.uk/10.3390/e20120981

Chicago/Turabian Style

Wang, Gang, Jie Zhang, Yafei Song, and Qiang Li. 2018. "An Entropy-Based Knowledge Measure for Atanassov’s Intuitionistic Fuzzy Sets and Its Application to Multiple Attribute Decision Making" Entropy 20, no. 12: 981. https://0-doi-org.brum.beds.ac.uk/10.3390/e20120981

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop