Entropy measures for Atanassov intuitionistic fuzzy sets based on divergence
In the literature, there are two different approaches to define entropy of Atanassov intuitionistic fuzzy sets (AIFS, for short). The first approach, given by Szmidt and Kacprzyk, measures how far is an AIFS from its closest crisp set, while the second approach, given by Burrillo and Bustince, measures how far is an AIFS from its closest fuzzy set. On the other hand, divergence measures are functions that measure how different two AIFSs are. Our work generalizes both types of entropies using local measures of divergence. This results in at least two benefits: depending on the application, one may choose from a wide variety of entropy measures and the local nature provides a natural way of parallel computation of entropy, which is important for large data sets. In this context, we provide the necessary and sufficient conditions for defining entropy measures under both frameworks using divergence measures for AIFS. We show that the usual examples of entropy measures can be obtained as particular cases of our more general framework. Also, we investigate the connection between knowledge measures and divergence measures. Finally, we apply our results in a multi-attribute decision-making problem to obtain the weights of the experts.
Montes, Ignacio; Pal, Nikhil R.; and Montes, Susana, "Entropy measures for Atanassov intuitionistic fuzzy sets based on divergence" (2018). Journal Articles. 1297.