Entropy measures for Atanassov intuitionistic fuzzy sets based on divergence

Article Type

Research Article

Publication Title

Soft Computing

Abstract

In the literature, there are two different approaches to define entropy of Atanassov intuitionistic fuzzy sets (AIFS, for short). The first approach, given by Szmidt and Kacprzyk, measures how far is an AIFS from its closest crisp set, while the second approach, given by Burrillo and Bustince, measures how far is an AIFS from its closest fuzzy set. On the other hand, divergence measures are functions that measure how different two AIFSs are. Our work generalizes both types of entropies using local measures of divergence. This results in at least two benefits: depending on the application, one may choose from a wide variety of entropy measures and the local nature provides a natural way of parallel computation of entropy, which is important for large data sets. In this context, we provide the necessary and sufficient conditions for defining entropy measures under both frameworks using divergence measures for AIFS. We show that the usual examples of entropy measures can be obtained as particular cases of our more general framework. Also, we investigate the connection between knowledge measures and divergence measures. Finally, we apply our results in a multi-attribute decision-making problem to obtain the weights of the experts.

First Page

5051

Last Page

5071

DOI

10.1007/s00500-018-3318-3

Publication Date

8-1-2018

Comments

All Open Access, Green

This document is currently not available here.

Share

COinS