A Scale-Invariant Generalization of the Rényi Entropy, Associated Divergences and Their Optimizations under Tsallis' Nonextensive Framework

Article Type

Research Article

Publication Title

IEEE Transactions on Information Theory

Abstract

Entropy and relative or cross entropy measures are two very fundamental concepts in information theory and are also widely used for statistical inference across disciplines. The related optimization problems, in particular the maximization of the entropy and the minimization of the cross entropy or relative entropy (divergence), are essential for general logical inference in our physical world. In this paper, we discuss a two parameter generalization of the popular Rényi entropy and associated optimization problems. We derive the desired entropic characteristics of the new generalized entropy measure including its positivity, expandability, extensivity and generalized (sub-)additivity. More importantly, when considered over the class of sub-probabilities, our new family turns out to be scale-invariant; this property does not hold for most existing generalized entropy measures. We also propose the corresponding cross entropy and relative entropy measures and discuss their geometric properties including generalized Pythagorean results over β-convex sets. The maximization of the new entropy and the minimization of the corresponding cross or relative entropy measures are carried out explicitly under the non-extensive ('third-choice') constraints given by the Tsallis' normalized q-expectations which also correspond to the β-linear family of probability distributions. Important properties of the associated forward and reverse projection rules are discussed along with their existence and uniqueness. In this context, we have come up with, for the first time, a class of entropy measures-a subfamily of our two-parameter generalization-that leads to the classical (extensive) exponential family of MaxEnt distributions under the non-extensive constraints; this discovery has been illustrated through the useful concept of escort distributions and can potentially be important for future research in information theory. Other members of the new entropy family, however, lead to the power-law type generalized q-exponential MaxEnt distributions which is in conformity with Tsallis' nonextensive theory. Therefore, our new family indeed provides a wide range of entropy and associated measures combining both the extensive and nonextensive MaxEnt theories under one umbrella.

First Page

2141

Last Page

2161

DOI

10.1109/TIT.2021.3054980

Publication Date

4-1-2021

Comments

Open Access, Green

This document is currently not available here.

Share

COinS