Optimal guessing under nonextensive framework and associated moment bounds

Article Type

Research Article

Publication Title

Statistics and Probability Letters

Abstract

We consider the problem of guessing the realization of a random variable but under more general Tsallis’ non-extensive entropic framework rather than the classical Maxwell–Boltzman–Gibbs–Shannon framework. We consider both the conditional guessing problem in the presence of some related side information, and the unconditional one where no such side-information is available. For both types of the problem, the non-extensive moment bounds of the required number of guesses are derived; here we use the q-normalized expectation in place of the usual (linear) expectation to define the non-extensive moments. These moment bounds are seen to be a function of the logarithmic norm entropy measure, a recently developed two-parameter generalization of the Renyi entropy, and hence provide their information theoretic interpretation. We have also considered the case of uncertain source distribution and derived the non-extensive moment bounds for the corresponding mismatched guessing function. These mismatched bounds are interestingly seen to be linked with an important robust statistical divergence family known as the relative (α,β)-entropies; similar link is discussed between the optimum mismatched guessing with the extremes of these relative entropy measures.

DOI

https://10.1016/j.spl.2023.109812

Publication Date

6-1-2023

Comments

Open Access, Green

This document is currently not available here.

Share

COinS