Characterizing the Functional Density Power Divergence Class

Article Type

Research Article

Publication Title

IEEE Transactions on Information Theory

Abstract

Divergence measures have a long association with statistical inference, machine learning and information theory. The density power divergence and related measures have produced many useful (and popular) statistical procedures, which provide a good balance between model efficiency on one hand and outlier stability or robustness on the other. The logarithmic density power divergence, a particular logarithmic transform of the density power divergence, has also been very successful in producing efficient and stable inference procedures; in addition it has also led to significant demonstrated applications in information theory. The success of the minimum divergence procedures based on the density power divergence and the logarithmic density power divergence (which also go by the names $\beta $ -divergence and $\gamma $ -divergence, respectively) make it imperative and meaningful to look for other, similar divergences which may be obtained as transforms of the density power divergence in the same spirit. With this motivation we search for such transforms of the density power divergence, referred to herein as the functional density power divergence class. The present article characterizes this functional density power divergence class, and thus identifies the available divergence measures within this construct that may be explored further for possible applications in statistical inference, machine learning and information theory.

First Page

1141

Last Page

1146

DOI

https://10.1109/TIT.2022.3210436

Publication Date

2-1-2023

Comments

Open Access, Green

This document is currently not available here.

Share

COinS