The extended Bregman divergence and parametric estimation

Article Type

Research Article

Publication Title

Statistics

Abstract

Minimization of suitable statistical distances (between the data and model densities) is a useful technique in the field of robust inference. Apart from the class of ϕ-divergences, the Bregman divergence is extensively used for this purpose. However, since the data density must have a linear presence in the term involving both the data and model densities in this structure, several useful divergences cannot be captured by the usual Bregman form. We provide an extension of the Bregman divergence by considering an exponent of the density function as the argument rather than the density itself. Many useful divergences, that are not ordinarily Bregman divergences, can be accommodated within this extended description. Using this formulation, one can develop many new families of divergences which may be useful in robust inference. In particular, through an application of this extension, we propose the new class of the GSB divergence family. We explore the applicability of the minimum GSB divergence estimator in discrete parametric models. Simulation studies and real data examples are provided to demonstrate the performance of the estimator and to substantiate the theory developed.

First Page

699

Last Page

718

DOI

10.1080/02331888.2022.2070622

Publication Date

1-1-2022

Comments

Open Access, Green

This document is currently not available here.

Share

COinS