The minimum S-divergence estimator under continuous models: the Basu–Lindsay approach
Article Type
Research Article
Publication Title
Statistical Papers
Abstract
Robust inference based on the minimization of statistical divergences has proved to be a useful alternative to the classical maximum likelihood based techniques. Recently Ghosh et al. (A Generalized Divergence for Statistical Inference, 2013a) proposed a general class of divergence measures for robust statistical inference, named the S-divergence family. Ghosh (Sankhya A, doi:10.1007/s13171-014-0063-2, 2014) discussed its asymptotic properties for the discrete model of densities. In the present paper, we develop the asymptotic properties of the minimum S-divergence estimators under continuous models. Here we use the Basu–Lindsay approach (Ann Inst Stat Math 46:683–705, 1994) of smoothing the model densities that, unlike previous approaches, avoids much of the complications of the kernel bandwidth selection. Illustrations are presented to support the performance of the resulting estimators both in terms of efficiency and robustness through extensive simulation studies and real data examples.
First Page
341
Last Page
372
DOI
10.1007/s00362-015-0701-3
Publication Date
6-1-2017
Recommended Citation
Ghosh, Abhik and Basu, Ayanendranath, "The minimum S-divergence estimator under continuous models: the Basu–Lindsay approach" (2017). Journal Articles. 2569.
https://digitalcommons.isical.ac.in/journal-articles/2569
Comments
Open Access, Green