Robust density power divergence based tests in multivariate analysis: A comparative overview of different approaches

Article Type

Research Article

Publication Title

Journal of Multivariate Analysis

Abstract

Hypothesis testing is one of the fundamental paradigms of statistical inference. The three canonical hypothesis testing procedures available in the statistical literature are the likelihood ratio (LR) test, the Wald test and the Rao (score) test. All of them have good optimality properties and past research has not identified any of these three procedures to be a clear winner over the other two. However, the classical versions of these tests are based on the maximum likelihood estimator (MLE), which, although the most optimal estimator asymptotically, is known for its lack of robustness under outliers and model misspecification. In the present paper we provide an overview of the analogues of these tests based on the minimum density power divergence estimator (MDPDE), which presents us with an alternative option that is strongly robust and highly efficient. Since these tests have, so far, been mostly studied for univariate responses, here we primarily focus on their performances for several important hypothesis testing problems in the multivariate context under the multivariate normal model family.

DOI

10.1016/j.jmva.2021.104846

Publication Date

3-1-2022

Comments

Open Access, Bronze

This document is currently not available here.

Share

COinS