A mixture of g-priors for variable selection when the number of regressors grows with the sample size

Article Type

Research Article

Publication Title

Test

Abstract

We consider the problem of variable selection in linear regression using mixtures of g-priors. A number of mixtures have been proposed in the literature which work well, especially when the number of regressors p is fixed. In this paper, we propose a mixture of g-priors suitable for the case when p grows with the sample size n, more specifically when p= O(nb) , 0 < b< 1. The marginal density based on the proposed mixture has a nice approximation with a closed form expression, which makes application of the method as tractable as an information criterion-based method. The proposed method satisfies fundamental properties like model selection consistency when the true model lies in the model space, and also consistency in an appropriate sense, under misspecified models setup. The method is quite robust in the sense that the above properties are not confined to normal linear models; they continue to hold under reasonable conditions for a general class of error distributions. Finally, we compare the performance of the proposed prior theoretically with that of some other mixtures of g-priors. We also compare it with several other Bayesian methods of model selection using simulated data sets. Theoretically, as well as in simulations, it emerges that unlike most of the other methods of model selection, the proposed prior is competent enough while selecting the true model irrespective of its dimension.

First Page

377

Last Page

404

DOI

10.1007/s11749-016-0516-0

Publication Date

6-1-2017

Comments

Open Access, Green

This document is currently not available here.

Share

COinS