Convergence rates for kernel regression in infinite-dimensional spaces
Article Type
Research Article
Publication Title
Annals of the Institute of Statistical Mathematics
Abstract
We consider a nonparametric regression setup, where the covariate is a random element in a complete separable metric space, and the parameter of interest associated with the conditional distribution of the response lies in a separable Banach space. We derive the optimum convergence rate for the kernel estimate of the parameter in this setup. The small ball probability in the covariate space plays a critical role in determining the asymptotic variance of kernel estimates. Unlike the case of finite-dimensional covariates, we show that the asymptotic orders of the bias and the variance of the estimate achieving the optimum convergence rate may be different for infinite-dimensional covariates. Also, the bandwidth, which balances the bias and the variance, may lead to an estimate with suboptimal mean square error for infinite-dimensional covariates. We describe a data-driven adaptive choice of the bandwidth and derive the asymptotic behavior of the adaptive estimate.
First Page
471
Last Page
509
DOI
10.1007/s10463-018-0697-2
Publication Date
4-1-2020
Recommended Citation
Chowdhury, Joydeep and Chaudhuri, Probal, "Convergence rates for kernel regression in infinite-dimensional spaces" (2020). Journal Articles. 344.
https://digitalcommons.isical.ac.in/journal-articles/344
Comments
Open Access, Green