Ir al contenido

Documat


Uniform integrability of the OLS estimators, and the convergence of their moments

  • Georgios Afendras [1] ; Marianthi Markatou [1]
    1. [1] University at Buffalo, State University of New York

      University at Buffalo, State University of New York

      City of Buffalo, Estados Unidos

  • Localización: Test: An Official Journal of the Spanish Society of Statistics and Operations Research, ISSN-e 1863-8260, ISSN 1133-0686, Vol. 25, Nº. 4, 2016, págs. 775-784
  • Idioma: inglés
  • DOI: 10.1007/s11749-016-0498-y
  • Texto completo no disponible (Saber más ...)
  • Resumen
    • The problem of convergence of moments of a sequence of random variables to the moments of its asymptotic distribution is important in many applications. These include the determination of the optimal training sample size in the cross-validation estimation of the generalization error of computer algorithms, and in the construction of graphical methods for studying dependence patterns between two biomarkers. In this paper, we prove the uniform integrability of the ordinary least squares estimators of a linear regression model, under suitable assumptions on the design matrix and the moments of the errors. Further, we prove the convergence of the moments of the estimators to the corresponding moments of their asymptotic distribution, and study the rate of the moment convergence. The canonical central limit theorem corresponds to the simplest linear regression model. We investigate the rate of the moment convergence in canonical central limit theorem proving a sharp improvement of von Bahr’s (Ann Math Stat 36:808–818, 1965) theorem.


Fundación Dialnet

Mi Documat

Opciones de artículo

Opciones de compartir

Opciones de entorno