Ir al contenido

Documat


Predictor selection for positive autoregressive processes

  • Autores: Ching-Kang Ing, Chiao-Yi Yang
  • Localización: Journal of the American Statistical Association, ISSN 0162-1459, Vol. 109, Nº 505, 2014, págs. 243-253
  • Idioma: inglés
  • DOI: 10.1080/01621459.2013.836974
  • Texto completo no disponible (Saber más ...)
  • Resumen
    • Let observations y1, �, yn be generated from a first-order autoregressive (AR) model with positive errors. In both the stationary and unit root cases, we derive moment bounds and limiting distributions of an extreme value estimator, , of the AR coefficient. These results enable us to provide asymptotic expressions for the mean squared error (MSE) of and the mean squared prediction error (MSPE) of the corresponding predictor, , of yn + 1. Based on these expressions, we compare the relative performance of () and the least-squares predictor (estimator) from the MSPE (MSE) point of view. Our comparison reveals that the better predictor (estimator) is determined not only by whether a unit root exists, but also by the behavior of the underlying error distribution near the origin, and hence is difficult to identify in practice. To circumvent this difficulty, we suggest choosing the predictor (estimator) with the smaller accumulated prediction error and show that the predictor (estimator) chosen in this way is asymptotically equivalent to the better one. Both real and simulated datasets are used to illustrate the proposed method. Supplementary materials for this article are available online.


Fundación Dialnet

Mi Documat

Opciones de artículo

Opciones de compartir

Opciones de entorno