Ir al contenido

Documat


Variable inclusion and shrinkage algorithms

  • Autores: Peter Radchenko, Gareth M. James
  • Localización: Journal of the American Statistical Association, ISSN 0162-1459, Vol. 103, Nº 483, 2008
  • Idioma: inglés
  • DOI: 10.1198/016214508000000481
  • Texto completo no disponible (Saber más ...)
  • Resumen
    • The Lasso is a popular and computationally efficient procedure for automatically performing both variable selection and coefficient shrinkage on linear regression models. One limitation of the Lasso is that the same tuning parameter is used for both variable selection and shrinkage. As a result, it typically ends up selecting a model with too many variables to prevent overshrinkage of the regression coefficients. We suggest an improved class of methods called variable inclusion and shrinkage algorithms (VISA). Our approach is capable of selecting sparse models while avoiding overshrinkage problems and uses a path algorithm, and so also is computationally efficient. We show through extensive simulations that VISA significantly outperforms the Lasso and also provides improvements over more recent procedures, such as the Dantzig selector, relaxed Lasso, and adaptive Lasso. In addition, we provide theoretical justification for VISA in terms of nonasymptotic bounds on the estimation error that suggest it should exhibit good performance even for large numbers of predictors. Finally, we extend the VISA methodology, path algorithm, and theoretical bounds to the generalized linear models framework. [PUBLICATION ABSTRACT]


Fundación Dialnet

Mi Documat

Opciones de artículo

Opciones de compartir

Opciones de entorno