Ir al contenido

Documat


Stabilized nearest neighbor classifier and its statistical properties

  • Autores: Will Wei Sun, Xingye Qiao, Guang Cheng
  • Localización: Journal of the American Statistical Association, ISSN 0162-1459, Vol. 111, Nº 515, 2016, págs. 1254-1254
  • Idioma: inglés
  • DOI: 10.1080/01621459.2015.1089772
  • Texto completo no disponible (Saber más ...)
  • Resumen
    • The stability of statistical analysis is an important indicator for reproducibility, which is one main principle of the scientific method. It entails that similar statistical conclusions can be reached based on independent samples from the same underlying population. In this article, we introduce a general measure of classification instability (CIS) to quantify the sampling variability of the prediction made by a classification method. Interestingly, the asymptotic CIS of any weighted nearest neighbor classifier turns out to be proportional to the Euclidean norm of its weight vector. Based on this concise form, we propose a stabilized nearest neighbor (SNN) classifier, which distinguishes itself from other nearest neighbor classifiers, by taking the stability into consideration. In theory, we prove that SNN attains the minimax optimal convergence rate in risk, and a sharp convergence rate in CIS. The latter rate result is established for general plug-in classifiers under a low-noise condition. Extensive simulated and real examples demonstrate that SNN achieves a considerable improvement in CIS over existing nearest neighbor classifiers, with comparable classification accuracy. We implement the algorithm in a publicly available R package snn. [web URL: http://www.tandfonline.com/doi/full/10.1080/01621459.2015.1089772]


Fundación Dialnet

Mi Documat

Opciones de artículo

Opciones de compartir

Opciones de entorno