Ir al contenido

Documat


Gradient-based kernel dimension reduction for regression

  • Autores: Kenji Fukumizu, Chenlei Leng
  • Localización: Journal of the American Statistical Association, ISSN 0162-1459, Vol. 109, Nº 505, 2014, págs. 359-370
  • Idioma: inglés
  • DOI: 10.1080/01621459.2013.838167
  • Texto completo no disponible (Saber más ...)
  • Resumen
    • This article proposes a novel approach to linear dimension reduction for regression using nonparametric estimation with positive-definite kernels or reproducing kernel Hilbert spaces (RKHSs). The purpose of the dimension reduction is to find such directions in the explanatory variables that explain the response sufficiently: this is called sufficient dimension reduction. The proposed method is based on an estimator for the gradient of the regression function considered for the feature vectors mapped into RKHSs. It is proved that the method is able to estimate the directions that achieve sufficient dimension reduction. In comparison with other existing methods, the proposed one has wide applicability without strong assumptions on the distributions or the type of variables, and needs only eigendecomposition for estimating the projection matrix. The theoretical analysis shows that the estimator is consistent with certain rate under some conditions. The experimental results demonstrate that the proposed method successfully finds effective directions with efficient computation even for high-dimensional explanatory variables


Fundación Dialnet

Mi Documat

Opciones de artículo

Opciones de compartir

Opciones de entorno