Ir al contenido

Documat


A generalized least-square matrix decomposition

  • Autores: Genevera I. Allen, Logan Grosenick, Jonathan Taylor
  • Localización: Journal of the American Statistical Association, ISSN 0162-1459, Vol. 109, Nº 505, 2014, págs. 145-159
  • Idioma: inglés
  • DOI: 10.1080/01621459.2013.852978
  • Texto completo no disponible (Saber más ...)
  • Resumen
    • Variables in many big-data settings are structured, arising, for example, from measurements on a regular grid as in imaging and time series or from spatial-temporal measurements as in climate studies. Classical multivariate techniques ignore these structural relationships often resulting in poor performance. We propose a generalization of principal components analysis (PCA) that is appropriate for massive datasets with structured variables or known two-way dependencies. By finding the best low-rank approximation of the data with respect to a transposable quadratic norm, our decomposition, entitled the generalized least-square matrix decomposition (GMD), directly accounts for structural relationships. As many variables in high-dimensional settings are often irrelevant, we also regularize our matrix decomposition by adding two-way penalties to encourage sparsity or smoothness. We develop fast computational algorithms using our methods to perform generalized PCA (GPCA), sparse GPCA, and functional GPCA on massive datasets. Through simulations and a whole brain functional MRI example, we demonstrate the utility of our methodology for dimension reduction, signal recovery, and feature selection with high-dimensional structured data. Supplementary materials for this article are available online.


Fundación Dialnet

Mi Documat

Opciones de artículo

Opciones de compartir

Opciones de entorno