Ir al contenido

Documat


Modelling covariance matrices by the trigonometric separation strategy with application to hidden Markov models

  • Luigi Spezia [1]
    1. [1] Biomathematics & Statistics Scotland
  • Localización: Test: An Official Journal of the Spanish Society of Statistics and Operations Research, ISSN-e 1863-8260, ISSN 1133-0686, Vol. 28, Nº. 2, 2019, págs. 399-422
  • Idioma: inglés
  • DOI: 10.1007/s11749-018-0580-8
  • Texto completo no disponible (Saber más ...)
  • Resumen
    • Bayesian inference on the covariance matrix is usually performed after placing an inverse-Wishart or a multivariate Jeffreys as a prior density, but both of them, for different reasons, present some drawbacks. As an alternative, the covariance matrix can be modelled by separating out the standard deviations and the correlations. This separation strategy takes advantage of the fact that usually it is more straightforward and flexible to set priors on the standard deviations and the correlations rather than on the covariance matrix. On the other hand, the priors must preserve the positive definiteness of the correlation matrix. This can be obtained by considering the Cholesky decomposition of the correlation matrix, whose entries are reparameterized using trigonometric functions. The efficiency of the trigonometric separation strategy (TSS) is shown through an application to hidden Markov models (HMMs), with conditional distributions multivariate normal. In the case of an unknown number of hidden states, estimation is conducted using a reversible jump Markov chain Monte Carlo algorithm based on the split-and-combine and birth-and-death moves whose design is straightforward because of the use of the TSS. Finally, an example in remote sensing is described, where a HMM containing the TSS is used for the segmentation of a multi-colour satellite image.


Fundación Dialnet

Mi Documat

Opciones de artículo

Opciones de compartir

Opciones de entorno