Ir al contenido

Documat


Resumen de Expectation propagation for likelihood-free inference

Simon Barthelmé, Nicolas Chopin

  • Many models of interest in the natural and social sciences have no closed-form likelihood function, which means that they cannot be treated using the usual techniques of statistical inference. In the case where such models can be efficiently simulated, Bayesian inference is still possible thanks to the approximate Bayesian computation (ABC) algorithm. Although many refinements have been suggested, ABC inference is still far from routine. ABC is often excruciatingly slow due to very low acceptance rates. In addition, ABC requires introducing a vector of �summary statistics� s(y), the choice of which is relatively arbitrary, and often require some trial and error, making the whole process laborious for the user. We introduce in this work the EP-ABC algorithm, which is an adaptation to the likelihood-free context of the variational approximation algorithm known as expectation propagation. The main advantage of EP-ABC is that it is faster by a few orders of magnitude than standard algorithms, while producing an overall approximation error that is typically negligible. A second advantage of EP-ABC is that it replaces the usual global ABC constraint ?s(y) - s(y)? ?, where s(y) is the vector of summary statistics computed on the whole dataset, by n local constraints of the form ?si(yi) - si(yi)? ? that apply separately to each data point. In particular, it is often possible to take si(yi) = yi, making it possible to do away with summary statistics entirely. In that case, EP-ABC makes it possible to approximate directly the evidence (marginal likelihood) of the model. Comparisons are performed in three real-world applications that are typical of likelihood-free inference, including one application in neuroscience that is novel, and possibly too challenging for standard ABC techniques.


Fundación Dialnet

Mi Documat