Academia.eduAcademia.edu

Outline

Bayesian model comparison with un-normalised likelihoods

2016, Statistics and Computing

Abstract

Models for which the likelihood function can be evaluated only up to a parameter-dependent unknown normalizing constant, such as Markov random field models, are used widely in computer science, statistical physics, spatial statistics, and network analysis. However, Bayesian analysis of these models using standard Monte Carlo methods is not possible due to the intractability of their likelihood functions. Several methods that permit exact, or close to exact, simulation from the posterior distribution have recently been developed. However, estimating the evidence and Bayes' factors (BFs) for these models remains challenging in general. This paper describes new random weight importance sampling and sequential Monte Carlo methods for estimating BFs that use simulation to circumvent the evaluation of the intractable likelihood, and compares them to existing methods. In some cases we observe an advantage in the use of biased weight estimates. An initial investigation into the theoretical and empirical properties of this class of methods is presented. Some support for the use of biased estimates is presented, but we advocate caution in the use of such estimates.

References (44)

  1. Alquier P, Friel N, Everitt RG, Boland A (2015) Noisy Monte Carlo: Convergence of Markov chains with approximate transition kernels. Statistics and Computing In press.
  2. Andrieu C, Roberts GO (2009) The pseudo-marginal ap- proach for efficient Monte Carlo computations. The An- nals of Statistics 37(2):697-725
  3. Andrieu C, Vihola M (2012) Convergence properties of pseudo-marginal Markov chain Monte Carlo algorithms. arXiv (1210.1484)
  4. Beaumont MA (2003) Estimation of population growth or decline in genetically monitored populations. Genetics 164(3):1139-1160
  5. Beskos A, Crisan D, Jasra A, Whiteley N (2011) Error Bounds and Normalizing Constants for Sequential Monte Carlo in High Dimensions. arXiv (1112.1544)
  6. Caimo A, Friel N (2011) Bayesian inference for exponential random graph models. Social Networks 33:41-55
  7. Chopin N (2002) A sequential particle filter method for static models. Biometrika 89(3):539-552
  8. Chopin N, Jacob PE, Papaspiliopoulos O (2013) SMC 2 : an efficient algorithm for sequential analysis of state space models. Journal of the Royal Statistical Society: Series B 75(3):397-426
  9. Del Moral P (2004) Feynman-Kac formulae: genealogical and interacting particle systems with applications. Probabil- ity and Its Applications, Springer, New York Del Moral P, Doucet A, Jasra A (2006) Sequential Monte Carlo samplers. Journal of the Royal Statistical Society: Series B 68(3):411-436
  10. Del Moral P, Doucet A, Jasra A (2007) Sequential Monte Carlo for Bayesian computation. Bayesian Statistics 8:115-148
  11. Didelot X, Everitt RG, Johansen AM, Lawson DJ (2011) Likelihood-free estimation of model evidence. Bayesian Analysis 6(1):49-76
  12. Drovandi CC, Pettitt AN, Lee A (2015) Bayesian indirect inference using a parametric auxiliary model. Statistical Science 30(1):72-95
  13. Everitt RG (2012) Bayesian Parameter Estimation for Latent Markov Random Fields and Social Networks. Journal of Computational and Graphical Statistics 21(4):940-960
  14. Fearnhead P, Papaspiliopoulos O, Roberts GO, Stuart AM (2010) Random-weight particle filtering of continuous time processes. Journal of the Royal Statistical Society Series B 72(4):497-512
  15. Friel N (2013) Evidence and Bayes factor estimation for Gibbs random fields. Journal of Computational and Graphical Statistics 22(3):518-532
  16. Friel N, Rue H (2007) Recursive computing and simulation- free inference for general factorizable models. Biometrika 94(3):661-672
  17. Girolami MA, Lyne AM, Strathmann H, Simpson D, Atchade Y (2013) Playing Russian Roulette with Intractable Like- lihoods. arXiv (1306.4032)
  18. Grelaud A, Robert CP, Marin JM (2009) ABC likelihood- free methods for model choice in Gibbs random fields. Bayesian Analysis 4(2):317-336
  19. Johndrow JE, Mattingly JC, Mukherjee S and Dunson D (2015) Approximations of Markov Chains and High- Dimensional Bayesian Inference. arXiv (1508.03387)
  20. Klaas M, de Freitas N, Doucet A (2005) Toward practical N 2 Monte Carlo: The marginal particle filter. In: Proceedings of the 20th International Conference on Uncertainty in Artificial Intelligence
  21. Kong A, Liu JS, Wong WH (1994) Sequential imputations and Bayesian missing data problems. Journal of the American Statistical Association 89(425):278-288
  22. Lee A, Whiteley N (2015) Variance estimation and allocation in the particle filter arXiv (2015.0394)
  23. Marin JM, Pillai NS, Robert CP, Rousseau J (2014) Rele- vant statistics for Bayesian model choice. Journal of the Royal Statistical Society: Series B (Statistical Methodol- ogy) 76(5):833-859
  24. Marjoram P, Molitor J, Plagnol V, Tavare S (2003) Markov chain Monte Carlo without likelihoods. Proceedings of the National Academy of Sciences of the United States of America 100(26):15,324-15,328
  25. Meng Xl, Wong WH (1996) Simulating ratios of normalizing constants via a simple identity: a theoretical exploration. Statistica Sinica 6:831-860
  26. Møller J, Pettitt AN, Reeves RW, Berthelsen KK (2006) An efficient Markov chain Monte Carlo method for distribu- tions with intractable normalising constants. Biometrika 93(2):451-458
  27. Murray I, Ghahramani Z, MacKay DJC (2006) MCMC for doubly-intractable distributions. In: Proceedings of the 22nd Annual Conference on Uncertainty in Artificial In- telligence (UAI), pp 359-366
  28. Neal RM (2001) Annealed importance sampling. Statistics and Computing 11(2):125-139
  29. Neal RM (2005) Estimating Ratios of Normalizing Constants Using Linked Importance Sampling. arXiv (0511.1216)
  30. Nicholls GK, Fox C, Watt AM (2012) Coupled MCMC With A Randomized Acceptance Probability. arXiv (1205.6857)
  31. Peters GW (2005) Topics in Sequential Monte Carlo Sam- plers. M.Sc. thesis, Unviersity of Cambridge
  32. Picchini U, Forman JL (2013) Accelerating inference for diffu- sions observed with measurement error and large sample sizes using Approximate Bayesian Computation: A case study. arXiv (1310.0973)
  33. Prangle D, Fearnhead P, Cox MP, Biggs PJ, French NP (2014) Semi-automatic selection of summary statistics for ABC model choice. Statistical Applications in Genetics and Molecular Biology 13(1):67-82
  34. Rao V, Lin L, Dunson DB (2013) Bayesian inference on the Stiefel manifold. arXiv (1311.0907)
  35. Robert CP, Cornuet JM, Marin JM, Pillai NS (2011) Lack of confidence in approximate Bayesian computation model choice. Proceedings of the National Academy of Sciences of the United States of America 108(37):15,112-7
  36. Schweinberger M, Handcock M (2015) Local dependence in random graph models: characterization, properties and statistical inference. Journal of the Royal Statistical So- ciety: Series B In press.
  37. Sisson SA, Fan Y, Tanaka MM (2007) Sequential Monte Carlo without likelihoods. Proceedings of the National Academy of Sciences of the United States of America 104(6):1760-1765
  38. Skilling J (2006) Nested sampling for general Bayesian com- putation. Bayesian Analysis 1(4):833-859
  39. Tavaré S, Balding DJ, Griffiths RC, Donnelly PJ (1997) Infer- ring Coalescence Times From DNA Sequence Data. Ge- netics 145(2):505-518
  40. Tran MN, Scharth M, Pitt MK, Kohn R (2013) IS 2 for Bayesian inference in latent variable models. arXiv (1309.3339)
  41. Whiteley N (2013) Stability properties of some particle filters. Annals of Applied Probability 23(6):2500-2537
  42. Wilkinson RD (2013) Approximate Bayesian computation (ABC) gives exact results under the assumption of model error. Statistical Applications in Genetics and Molecular Biology 12(2):129-141
  43. Wood SN (2010) Statistical inference for noisy nonlinear eco- logical dynamic systems. Nature 466(August):1102-1104
  44. Zhou Y, Johansen AM, Aston JAD (2015) Towards auto- matic model comparison: An adaptive sequential Monte Carlo approach. Journal of Computational and Graphical Statistics In press.