Comments on: Bayesian computation: a summary of the current state, and samples backwards and forwards
https://xianblog.wordpress.com/2015/06/25/bayesian-computation-a-summary-of-the-current-state-and-samples-backwards-and-forwards/
an attempt at bloggin, nothing more...Fri, 26 Jun 2015 07:07:37 +0000
hourly
1 http://wordpress.com/
By: xi'an
https://xianblog.wordpress.com/2015/06/25/bayesian-computation-a-summary-of-the-current-state-and-samples-backwards-and-forwards/comment-page-1/#comment-99995
Fri, 26 Jun 2015 07:07:37 +0000http://xianblog.wordpress.com/?p=27765#comment-99995In reply to samuelwxy.

Thanks for the comments. Your paper is on my [long] reading list!

]]>
By: samuelwxy
https://xianblog.wordpress.com/2015/06/25/bayesian-computation-a-summary-of-the-current-state-and-samples-backwards-and-forwards/comment-page-1/#comment-99987
Fri, 26 Jun 2015 05:17:51 +0000http://xianblog.wordpress.com/?p=27765#comment-99987Thanks! This is a great summary! I have been looking forward to such comprehensive review on Big Bayesian computation for a while. I just had one quick comment on the parallel MCMC part as it is related to my work. Neiswanger et al. (2013) does not really solve the mixture component explosion issue, even they use the “one-at-a-time” trick. Supposing each subset posterior has 10,000 samples, their method needs to multiply at least two kernel densities, which already yields 10^8 mixture components, and each component is associated with a different weight. An independent Metropolis sampler works poorly in such situation. (It is likely to stuck at one “good” component and never move away, so works poorly for multi-modality). We recently wrote a new paper to fully address this component explosion issue by using multi-scale histograms (http://arxiv.org/pdf/1506.03164v1.pdf, comments are gratefully welcomed!). We had one example in the paper, where the posterior possesses two heterogeneous modes. It seems that Neiswanger et al. (2013) cannot allocate the correct probability mass to the two modes because of the aforementioned reason.
]]>