This sounds like profile likelihood. But a more interesting interpretation would be to separate easily simulated parameters from harder-to-simulate parameters and to replace the formers by their MAP, in order to facilitate the exploration of the posterior of the latters… Interesting, indeed!

]]>If you replace mean with mode, you get Besag’s interated conditional modes (ICM) algorithm that he developed in the context of Markov random fields.

]]>Through PMCMC [particle Markov chain Monte Carlo] sampling, we can separate the variables of interest into those which may be easily sampled by using traditional MCMC techniques and those which require a more specialized SMC approach. Consider for instance the use of simulated annealing in an SMC framework (Neal, 2001; Del Moral et al., 2006). Rather than ﬁnding the posterior maximum a posteriori estimate of all parameters, PMCMC sampling now allows practitioners to combine annealing with traditional MCMC methods to maximize over some dimensions simultaneously while exploring the full posterior in others.

It’d be interesting to study the properties of such an approach; as you say, it is perhaps closer to EM than MCMC.

]]>