Normalising flows are a fun area, producing machine learning generative models with a rigorous probabilistic foundation. Another good blog post on them is:

http://akosiorek.github.io/ml/2018/04/03/norm_flows.html

The target distribution in the paper’s queueing distribution is the ABC posterior for a Gaussian ABC kernel. So the final approximate posterior (in Figure 3) is an ABC approximation, but for a very small value of the bandwidth parameter. The other example (sinusoidal) uses a different tempering scheme, but the the details are less important in the end as the algorithm ends up targeting the exact posterior.

As you note, the method is limited to fairly low dimensional target distributions (and also targets that factorise in a nice way – more on this in the next version!) Hopefully this can be improved to some extent in future by plugging in more advanced normalising flows. There’s also a lot of scope to improve a lot of other aspects of the algorithm. As you mention variance reduction of the gradient estimator would be great, and many other ideas could be adapted from SMC, adaptive importance sampling, the cross-entropy method etc

]]>The 54th qubit of the machine didn’t work properly so wasn’t used in the experiment.

For the random generation application, the idea is to use the output string of the quantum machine as an input to a randomness extractor, which will turn biased bits into a shorter random string approximately uniform (this also requires a short seed).

]]>Enseignant à Dauphine, vous devriez être au courant, non ? ;-]

]]>