mixed feelings
Two recent questions on X validated about mixtures:
- One on the potential negative explosion of the E function in the EM algorithm for a mixture of components with different supports: “I was hoping to use the EM algorithm to fit a mixture model in which the mixture components can have differing support. I’ve run into a problem during the M step because the expected log-likelihood can be [minus] infinite” Which mistake is based on a confusion between the current parameter estimate and the free parameter to optimise.
- Another one on the Gibbs sampler apparently failing for a two-component mixture with only the weights unknown, when the components are close to one another: “The algorithm works fine if σ is far from 1 but it does not work anymore for σ close to 1.” Which did not see a wide posterior as a possible posterior when both components are similar and hence delicate to distinguish from one another.
Leave a Reply