Thanks for the response, much appreciated. I’ve been working with geometric mixtures (in the context of logarithmic pooling) and was wondering if we could use them in your framework. But you’re right: linear pools (aka arithmetic mixtures) are more tractable for model choice and testing. Besides, the interpretation of alpha seems to me to be more straightforward.

Best,

Luiz

]]>’tis true that geometric mixtures of exponential family densities are again from exponential families. This does not always mean the result is a closed form normalising constant, except when both components are from the same exponential family. And in any case my point is more that, on a general basis, when considering testing and model choice, the arithmetic mixture is always manageable.

]]>Pretty much what I thought. Thanks for clarifying. If the densities involved are from the exponential family, it’s possible to show that the resulting mixture will also be an exponential family density. For simple cases such as Gaussian, gamma and beta the resulting mixture will be in the same family. Do you think then that in these cases geometric mixtures would be worthy looking at?

Cheers,

Luiz

]]>If I take a geometric mixture of two arbitrary densities, this function requires a normalising constant to be a probability density (i.e., to integrate to one). In most cases, the constant cannot be derived analytically, which is what I call an intractable constant.

]]>When in the paper you say that the normalising constant is intractable for geometric mixtures, what do you mean exactly?

Best,

Luiz

]]>