Slowly cooked pulled pork with a hellish amount of red peppers, meaning I ended up eating most of it by myself over a few days. Tried cauliflower risotto, and liked it. Took my mom to a nice restaurant in Caen, À Contre Sens, after an oyster breakfast with her on the quays of a nearby Channel harbour, with a surprise lunch based on local (Norman) products. Finding hardly anyone in the restaurant due to COVID regulations made the experience even more enjoyable. And such a difference from the previous Michelin we sampled this summer!

Wasted hours watching the US presidential vote counting slowly unraveling, computing & recomputing from the remaining ballots the required percentage of Biden’s votes towards catching up, and refreshing my NYT & Fivethirtyeight webpages way too often. And remain fazed by an electoral system stuck in a past when less than 50,000 men elected George Washington.

Cleaned up our vegetable patch after collecting the last tomatoes, pumpkins, and peppers. And made a few jars of green tomato jam, albeit not too sweet to be used as chutney!

Watched the TV series *The Boys*, after reading super-positive reviews in Le Monde and other journals. Which is a welcome satire on the endless sequence of super-heroes movies and series, by simply pushing on the truism that with super-powers does not come super-responsibility. Or even the merest hint of ethics. Plus some embarrassing closeness with the deeds and sayings of the real Agent Orange. Among the weaknesses, a definitive excess of blood and gore, ambiguous moral stands of the [far from] “good” guys who do not mind shooting sprees in the least, and some very slow episodes. Among the top items, the boat-meet-whale incident, “Frenchie” from Marseille almost managing a French accent when speaking some semblance of French, and Karl Urban’s maddening accent that’s a pleasure to listen even when I understand a sentence out of two, at best.

]]>

**A**venue de Wagram is one of the avenues leaving from Arc de Triomphe in Paris, named after a (bloody) Napoléonic battle (1809). This is also where I locked my bike today before joining my son for a quick lunch and where I found my back wheel completely dismantled when I came back! Not only the wheel had been removed from the frame, but the axle had been taken away, damaging the ball bearing… After much cursing, I looked around for the different pieces and remounted the wheel on the bike. The return home to the local repair shop was slower than usual as the wheel was acting as a constant brake. I am somewhat bemused at this happening in the middle of the day, on a rather busy street and at the motivation for it. Disgruntled third year student furious with the mid-term exam? Unhappy author after a Biometrika rejection?

Not a great week for biking since I also crashed last weekend on my way back from the farmers’ market when my pannier full of vegetables got caught in between the spokes. Nothing broken, apart from a few scratches and my cell phone screen… [Note: the title is stolen from Hugo’s Waterloo! Morne plaine!, a terrible and endless poem about the ultimate battle of Napoléon in 1815. With a tenth of the deaths at Wagram… Unsurprisingly, no Avenue de Waterloo leaves from Arc de Triomphe! ]

]]>[weirdly called *negative exponential* in the question] meaning the (minimal) sufficient statistic is made of the first order statistic and of the sample sum (or average), or equivalently

Finding the joint distribution of T is rather straightforward as the first component is a drifted Exponential again and the second a Gamma variate with n-2 degrees of freedom and the scale θ². (Devroye’s Bible can be invoked since the Gamma distribution follows from his section on Exponential spacings, p.211.) While the derivation of a function with constant expectation is straightforward for the alternate exponential distribution

since the ratio of the components of T has a fixed distribution, it proved harder for the current case as I was seeking a parameter free transform. When attempting to explain the difficulty on my office board, I realised I was seeking the wrong property since an expectation was enough. Removing the dependence on θ was simpler and led to

but one version of a transform with fixed expectation. This also led me to wonder at the range of possible functions of θ one could use as scale and still retrieve incompleteness of T. Any power of θ should work but what about exp(θ²) or sin²(θ³), i.e. functions for which there exists no unbiased estimator..?

]]>A question that came up when reading the paper with our PhD students is whether or not the coupled chains stay identical after meeting once. When facing two different targets this seems inevitable and indeed Lemma 2 seems to show that no. A strong lemma that does not [need to] state what happens outside the diagonal Δ.

One of the essential tricks is to optimise several kinds of maximal coupling, incl. one for the Bernoullesque choice of moving, as given on p.3.

Algorithm 1 came as a novelty to me as it first seemed (to me!) the two chains may never meet, but this was before I read the small prints of the transition (proposal) kernel being maximally coupled with itself. While Algorithm 2 may be the earliest example of Metropolis-Hastings coupling I have seen, namely in 1999 in Crete, in connection with a talk by Laird Breyer and Gareth Roberts at a workshop of our ESSS network. As explained by the authors, this solution is not always a maximal coupling for the reason that

min(q¹.q²) min(α¹,α²) ≤ min(q¹α¹,q²α²)

(with q for the transition kernel and α for the acceptance probability). Lemma 1 is interesting in that it describes the probability to un-meet (!) as the surface between one of the move densities and the minimum of the two.

The first solution is to couple by plain Accept-Reject with the first chain being the proposed value and if rejected [i.e. not in C] to generate from the remainder or residual of the second target, in a form of completion of acceptance-rejection (accept when *above* rather than *below*, i.e. in A or A’). This can be shown to be a maximal coupling. Another coupling using reflection residuals works better but requires some spherical structure in the kernel. A further coupling on the acceptance of the Metropolis-Hastings move seems to bring an extra degree of improvement.

In the introduction, the alternatives about the acceptance probability α(·,·), e.g. Metropolis-Hastings versus Barker, are mentioned but would it make a difference to the preferred maximal coupling when using one or the other?

A further comment is that, in larger dimensions, I mean larger than one!, a Gibbsic form of coupling could be considered. In which case it would certainly decrease the coupling probability but may still speed up the overall convergence by coupling more often. See “maximality is sometimes less important than other properties of a coupling, such as the contraction behavior when a meeting does not occur.” (p.8)

As a final pun, I noted that *Vaserstein* is not a typo, as Leonid Vaseršteĭn is a Russian-American mathematician, currently at Penn State.

Give the maximum integer that cannot be written as 105x+30y+14z. Same question for 105x+70y+42z+30w.

These are indeed Diophantine equations and the existence of a solution is linked with Bézout’s Lemma. Take the first equation. Since 105 and 30 have a greatest common divisor equal to 3×5=15, there exists a pair (x⁰,y⁰) such that

105 x⁰ + 30 y⁰ = 15

hence a solution to every equation of the form

105 x + 30 y = 15 a

for any relative integer a. Similarly, since 14 and 15 are co-prime,

there exists a pair (a⁰,b⁰) such that

15 a⁰ + 14 b⁰ = 1

hence a solution to every equation of the form

15 a⁰ + 14 b⁰ = c

for every relative integer c. Meaning 105x+30y+14z=c can be solved in all cases. The same result applies to the second equation. Since algorithms for Bézout’s decomposition are readily available, there is little point in writing an R code..! However, the original question must impose the coefficients to be positive, which of course kills the Bézout’s identity argument. Stack Exchange provides the answer as the linear Diophantine problem of Frobenius! While there is no universal solution for three and more base integers, Mathematica enjoys a FrobeniusNumber solver. Producing 271 and 383 as the largest non-representable integers. Also found by my R code

o=function(i,e,x){ if((a<-sum(!!i))==sum(!!e))sol=(sum(i*e)==x) else{sol=0 for(j in 0:(x/e[a+1]))sol=max(sol,o(c(i,j),e,x))} sol} a=(min(e)-1)*(max(e)-1)#upper bound M=b=((l<-length(e)-1)*prod(e))^(1/l)-sum(e)#lower bound for(x in a:b){sol=0 for(i in 0:(x/e[1]))sol=max(sol,o(i,e,x)) M=max(M,x*!sol)}

(And this led me to recover the earlier ‘Og entry on the coin problem! As of last November.) The published solution does not bring any useful light as to why 383 is the solution, except for demonstrating that 383 is non-representable and any larger integer is representable.

]]>