An operation þ applies to all pairs of natural integers with the properties

0* þ (*a+1) = (0 *þ a)+1, (a+1) þ (b+1)=(a þ b)+1, 271 þ 287 = 77777, 2018 þ 39 = 2018×39*

Find the smallest integer d>287 such that there exists c<d leading to c þ d = c x d, the smallest integer f>2017 such that 2017 þ f = 2017×40. Is there any know integer f such that f þ 2017 = 40×2017?

The major appeal in this puzzle (where no R programming seems to help!) is that the “data” does not completely defines the operation * þ *! Indeed, when a<b, it is straightforward to deduce that a* þ *b = (0* þ *0)+b, hence solving the first two questions by deriving (0* þ *0)=270×287 [with d=2×287 and f=2017×40-270×287], but the opposed quantity b* þ *a is not defined, apart from (2018-39)* þ *0. This however brings a resolution since

(2018-39) *þ *0 = 2017×39 and (2018-39+2017) *þ 2*017 = 2017×39+2017 = 2017×40

leading to f=2018-39+2017=3996.

Filed under: Books, Kids Tagged: competition, Le Monde, mathematical puzzle, number theory, R, simulation ]]>

gainz=function(b,c,T=1e4,type="raw"){ x=matrix(runif(4*T),ncol=4) maz=t(apply(x,1,cummax)) zam=t(apply(x[,4:1],1,cummax)) if (type=="raw"){return(mean( ((x[,2]>b*x[,1])*x[,2]+ (x[,2]<b*x[,1])*((x[,3]>c*maz[,2])*x[,3]+ (x[,3]<c*maz[,2])*x[,4]))/maz[,4]))} if (type=="global"){return(mean( ((x[,2]>b*x[,1])*(x[,2]==maz[,4])+ (x[,2]<b*x[,1])*((x[,3]>c*maz[,2])*(x[,3]==maz[,4])+ (x[,3]<c*maz[,2])*(x[,4]==maz[,4])))))} if (type=="remain"){return(mean( ((x[,2]>b*x[,1])*(x[,2]==zam[,3])+ (x[,2]<b*x[,1])*((x[,3]>c*maz[,2])*(x[,3]==zam[,2])+ (x[,3]<c*maz[,2])*(x[,4]==zam[,2])))))}}

where the data is generated from a U(0,1) distribution as the loss functions are made scaled free by deciding to always sacrifice the first draw, x¹. This function is to be optimised in (b,c) and hence I used a plain vanilla simulated annealing R code:

avemale=function(T=3e4,type){ b=c=.5 maxtar=targe=gainz(b,c,T=1e4,type) temp=0.1 for (t in 1:T){ bp=b+runif(1,-temp,temp) cp=c+runif(1,-temp,temp) parge=(bp>0)*(cp>0)*gainz(bp,cp,T=1e4,type) if (parge>maxtar){ b=bs=bp;c=cs=cp;maxtar=targe=parge}else{ if (runif(1)<exp((parge-targe)/temp)){ b=bp;c=cp;targe=parge}} temp=.9999*temp} return(list(bs=bs,cs=cs,max=maxtar))}

with outcomes

- b=1, c=.5, and optimum 0.8 for the raw type
- b=c=1 and optimum 0.45 for the global type
- b undefined, c=2/3 and optimum 0.75 for the remain type

Filed under: Statistics Tagged: mathematical puzzle, R, simulated annealing, The Riddler ]]>

Filed under: pictures, Running, Travel Tagged: jatp, medieval architecture, Normandy, Rouen ]]>

While the prior distribution (of the weights) of the Dirichlet mixture process is easy to generate via the stick breaking representation, the posterior distribution is trickier as the weights are multiplied by the values of the sampling distribution (likelihood) at the corresponding parameter values and they cannot be normalised. Introducing a uniform to replace all weights in the mixture with an indicator that the uniform is less than those weights corresponds to a (latent variable) completion [or a demarginalisation as we called this trick in Monte Carlo Statistical Methods]. As elaborated in the paper, the Gibbs steps corresponding to this completion are easy to implement, involving only a finite number of components. Meaning the allocation to a component of the mixture can be operated rather efficiently. Or not when considering that the weights in the Dirichlet mixture are not monotone, hence that a large number of them may need to be computed before picking the next index in the mixture when the uniform draw happens to be quite small.

Filed under: Books, Statistics, University life Tagged: Communications in Statistics, Dirichlet process, Gibbs sampler, intractability, MCMC, Monte Carlo Statistical Methods, normalising constant, slice sampling ]]>

Filed under: Wines Tagged: French wines, Paris, zinfandel ]]>

Filed under: Statistics Tagged: academic journals, CBGP, INRA, INRIA, Montpellier, Nature, Peer Community, peer review, PIC Evol Biol, refereeing, Synlett ]]>

Filed under: Statistics Tagged: Cédric Villani, Emmanuel Macron, Field medal, French elections, French mathematicians, Le Monde, Orsay ]]>

Filed under: Statistics Tagged: bouncy particle sampler ]]>

Filed under: Kids, pictures, Running, Travel Tagged: 10k, Caen, D-Day beaches, Grand Master, La Prairie, La Rochambelle, Les Courants de la Liberté, Normandie Course à Pied, Normandy, road races, veteran (V2) ]]>

The book is more interesting as a dystopia on electoral systems and the way the information revolution can produce a step back in democracy, with the systematisation of fake news and voters’ manipulation, where the marketing research group YouGov has become a party, than as a science-fiction (or politics-fiction) book. Indeed, it tries too hard to replicate The cyberpunk reference, William Gibson’s Neuromancer, with the same construct of interlacing threads, the same fascination for Japan, airports, luxury hotels, if not for brands, and a similar ninja-geek pair of characters. And with very little invention about the technology of the 21st Century. (And a missed opportunity to exploit artificial intelligence themes and the prediction of outcomes when *Information* builds a fake vote database but does not seem to mind about Benford’s Law.) The acknowledgement section somewhat explains this imbalance, in that the author worked many years in humanitarian organisations and is currently completing a thesis at Science Po’ (Paris).

Filed under: Books, Travel Tagged: Benford's Law, book review, cyberpunk literature, elections, Neuromancer, Paris, Science Po', Tokyo, William Gibson, YouGov ]]>