Archive for Pascal

Extending R

Posted in Books, Kids, R, Statistics with tags , , , , , , , , , , , , , , , , , on July 13, 2016 by xi'an

As I was previously unaware of this book coming up, my surprise and excitement were both extreme when I received it from CRC Press a few weeks ago! John Chambers, one of the fathers of S, precursor of R, had just published a book about extending R. It covers some reflections of the author on programming and the story of R (Parts 2 and 1),  and then focus on object-oriented programming (Part 3) and the interfaces from R to other languages (Part 4). While this is “only” a programming book, and thus not strictly appealing to statisticians, reading one of the original actors’ thoughts on the past, present, and future of R is simply fantastic!!! And John Chambers is definitely not calling to simply start over and build something better, as Ross Ihaka did in this [most read] post a few years ago. (It is also great to see the names of friends appearing at times, like Julie, Luke, and Duncan!)

“I wrote most of the original software for S3 methods, which were useful for their application, in the early 1990s.”

In the (hi)story part, Chambers delves into the details of the evolution of S at Bells Labs, as described in his [first]  “blue book” (which I kept on my shelf until very recently, next to the “white book“!) and of the occurrence of R in the mid-1990s. I find those sections fascinating maybe the more because I am somewhat of a contemporary, having first learned Fortran (and Pascal) in the mid-1980’s, before moving in the early 1990s to C (that I mostly coded as translated Pascal!), S-plus and eventually R, in conjunction with a (forced) migration from Unix to Linux, as my local computer managers abandoned Unix and mainframe in favour of some virtual Windows machines. And as I started running R on laptops with the help of friends more skilled than I (again keeping some of the early R manuals on my shelf until recently). Maybe one of the most surprising things about those reminiscences is that the very first version of R was dated Feb 29, 2000! Not because of Feb 29, 2000 (which, as Chambers points out, is the first use of the third-order correction to the Gregorian calendar, although I would have thought 1600 was the first one), but because I would have thought it appeared earlier, in conjunction with my first Linux laptop, but this memory is alas getting too vague!

As indicated above, the book is mostly about programming, which means in my case that some sections are definitely beyond my reach! For instance, reading “the onus is on the person writing the calling function to avoid using a reference object as the argument to an existing function that expects a named list” is not immediately clear… Nonetheless, most sections are readable [at my level] and enlightening about the mottoes “everything that exists is an object” and “everything that happens is a function” repeated throughout.  (And about my psycho-rigid ways of translating Pascal into every other language!) I obviously learned about new commands and notions, like the difference between

x <- 3

and

x <<- 3

(but I was disappointed to learn that the number of <‘s was not related with the depth or height of the allocation!) In particular, I found the part about replacement fascinating, explaining how a command like

diag(x)[i] = 3

could modify x directly. (While definitely worth reading, the chapter on R packages could have benefited from more details. But as Chambers points out there are whole books about this.) Overall, I am afraid the book will not improve my (limited) way of programming in R but I definitely recommend it to anyone even moderately skilled in the language.

Dennis Ritchie 1941-2011

Posted in Books, R, University life with tags , , , , , on October 29, 2011 by xi'an

I just got the “news” that Dennis Ritchie died, although this happened on October 12… The announcement was surprisingly missing from my information channels and certainly got little media coverage, compared with Steve Jobs‘ demise. (I did miss the obituaries in the New York Times and in the Guardian. The Economist has the most appropriate heading, printf(“goodbye, Dennis”); !!!) Still, Dennis Ritchie contributed to computer science to extents comparable to Steve Jobs’, if on a lesser commercial plane: he is a founding father of both the C language and the Unix operating system. I remember spending many days perusing over his reference book, The C programming language, co-written with Brian Kernighan. (I kept trying programming in C until Olivier Cappé kindly pointed out to me that I was merely translating my Pascal vision into C code, missing most of the appeal of the language!) And, of course, I also remember discovering Unix when arriving at Purdue as a logical and much more modern operating system: just tfour years after programming principal components on punched card and in SAS, this was a real shock! I took a few evening classes at Purdue run by the Computer Department and I still carry around the Purdue University UNIX Pocket Guide. Although I hardly ever use it, it is there on the first shelf on top of my desk… As is The C programming language even though I have not opened it in years!

So we (geeks, computer users, Linuxians, R users, …) owe a lot to Dennis Ritchie and it is quite sad both that he passed away by himself and that his enormous contribution was not better acknowledged. Thus, indeed,

for (i=0; i<ULONG_LONG_MAX; i++)
    printf("thanks a lot, Dennis")

The great’08 Pascal challenge

Posted in Statistics with tags , , , , , , , on October 8, 2008 by xi'an

In order to make advances in the processing of their datasets and experiments, and in the understanding of the fundamental parameters driving the general relativity model, cosmologists are lauching a competition called the great’08 challenge through the Pascal European network. Details about the challenge are available on an arXiv:0802.1214 document, the model being clearly defined from a statistical point of view as a combination of lensing shear (the phenomenon of interest) and of various (=three) convolution noises that make the analysis so challenging, and the date being a collection of images of galaxies. The fundamental problem is to identify a 2d-linear distortion applied to all images within a certain region of the space, up (or down) to a precision of 0.003, the distortion being identified by an isotonic assumption over the un-distrorted images. The solution must be efficient too in that it is to be tested on 27 million galaxies! A standard MCMC mixture analysis on each galaxy is thus unlikely to converge before the challenge is over, next April. I think the challenge is worth considering by statistical teams, even though this represents a considerable involvement over the next six months….

Follow

Get every new post delivered to your Inbox.

Join 1,068 other followers