Archive for climate change

agent-based models

Posted in Books, pictures, Statistics with tags , , , , , , , , on October 2, 2018 by xi'an

An August issue of Nature I recently browsed [on my NUS trip] contained a news feature on agent- based models applied to understanding the opioid crisis in US. (With a rather sordid picture of a drug injection in Philadelphia, hence my own picture.)

To create an agent-based model, researchers first ‘build’ a virtual town or region, sometimes based on a real place, including buildings such as schools and food shops. They then populate it with agents, using census data to give each one its own characteristics, such as age, race and income, and to distribute the agents throughout the virtual town. The agents are autonomous but operate within pre-programmed routines — going to work five times a week, for instance. Some behaviours may be more random, such as a 5% chance per day of skipping work, or a 50% chance of meeting a certain person in the agent’s network. Once the system is as realistic as possible, the researchers introduce a variable such as a flu virus, with a rate and pattern of spread based on its real-life characteristics. They then run the simulation to test how the agents’ behaviour shifts when a school is closed or a vaccination campaign is started, repeating it thousands of times to determine the likelihood of different outcomes.

While I am obviously supportive of simulation based solutions, I cannot but express some reservation at the outcome, given that it is the product of the assumptions in the model. In Bayesian terms, this is purely prior predictive rather than posterior predictive. There is no hard data to create “realism”, apart from the census data. (The article also mixes the outcome of the simulation with real data. Or epidemiological data, not yet available according to the authors.)

In response to the opioid epidemic, Bobashev’s group has constructed Pain Town — a generic city complete with 10,000 people suffering from chronic pain, 70 drug dealers, 30 doctors, 10 emergency rooms and 10 pharmacies. The researchers run the model over five simulated years, recording how the situation changes each virtual day.

This is not to criticise the use of such tools to experiment with social, medical or political interventions, which practically and ethically cannot be tested in real life and working with such targeted versions of the Sims game can paradoxically be more convincing when dealing with policy makers. If they do not object at the artificiality of the outcome, as they often do for climate change models. Just from reading this general public article, I thus wonder at whether model selection and validation tools are implemented in conjunction with agent-based models…

and it only gets worse…

Posted in Kids, pictures with tags , , , , , , , , , , , , , , on October 6, 2017 by xi'an

“An internal Interior Department memo has proposed lifting restrictions on exploratory seismic studies in the Arctic National Wildlife Refuge, a possible first step toward opening the pristine wilderness area to oil and gas drilling.” NYT, Sept 17, 2017

“The Trump administration opened the door to allowing more firearms on federal lands. It scrubbed references to “L.G.B.T.Q. youth” from the description of a federal program for victims of sex trafficking. And, on the advice of religious leaders, it eliminated funding to international groups that provide abortion.” NYT, Sept 11, 2017

“On Aug. 18, the National Academies of Sciences, Engineering and Medicine received an order from the Interior Department that it stop work on what seemed a useful and overdue study of the health risks of mountaintop-removal coal mining.” NYT, Sept 9, 2017

“Last month the National Oceanic and Atmospheric Administration dissolved its 15-member climate science advisory committee, a panel set up to help translate the findings of the National Climate Assessment into concrete guidance for businesses, governments and the public.” NYT, Sept 9, 2017

Climate contrarians, like Trump’s EPA administrator Scott Pruitt and Energy Secretary Rick Perry, don’t understand how scientific research works. They are basically asking for a government handout to scientists to do what scientists are should already be doing. They are also requesting handouts for scientists who have been less successful in research and publications – a move antithetical to the survival of the fitness approach that has formed the scientific community for decades. ” The Guardian, Aug 31, 2017

sinking ever deeper in a bottomless pit…

Posted in Kids with tags , , , , on June 2, 2017 by xi'an

and it only gets worse…

Posted in Kids, pictures, Travel with tags , , , , , , , , , , on April 7, 2017 by xi'an

The State Department said on Monday it was ending U.S. funding for the United Nations Population Fund, the international body’s agency focused on family planning as well as maternal and child health in more than 150 countries.Reuters, April 3, 2017

“When it comes to science, there are few winners in US President Donald Trump’s first budget proposal. The plan, released on 16 March, calls for double-digit cuts for the Environmental Protection Agency (EPA) and the National Institutes of Health (NIH). It also lays the foundation for a broad shift in the United States’ research priorities, including a retreat from environmental and climate programmes.” Nature, March 16, 2017

“In light of the recent executive order on visas and immigration, we are compelled to speak out in support of our international members. Science benefits from the free expression and exchange of ideas. As the oldest scientific society in the United States, and the world’s largest professional society for statisticians, the ASA has an overarching responsibility to support rigorous and robust science. Our world relies on data and statistical thinking to drive discovery, which thrives from the contributions of a global community of scientists, researchers, and students. A flourishing scientific culture, in turn, benefits our nation’s economic prosperity and security. ​” ASA, March, 2017

Spring, already?!

Posted in pictures with tags , , , , , on February 17, 2014 by xi'an


simulating Nature

Posted in Books, Statistics with tags , , , , , , , , , , , , , , , on July 25, 2012 by xi'an

This book, Simulating Nature: A Philosophical Study of Computer-Simulation Uncertainties and Their Role in Climate Science and Policy Advice, by Arthur C. Petersen, was sent to me twice by the publisher for reviewing it for CHANCE. As I could not find a nearby “victim” to review the book, I took it with me to Australia and read it by bits and pieces along the trip.

“Models are never perfectly reliable, and we are always faced with ontic uncertainty and epistemic uncertainty, including epistemic uncertainty about ontic uncertainty.” (page 53)

The author, Arthur C. Petersen, was a member of the United Nations’ Intergovernmental Panel on Climate Change (IPCC) and works as chief scientist at the PBL Netherlands Environmental Assessment Agency. He mentions that the first edition of this book, Simulating Nature, has achieved some kind of cult status, while being now out of print,  which is why he wrote this second edition. The book centres on the notion of uncertainty connected with computer simulations in the first part (pages 1-94) and on the same analysis applied to the simulation of climate change, based on the experience of the author, in the second part (pages 95-178). I must warn the reader that, as the second part got too focussed and acronym-filled for my own taste, I did not read it in depth, even though the issues of climate change and of the human role in this change are definitely of interest to me. (Readers of CHANCE must also realise that there is very little connection with Statistics in this book or my review of it!) Note that the final chapter is actually more of a neat summary of the book than a true conclusion, so a reader eager to get an idea about the contents of the book can grasp them through the eight pages of the eighth chapter.

“An example of the latter situation is a zero-dimensional (sic) model that aggregates all surface temperatures into a single zero-dimensional (re-sic) variable of globally averaged surface temperature.” (page 41)

The philosophical questions of interest therein are that a computer simulation of reality is not reproducing reality and that the uncertainty(ies) pertaining to this simulation cannot be assessed in its (their) entirety. (This the inherent meaning of the first quote, epistemic uncertainty relating to our lack of knowledge about the genuine model reproducing Nature or reality…) The author also covers the more practical issue of the interface between scientific reporting and policy making, which reminded me of Christl Donnelly’s talk at the ASC 2012 meeting (about cattle epidemics in England). The book naturally does not bring answers to any of those questions, naturally because a philosophical perspective should consider different sides of the problem, but I find it more interested in typologies and classifications (of types of uncertainties, in crossing those uncertainties with panel attitudes, &tc.) than in the fundamentals of simulation. I am obviously incompetent in the matter, however, as a naïve bystander, it does not seem to me that the book makes any significant progress towards setting epistemological and philosophical foundations for simulation. The part connected with the author’s implication in the IPCC shed more light on the difficulties to operate in committees and panels made of members with heavy political agendas than on the possible assessments of uncertainties within the models adopted by climate scientists…With the same provision as above, the philosophical aspects do not seem very deep: the (obligatory?!) reference to Karl Popper does not bring much to the debate, because what is falsification to simulation? Similarly, Lakatos’ prohibition of “direct[ing] the modus tollens at [the] hard core” (page 40) does not turn into a methodological assessment of simulation praxis.

“I argue that the application of statistical methods is not sufficient for adequately dealing with uncertainty.” (page 18)

“I agree (…) that the theory behind the concepts of random and systematic errors is purely statistical and not related to the locations and other dimensions of uncertainty.” (page 55)

Statistics is mostly absent from the book, apart from the remark that statistical uncertainty (understood as the imprecision induced by a finite amount of data) differs from modelling errors (the model is not reality), which the author considers cannot be handled by statistics (stating that Deborah Mayo‘s theory of statistical error analysis cannot be extended to simulation, see the footnote on page 55). [In other words, this book has no connection with Monte Carlo Statistical Methods! With or without capitals… Except for a mention of `real’ random number generators on—one of many—footnotes on page 35.]  Mention is made of “subjective probabilities” (page 54), presumably meaning a Bayesian perspective. But the distinction between statistical uncertainty and scenario uncertainty which “cannot be adequately described in terms of chances or probabilities” (page 54) misses the Bayesian perspective altogether, as does the following sentence that “specifying a degree of probability or belief [in such uncertainties] is meaningless since the mechanism that leads to the events are not sufficiently known” (page 54).

“Scientists can also give their subjective probability for a claim, representing their estimated chance that the claim is true. Provided that they indicate that their estimate for the probability is subjective, they are then explicitly allowing for the possibility that their probabilistic claim is dependent on expert judgement and may actually turn out to be false.” (page 57)

In conclusion, I fear the book does not bring enough of a conclusion on the philosophical justifications of using a simulation model instead of the actual reality and on the more pragmatic aspects of validating/invalidating a computer model and of correcting its imperfections with regards to data/reality. I am quite conscious that this is an immensely delicate issue and that, were it to be entirely solved, the current level of fight between climate scientists and climatoskeptics would not persist. As illustrated by the “Sound Science debate” (pages 68-70), politicians and policy-makers are very poorly equipped to deal with uncertainty and even less with decision under uncertainty. I however do not buy the (fuzzy and newspeak) concept of “post-normal science” developed in the last part of Chapter 4, where the scientific analysis of a phenomenon is abandoned for decision-making, “not pretend[ing] to be either value-free or ethically neutral” (page 75).

Le Monde on E. Wegman

Posted in Statistics with tags , , , , , , on December 31, 2011 by xi'an

In addition to the solution to the wrong problem, Le Monde of last weekend also dedicated a full page of its Science leaflet to the coverage of Michael Mann’s hockey curve of temperature increase and the hard time he has been given by climato-skeptics since its publication in 1998… The page includes an insert on Ed Wegman’s 2006 [infamous] report for the U.S. Congress, amply documented on Andrew’s blog. And mentions the May 2011 editorial of Nature on the plagiarism investigation. (I reproduce it above as it is not available on the Le Monde website.)