The Royal Society reinvents statistics

I read this post by Nassim Nicholas Taleb yesterday morning, checked out the links and still can’t quite believe my eyes. Last December, the Royal Statistical Society announced its first ever International Statistic of the Year, part of “a new initiative that celebrates how statistics can help us better understand the world around us.” The winning statistic was 69:

This is the annual number of Americans killed, on average, by lawnmowers – compared to two Americans killed annually, on average, by immigrant Jihadist terrorists.

The figure was highlighted in a viral tweet this year from Kim Kardashian in response to a migrant ban proposed by President Trump; it had originally appeared in a Richard Todd article for the Huffington Post…

Todd and Kardashian’s use of these figures shows how everyone can deploy statistical evidence to inform debate and highlight misunderstandings of risk in people’s lives.

The Huffington Post article essentially claimed (inter alia) that lawnmowers are more of a danger to American lives than terrorists because the number of people killed by lawnmowers in the US was greater than the number killed by terrorists in 2005-14. According to the author’s estimates of the 10-year averages, 69 people per year were killed by lawnmowers, 31 by lightning, and 14 (if I understand his table correctly) by Islamic and “far right-wing” terrorists. Therefore, the author believes, “the odds are greater that you will be struck by lightning than… be killed by an ISIS terrorist.”

This isn’t valid statistical inference; I would call it a case of naive frequentism. N. N. Taleb has pointed out the weakness of this logic on YouTube, Twitter and Facebook. Queen Mary University professors Norman Fenton and Martin Neil (who are also co-founders of Agena) have produced a note explaining the fallacies in Todd’s reasoning in reasonably non-technical terms. Here’s their summary assessment:

Contrary to the statement in the Royal Statistical Society citation, the figures directly comparing numbers killed by lawnmower with those killed by Jihadist terrorists, do NOT ‘highlight misunderstandings of risk’ or ‘illuminate the bigger picture’. They do the exact opposite as we explain here.

Fenton and Neil, as well as Taleb, provide a number of good reasons why the statistical inference that so delighted the RSS makes no sense. I don’t have much to add except restate one of their obvious points. Even if terrorist attacks were caused by nature rather than human behavior, like major earthquakes or volcano eruptions, this reasoning would be comedy material. “In the past ten years, no one’s died in an earthquake around here – this area is safe, perfectly safe.”

So how come? Perhaps the seven-member judging panel was staffed with non-statisticians? Well, yes and no. Mona Chalabi is a “data journalist”; Mark Easton is a BBC editor; Ben Page, the CEO of Ipsos MORI, a corporate manager. Diane Coyle, as a Harvard-trained economist, and Jil Matheson, with decades of experience at government statistical agencies, cannot be ignorant of elementary inference errors, but it’s understandable how political considerations might have gotten the better of them.

This leaves two academically trained statisticians: chairman David Spiegelhalter, formerly a student of Adrian Smith at Nottingham, and Liberty Vittert, a lecturer at the University of Glasgow, where she received her PhD in math and statistics, having earned her undergrad degree from MIT. Dr. Vittert is the only person on the jury who seems to be actually doing research in her main field (Spiegelhalter is mostly a science popularizer these days). With this in mind, it’s depressing to read her exulting in Kim Kardashian’s statistical perspicacity:

Everyone on the panel was particularly taken by this statistic and its insight into risk – a key concept in both statistics and everyday life. When you consider that this figure was put into the public domain by Kim Kardashian, it becomes even more powerful because it shows anyone, statistician or not, can use statistics to illustrate an important point and illuminate the bigger picture.

I hope someone generalizes Poe’s law to cover these kinds of situations: you’d think this is a quote from The Onion. Outside their discipline, out in the real world, mathematicians can be as clueless, deluded and dishonest as anybody else. But this is Dr. Vittert’s core competency. Is her paean to fake stats a rite of passage of sorts? Does one have to incant nonsense to get tenure these days?

I would understand if the RSS had focused on another statistic – the number of Americans shot by other Americans, 1,737 per year on average. That’s 25 times the number of lawnmower deaths, and the nature of this risk is closer to the risk of being killed by terrorists. The Society’s choice would have held water, at least for some time. I’d still be in favor of the right to bear arms, but it’s a conviction that feeds on the visceral rather than the empirical.

3 Comments

  1. The comparison fails for me on the grounds that using a lawnmower is useful, and hence the risks associated with it are a trade-off, but there is no utility to be had from terrorists killing someone. It can be argued that terrorism is a risk associated with other, perhaps useful, things -such as immigration or relatively lax security controls – but then they should be comparing those things with lawnmowers.

    • They RSS probably didn’t mean to get into cost-benefit analysis. They were trying to say that the average American is more likely to be killed by a lawnmower than a terrorist. The “proof” is the average number of fatalities per year caused by lawnmowers and terrorists in 2005-14.

      Among the flaws in this argument is it gets destroyed by moving or extending this 10-year period. Kardashian quoted an article from 2016, when the most recents stats avaiable were for 2014. Move the window forward to 2008-17, and you’ll get a higher number. Expand it backward to include 2001, and you’ll get a totally different picture by orders of magnitude (3,000 victims by 15 years = 200 per year).

      Also, some stats are well predicted by their past frequencies, like the number of births and deaths n any given year. They aren’t very sensitive to short-term government policies, especially the number of deaths. Terrorist attacks are the opposite – the past is a poor guide for the future, and government policies matter a lot.

Leave a Reply

Your email address will not be published. Required fields are marked *