I decided I would start writing this blog to highlight some of the misuses of statistical data found in journalistic articles, and give some ideas as to how to spot them. While I don’t have as much experience as other writers such as the Bad Science columnist Ben Goldacre and numerous other blogs (whose links will gradually begin appearing on this site in due course), I feel inspired enough by reading his book and enraged enough at the consequences and the symptoms of poor statistics use in the mainstream media that I want to join in and add my voice to the cause.
I have two major topics of interest:
- Public health (my professional interest – I’m currently doing a PhD on the social causes of differences in life expectancy)
- Free and open source software (a hobby of mine – though something I feel particularly strongly about).
I have noticed a number of parallels between the two subjects, in terms of the misuse of data to mislead the general public, along with a rather alarming problem in a wider societal inability to critically assess what numbers actually mean, and what they can tell us. Arguments are frequently made on the basis that “the numbers say it all” or “the stats speak for themselves”. This is infrequently (quite possibly, never) the case. I believe that this has come about from the generally held belief that anything to do with numbers involves lots of complicated maths that the average man on the street couldn’t possibly understand, but are somehow magically authoritative. This is, again, usually not the case.