This is an extended version of a comment I left on an article by science writer and physicist at Oxford University Dr David Robert Grimes in the Irish Times. The article is titled ‘Ideological fixations can lead people to believe what they want to believe’. In it, the author describes how quantitative information ‘can be distorted by people’s political bias’, and how ‘ we are quite statistically innumerate and easily misled’.
It’s unfortunate that the author -who denounces ‘poor statistical reasoning in media’- has become a factor in the social phenomenon he has set out to denounce. He has used dodgy evidence to support his claim of a ‘vast gulf’ between what the public thinks is true and what is true in fact. If such a ‘chasm’ is indeed vast, the author has just widened it.
Let’s ‘think like scientists’, and evaluate the evidence for the claims made in this article ‘in a critical and detached manner’, as the author recommends.
To support his claim that there is a ‘vast gulf between what is popularly accepted and what is objectively true’, the author says ‘almost half the country are (sic) under the impression that politicians receive the most money from the public purse’. This evidence is from an Irish Times survey question, which reads as follows:
‘Which of the following groups receive the most from the public purse in terms of direct payments to them?’
48% of those surveyed answered ‘politicians’. The Irish Times said the correct answer was ‘Welfare recipients’. The author agrees.
But there are also excellent grounds for answering ‘politicians’. The majority of welfare recipients and the majority of public servants receive a lot less in direct payments from the public purse than an elected TD. The question is unreliable, then, and so too is the relevant evidence provided by the author.
Why doesn’t such a question strike even people with considerable experience of scientific research as patently bogus? Why isn’t it obvious that such a question has been fashioned in order to confirm a particular conception of what and how the public thinks – indeed, what the public is?
It isn’t hard to imagine a plausible genesis of this question. Let’s momentarily cast scientific rigour to the wind, and speculate that it happened something like this. Political correspondent has conversations with politicians who complain to him about members of the public complaining how much they hate politicians, how overpaid they are, and so on and so forth. Political correspondent thinks some of the politicians he meets are dacent spuds and that the public, as is its wont, is being unreasonable. Political correspondent automatically dismisses possibility that calls for cuts to politician pay are based not on statistical innumeracy and ignorance, but a sense that someone who earns lots of money by everyday standards has no business taking decisions that impoverish those who are already struggling financially. Political correspondent thinks, what kind of a question can I come up with to measure the rabblement’s ignorance on this particular point? Political correspondent contacts polling company. Question is formulated.
Granted, it mightn’t have happened like this at all. It may just be that poorly formulated multiple choice guess questions with arbitrarily selected answers conform to the most respected scientific methods for establishing how well the public is informed. Which wouldn’t make such an approach any more reliable, but never mind.
The irony here is that the author’s dodgy evidence effectively supports his own claims about ‘our tendency to acknowledge evidence which agrees with our views and to disregard conflicting evidence’, and ‘ideological fixation’. If you’re convinced the public is poorly informed, you’ll seek out evidence that confirms this belief, even when the evidence doesn’t conform to the minimum standards of scientific rigour you might otherwise demand: what the author describes as ‘confirmation bias’.
The irony is compounded by the fact that his claims are published in the same organ that initially produced the dodgy evidence. It isn’t the author’s fault, but this is an organ which only yesterday published the headline that ‘Reports of welfare fraud’ –reports– had risen by 2500%. The detail of the article revealed that only 16 per cent of reports analysed –reports analysed, not reports on the whole- had led to payments being stopped.
In other words, 84% of welfare fraud reports analysed (there was no indication of the sample size) were based on nothing more than prejudice, a consideration that troubled neither the Department of Social Protection nor the Irish Times. (At the last time of checking, the report was also claiming that 8,350 was more than half of 21,000). By sheer coincidence, the same organ on the same day published a cotton wool-soft piece on the Minister for Social Protection, Joan Burton. Perhaps this tells us something about why the Irish Times might publish articles that confirm the view that the public is dangerously ignorant.