Contributing Editor David Hand explains that while statistics as a field is not impossible, it is often misunderstood:

The importance of public understanding of science, and of outreach activities more generally, is now widely accepted. Much research funding comes from public sources, so there’s an obligation to ensure that the money is well spent, and that people understand what they are getting in return. Beyond that, a well-informed population is necessary so that people understand the modern world and can make rational decisions about their lives.

However, we have to acknowledge that science is a rather esoteric activity. Contrary to what is believed in some quarters, scientists do not deliberately create technical jargon with the aim of obscuring their meaning to the non-cognoscenti. Rather, the technical jargon is created to reduce the risk of ambiguities and mistaken meanings, and to encapsulate complex concepts in concise terms.

Language difficulties aside, the fact is that science is difficult to understand. How could it be otherwise: if it were easy, it would not be necessary for researchers to devote their lives to it.

But this very difficulty, along with the necessarily complex language, does stimulate suspicions about scientific research. Some years ago, I led a discussion about research for an audience of interested laypeople at the Dana Centre in the UK. I was mainly talking about clinical trials, and I still recall the person who said they didn’t trust any published research results. “After all,” they said, “someone had to fund the research, and why would they do that unless they had a vested interest in the outcome?”

Clearly the key here is education. This point was also driven home by the results of a study exploring the public’s views on using administrative data for research purposes, carried out by Ipsos MORI for the UK’s Office for National Statistics and its Economic and Social Research Council. This study found that “participants generally had a very low initial awareness and understanding of social research … at the beginning of the dialogues, low awareness of the uses of social research drove scepticism about its value…. [Participants in the study] who started the day with low awareness of social research (and therefore low trust) found discussing the issues and speaking to experts interesting and reassuring.”

So the answer is education, enhanced public understanding, and so on. But the obstacle is that it’s tough. It’s one thing to find that members of the public who are prepared to spend a day listening to and talking to researchers grow in understanding, awareness, and appreciation of research, but we can hardly expect everyone to do so—even if we scientists had the time.

If these challenges are tough in science in general, they are particularly troublesome in statistics. This is not because statistics is a uniquely tough discipline—though tough it is. It’s because, on the one hand, many people must have a basic grasp of the subject (necessary in order to be able to carry out their jobs—statistics is almost uniquely ubiquitous) while on the other hand, a basic grasp may not do the job. Things are then further complicated by the ready availability of tools which permit people to appear to be able to undertake highly sophisticated analyses, regardless of whether they understand what they are doing. After all, a computer program just manipulates the numbers, in ignorance of what those numbers mean.

Take medical research, for example. There have been several studies investigating statistical errors in published medical research papers (with rather alarming results). In most cases, they are errors of the kind one would hope a professional statistician would avoid. A large part of the danger is the dumbing down of statistics, to simple, easily understood—and wrong—answers. Statistical ideas are often subtle: think of the Monty Hall problem, or concepts such as regression to the mean or Simpson’s paradox, for example. It can take extensive study to understand, appreciate, and avoid such misconceptions.

The underlying point is that statistical concepts require more careful study than most researchers in other domains, or people presented with statistics, have time for. The resulting misunderstandings help to promote mistrust of the discipline.

All of this makes me wonder if, for statistics, the emphasis should not be so much on public understanding, but public appreciation. That is, on trying to communicate a recognition of the importance of the discipline, coupled with an appreciation that one needs advanced technical skills to do it properly, rather than placing so much emphasis on (a doomed effort of) trying to equip people with the skills to do it. Perhaps we need a greater recognition of the fact that building valid understanding from data requires sophisticated expertise, of the kind possessed by statisticians, just as building successful rockets requires sophisticated expertise, of the kind possessed by rocket scientists.

rocket