The director of marketing at my college, who also teaches a communication course, asked me to order The Misinformation Age: How False Beliefs Spread. It turns out this book discusses many of the things I’m studying in my “Science and the Media” course right now. The title might lead you to think this book is about fake news, but it goes far beyond that, covering the many ways that real information can be manipulated or shared selectively in ways that alters what people think.
A couple of things stood out for me as I read this book. First is something we talk about a lot in the library world — inaccurate or misleading information is a far greater problem than outright fake news. Really fake news — like the Pizzagate story — is often sensational and headline grabbing, and we usually indulge in some collective hand-wringing after one of these stories explodes. What’s more dangerous, and Cailin O’Connor and James Owen Weatherall detail this carefully in their book, is deliberate or even inadvertent spread of information that is factual but shared in ways that give people the wrong idea. For example, the tobacco industry knew it couldn’t undo or entirely discredit the research linking smoking and cancer, but decided on a different strategy, as summed up in a memo that O’Connor and Weatherall quote: “Doubt is our product since it is the best means of competing with the ‘body of fact’ that exists in the mind of the public.”
In other words, the tobacco companies not only didn’t care that their product caused cancer, they also worked to make the public doubt the truth, so they could go on selling cigarettes. The book goes into a fair bit of detail about their misinformation campaigns. It wasn’t all through advertising — they recruited scientists to do research and then shared only what they wanted to from the results. So it wasn’t untrue, but highly selective, and it deluded people into thinking smoking was healthier than it is.
That is the root of The Misinformation Age. O’Connor and Weatherall share mathematical models that explain how scientists and others share and assess information. The way we do this — ostensibly to get to the most accurate view we can of something — is informed by a number of psychological tendencies related to how we decide who and what to trust. When bad actors, like the tobacco industry, or other commercial or political operatives, interfere with the way we receive information, we sometimes never even have the chance to reach the right conclusions. The second thing that stood out for me as I read is that these social influences impact not only the public, but also experts in science and the media, often slowing, if not completely obscuring, these experts progress towards truth.
And this is a book that doesn’t shy away from the idea that there is such a thing as truth. O’Connor and Weatherall are philosophers of science, so they come from a science perspective, but it’s worth remembering that many fields also boil down to this: there are facts, which encompass what happened, when, where, and with whom, which can be measured, quantified, described and verified. And then there is how we view the facts. Truth is the raw material, and our conclusions can contain the truth but are not themselves necessarily the truth. So when we take in only selected facts, or facts that have been manipulated to help us reach a particular conclusion, or facts produced in a particular way to benefit a particular person, group, commercial or political entity, we will form views based in only part of the truth. Online media (both traditional and social) makes it very easy to package truth according to a particular frame or value and share it widely.
And that is much harder to fight than “fake” news. As O’Connor and Weatherall note, “Merely sussing out industrial or political funding or influence in the production of science is not sufficient. We also need to be attuned to how science is publicized and shared.” This means watching out for balance bias: “If journalists make efforts to be ‘fair’ by presenting results from two sides of scientific debate, they can bias what results the public sees in deeply misleading ways.” I recently gave up listening to national NPR coverage because I’d had it with how often someone is invited on air who has prepared talking points that are not based in fact, and then is allowed to say those things without the reporter or host being able to note that the view expressed is unsubstantiated.
Las week I did tune in to an NHPR show, The Exchange, to hear a show on vaccinations. I was delighted that the host and the panel responded to uninformed callers the way media should — they acknowledged that the anti-vaxx view exists, and then calmly and factually explained why it is unsubstantiated. The host of the show even responded to a caller who claimed the show was on- sided by noting that because of the level of consensus among medical professionals that vaccination is safe, effective, saves lives, and eradicates disease, it would be wrong to present “both sides” as if they are equal. This is responsible media. Especially in reporting science, rather than creating the false impression that all theories have merit, the media should explain when a consensus has been reached, how certain it is, and what conclusions can be drawn, even if it means discrediting views that aren’t evidence-based.
I can’t recommend this book highly enough. It’s a tough read, and you’ll be angry when you’re through — after all, you are part of this: “Public beliefs are often worse than ignorant: they are actively misinformed and manipulated.” But you may feel better equipped to seek evidence and resist misinformation, which is good for all of us, after reading this well-documented, well-reasoned book.
Read Full Post »