Via the Cliscep post “Don’t call me an alarmist,” says alarmist, I landed on this livescience article: Treading the Fine Line Between Climate Talk and Alarmism. It is an op-ed written by Sarah E. Myhre about climate change communication and her wish not being called an alarmist.
One thing that caught my attention in the op-ed was this statement:
We would never fault an oncologist for informing patients about the cancer risks that come with smoking. Why would we expect Earth scientists to be any different, when we’re just as certain?
It is not clear from the text what we should expect from those Earth scientists. Luckily, the links goes to an article in Scientific American, titled “Climate Risks as Conclusive as Link between Smoking and Lung Cancer”. So apparently, she means that the Earth scientists know as much from climate risks as medical scientists about the link between smoking and lung cancer…
Looking at the data of the second experiment of the Neutralizing misinformation through inoculation paper, I came across something rather strange. The time to complete the survey was recorded also and some of the participants finished the survey in an incredibly short timespan.
Let me first explain how I got there. I incidentally stumbled on it by looking at something else that initially puzzled me. This is how the population of experiment 2 was selected as explained in the section “Participants”:
Participants (N = 400) were a representative U.S. sample, recruited through Qualtrics.com, based on U.S. demographic data on gender, age, and income in the same fashion as for Experiment 1 (49.2% female, average age M ≈ 43 years, SD ≈ 15 years). The sample delivered by Qualtrics comprised only participants who had successfully answered all attention filter items. None of the participants had participated in Experiment 1. Outliers in the time taken to complete the survey (n = 8) were eliminated according to the outlier labelling rule as in Experiment 1. The final sample of participants (N = 392) were randomly allocated to the four experimental conditions: control (n = 98), inoculation (n = 98), misinformation (n = 99), and inoculation+misinformation (n = 97).
As it is explained here, I understood that there were 400 participants and 8 of them were outliers (in the sense that it took them too long to complete the survey). Subtracting those eight outliers left them with 392 final participants who were randomly allocated to the four experimental conditions.
In the order as it is explained in that paragraph, it didn’t make much sense to me.
This is already the third post on the Neutralizing misinformation through inoculation paper of Cook, Lewandowsky and Ecker (2017). This post will focus on one paragraph in the general conclusions of this paper (my emphasis):
The ongoing focus on questioning the consensus, in concert with the gateway belief status of perceived consensus, underscores the importance of communicating the consensus [68,69]. However, positive consensus messaging is not sufficient, given recent findings that misinformation can undermine positive information about climate change [33,56]. As a complement to positive messages, inoculation interventions are an effective way to neutralize the influence of misinformation.
Although these are nice sounding conclusions, I have one problem with it: these could never ever be one of the conclusions that can be drawn from these two experiments described in the paper…
When I hear the complaint that there is a problem with false balance media coverage in the climate debate, then I am always surprised. False balance meaning both sides of the debate getting equal time, so the perception is that they are both equally likely, even when this is not the case. Personally, I am surprised because I hardly see this in practice. When there is a debate on climate change on for example our television, radio or newspapers, then it is between like-minded individuals from the alarmist side who might differ on some technicalities, but generally agree with each other.
Heck, when I look back to my believers years, then I can’t even remember who those “deniers” were or what they were standing for. If I heard a skeptical argument, then it came from an alarmist, who brought it in a denigrating way and then sabering it down. You know, THAT much balance.
This may differ in other countries of course. The Inoculation paper of Cook and Lewandowsky had one experiment devoted to false balance media coverage, so apparently, the authors considered it a big enough problem to devote one of the two experiments in their paper to it. I wondered in which country this is considered a problem. The study was approved by a Australian University, but the participants were recruited from the US population. Whatever the country may be, the skeptics there should be glad to have the opportunity to debate the other side on equal representation 😉
Experiment 1 in this paper investigates the effect of false balance on the perceived consensus of the participant, but also they investigated if it was possible to counteract this effect by “inoculating” the participants against it by showing in advance how a certain misconception works or by explaining in advance that there is a consensus.
There were 5 groups in this experiment:
John Cook and Stephan Lewandowsky (together with Ulrich Ecker) have released a new paper at the beginning of May 2017. It is called Neutralizing misinformation through inoculation: Exposing misleading argumentation techniques reduces their influence.
The paper is about (skeptical) “misconceptions” and how to “neutralize” them by means of the inoculation theory. It is a topic that I recognize. John Cook wrote about this several times in the past. I was rather weary hearing his argumentation back then and this time it is no different.
The paper is certainly more thoughtfully written than the Alice-in-Wonderland paper (from two of the authors), but reading it, it is my impression that this is not the work of neutral researchers. I noticed that already in the beginning when I read the abstract. This is how it starts:
Misinformation can undermine a well-functioning democracy. For example, public misconceptions about climate science can lead to lowered acceptance of the reality of climate change and lowered support for mitigation policies.