Neutralizing misinformation through inoculation: false balance and the perceived consensus

When I hear the complaint that there is a problem with false balance media coverage in the climate debate, then I am always surprised. False balance meaning both sides of the debate getting equal time, so the perception is that they are both equally likely, even when this is not the case. Personally, I am surprised because I hardly see this in practice. When there is a debate on climate change on for example our television, radio or newspapers, then it is between like-minded individuals from the alarmist side who might differ on some technicalities, but generally agree with each other.

Heck, when I look back to my believers years, then I can’t even remember who those “deniers” were or what they were standing for. If I heard a skeptical argument, then it came from an alarmist, who brought it in a denigrating way and then sabering it down. You know, THAT much balance.

This may differ in other countries of course. The Inoculation paper of Cook and Lewandowsky had one experiment devoted to false balance media coverage, so apparently, the authors considered it a big enough problem to devote one of the two experiments in their paper to it. I wondered in which country this is considered a problem. The study was approved by a Australian University, but the participants were recruited from the US population. Whatever the country may be, the skeptics there should be glad to have the opportunity to debate the other side on equal representation 😉

Experiment 1 in this paper investigates the effect of false balance on the perceived consensus of the participant, but also they investigated if it was possible to counteract this effect by “inoculating” the participants against it by showing in advance how a certain misconception works or by explaining in advance that there is a consensus.

There were 5 groups in this experiment:

  1. control group
    (not sure whether they got a text not related to climate change or a text related to climate change without “misinformation”)
  2. the group that was subjected to the false balance only
    (they got a mock news article that first featured scientists presenting research supporting AGW, followed by contrarian scientists rejecting AGW and proposing alternative explanations)
  3. the group that was “inoculated” before being subjected to the false balance
    (a textual explanation of the “false balance” strategy used by the tobacco industry to confuse the public about the level of scientific agreement by staging a fake debate)
  4. the group that got the message that there was a consensus between scientists before being subjected to the false balance
    (a text-only description of various studies reporting 97% scientific agreement on human-caused global warming)
  5. a group that both got the inoculation text and the message of consensus before being subjected to the false balance.

The group with the lowest perceived consensus was indeed the group that was subjected to the misinformation only (average perceived consensus on a scale of 0 → 100 is 63.5). That is not exactly surprising. That is how our brain works. We are sensitive to how information is brought, at least if we trust the source. If the issue is brought in an equal way, then that then will be our reality. I experienced that in my believer’s years when I accepted the view as it was presented in the media (which I believed without much questioning at that time).

The second group was the control group (68.9), closely followed by the inoculation group (70.0), the group that got both the inoculation and the consensus text (83.9) and the consensus text group did best (86.1).

The conclusion of experiment 1 was that:

In sum, the effect of false-balance media coverage had the greatest effect on perceived consensus among the various climate attitudes measured. However, a consensus message presented with the false-balance message was effective in increasing perceived consensus, thus neutralizing the negative influence of the misinformation. In addition, we found that an inoculation message was effective in neutralizing the effect of misinformation on perceived consensus.

I was surprised why the result of this consensus intervention was so downplayed. The consensus intervention and the combined consensus/inoculation intervention showed by far the best results, while the inoculation intervention (which is the focus of the paper) did only slighty better than the control group. Just looking at the data, it would be tempting to see the consensus intervention as the most promising. I have the impression that the authors didn’t expect this result (it was not confirmed by other research) and just considered it to be a neutralizing effect, like the other interventions.

Then there was this explanation of the inoculation technique in the general conclusion:

It is also noteworthy that the inoculations in this study did not mention the specific misinformation that was presented after the inoculation, but rather warned about misinformation in a broader sense by explaining the general technique being used to create doubt about an issue in the public’s mind. The purpose of this type of intervention is to stimulate critical thinking through the explanation of argumentative techniques, thus encouraging people to move beyond shallow heuristic-driven processing and engage in deeper, more strategic scrutinizing of the presented information. A consequence of this approach is that generally-framed inoculations could potentially neutralize a number of misleading arguments that employ the same technique or fallacy.

Hoooooooo, hold your horses! If I would get some text explaining a misleading technique and then later get presented a case that uses the very same technique, then chances are that I find the similarities between the two and potentially letting me come to the conclusion that the second case also uses this technique…

This has nothing to do with the inoculating message “stimulating critical thinking” or the person “engaging in deeper, more strategic scrutinizing of the presented information”. It is the (conscious or unconscious) link that I would make with the limited information that I received at that point. In that case the effect might not last very long (just as in a real inoculation).

It should also be indifferent to the message that is provided. This should also work when for example the participant would been “inoculated” with an explanatory message before providing some alarmist statement. It just depends on how the issue is framed.

This inoculation seems an awful lot like an exercise in planting suggestions in such a way that perceived consensus is guided into a certain direction.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s