On Monday, I reported on the latest study to take a bite out of the idea of human rationality. In a paper just published in Pediatrics, Brendan Nyhan of Dartmouth University and his colleagues showed that presenting people with information confirming the safety of vaccines triggered a “backfire effect,” in which people who already distrusted vaccines actually became less likely to say they would vaccinate their kids.
Unfortunately, this is hardly the only example of such a frustrating response being documented by researchers. Nyhan and his coauthor, Jason Reifler of the University of Exeter, have captured several others, as have other researchers. Here are some examples:
1. Tax cuts increase revenue? In a 2010 study, Nyhan and Reifler asked people to read a fake newspaper article containing a real quotation of George W. Bush, in which the former president asserted that his tax cuts “helped increase revenues to the Treasury.” In some versions of the article, this false claim was then debunked by economic evidence: A correction appended to the end of the article stated that in fact, the Bush tax cuts “were followed by an unprecedented three-year decline in nominal tax revenues, from $2 trillion in 2000 to $1.8 trillion in 2003.” The study found that conservatives who read the correction were twice as likely to believe Bush’s claim was true as were conservatives who did not read the correction.
2. Death panels! Another notorious political falsehood is Sarah Palin’s claim that Obamacare would create “death panels.” To test whether they could undo the damage caused by this highly influential morsel of misinformation, Nyhan and his colleagues had study subjects read an article about the “death panels” claim, which in some cases ended with a factual correction explaining that “nonpartisan health care experts have concluded that Palin is wrong.” Among survey respondents who were very pro-Palin and who had a high level of political knowledge, the correction actually made them more likely to wrongly embrace the false “death panels” theory.
3. Obama is a Muslim! And if that’s still not enough, yet another Nyhan and Reifler study examined the persistence of the “President Obama is a Muslim” myth. In this case, respondents watched a video of President Obama denying that he is a Muslim or even stating affirmatively, “I am a Christian.” Once again, the correction—uttered in this case by the president himself—often backfired in the study, making belief in the falsehood that Obama is a Muslim worse among certain study participants. What’s more, the backfire effect was particularly notable when the researchers administering the study were white. When they were nonwhite, subjects were more willing to change their minds, an effect the researchers explained by noting that “social desirability concerns may affect how respondents behave when asked about sensitive topics.” In other words, in the company of someone from a different race than their own, people tend to shift their responses based upon what they think that person’s worldview might be.
4. The alleged Iraq-Al Qaeda link. In a 2009 study, Monica Prasad of Northwestern University and her colleagues directly challenged Republican partisans about their false belief that Iraq and Al Qaeda collaborated in the 9/11 attacks, a common charge during the Bush years. The so-called challenge interviews included citing the findings of the 9/11 Commission and even a statement by George W. Bush, asserting that his administration had “never said that the 9/11 attacks were orchestrated between Saddam and Al Qaeda.” Despite these facts, only 1 out of 49 partisans changed his or her mind after the factual correction. Forty-one of the partisans “deflected” the information in a variety of ways, and seven actually denied holding the belief in the first place (although they clearly had).
5. Global warming. On the climate issue, there does not appear to be any study that clearly documents a backfire effect. However, in a 2011 study, researchers at American and Ohio State universities found a closely related “boomerang effect.” In the experiment, research subjects from upstate New York read news articles about how climate change might increase the spread of West Nile Virus, which were accompanied by the pictures of the faces of farmers who might be affected. But in one case, the people were said to be farmers in upstate New York (in other words, victims who were quite socially similar to the research subjects); in the other, they were described as farmers from either Georgia or from France (much more distant victims). The intent of the article was to raise concern about the health consequences of climate change, but when Republicans read the article about the more distant farmers, their support for action on climate change decreased, a pattern that was stronger as their Republican partisanship increased. (When Republicans read about the proximate New York farmers, there was no boomerang effect, but they did not become more supportive of climate action either.)
Together, all of these studies support the theory of “motivated reasoning”: The idea that our prior beliefs, commitments, and emotions drive our responses to new information, such that when we are faced with facts that deeply challenge these commitments, we fight back against them to defend our identities. So next time you feel the urge to argue back against some idiot on the internet…pause, take a deep breath, and realize not only that arguing might not do any good, but that in fact, it might very well backfire.