Can factual information change people's minds? Most people assume the answer is "yes." After all, if people believe something that isn't true then exposing them to the truth should cause them to abandon their beliefs, right?
Wrong. There's plenty of evidence that life is much more complicated. An interesting posting on MotherJones.com entitled The Backfire Effect, alerts us to a study suggesting that knowledge may even have the opposite effect to what you expect. (Hat Tip: Canadian Cynic)
The paper is here.
The persistence of political misperceptions
Brendan Nyhan and Jason Reifler
The authors review the literature and conclude that substantial numbers of people are quite resistant to facts when they hold strong opinions. Surprisingly, some people actually become more convinced they are right after hearing facts that contradict their belief. This phenomenon, called "The Backlash Effect," is actually familiar to us in another context. One example given in the paper is, "... that hearing a Democrat argue against using military force in some cases causes Republicans to become more supportive of doing so."
In one of the studies conducted by Nyhan and Reifler, students were divided into two groups. Both groups read a news report quoting from a speech by President Bush in October 2004—six months after the invasion of Iraq. One group read an extended news report describing the Duelfer Report, which all but proved that Iraq did not have weapons of mass destruction before the war began. The additional information counts as "the correction."
Students were then asked whether they agreed with the following statement.
Immediately before the U.S. invasion, Iraq had an active weapons of mass destruction program, the ability to produce these weapons, and large stockpiles of WMD, but Saddam Hussein was able to hide or destroy these weapons right before U.S. forces arrived.Here's the result ...
For very liberal subjects, the correction worked as expected, making them more likely to disagree with the statement that Iraq had WMD compared with controls. The correction did not have a statistically significant effect on individuals who described themselves as liberal, somewhat left of center, or centrist. But most importantly, the effect of the correction for individuals who placed themselves to the right of center ideologically is statistically significant and positive. In other words, the correction backfired – conservatives who received a correction telling them that Iraq did not have WMD were more likely to believe that Iraq had WMD than those in the control condition.I'm pretty skeptical about these sorts of studies because there are so many variables and the sample sizes are quite small. Nevertheless, this "backfire effect" makes some sense given my own experience in trying to debate various issues.
I exhibit it myself sometimes. Faced with opponents who are vigorously disagreeing with me I can feel myself being driven to a hardened, more extreme, position than I would otherwise hold. In other words, when presented with uncomfortable facts that contradict my point of view, I sometimes work even harder to refute or rationalized those facts. That's more comforting than being forced to admit I'm wrong.
My opponents do this too. In fact, they do it far more often than I do because they are far more likely to start off being wrong.
The bottom line is that you have to be careful to remain objective in the face of factual information. Be prepared to re-evaluate your position if the facts are against you. And don't assume that your opponents will be swayed by correcting their misperceptions. That's only the first step toward changing their minds.
The paper goes on to describe other studies and it discusses possible explanations. It's a good read and I recommend it.
3 comments :
Nowadays just agreeing on the facts is a big problem. Disinformation campaigns make up a respectable percentage of websites, so the internet and mas media is a source of enormous amounts of false and distorted information.
I read the article and liked it. Here are a couple of logical errors I sometimes make:
1. Sometimes a person's position can be right but the argument that they are advancing is completely wrong. I sometimes let a bad argument give me a reason for rejecting the position itself.
2. Sometimes I am tempted to disagree because I don't like the person making the argument. On occasion, an idiot is correct about something.
Would republicans still behave like this with respect to information coming from democrats and the opposite if there were no republicans and democrats (the nonsensical nature of that statement aside). In other words, if we eliminate the totally irrelevant with respect to actual issues we have to solve division lines along things with purely imaginary significance (like politics), would this till be the case?
Post a Comment