Thursday, August 26, 2010

Backfire and Sentience

Last year I posted about Selective Inattention
To quote a pertinent part of that post
The more convinced one is of the validity of his position the less capable he is to accommodate contrary factors. Moreover, psychological and spiritual sanity require certainty on some issues as the basis for both clarity and security needed for examining others.

I came across an article by Joe Keohane "How Facts Backfire" that adds more to this subject. Even though Joe is talking about political beliefs, it also applies in the realm of religion.

“The general idea is that it’s absolutely threatening to admit you’re wrong,” says political scientist Brendan Nyhan, the lead researcher on the Michigan study. The phenomenon — known as “backfire” — is “a natural defense mechanism to avoid that cognitive dissonance.”

(Cognitive dissonance is an uncomfortable feeling caused by holding contradictory ideas simultaneously. The theory of cognitive dissonance proposes that people have a motivational drive to reduce dissonance." It is interesting to note that our pioneers are held up as examples in the Wikipedia article on cognitive dissonance "The Great Disappointment of 1844 is an example of cognitive dissonance in a religious context.")

Joe' had some very disturbing points to make (these were from recent research such as the Michigan study mentioned above):

Facts do not cure misinformation -
when misinformed people are presented with facts that correct their beliefs "they rarely changed their minds. In fact, they often became even more strongly set in their beliefs."

Our beliefs often dictate the facts we notice
Most of us like to believe that our opinions have been formed over time by careful, rational consideration of facts and ideas, and that the decisions based on those opinions, therefore, have the ring of soundness and intelligence. In reality, we often base our opinions on our beliefs, which can have an uneasy relationship with facts. And rather than facts driving beliefs, our beliefs can dictate the facts we chose to accept. They can cause us to twist facts so they fit better with our preconceived notions. Worst of all, they can lead us to uncritically accept bad information just because it reinforces our beliefs. This reinforcement makes us more confident we’re right, and even less likely to listen to any new information."

Motivated Reasoning - We passively accept as truth any information that confirms our beliefs and actively dismiss information that doesn't. This is like selective inattention

Generally, people tend to seek consistency. There is a substantial body of psychological research showing that people tend to interpret information with an eye toward reinforcing their preexisting views. If we believe something about the world, we are more likely to passively accept as truth any information that confirms our beliefs, and actively dismiss information that doesn’t. This is known as “motivated reasoning.”

Salience
the more strongly the participant cared about the topic — a factor known as salience — the stronger the backfire.

Insecurity breeds selective inattention

if you feel good about yourself, you’ll listen — and if you feel insecure or threatened, you won’t. This would also explain why demagogues benefit from keeping people agitated. The more threatened people feel, the less likely they are to listen to dissenting opinions, and the more easily controlled they are.

Some of the cures for this are:
Directness
There are also some cases where directness works. Kuklinski’s welfare study suggested that people will actually update their beliefs if you hit them “between the eyes” with bluntly presented, objective facts that contradict their preconceived ideas.

Education (but it probably won't work)
And if you harbor the notion — popular on both sides of the aisle — that the solution is more education and a higher level of political sophistication in voters overall, well, that’s a start, but not the solution. A 2006 study by Charles Taber and Milton Lodge at Stony Brook University showed that politically sophisticated thinkers were even less open to new information than less sophisticated types. These people may be factually right about 90 percent of things, but their confidence makes it nearly impossible to correct the 10 percent on which they’re totally wrong. Taber and Lodge found this alarming, because engaged, sophisticated thinkers are “the very folks on whom democratic theory relies most heavily.”

As Moore said of selective inattention "it is relatively easy to discern in others and almost impossible to see in one's self". We should be humble and take correction to heart.

This is tough work. I don't know if I'm up to it. But it does explain much of inconsistency I see in the anti-Trinitarians. But where is it in my life. What am I strongly attached to? Where am I ignoring facts?