Why Facts Don’t Change Our Minds

New discoveries about the human mind show the limitations of reason. To wit: The vaunted human capacity for reason may have more to do with winning arguments than with thinking straight.

By Dan Gifford

One of the more common beliefs expressed by Second Amendment rights activists is that those who oppose those rights would change their minds if they were told "the real facts."

It's at that point names like Florida State University Criminologist Gary Kleck,  former University of Chicago prof John Lott or academic fraud destroyer extraordinaire Clayton Cramer may be invoked as fact tellers. All have exposed much "unassailable" ivory tower research about firearms as academic fraud and or political narrative bunk. But as numerous studies have shown, “once formed, impressions are remarkably perseverant,” immune to any amount of fact that contradicts their beliefs and one doesn't have to be a poker wiz to read that immunity on one's face.

For instance, look at this woman's face at an Oklahoma City Mom's Demand Action (a Michael Bloomberg funded anti Second Amendment  group) confrontation with elected officials where she and her group demanded more gun control laws. Would any rational person seriously believe that face would be open to any amount of opposite factual reason?

Like Trofim Lysenko, the Soviet Director of Agronomy and Genetics who maintained that crops germinate according to Marxist principles, no amount of failed crops and or mass starvation could convince ideologue Lysenko of the stupidity of that notion. But then, he didn't have to alter his view. Soviet dictator Joseph Stalin had around 3,000 scientists who disagreed with Lysenko shot.

The place that leaves us at the end of the trait development trail is with a principle that's rather simple even if the reasons for it may not be. Stanford studies have shown that even after the evidence “for their beliefs has been totally refuted, people fail to make appropriate revisions in those beliefs.” Those 1970s Stanford studies became famous for proving the contention that people too rarely think straight.

That reasonable-seeming people are often totally irrational was a stunner at the time of the experiments.  Many other experiments have since confirmed and added to that finding. So how come? What causes that frame of mind? How did we come to be that way?

Stanford cognitive scientists Hugo Mercier and Dan Sperber had a go at determining the answer. Their studies found that reason is not innate, but is an evolved trait from the earliest humans and that it must be grasped within that background.

Wading thru the highfalutin academic jargon jungle gets us to the evolutionary clearing the researchers found.  There wasn't much advantage to lucid reasoning within a hunter-gathering society. But there was a whole lot to be gained from winning arguments that allowed the winner to control others.

Stripped of a lot of what might be called cognitive-science-ese, Mercier and Sperber’s argument is that the biggest advantage humans have over other species is our ability to cooperate. However, cooperation is difficult to establish and difficult to sustain. For any individual, freeloading is almost always the most attractive course of action. That meant early reason developed not to enable us to solve abstract, logical problems or even to help us draw conclusions from unfamiliar data, it instead appears to have developed to resolve the leadership problems posed by living in collaborative groups.

“Reason is an adaptation to the hypersocial niche humans have evolved for themselves,” said Mercier and Sperber. Habits of mind that seem weird or goofy or downright stupid from an “intellectualist” point of view prove shrewd when seen from a social “interactionist” perspective. Confirmation bias is the chief trait of those.

It's the tendency people have to embrace information that supports their beliefs and reject information that contradicts them.  A form of faulty thinking it is, but it's possibly the best catalogued. Much of that cataloging was  done at at Stanford as well.

Cutting to the chase, those taking part in the experiment not only rejected facts contrary to their beliefs, the facts made them even more resolute it their beliefs.

If reason is designed to generate sound judgments, then it’s hard to
conceive of a more serious design flaw than confirmation bias in the face of false facts. Mercier and Sperber believe that flaw must have some adaptive function, and that function is related to our “hypersociability.”

Mercier and Sperber prefer the term “myside bias.” Humans, they point out, aren’t randomly credulous. Presented with someone else’s argument, we’re quite adept at spotting the weaknesses. Almost invariably, the positions we’re blind about are our own. This lopsidedness, according to Mercier and Sperber, reflects the task
that reason evolved to perform, which is to prevent us from getting
screwed by the other members of our group.

Living in small bands of hunter-gatherers, our ancestors were primarily concerned with their social standing, and with making sure that they weren’t the ones risking their lives on the hunt while others loafed around in the cave. As mentioned before, there was little advantage to clear reasoning, while there was much to be gained from winning arguments.

It’s no wonder, then, that today reason often seems to fail us. As Mercier and Sperber write, “This is one of many cases in which the environment changed too quickly for natural selection to catch up.”

Steven Sloman, a professor at Brown, and Philip Fernbach, a professor at the University of Colorado, are also cognitive scientists. They, too, believe sociability is the key to how the human mind functions or, perhaps more pertinently, malfunctions. They begin their book, “The Knowledge Illusion: Why We Never Think Alone” (Riverhead), with a look at toilets.

In that study, graduate students were asked to explain how a range of everyday devices like toilets actually work.  the revelation was that not only did the students know far less than they thought, they would defend their own ignorance, in order to protect their own self esteem.

Sloman and Fernbach called this the “illusion of explanatory depth.” That is, people believe they know way more than they actually do. It's a trait with potentially deadly consequences explored in another book by Oxford psychiatrist Jack Gorman and his daughter, Sara. In their “Denying to the Grave: Why We Ignore the Facts That Will Save Us,”  the authors reveal there's a gap between what science tells us and what we tell ourselves.

The Gorman's  focus is with beliefs that are not just provably false, according to their own beliefs, but also potentially deadly because the believer cannot bring him or herself to replace their own bias.Why? Adaptive reasons aside, the Gorman's say their research shows people get a "rush" of pleasure when from information that supports their views -- especially when that information is supported or lauded by others. “It feels good to ‘stick to our guns’ even if we are wrong,” they observe.

But esteemed researchers are not immune from their own biases. Against all fact, the Gorman's believe the long discredited "truth" that handguns are dangerous to those who own them and have sought to use their research to convince others who know better.   The Gormans findings contain more than a bit of irony in their conclusion:  “The challenge that remains, is to figure out how to address the tendencies that lead to false scientific belief.”

Indeed.


Dan Gifford is a national Emmy-winning,
Oscar-nominated film producer and former
reporter for CNN, The MacNeil Lehrer
News Hour and ABC News.

###