Scientific reasons for why people don't believe science

We all wear mental blinders. That's the nature of how the mind/brain works: we perceive reality through cultural, neurological, genetic, emotional, and other filters.

Science is one of the best ways to see things more clearly. Individually we're prone to significant biases. Collectively, though, it's possible to systematically correct for blind spots and home in on a truthful perspective.

I came across an interesting article by Chris Mooney in Mother Jones, "The Science of Why We Don't Believe Science." It starts off by describing a cult, the Seekers, who thought they were communicating with aliens, including an astral incarnation of Jesus Christ.

When a predicted alien cataclysm didn't strike the Earth on the appointed day, the cult had to either face the facts or continue on with their weird beliefs. No big surprise -- what happened is that their blind faith became even stronger.

Mooney writes:

In the annals of denial, it doesn't get much more extreme than the Seekers. They lost their jobs, the press mocked them, and there were efforts to keep them away from impressionable young minds. But while Martin's space cult might lie at on the far end of the spectrum of human self-delusion, there's plenty to go around.

And since Festinger's day, an array of new discoveries in psychology and neuroscience has further demonstrated how our preexisting beliefs, far more than any new facts, can skew our thoughts and even color what we consider our most dispassionate and logical conclusions.

This tendency toward so-called "motivated reasoning" helps explain why we find groups so polarized over matters where the evidence is so unequivocal: climate change, vaccines, "death panels," the birthplace and religion of the president, and much else.! It woul d seem that expecting people to be convinced by the facts flies in the face of, you know, the facts.

The theory of motivated reasoning builds on a key insight of modern neuroscience: Reasoning is actually suffused with emotion (or what researchers often call "affect"). Not only are the two inseparable, but our positive or negative feelings about people, things, and ideas arise much more rapidly than our conscious thoughts, in a matter of millisecondsfast enough to detect with an EEG device, but long before we're aware of it.

That shouldn't be surprising: Evolution required us to react very quickly to stimuli in our environment. It's a "basic human survival skill," explains political scientist Arthur Lupia of the University of Michigan. We push threatening information away; we pull friendly information close. We apply fight-or-flight reflexes not only to predators, but to data itself.

We're not driven only by emotions, of coursewe also reason, deliberate. But reasoning comes later, works slowerand even then, it doesn't take place in an emotional vacuum. Rather, our quick-fire emotions can set us on a course of thinking that's highly biased, especially on topics we care a great deal about.

Consider a person who has heard about a scientific discovery that deeply challenges her belief in divine creationa new hominid, say, that confirms our evolutionary origins.

What happens next, explains political scientist Charles Taber of Stony Brook University, is a subconscious negative response to the new informationand that response, in turn, guides the type of memories and associations formed in the conscious mind. "They retrieve thoughts that are consistent with their previous beliefs," says Taber, "and that will lead them to build an argument and challenge what they're hearing."

In other words, when we think we're reasoning, we may instead be rationalizing. Or to use an analogy offered by University of Vi! rginia p sychologist Jonathan Haidt: We may think we're being scientists, but we're actually being lawyers. Our "reasoning" is a means to a predetermined endwinning our "case"and is shot through with biases.

They include "confirmation bias," in which we give greater heed to evidence and arguments that bolster our beliefs, and "disconfirmation bias," in which we expend disproportionate energy trying to debunk or refute views and arguments that we find uncongenial.


Popular posts from this blog

The Ultimate Yoga Guides

Benefits of the Vajra Guru Mantra

The 6 Important things about Yoga