370 online
 
Most Popular Choices
Share on Facebook 277 Printer Friendly Page More Sharing Summarizing
Exclusive to OpEd News:
OpEdNews Op Eds   

Decision-Making Shortcuts

By       (Page 1 of 2 pages)   2 comments

Richard Turcotte
Follow Me on Twitter     Message Richard Turcotte
Become a Fan
  (5 fans)

On several occasions both here at OpEdNews and on my own political blog, I've discussed the concept of Motivated Reasoning, most often associated with the terrific body of work by Jonathan Haidt [author and currently a professor at New York University Stern School of Business; see this].

In my decidedly and admittedly layman's terms, Motivated Reasoning, as it is most often referred, can be explained this way: When we believe something, our tendency is to seek out and accept quickly anything that supports that belief, while dismissing information that might cause us to question it.

It's an important psychological mechanism most of us utilize from time to time, more so in policy debates--or so it seems. The concept is worth elaboration if for no other reason than it now permeates almost every policy discussion of note--cultural, political, economic, environmental ... you name it.

If it were not so useful to us in our daily affairs we would have abandoned it long ago. But its automatic inclusion in debate/discussion strategies does not mean its utility or value should not be questioned. With a nation seemingly growing more polarized by the day, there might be more than a little bit of merit in asking ourselves if we can better address the pressing issues we face.

[O]ur preexisting beliefs, far more than any new facts, can skew our thoughts and even color what we consider our most dispassionate and logical conclusions. This tendency toward so-called 'motivated reasoning' helps explain why we find groups so polarized over matters where the evidence is so unequivocal: climate change, vaccines, 'death panels,' the birthplace and religion of the president, and much else. It would seem that expecting people to be convinced by the facts flies in the face of, you know, the facts.
The theory of motivated reasoning builds on a key insight of modern neuroscience: Reasoning is actually suffused with emotion (or what researchers often call 'affect'). Not only are the two inseparable, but our positive or negative feelings about people, things, and ideas arise much more rapidly than our conscious thoughts, in a matter of milliseconds--fast enough to detect with an EEG device, but long before we're aware of it ... We push threatening information away; we pull friendly information close. We apply fight-or-flight reflexes not only to predators, but to data itself. [Links/citations in the original article.]


Another way of looking at it comes courtesy of Michael Shermer, in an excerpt from his book The Believing Brain: From Ghosts and Gods to Politics and Conspiracies--How We Construct Beliefs and Reinforce Them as Truths [St. Martin's Griffin 2012]:

We form our beliefs for a variety of subjective, emotional and psychological reasons in the context of environments created by family, friends, colleagues, culture and society at large. After forming our beliefs, we then defend, justify and rationalize them with a host of intellectual reasons, cogent arguments and rational explanations. Beliefs come first; explanations for beliefs follow. I call this process, wherein our perceptions about reality are dependent on the beliefs that we hold about it, belief-dependent realism. Reality exists independent of human minds, but our understanding of it depends on the beliefs we hold at any given time.


A moment's pause is all one needs to appreciate the logic, truth, and value in these two assessments. Who among us wants to spend every waking moment twisting ourselves into knots as we conduct ferocious internal debates about the pro's and con's of every topic before us? We rely on these psychological adaptations because at the very least they get us from one day to the next.

But on matters of great national significance--whether the impact and evidence exist in full bloom today or will more likely unfold over time--we may be doing ourselves, our families, our communities, and our society more harm than we realize by failing to pause now and then and consider the facts at hand. Thinking that our support for or opposition to a policy issue of note is the end product of careful analysis and reasoning is a wonderful thought about our own behavior and thought-process. But is that what we do?

Most of us like to believe that our opinions have been formed over time by careful, rational consideration of facts and ideas, and that the decisions based on those opinions, therefore, have the ring of soundness and intelligence. In reality, we often base our opinions on our beliefs, which can have an uneasy relationship with facts. And rather than facts driving beliefs, our beliefs can dictate the facts we chose to accept. They can cause us to twist facts so they fit better with our preconceived notions. Worst of all, they can lead us to uncritically accept bad information just because it reinforces our beliefs. This reinforcement makes us more confident we're right, and even less likely to listen to any new information. And then we vote.
What's going on? How can we have things so wrong, and be so sure that we're right? Part of the answer lies in the way our brains are wired. Generally, people tend to seek consistency. There is a substantial body of psychological research showing that people tend to interpret information with an eye toward reinforcing their preexisting views. If we believe something about the world, we are more likely to passively accept as truth any information that confirms our beliefs, and actively dismiss information that doesn't. This is known as 'motivated reasoning.' Whether or not the consistent information is accurate, we might accept it as fact, as confirmation of our beliefs. This makes us more confident in said beliefs, and even less likely to entertain facts that contradict them.


And thus the problem....

Would it make sense for a business owner to disregard marketing information or manufacturing considerations offered by her or his own experts and rely solely on cherry-picked "facts" consistent with personal beliefs (not to completely discount "gut instinct") and/or what is most soothing to the owner psychologically/emotionally? Should a football coach ignore the clear tendencies and skill sets offered by the next opponent and go with comfortable beliefs instead--false though they are as applied to the game at hand?

The natural reaction and response from most of us would be to suggest that more information and understanding will be the quick and easy tipping point to sounder decisions and actions. Certainly we'd all like to believe that that is exactly what we do when evaluating new ideas or data. Not so fast.

Here's another observation from the Boston Globe article quoted above:

Facts don't necessarily have the power to change our minds. In fact, quite the opposite. In a series of studies in 2005 and 2006, researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs.
This bodes ill for a democracy, because most voters -- the people making decisions about how the country runs -- aren't blank slates. They already have beliefs, and a set of facts lodged in their minds. The problem is that sometimes the things they think they know are objectively, provably false. And in the presence of the correct information, such people react very, very differently than the merely uninformed. Instead of changing their minds to reflect the correct information, they can entrench themselves even deeper.
'The general idea is that it's absolutely threatening to admit you're wrong,' says political scientist Brendan Nyhan, the lead researcher on the Michigan study. The phenomenon -- known as 'backfire' -- is 'a natural defense mechanism to avoid that cognitive dissonance.'*


Next Page  1  |  2

(Note: You can view every article as one long page if you sign up as an Advocate Member, or higher).

Rate It | View Ratings

Richard Turcotte Social Media Pages: Facebook page url on login Profile not filled in       Twitter page url on login Profile not filled in       Linkedin page url on login Profile not filled in       Instagram page url on login Profile not filled in

Looking Left and Right: Inspiring Different Ideas, Envisioning Better Tomorrows I remain a firm believer in late U.S. Senator Paul Wellstone's observation that "We all do better when we all do better." That objective might be worth pursuing (more...)
 

Go To Commenting
The views expressed herein are the sole responsibility of the author and do not necessarily reflect those of this website or its editors.
Follow Me on Twitter     Writers Guidelines

 
Contact AuthorContact Author Contact EditorContact Editor Author PageView Authors' Articles
Support OpEdNews

OpEdNews depends upon can't survive without your help.

If you value this article and the work of OpEdNews, please either Donate or Purchase a premium membership.

STAY IN THE KNOW
If you've enjoyed this, sign up for our daily or weekly newsletter to get lots of great progressive content.
Daily Weekly     OpEd News Newsletter
Name
Email
   (Opens new browser window)
 

Most Popular Articles by this Author:     (View All Most Popular Articles by this Author)

Peak Oil: Thinking Ahead

Peak Oil: Another Challenge

Decision-Making Shortcuts

Peak Oil: A Few Basics

Left v. Right Pt 9

Indifference

To View Comments or Join the Conversation:

Tell A Friend