[M]oral coherence describes the tendency for people to fit their factual beliefs to their moral world-view, so that what is right morally becomes what is right practically as well. Finding political middle-ground is hard enough when policies push people's moral buttons, but when different factual beliefs arise from different moral evaluations, compromise becomes even more difficult....Moral coherence between factual and moral beliefs sheds light on why liberals and conservatives seem, at times, to be experiencing completely different realities. (Links in the original article from which this quote is taken.)
The internal shortcuts we adopt to support our beliefs and views serve important purposes in establishing and maintaining our self-esteem. But when they become obstacles preventing us from speaking with one another--rather than at or past each other--their value to our long-term well-being diminishes. Understanding the role these obstacles play, respecting what they intend to provide, but then moving beyond them if they cannot serve greater purposes is the task at hand, and for all of us
The problems we face and the need to commit ourselves to finding meaningful, lasting solutions and plans to these challenges--climate change, economic growth, and energy supplies chief among them--mandates that we find a way to move past the barriers which prevent us from engaging in meaningful conversations.
More and more we respond by shutting out the assault of cognitive dissonance and retreating from any unwelcome input. We surround ourselves with news outlets, friends and even neighbors who carefully reinforce what we want to believe. We are building our own reality to support our chosen narrative. It doesn't seem to be working out well on a personal level and it's rotting our politics.
Compounding the difficulties we create for ourselves are related psychological shortcuts we incorporate into both our decision-making and negotiation processes. We place too much emphasis on preferred sources [MSNBC, Fox] or bits of information which we quickly decide unequivocally confirm our positions, never giving ourselves the opportunity to consider more information. Eliminating options before they can be properly evaluated may be a time-saver, but if that's our priority in problem-solving endeavors, we need new priorities.
At first or second glance, there's nothing especially unusual or even troubling about these psychological adaptations. Human nature is what it is, and obviously there are any number of benefits derived from these cognitive shortcuts. In most cases, there's probably not much in the way of severe consequences when we "rule out" the information, perspectives, and opinions/suggestions of those not in our group or inconsistent with our established beliefs.
It makes perfect sense that the more validations we seek out and add to our formed beliefs, discarding those which create some inner discord for whatever reason, the more certain we become about the facts, impact, and "benefits" of the positions we stake out on matters of both personal and public importance. The issues that matter most to us matter most, so of course we want the supporting information anchored firmly.
But it's not a perfect system for self-preservation or validation. For those of who want to feel a sense of worth and confidence and value and their related brethren--all of us, in other words--does it make sense that on matters of great and enduring significance, accepting the facts and the truths is in the end more beneficial and important than "sticking to our guns"? [Duly noted that this ideal is simpler in concept than in application.]
Once something is added to your collection of beliefs, you protect it from harm. You do it instinctively and unconsciously when confronted with attitude-inconsistent information. Just as confirmation bias shields you when you actively seek information, the backfire effect defends you when the information seeks you, when it blindsides you. Coming or going, you stick to your beliefs instead of questioning them. When someone tries to correct you, tries to dilute your misconceptions, it backfires and strengthens them instead. Over time, the backfire effect helps make you less skeptical of those things which allow you to continue seeing your beliefs and attitudes as true and proper....
The backfire effect is constantly shaping your beliefs and memory, keeping you consistently leaning one way or the other through a process psychologists call biased assimilation. Decades of research into a variety of cognitive biases shows you tend to see the world through thick, horn-rimmed glasses forged of belief and smudged with attitudes and ideologies.
Letting the world pass us by; and/or ignoring more beneficial information; and/or declining "better" opportunities all because of a stubborn insistence that we must not and/or will not not change our ways is a strategy--one used quite often by just about everyone. But every choice we do and do not make carries consequences. We hope most are the good kind. But that's not always the case.
When we have an opportunity to make better and more informed decisions--discomforting thought they may be in the moment--does it make sense to allow for some unease and confusion in the short-term if producing more advantageous outcomes is the end result?
Should a lack of vision, courage, and wisdom be our legacy in this new century?
How much better do we choose to be?
Adapted from a recent blog post of mine