Don’t let motivated reasoning distort your decisions

Communication  Culture
07 January, 2019

Why we create a selective version of reality – and how to give facts a fighting chance.

In 2016, Oxford Dictionaries’ word of the year was “post-truth” – defined as “denoting circumstances in which objective facts are less influential in shaping public opinion than appeals to emotion and personal belief”.

The choice is a telling sign of the times. “Fake news” has spread through the world with alarming speed over the last few years, accelerated by social media and WhatsApp. People continue to believe and share stories that have objectively been proven false, while denying hard evidence. And this isn’t exclusive to any one group – it’s happening across countries, cultures and political ideologies.

Why do reasonable, often well-educated people subscribe to such skewed reality?

Reasoning doesn’t happen in an emotional vacuum. Our pre-existing feelings about a topic affect the way we evaluate new information about it. Within a split second, our biases kick in and set us on a certain path of thinking. As Chris Mooney puts it in his article, The Science of Why We Don’t Believe Science:

We push threatening information away; we pull friendly information close. We apply fight-or-flight reflexes not only to predators, but to data itself.

The process of choosing to believe information that confirms our beliefs and ignoring facts that don’t is called motivated reasoning. In order to avoid changing our minds, we do a series of mental gymnastics: we deny, ignore, excuse, justify, and rationalise. Mooney explains that while we might think of ourselves as information scientists, we’re actually being lawyers:

Our “reasoning” is a means to a predetermined end – winning our “case” – and is shot through with biases. They include “confirmation bias,” in which we give greater heed to evidence and arguments that bolster our beliefs, and “disconfirmation bias,” in which we expend disproportionate energy trying to debunk or refute views and arguments that we find uncongenial.

So, this week, my message focuses on why each of us is susceptible to motivated reasoning, and what we can do to foster a culture of greater accuracy and objectivity.

The role of denial

Denial is a key component of motivated reasoning. The goal is to avoid mental discomfort and punishment (which is quite understandable), or to hoard power (which is less so). It also helps us maintain the status quo, so we don’t need to face the unknown or risk being disillusioned. Denial – be it conscious or unconscious – is a common human instinct. You can find plenty of examples, not only in the world at large but also at work and in your personal life.

Think of the way in which young children immediately deny any wrongdoing to avoid being scolded, or the manner in which close-knit groups (like families or teams) instinctively turn a blind eye to the bad behaviour of one of their members. Consider how certain people deny, even to themselves, the ill-effects of addiction or abuse. Think about how some leaders ignore clearly emerging trends and scientific evidence.

At times, denial does serve a purpose. For example, it’s often the first stage of grief when faced with a sudden and painful loss, giving people the time they need to adjust to the shock gradually. Denial can also build cohesion in groups. In How We’re Programmed to Believe Fake News, Lies, and Abusers, Darlene Lancer elaborates:

It’s a unifying force between spouses, and among families, groups, or political parties. We overlook things that might cause arguments, hurt, or separation. One study showed that people will forgive a member of a clique four to five times more than a stranger. Idealization supports denial and blinds us to anything that would mar respect for a partner, family or group member, or leader.

To an extent, we can understand and even practice this behaviour. Overlooking each other’s minor mistakes and irritating habits allows us to get along and build stronger relationships. But with more serious issues, denial can become downright dangerous.

The perils of motivated reasoning

The social dangers of building a selective version of reality are obvious. There is growing proliferation of fake news and loss of trust in the media. Political polarisation is stronger than ever, as all sides are increasingly unwilling to entertain any notions that challenge their own stance. This kind of extreme divide can also be found at workplaces, especially when leaders encourage the formation of rigid “camps” and groupthink.

Motivated reasoning causes a serious deficit of trust within the organisation. It destroys generalised trust, replacing it with particularised trust – the tendency to believe only “people like us”, within your own little group.

Think about any office cliques you’ve noticed: its members are much more likely to accept information from each other, rather than from those outside the circle. And, of course, clique members are likelier to share views that already align with the group’s existing beliefs – it’s a vicious cycle.

At an individual level, motivated reasoning can trap you in a negative situation. It keeps you from acknowledging the problem, taking responsibility, and doing the work that’s needed. At best, you remain in an unhappy status quo; at worst, the situation blows up and people (including you) get hurt. Plus, not facing the issue can lead to displaced anger, passive-aggressive behaviour, and even mental health issues.

Fostering accuracy and objectivity

Here are some suggestions to curb motivated reasoning and create a clearer picture of reality:

1. Inoculate yourself

Yes, motivated reasoning often happens unconsciously – but being more aware of it can help us overcome the tendency to pick and choose facts. A recent study found that people can be pre-emptively “inoculated” against misinformation. For instance, the study delivered a message about the strong scientific consensus on climate change, including a warning on how some groups wrongly claim that there is a lot of conflict on this issue among scientists. This was an effective way to make people more mindful in the future, showing that awareness can help us change the way we respond to evidence.

When you come across uncomfortable facts that diverge from your existing beliefs, make a special effort to keep an open mind and be objective. Practice this with your team as well: when a left-field view emerges, don’t just dismiss it. Give it a fair hearing and examine the evidence.

2. Break partisanship

In a situation with two or more opposing camps with fixed beliefs, motivated reasoning flourishes. But if people see that leaders give their support based on facts (not just along group lines), then they too become less biased. This is as true of the workplace as it is of politics. As leaders, we must endorse strategies based on evidence, encourage lively disagreement (even among ourselves), and avoid creating homogenous ideological blocks.

3. Slow down information-sharing

Social media and instant messaging apps have played a huge role in fostering a culture of misinformation. Many times, we receive a story that confirms our own convictions and immediately share it with our entire network – without any fact-checking whatsoever. Instead, read the entire piece (not just the headline!), check the source, and verify that it’s actually true before forwarding it on.

4. Facilitate diverse discussions

If everyone in a group has the same perspective, it becomes an echo chamber. But if people have varying opinions, then a group interaction can actually boost the reasoning process – provided it’s done the right way. Make sure you have a diversity of opinions in the room, leave your ego at the door, and encourage open-mindedness and dissent. Demonstrate your willingness to consider facts that go against your own commonly-known beliefs – your team members will take their cue from you. As Lee McIntyre, author of Respecting Truth: Willful Ignorance in the Internet Age, points out:

One real advantage of group reasoning is that you get critical feedback. If you’re in a silo, you don’t get critical feedback, you just get applause.

5. Frame facts within values

In the previously mentioned article, Mooney explains that the way in which you frame facts can affect how they’re received. He mentions a study by Dan Kahan in which the basic science of climate change was packaged into two newspaper articles: one was headlined “Scientific Panel Recommends Anti-Pollution Solution to Global Warming”, while the other was “Scientific Panel Recommends Nuclear Solution to Global Warming”. Those with pro-industry views were much more open to the second headline than the first, even though they basically conveyed the same message. So, if you’re trying to get someone to accept the facts, put them in a context that lines up with their core values.

As always, I look forward to your thoughts.

X

Join the 8AM conversation

Loading