During a 2015 debate at the Ronald Reagan Presidential Library in Simi Valley, California, Dr. Ben Carson, a pediatric neurosurgeon, was asked to comment about his rival Donald Trump’s assertion that autism was linked to childhood vaccines.
“Well, let me put it this way,” replied Dr. Carson, “there have been numerous studies, and they have not demonstrated that there is any correlation between vaccinations and autism.” Referring to Trump, Carson added, “I think he’s an intelligent man and will make the correct decision after getting the real facts.”
This argument had little impact. Trump had his own evidence to consider: “Just the other day, two-years-old, two-and-a-half-years-old, a child, a beautiful child went to have the vaccine, and came back, and a week later got a tremendous fever, got very, very sick, now is autistic.”
We should not be surprised by the fact that the now President put more weight on the experience of an acquaintance than on empirical studies examining thousands of individuals. Both neuroscience and social science suggest this is normal — we pick and choose the evidence we consider, and evaluate it in relation to our pre-existing beliefs. This is known as the confirmation bias — and all of us possess it, to some degree.
What we should be focusing our attention on is Dr. Carson’s unsuccessful attempt to alter his opponent’s mind. If you are like most people your instinct is like Carson’s — we try and alter people’s beliefs and actions by introducing evidence to prove we are right and they are wrong. And yet, it often fails.
But perhaps instead of treating the brain as if it is a perfectly rational machine and trying to fight human biases, which have emerged over millions of years of evolution, we need to go along with those biases to make a change.
Recently Andreas Kappes and I, together with others, conducted a study to try to understand what goes on inside the brain when people are confronted with opinions that contradict their own. We recorded the brain activity of pairs of individuals who were making financial decisions together, and found that when a duo disagreed, their brains immediately became less sensitive to the information presented by the other person. However, when they agreed, each person’s brain activity reflected precise encoding of the information provided by the other.
What this implies is that to elicit change we must first identify arguments that rely on common ground.
Take the alleged link between autism and childhood vaccines. Just like Carson, many health professionals attempt to change parents’ decisions not to allow vaccinations of their children, by presenting them with data that suggests there is no link. However, studies show that the approach has little impact.
To solve this problem a group of scientists came up with a new idea; instead of trying to persuade people that the MMR vaccine does not cause autism, they would remind them that it protects children from deadly illnesses. In the heated debate, people had forgotten what measles, mumps and rubella were.
Everyone agreed that the vaccine would protect children from these illnesses and everyone’s priority was the child’s health. Focusing on what they had in common rather than what they disagreed upon was successful — people’s attitudes towards vaccines changed three times as much than when following the CDC’s standard approach.
In life, we tend to focus on our differences, because those carry the most amount of information about what makes each person unique. We forget that our commonalities far outweigh them. When conducting experiments I am often amazed how similar people are in responding to questions and performing tasks, especially when those involve emotional or social factors.
So if we want to affect the behaviors and beliefs of the person in front of us, the first thing we need to do is figure out what goes on inside their head. The good news is that the human brain has a remarkable ability to think about what another person is thinking and feeling.
And it is important to consider not only what people already believe (what cognitive scientists call ‘priors’), but also what they want to believe. Messages that tap into basic human desires — such as the need for agency, a craving for hope, a longing to feel part of a group — are more likely to have impact.
Consider the need for agency, for instance. When people feel in control of their life and environment, they become happier. But when they believe their ability to control their environment has been removed — when they think another political group is determining the law of the land or feel their spouse is dictating their actions — they become anxious. Because people desire to feel in control, they are also more receptive to information that expands their sense of it.
For example, a study conducted at Harvard University found that allowing citizens to suggest how their taxes should be used (e.g., for education, healthcare, science and so on) increased their intention to pay them by two-fold. Accordingly, explaining to parents how vaccinations can help them protect their children enhances their sense of control and makes them more open to the message.
The problem with Carson’s assertion that someone, anyone, will simply “make the correct decision after getting the real facts” is that it overlooks the core of what makes us human; our fears, our desires, our prior beliefs. To make a change we must tap into those motives, presenting information in a frame that emphasizes common beliefs, triggers hope and expands people’s sense of agency.
This is the second in a two-part op-ed by Tali Sharot. You can read part one here.