We humans

Why you think you’re right, even when you’re wrong

Mar 9, 2017 /

Are you a soldier or a scout? Your answer to this question, says decision-making expert Julia Galef, could determine how clearly you see the world.

Imagine for a moment you’re a soldier in the heat of battle — perhaps a Roman foot soldier, medieval archer or Zulu warrior. Regardless of your time and place, some things are probably constant. Your adrenaline is elevated, and your actions stem from your deeply ingrained reflexes, reflexes that are rooted in a need to protect yourself and your side and to defeat the enemy.

Now, try to imagine playing a very different role: the scout. The scout’s job is not to attack or defend; it’s to understand. The scout is the one going out, mapping the terrain, identifying potential obstacles. Above all, the scout wants to know what’s really out there as accurately as possible. In an actual army, both the soldier and the scout are essential.

You can also think of the soldier and scout roles as mindsets — metaphors for how all of us process information and ideas in our daily lives. Having good judgment and making good decisions, it turns out, depends largely about which mindset you’re in. To illustrate the two mindsets in action, let’s look at a case from 19th-century France, where an innocuous-looking piece of torn-up paper launched one of the biggest political scandals in history in 1894. Officers in the French general’s staff found it in a wastepaper basket, and when they pieced it back together, they discovered that someone in their ranks had been selling military secrets to Germany. They launched a big investigation, and their suspicions quickly converged on one man: Alfred Dreyfus. He had a sterling record, no past history of wrongdoing, no motive as far as they could tell.

However, Dreyfus was the only Jewish officer at that rank in the army, and unfortunately, at the time the French Army was highly anti-Semitic. The other officers compared Dreyfus’s handwriting to that on the paper and concluded it was a match, even though outside professional handwriting experts were much less confident about the similarity. They searched Dreyfus’ apartment and went through his files, looking for signs of espionage. They didn’t find anything. This just convinced them that not only was Dreyfus was guilty, but he was also sneaky because clearly he had hidden all of the evidence. They looked through his personal history for incriminating details. They talked to his former teachers and learned he had studied foreign languages in school, which demonstrated to them a desire to conspire with foreign governments later in life. His teachers also said that Dreyfus had had a good memory, which was highly suspicious since a spy must remember a lot of things.

The case went to trial, and Dreyfus was found guilty. Afterwards, officials took him out into the public square; they ritualistically tore his insignia from his uniform and broke his sword in two. This was called the Degradation of Dreyfus. He was sentenced to life imprisonment on the aptly named Devil’s Island, this barren rock off the coast of South America. He spent his days there alone, writing letter after letter to the French government begging them to reopen his case so they could discover his innocence. While you might guess that Dreyfus had been set up or intentionally framed by his fellow officers, historians today don’t think that was what happened. As far as they can tell, the officers genuinely believed that the case against Dreyfus was strong.

Other pieces of information are the enemy, and we want to shoot them down.

So the question arises: What does it say about the human mind that we can find such paltry evidence to be compelling enough to convict a man? This is a case of what scientists refer to as “motivated reasoning,” a phenomenon in which our unconscious motivations, desires and fears shape the way we interpret information. Some pieces of information feel like our allies — we want them to win; we want to defend them. And other pieces of information are the enemy, and we want to shoot them down. That’s why I call motivated reasoning “soldier mindset.”

While you’ve never persecuted a French-Jewish officer for high treason, you might follow sports or know someone who does. When the referee judges your team has committed a foul, for example, you’re probably highly motivated to find reasons why he’s wrong. But if he judges that the other team committed a foul — that’s a good call. Or, maybe you’ve read an article or a study that examined a controversial policy, like capital punishment. As researchers have demonstrated, if you support capital punishment and the study shows it’s not effective, then you’re highly motivated to point out all the reasons why the study was poorly designed. But if it shows that capital punishment works, it’s a good study. And vice versa: if you don’t support capital punishment, same thing.

Our judgment is strongly influenced, unconsciously, by which side we want to win — and this is ubiquitous. This shapes how we think about our health, our relationships, how we decide how to vote, and what we consider fair or ethical. What’s most scary to me about motivated reasoning or soldier mindset is just how unconscious it is. We can think we’re being objective and fair-minded and still wind up ruining the life of an innocent person like Dreyfus.

Fortunately, for Dreyfus, there was also a man named Colonel Picquart. He was another high-ranking officer in the French Army, and like most people, he assumed Dreyfus was guilty. Also like most of his peers, he was somewhat anti-Semitic. But at a certain point, Picquart began to suspect, “What if we’re all wrong about Dreyfus?” Picquart discovered evidence that the spying for Germany had continued, even after Dreyfus was in prison. He also discovered that another officer in the army had handwriting that perfectly matched the torn-up memo.

It took Picquart ten years to clear Dreyfus’s name, and for part of that time, he himself was put in prison for the crime of disloyalty to the army. Some people feel that Picquart shouldn’t be regarded as a hero, because he was an anti-Semite. I agree that kind of bias is bad. But I believe the fact that Picquart was anti-Semitic makes his actions more admirable, because he had the same reasons to be biased as his fellow officers but his motivation to find and uphold the truth trumped all of that.

To me, Picquart is a poster child for what I call “scout mindset,” the drive not to make one idea win or another lose, but to see what’s there as honestly and accurately as you can even if it’s not pretty, convenient or pleasant. I’ve spent the last few years examining scout mindset and figuring out why some people, at least sometimes, seem able to cut through their own prejudices, biases and motivations and attempt to see the facts and the evidence as objectively as they can. The answer, I’ve found, is emotional.

Scout mindset means seeing what’s there as accurately as you can, even if it’s not pleasant.


Just as soldier mindset is rooted in emotional responses, scout mindset is, too — but it’s simply rooted in different emotions.
For example, scouts are curious. They’re more likely to say they feel pleasure when they learn new information or solve a puzzle. They’re more likely to feel intrigued when they encounter something that contradicts their expectations.

Scouts also have different values. They’re more likely to say they think it’s virtuous to test their own beliefs, and they’re less likely to say that someone who changes her mind seems weak. And, above all, scouts are grounded, which means their self-worth as a person isn’t tied to how right or wrong they are about any particular topic. For example, they can believe that capital punishment works and if studies come out that show it doesn’t, they can say, “Looks like I might be wrong. Doesn’t mean I’m bad or stupid.” This cluster of traits is what researchers have found — and I’ve found anecdotally — predicts good judgment.

The key takeaway about the traits associated with scout mindset is they have little to do with how smart you are or how much you know. They don’t correlate very closely to IQ at all; they’re about how you feel. I keep coming back to a particular quote from Antoine de Saint-Exupéry, author of The Little Prince. “If you want to build a ship, don’t drum up your men to collect wood and give orders and distribute the work,” he said. “Instead, teach them to yearn for the vast and endless sea.”

In other words, if we really want to improve our judgment as individuals and as societies, what we need most is not more instruction in logic, rhetoric, probability or economics, even though those things are all valuable. What we most need to use those principles well is scout mindset. We need to change the way we feel — to learn how to feel proud instead of ashamed when we notice we might have been wrong about something, or to learn how to feel intrigued instead of defensive when we encounter some information that contradicts our beliefs. So the question you need to consider is: What do you most yearn for — to defend your own beliefs or to see the world as clearly as you possibly can?