By Stephen T. Messenger
September 14, 2021
Kathryn Schulz, in her 2011 Ted Talk, asks us what it feels like to be wrong. Not knowingly incorrect, but obliviously wrong.
She says it feels great!
When you don’t know you’re wrong but think you’re right, you feel pretty good about yourself.
I find myself in that position a number of times… usually multiple times day. I was packing the back of the car in preparation for a trip last week and found myself among a group of family spectators who all thought they knew a better way to pack than I did. Now, I’ve loaded this exact car hundreds of times with similar items, and I was feeling pretty smug.
But the dreaded moment came when the hatch didn’t quite close, and all my onlookers gave me the “I told you so” glance followed by a flurry of more suggestions. I went from feeling great to terrible.
It’s easy to be wrong. Human nature wants to be right all the time, but odds are we’re going to mess something up… and soon. It’s important to gather all the data, have a trusted network that can help us see through our biases, and find better ways to solve problems.
The Columbia University Statistical Research Group in WWII conducted a study on bullet holes in returning bombers from Europe. They observed the majority of the pockmarked areas were located in the wings and fuselage, not the engine. The military planners believed that if you armored the areas taking hits, the planes would become more survivable, albeit weighed down with additional armor resulting in a slower, and less maneuverable, shorter-range aircraft.

Just like me packing a car—completely wrong.
Abraham Wald, one of the members of the group, inserted himself in the conversation. He realized they were missing one important set of data—the planes that never made it home.
While the bombers that returned routinely had bullet holes everywhere except the engines, the ones that were shot down most likely experienced damage to the motors and crashed because of it. The armor in fact needed to protect the engines.
This failure is known as survivorship bias. It’s where only the on-hand data is observed, and the rest is dismissed. It was again seen in the military when planners observed an increase in head injuries from the newly commissioned World War I Brodie Helmets. This was alarming, leading many military leaders to believe the helmet didn’t work. However, they failed to consider that those not wearing the protection were killed and the Brodie helmet was actually saving lives.

Before you think you’re right, you need to ask yourself:
What data is missing?
Are my assumptions valid?
Are there people around with more experience that I should listen to?
Have I considered follow-on effects?
Am I able to admit that I could be wrong and listen to others to see a different point of view?
It feels great to be wrong and not know it. The challenge is to be open to new ideas and fresh perspectives. This comes by gathering additional data, leveraging experienced partners, and thinking through problems. Only then will we have the best chance of getting to the right answer.
After all, while feeling obliviously wrong feels great, it feels even better to be knowingly right.
________
We invite you to receive a free, weekly email in your inbox every Tuesday with a new leadership article. We hope you’ll join us as we all seek to become better leaders!