I just finished Nassim Nicholas Taleb’s study of randomness, The Black Swan. Yes, I know, even its update was several years ago. But if you have not read it, I recommend the book.

Often my reading uncovers things in flocks. In this case, I’m coupling Taleb’s view of experts with that of a piece in the MIT Sloan Management Review. (Note: I am sure that Taleb would flinch at the idea of pairing him with an MIT piece, but it fits.)

I will take a moment aside to notice the “exception that proves the rule” as they say. In certain sciences, there are experts. In the field of most of the people who read this, there are process control experts who truly can give excellent and accurate advice on their subject. They know what they are doing through a series of study, experience, reflection, more experience, and iterate.

In these latter stages of my career, I’ve been labeled “SME” (subject matter expert). That scares me. I have a friend who introduces me as an expert. That worries me. Mostly I am a student and observer who thinks about what I learn and observe. And thanks to being blindsided by the economy or company politics a few times, I also try to have antennae scanning the ecosystem watching for curious blips on the horizon that just might signal crisis—or opportunity.

Back to Taleb. He says the power of random events, “black swans,” can confound even Nobel laureate level experts. They can’t be predicted, especially by using bell curves or linear regression. One can train awareness to become sensitive to vulnerabilities to these black swan events.

Taleb writes with an irreverence that I appreciate. Although, reading some comments revealed the sensitivities of those he tweaks (or skewers).

Taleb noticed experts had nothing over ordinary people—e.g. his uncle a government minister and his chauffeur. He tells the story of being a youth in Lebanon during the civil war. His uncle was a minister in the government. One day he asked his chauffeur what he thought about the war. Neither the minister or the chauffeur seemed to have a better grasp on what was happening in the war.

He further noticed during his career as a trader that business executives don’t know much about the situation when they make decisions, oblivious to the danger of not knowing when you don’t know.

People, especially experts, have a map in mind of how things are that is not based on empirical evidence. This blinds them to looking for potential outliers waiting to pounce.

In the winter 2019 issue of the MIT Sloan Management Review essay “Think Critically About the Wisdom of Experts,” Andrew A. King debunks the myth of the expert. “Expert analysis informs the decisions we make as leaders and managers — and in our everyday lives. Much of our knowledge is ultimately garnered from the testimony of teachers, mentors, colleagues, and authors who write for publications like this one.”

He goes on to say, “But we also live in a world where, almost daily, some expert’s previous certainty is discredited by new analysis.”

Take these examples from King:

  • Diets once thought to be foolproof are ridiculed; management practices once decried are suddenly praised.
  • In the second most popular TED talk of all time, social psychologist Amy Cuddy tells us that holding certain physical postures boosts our power hormones and makes us more courageous; however, attempts to replicate that result have failed.
  • European governments chose to adopt austerity policies in part because esteemed Harvard economists Carmen Reinhart and Kenneth Rogoff told them that high debt levels cause a sudden drop in economic growth. Then a graduate student, Thomas Herndon, discovered that their claim was influenced by an Excel spreadsheet error.
  • The replication crisis — whereby scientific findings are increasingly being revealed as tough to reproduce — is plaguing psychology, economics, and medical research.

So how should we treat the next piece of advice we get from a scholar or a consultant? We should always think critically about what we hear or read.

Share This