Black Swan – The Impact of the Highly Improbable

I have recently finished listening to Black Swan by Nassim Nicholas Taleb (a former derivatives trader, is Dean’s Professor in the Sciences Courant Institute of Mathematical Sciences).

The book is written about the extreme impact of certain kinds of rare and unpredictable events and our tendency to try and find simplistic explanations. I have pulled together a few extracts I found valuable.

The Black Swan 

Before 1697, teachers confidently taught European schoolchildren that all swans were white. They had little reason to think otherwise. But then Dutch explorer Willem de Vlamingh landed in Australia and found dark feathered birds that looked remarkably like swans. Black swans? Indeed. Once observed, they were as unmistakable as they had been unimaginable, and they forced Europeans to revise forever their concept of “swan.” In time, black swans came to seem ordinary. This pattern is common. Just because you haven’t seen a black swan, doesn’t mean that there are no black swans. Unlikely events seem impossible when they lie in the unknown or in the future. But after they happen, people assimilate them into their conception of the world. Think of the advent of World Wars I and II, the popping of the 1990s Internet stock bubble, or world-changing inventions like the internal combustion engine, the personal computer and the Internet, yet in hindsight they seem almost inevitable. Why?

 

The human mind is wonderful at simplifying the onslaught of today’s “booming, buzzing confusion” of data. Mental schema, heuristics, biases, self-deception – these are not “bugs” in the cognitive system, but useful features that allow the human mind to concentrate on the task at hand and not get overwhelmed by a literally infinite amount of data. But human simplifying mechanisms are not without their costs. Take stories, for example.

The Narrative Fallacy

Stories help people remember and make sense of the past. Think of a typical business magazine profile of a successful businessman. The story begins in the present, after he has become rich beyond his wildest dreams. The story then cuts back to his humble beginnings. He started with nothing and wanted to get rich (in terms of story structure, his “dramatic need”). He faced obstacle after obstacle (perhaps he had a rival – the “antagonist”). But he made shrewd decisions and flouted the wisdom of the Cassandras who counselled caution (“Idiots!”). As success built on success, he amassed a fortune. He retired early, married a model and now has brilliant children who play Chopin blindfolded and will all attend Ivy League colleges. His virtues will be extolled in a B-School case study. Wide-eyed business students will sit rapt at his feet when he visits their schools on a lecture tour promoting his latest book. He is a superman, an inspiration. Now consider an alternative hypothesis: He got lucky. His putative “virtues” had nothing to do with his success. He is, essentially, a lottery winner. The public looks at his life and concocts a story about how brilliant he was, when, in fact, he was merely at the right place at the right time. This is the “ludic fallacy” (ludus means game in Latin): People underestimate luck in life – though they ironically overestimate it in certain games of “chance.” Even the businessman himself falls victim to flawed thinking through the self-sampling bias. He looks at himself, a sample of one, and draws a sweeping conclusion, such as, “If I can do it, anyone can!” Notice that the same reasoning would apply had he merely bought a winning lottery ticket. “I’m a genius for picking 3293927! Those long odds didn’t mean a darn thing. I mean, after all, I won didn’t I!” In the case of the inspiring businessman, consider his population cohort. Where are all the similarly situated people, who started out like him and have the same attributes? Are they also rich? Or homeless? Usually you can’t find this sort of “silent” disconfirming evidence. The mind uses many more simplifying schema that can lead to error. Once people have theories, they seek confirming evidence; this is called “confirmation bias.” They fall victim to “epistemic arrogance,” becoming overconfident about their ideas and failing to account for randomness.

“Mediocristan” or “Extremistan?”

So, the human mind tends to smooth away the rough features of reality. Does this matter? It can matter, and a lot, depending on whether you’re in “Mediocristan” or “Extremistan.” Mediocristan refers to phenomena you could describe with standard statistical concepts. Extremistan refers to phenomena where a single, event or person can radically skew the distribution. For instance, if you and Bill Gates share a cab, the average wealth in the cab can be north of $25 billion dollars. But the distribution is not certainly not even. When this happens, odds are you’re no longer in Kansas. You’re in Extremistan.

Phony Forecasting (or Nerds and Herds)

Extremistan might not be so bad if you could predict when outliers would occur and what their magnitude might be. But no one can do this precisely. Consider Harry Potter. Nobody could predict whether a book by a mother on welfare about a boy magician with an odd birthmark would flop or make the author a billionaire. Stock prices are the same way. Anyone who claims to be able to predict the price of a stock or commodity years in the future is a charlatan. Yet the magazines are filled with the latest “insider” advice about what the market will do. Ditto for technology. Do you know what the “next big thing” will be? No. No one does. Prognosticators generally miss the big important events – the black swans that impel history. Chalk these errors up to “nerds and herds.” Nerds are people who can only think in terms of the tools they have been taught to use. When all you have is a hammer, everything becomes a nail. If all you have is standard deviation, and mild, ordinary randomness, you’ll see bell curves everywhere and will explain away disconfirming data as “outliers,” “noise” or “exogenous shocks.” (The proliferation of Excel spreadsheets allowing every user to fit a regression line to any messy series of data doesn’t help.)

Further, humans follow the herd and look to “experts” for guidance. Yet, some domains can’t have experts because the phenomena the expert is supposed to know are inherently and wildly random. Of course, this discomforting thought requires a palliative, which is to think that the world is much more orderly and uniform than it often is. This soothing belief usually serves people well. Then comes a stock market drop or a meteor strike (on the downside), or Star Wars and the Internet (on the upside), and the curve is shot.

Befriending Black Swans

Even given these grim facts, you can tame, if not befriend, the black swan by cultivating some “epistemic virtues:”

  • Keep your eyes open for black swans – Look around and realize when you are in Extremistan rather than Mediocristan. Social contagion and rich-get-richer phenomena are clues that you’ve just gotten off the bus in Extremistan.
  • Beliefs are “sticky,” but don’t get glued to them – Revise your beliefs when confronted with contrary evidence. Dare to say, “I don’t know,” “I was wrong” or “It didn’t work.”
  • Know where you can be a fool and where you can’t – Are you trying to predict what sort of birthday cake your daughter wants? Or the price of oil in 17 years after investing your life’s savings in oil futures? You can’t help being foolish – no one can. But sometimes foolishness is dangerous, and sometimes it is benign.
  • As a forecasting period lengthens, prediction errors grow exponentially – Suspend judgment where evidence is lacking and be wary of overly precise predictions. “Fuzzy” thinking can be more useful. Often you should focus only on consequences, not overly precise probabilities.
  • Look for the non obvious – Seek out disconfirming evidence for pet theories. Think, “What event would refute this theory?” rather than just stacking up confirming evidence for the sake of consistency, and turning out any evidence that contradicts your notion. In other words: Amassing confirming evidence doesn’t prove a theory or a mental model.
  • Know that in many cases, you cannot know – Think outside your usual, customary conceptual categories. Eliminate alternatives that you know are wrong rather than always trying to find out what is right.
  • Avoid dogmatism – “De-narrate” the past and remember that stories mislead. That’s the whole point: They are psychological armour against the “slings and arrows of outrageous fortune.” Think for yourself. Avoid nerds and herds. This universe, this planet and your life were highly unlikely. But they happened. Enjoy your good fortune and remember that you are a black swan.

Leave a Reply