The Black Swan
Nassim Nicholas Taleb
Until a black swan was sighted in Western Australia by early explorers, it was assumed that swans were white; it was part of the definition of swans that they were white. But as Nassim Nicholas Taleb points out in this sprawling, brilliant work, all you need is one variation to show up the falsity of your assumptions.
From this simple observation, derived from Hume, Taleb creates an entire theory of events and causality. His definition of a ‘black swan’ event is that it happens against all expectation, and has an extreme impact. Most intriguingly, human nature after the fact tries to explain it away, as if it was predictable.
Our history has become the story of big events no one expected. No one, for instance, foresaw the severity of World War One, the rise of Hitler, the sudden collapse of the Soviet bloc, the spread of the Internet, or 9/11. No one foresees particular ideas, fashions, or art genres coming into vogue. And yet, Taleb points out, “A small number of Black Swans explain almost everything in our world, from the success of ideas and religions, to the dynamics of historical events, to elements of our own personal lives.” Moreover, the effect of Black Swans is increasing because the world is becoming more complicated. The combination of low predictability and large impact causes a problem for the human mind, because our brains are built to focus on the known and visible.
Taleb imagines two places to express our ways of seeing the world. ‘Mediocristan’ is a state in which there is an equal relation between effort and result, where the future can be predicted, and where most things fall into a wide band of averages. ‘Extremistan’ is an inherently unstable, unpredictable, winner-takes-all, kind of place. It is the latter that we actually live in, and accepting the fact is the first step to thriving in it.
As a ‘sceptical empiricist’, Taleb’s heroes are Hume, Sextus Empiricus, and Karl Popper. He is very critical of the kind of philosophy focused on language that fills academia. While interesting, it has nothing to do with the real world, he says - a world in which people have to live with uncertainty.
What we don’t know...
The black swan effect has made a mockery of attempts to curb uncertainty, whether in the form of fancy financial algorithms that purport to eliminate risk, or the predictions of social scientists. Think about your own life: how many things , from meeting your spouse to the profession you entered, came according to plan or on schedule? Who expected that you would be fired, exiled, enriched or impoverished? Taleb observes, “Black Swan logic makes what you don’t know far more relevant than what you do know”, because it is these big unexpected things that shape our lives. And if that is so, why do we keep believing things will go as they have done in the past? Our minds, he says, suffer from a ‘triplet of opacity’:
- False understanding. We believe we understand more of what’s going on in the world than we actually do;
- Retrospective distortion. We ascribe meaning to events after they’ve happened, creating a story. This is what we call ‘history’;
- Overvaluing facts, statistics, and categories. We shouldn’t fool ourselves that they can predict the future, or even give us an accurate picture of reality now.
We live according to rules of what we consider normal, but normality is rarely the test of anything. When something major happens out of the blue, we are keen to discount its rarity and unexpectedness. We want to be able to explain everything away. Yet we don’t really know a person until we see how they act in an extreme situation, and neither can we assess the danger of a criminal based on what he does on a regular day. It is the rare or unusual event that often defines a situation, not whatever is ‘normal’.
It is not just that the average person does not see what is going on – the so called experts and people in charge do not either. Taleb’s grandfather was a minister in the Lebanese government during its civil war, but claims he knew no more what was happening than his driver. He does not hold back pointing out the “epistemic arrogance of the human race”, including CEOs who believe that their company’s success is down to them - and not a million other factors, including blind luck. Such fantasies are encouraged in business schools.
No one expected the rise of the world religions. Christian scholars are baffled by the lack of mention of their faith in its early days by Roman chronicles; equally, who could have foreseen the rapid spread of Islam? The historian Paul Veyne noted that religions spread “like bestsellers”. Yet in our minds they quickly become part of the scenery – we normalise them. The same trait means we will be shocked by the sudden rise of the next new religion.
To illustrate his point about extreme events, Taleb asks us to consider a farm turkey. The turkey will look upon the farmer in very kindly terms, since every day he provides an abundance of food, plus shelter. Given the experience of all its days in this environment, how could it not feel this way? But its experience thus far is totally misleading, because one day, totally unexpectedly, he is slaughtered. The moral: despite what we’ve been told, the past generally tells us nothing about the future; the apparent ‘normalness’ of today is “viciously misleading”. EJ Smith, a ship’s captain, said in 1907: “I never saw a wreck and have never been wrecked nor was I ever in any predicament that threatened to end in disaster of any sort.” Five years later, the vessel under his helm was the Titanic.
The human brain is wired to make general assumptions from experience. The problem is that, in real life, a black swan can come along after a whole life of seeing only white ones. Better to rest in the fact of how little we know, and know also the faults in our reasoning; the point is not to be able to predict black swan events, only to be a bit more mentally prepared. It is human nature to react to big, unforeseen events with small, focused adaptions that either try to prevent an event happening again (if it was bad) or to make it happen it again (if it was good). But what we should be doing is peering into what we don’t know and why we don’t know it. Humans think much less than they believe they do, Taleb says; most of our action is instinctive. This makes us less likely to understand black swan events, because we are always lost in the details, only reacting.
Everything comes from unknown factors, while “all the while we spend our time engaged in small talk, focusing on the known, and the repeated”. Umberto Eco had a 30,000 word library, and separated people visiting it into those who were impressed by the number of titles, and those who were more interested in the actual knowledge they contained. Eco was an ‘anti scholar’, who couldn’t care less about the number of books he had read; it was the knowledge contained in the ones he hadn’t read that was far more important.
...and how to get around it
Humans like certainty, but the wise see that certainty is elusive, that “understanding how to act under conditions of incomplete information is the highest and most urgent human pursuit.”
Taleb notes that a “successions of anecdotes selected to fit a story do not constitute evidence”. Instead of going around trying to confirm our existing ideas, we should, as Popper taught, be trying to falsify them. Only then might we get a semi-accurate sense of the truth. The best investors, like George Soros, when making a financial bet try to find instances where their assumption is wrong. Taleb sees this “ability to look at the world without the need to find signs that stroke one’s ego” as genuine self-confidence.
He admits that “It takes considerable effort to see facts...while withholding judgment and resisting explanations. And this theorizing disease is rarely under our control: it is largely anatomical, part of our biology, so fighting it requires fighting one’s own self.” That we are like this is understandable. We have to make rules and oversimplify in order to put endless information into some order in our heads. Myths and stories enable us to make sense of our world. Science is meant to be different, but instead we use science to organize things for our own benefit. Seen in this context, knowledge is therapy, doing little more than making us feel better. Scientists and academics of all stripes are guilty of this, and of course we see it in the media every day. If a candidate loses an election, ‘causes’ will be trotted out. Whether or not they are correct doesn’t matter; what matters is that there is a narrative quickly put in place as to why an event happened. It would be shocking for the newsreader to say, ‘Smith lost the election, but we actually have no idea why’.
Not only do we not know stuff, we totally overestimate the extent of our knowledge, and how efficient and effective we are. This overconfidence seems to be hardwired. Taleb mentions experiments with students who have to estimate the time needed to complete their assignment. Broken into two groups, the optimistic ones thought they could deliver in 26 days; the pessimists promised they would deliver in 47 days. What was the actual average time for completion? 56 days. (Taleb’s manuscript was delivered to the publisher 15 months late).
Why are we like this? It is because we ‘tunnel’ mentally, not taking account of the ‘unexpected’ things that take us off course. But of course, the ‘unexpected’ should be incorporated into calculations for the achievement of anything.
Taleb’s assertion that “almost no discovery, no technologies of note, came from design and planning – they were just Black Swans” is easily argued against. For example, Du Pont spent years developing Nylon, knowing how valuable it would be; and in fact most successful medicines, although they often come from chance discoveries, then need years of development and planning before being brought to market. Yet he is right that organisations and individuals need to focus more on tinkering than planning, in the probability that, through constant trial and error, the chances of creating a positive black swan (an idea that sweeps all before it, a product that becomes the market leader) are increased. The other tip Taleb gives us is to have patience:
“...earthquakes last minutes, 9/11 lasted hours, but historical changes and technological implementations are Black Swans that can take decades. In general, positive Black Swans take time to show their effect while negative ones happen very quickly”.
Building a great enterprise, for instance, will take many years, and though we can never know what the future holds, at least the long view allows us to take obstacles and reversals in our stride.
The book itself is a microcosm of Taleb’s argument about complexity: it has almost too much information, too many startling challenges to our thinking, to be able to summarise neatly. Best to read it yourself, if only for the many entertaining digressions and examples that we have no room for here. As Taleb’s header quote suggests, summarisation takes out random discovery, and it is these discoveries that make all the difference in your life and career, and in our world.
Taleb is famed for having predicted, in the original edition of the book, the 2008 financial crisis, when he wrote about the fragility of the large bank sector, suggesting that if one collapsed, they could all go, as they were so entwined with each other. In the second edition (2010) he elaborates on this concept of fragility, noting that few lessons have been learned. His critique of mega-sized companies and institutions is that they can get away with much more than smaller ones, and so their risks tend to be hidden. This makes them more, not less, vulnerable to black swan events.
Source: Philosophy Classics: Thinking, Being, Acting, Seeing, Profound Insights and Powerful Thinking from Fifty Key Books by Tom Butler-Bowdon (London & Boston: Nicholas Brealey).
Nassim Nicholas Taleb
Taleb was born in Amioun, Lebanon, in 1960. His parents had French citizenship, and he attended a French school in Lebanon. During the Lebanese Civil War which began in 1975, he studied for several years in the basement of his home.
A former derivatives trader turned mathematical analyst specializing in problems of probability and uncertainty, he held positions with major banks such as Credit Suisse First Boston, UBS, and BNP-Paribas. Taleb is currently Distinguished Professor of Risk Engineering at New York University’s Polytechnic Institute and Distinguished Research Scholar at the Said Business School, University of Oxford. His degrees include an MBA from the Wharton School, University of Pennsylvania, and a PhD from the University of Paris.
Other books include Fooled by Randomness (2001), Dynamic Hedging (1997), and a book of philosophical aphorisms, The Bed of Procrustes (2010).