An amusing book that will change the way you think about sudden, unexpected events. Taleb explains that we're bad at predicting the future and that we overestimate what we know. He also provides a model for maximising potential upside while reducing downside.
First, an outlier that lies outside the realm of regular expectations, because nothing in the past can convincingly point to its possibility (rarity). Second, it carries an extreme impact (extreme impact). Third, human nature makes us concoct explanations for its occurrence after the fact, making it explainable and predictable (retrospective predictability).
The central idea of this book concerns our blindness with respect to randomness, particularly the large deviations.
Named after the philosopher Plato, it's our tendency to mistake the map for the territory, to focus on pure and well-defined “forms,” whether objects, like triangles, or social notions, like utopias, even nationalities.
The Black Swan is produced in the Platonic fold, the explosive boundary where the Platonic mind-set enters in contact with messy reality, where the gap between what you know and what you think you know becomes dangerously wide.
Three ailments that the human mind suffers from as it comes into contact with history.
This distinction allows us to make a clear-cut differentiation between two varieties of uncertainties, two types of randomness (Extremistan and Medicoristan)
Pick a profession that is not scalable (e.g. prostitute). A scalable profession (e.g. speculator) is good only if you are successful; they are more competitive, produce monstrous inequalities, and are far more random, with huge disparities between efforts and rewards.
Mediocristan — When your sample is large, no single instance will significantly change the aggregate or the total
Extremistan — Inequalities are such that one single observation can disproportionately impact the aggregate, or the total
Extremistan does not always imply Black Swans. Some events can be rare and consequential, but somewhat predictable, particularly to those who are prepared for them and have the tools to understand them.
Consider a turkey that is fed every day. Every single feeding will firm up the bird’s belief that it is the general rule of life to be fed every day by friendly members of the human race “looking out for its best interests,” as a politician would say. On the afternoon of the Wednesday before Thanksgiving, something unexpected will happen to the turkey. It will incur a revision of belief.
How can we know the future, given knowledge of the past; or, more generally, how can we figure out properties of the (infinite) unknown based on the (finite) known?
Our tendency to confuse absence of evidence ("no evidence of Black Swans") with evidence of absence ("evidence of no Black Swans").
e.g. An acronym used in the medical literature is NED, which stands for No Evidence of Disease. There is no such thing as END, Evidence of No Disease.
The backward-looking mental tripwire that causes us to attribute a linear and discernible cause-and-effect chain to our knowledge of the past.
The fallacy is associated with our vulnerability to overinterpretation and our predilection for compact stories over raw truths.
Memory is more of a self-serving dynamic revision machine: you remember the last time you remembered the event and, without realising it, change the story at every subsequent remembrance.
Judgement and decision making researchers have mapped our activities into (roughly) a dual mode of thinking, which they separate as “System 1” and “System 2,” or the experiential and the cogitative.
System 1, the experiential one, is effortless, automatic, fast, opaque (we do not know that we are using it), parallel-processed, and can lend itself to errors.
System 2, the cogitative one, is what we normally call thinking.
Our misunderstanding of the Black Swan can be largely attributed to our using System 1, i.e., narratives, the sensational and the emotional.
The way to avoid the ills of the narrative fallacy is to favour experimentation over storytelling, experience over history, and clinical knowledge over theories.
The attributes of the uncertainty we face in real life have little connection to the sterilized ones we encounter in exams and games.
Those who spend too much time with their noses glued to maps will tend to mistake the map for the territory.
Why on earth do we predict so much? Worse, even, and more interesting: Why don’t we talk about our record in predicting? Why don’t we see how we (almost) always miss the big events?
Epistemic arrogance bears a double effect: we overestimate what we know, and underestimate uncertainty, by compressing the range of possible uncertain states (i.e., by reducing the space of the unknown).
When you develop your opinions on the basis of weak evidence, you will have difficulty interpreting subsequent information that contradicts these opinions, even if this new information is obviously more accurate. Two mechanisms are at play here: confirmation bias and belief perseverance, the tendency not to reverse opinions you already have.
We use reference points in our heads, say sales projections, and start building beliefs around them because less mental effort is needed to compare an idea to a reference point than to evaluate it in the absolute (System 1 at work!). We cannot work without a point of reference.
You search for what you know (say, a new way to reach India) and find something you didn’t know was there (America). If you think that the inventions we see around us came from someone sitting in a cubicle and concocting them according to a timetable, think again: almost everything of the moment is the product of serendipity.
Prediction requires knowing about technologies that will be discovered in the future. But that very knowledge would almost automatically allow us to start developing those technologies right away. Ergo, we do not know what we will know.
Make a distinction between positive contingencies and negative ones.
Don’t look for the precise and the local.
Seize any opportunity, or anything that looks like opportunity.
Beware of precise plans by governments.
All these recommendations have one point in common: asymmetry. Put yourself in situations where favourable consequences are much larger than unfavourable ones.
Example: keep high cash reserves while taking more aggressive risks but with a small portion of the portfolio.
Example: plenty of idleness, some high intensity. The data shows that long, very long walks, combined with high-intensity exercise outperform just running.
Snub your destiny. I have taught myself to resist running to keep on schedule. This may seem a very small piece of advice, but it registered. In refusing to run to catch trains, I have felt the true value of elegance and aesthetics in behavior, a sense of being in control of my time, my schedule, and my life. Missing a train is only painful if you run after it! Likewise, not matching the idea of success others expect from you is only painful if that’s what you are seeking.
You stand above the rat race and the pecking order, not outside of it, if you do so by choice.
I am sometimes taken aback by how people can have a miserable day or get angry because they feel cheated by a bad meal, cold coffee, a social rebuff, or a rude reception. Recall my discussion in Chapter 8 on the difficulty in seeing the true odds of the events that run your own life. We are quick to forget that just being alive is an extraordinary piece of good luck, a remote event, a chance occurrence of monstrous proportions.
Imagine a speck of dust next to a planet a billion times the size of the earth. The speck of dust represents the odds in favor of your being born; the huge planet would be the odds against it. So stop sweating the small stuff.
Mother Nature likes redundancies, three different types of redundancies.
It's not that we need to stop globalisation and prevent travel. We just need to be aware of the side effects, the trade-offs—and few people are. I see the risks of a very strange acute virus spreading throughout the planet.
Thinking that a Black Swan should be a Black Swan to all observers.
Not understanding that doing nothing can be much more preferable to doing something potentially harmful.
The recommendation is to move from the Fourth Quadrant into the third one.
The Fourth Quadrant is where the difference between absence of evidence and evidence of absence becomes acute.
For an illustration of the way we can be ludicrously domain-specific in daily life, go to the luxury Reebok Sports Club in New York City, and look at the number of people who, after riding the escalator for a couple of floors, head directly to the StairMasters.
Self-confidence is the ability to look at the world without the need to find signs that stroke one’s ego.
Happiness depends far more on the number of instances of positive feelings, what psychologists call “positive affect,” than on their intensity when they hit. In other words, good news is good news first; how good matters rather little.
We do not live in an environment where results are delivered in a steady manner—Black Swans dominate much of human history. The same property in reverse applies to our unhappiness. It is better to lump all your pain into a brief period rather than have it spread out over a longer one.
I propose that if you want a simple step to a higher form of life, as distant from the animal as you can get, then you may have to denarrate, that is, shut down the television set, minimize time spent reading newspapers, ignore the blogs. Train your reasoning abilities to control your decisions; nudge System 1 (the heuristic or experiential system) out of the important ones. Train yourself to spot the difference between the sensational and the empirical.
We attribute our successes to our skills, and our failures to external events outside our control, namely to randomness. We feel responsible for the good stuff, but not for the bad. This causes us to think that we are better than others at whatever we do for a living.
Variability matters: don't cross a river if its four feet deep on average.
To borrow from Warren Buffett, don’t ask the barber if you need a haircut—and don’t ask an academic if what he does is relevant.
It has been shown, by Michael Marmot of the Whitehall Studies, that those at the top of the pecking order live longer, even when adjusting for disease. Marmot’s impressive project shows how social rank alone can affect longevity. It was calculated that actors who win an Oscar tend to live on average about five years longer than their peers who don’t.
While many study psychology, mathematics, or evolutionary theory and look for ways to take it to the bank by applying their ideas to business, I suggest the exact opposite: study the intense, uncharted, humbling uncertainty in the markets as a means to get insights about the nature of randomness that is applicable to psychology, probability, mathematics, decision theory, and even statistical physics. You will see the sneaky manifestations of the narrative fallacy, the ludic fallacy, and the great errors of Platonicity, of going from representation to reality.
I have to confess that I do not worry a lot—I try to worry about matters I can do something about
A Black Swan for the turkey is not a Black Swan for the butcher.
There is plenty of research on anchoring that proves the toxicity of giving someone a wrong numerical estimate of risk. German judges, very respectable people, who rolled dice before sentencing issued sentences 50 percent longer when the dice showed a high number, without being conscious of it.
People do not realise that success consists mainly in avoiding losses, not in trying to derive profits.
What is fragile should break early, while it’s still small.
Don’t let someone making an “incentive” bonus manage a nuclear plant—or your financial risks. No incentives without disincentives: capitalism is about rewards and punishments, not just rewards.
Image credits: Black Swan by Nassim Taleb