Nassim Nicholas Taleb on Black Swans
Apr 30 2007

Nassim Taleb talks about the challenges of coping with uncertainty, predicting events, and understanding history. This wide-ranging conversation looks at investment, health, history and other areas where data play a key role. Taleb, the author of Fooled By Randomness and The Black Swan, imagines two countries, Mediocristan and Extremistan where the ability to understand the past and predict the future is radically different. In Mediocristan, events are generated by a underlying random process that is normally distributed. These events are often physical and observable and they tend to cluster around the middle. Most people are near the average height and no adult is more than nine feet tall. But in Extremistan, the right-hand tail of events is thick and long and the outlier, the seemingly wildly unlikely event is more common than our experience with Mediocristan would indicate. Bill Gates is more than a little wealthier than the average. The civil war in Lebabon or the events of 9/11 were more worse than just a typical bad day in the Beirut or New York City. Taleb's contention is that we often bring our intuition from Mediocristan for the events of Extremistan, leading us to error. The result is a tendency to be blind-sided by the unexpected.

RELATED EPISODE
Nassim Nicholas Taleb on Black Swans, Fragility, and Mistakes
Nassim Taleb, author of The Black Swan and Fooled by Randomness, talks with EconTalk host Russ Roberts about his latest thoughts on robustness, fragility, debt, insurance, uncertainty, exercise, moral hazard, knowledge, and the challenges of fame and fortune.
EXPLORE MORE
Related EPISODE
Nassim Nicholas Taleb on Rationality, Risk, and Skin in the Game
Nassim Nicholas Taleb, author of Skin in the Game, talks with EconTalk host Russ Roberts about the ideas in the book. This is the third episode of EconTalk with Taleb related to the general topic of skin in the game...
EXPLORE MORE
Explore audio transcript, further reading that will help you delve deeper into this week’s episode, and vigorous conversations in the form of our comments section below.

READER COMMENTS

paul
May 1 2007 at 10:09am

good pod-cast. extremely important points that are rarely discussed. 2 questions, which papers by Boettke and Shackel are being referenced?

thanks, keep up the good work

Paul

Lauren Landsburg
May 1 2007 at 12:02pm

Hi, Paul.

Russ asked me to add these for you:

The Boettke paper is:

“High Priests and Lowly Philosophers: The Battle for the Soul of Economics”, by Peter Boettke, Christopher Coyne, and Peter Leeson. Mercatus Center Working Paper, no. 44.

There is a bio for Shackle–I’ve corrected the spelling of his name in the Podcast Highlights text–at the Newschool’s HET site:

George L.S. Shackle, 1903-1992

Lauren
Econlib Editor

Ethan
May 2 2007 at 3:59am

Great podcast Russ. Nassim Taleb has been one of my favorite guest so far. I really enjoyed hearing about his work and look forward to reading his books.
“I’m an economics major,” is a conversation ender at the bars; I think I will use moral philosopher from now on.

For future podcasts I love to hear about market based solutions indeveloping countries(Karol Boudreaux?) or protecting the environment. An interview with James Buchanan would be great as well!

-Ethan Sklar

Torsten
May 4 2007 at 8:20am

I would not say artefacts and new technology are developed by accident or coincidents. I agree that lots of it isn’t designed in a scientific process using explicit knowledge, but their is also tacit knowledge that contributes to the success of the inventor. The same applies to entrepreneurs.

bjcefola
May 7 2007 at 6:11pm

This was a great podcast. A lot of ideas packed into a 60 minute discussion, I look forward to reading the book.

A missed opportunity: I’d like to have heard Talib’s thoughts on the usefulness of Sarbanes Oxley in the context of Luddic models…

Rohit
May 8 2007 at 4:05pm

This was a great podcast. It corelates to Hindu holy book “Gita” where it says “Just Do your job. Don’t think/predict about result because you are can’t. Only certain thing is for every action, there is a reaction”

Looking forward to read your books.

Mr. Robert: You have been very helpfull in asking him right questions to clarfiy certain words/his thinking…

jack jacobson
May 13 2007 at 12:06pm

Great podcast. What is the reference for the statement “80% of epidemiological studies fail to represent real life”

Thanks

Russ Roberts
May 13 2007 at 7:42pm

Jack Jacobson,

Not sure what Taleb was thinking about. This is what I had in mind:

http://cafehayek.typepad.com/hayek/2005/07/only_a_third.html

“Only” one third of medical studies turn out to be unreliable.

Jack Jacobson
Jun 6 2007 at 12:20am

FYI, I found more information on the source of the 80% quote noted above. The author Taleb was referring to is John P. A. Ioannidis MD PhD from the Department of Hygiene and Epidemiology, University of Ioannina School of Medicine, Ioannina, Greece and Adjunct Professor of Medicine at Tufts University School of Medicine.

His lecture at the AAAS meeting was S048 – Mixed Health Messages: Observational Versus Randomized Trials – Presented at the American Association of Science Meeting in February in San Francisco, CA, USA, available for purchase at http://shop.lawrencemg.com/aaas-2007-m-16.html?page=3&sort=3a

Among other articles he’s written on this topic are these:
http://medicine.plosjournals.org/perlserv/?request=get-document&doi=10.1371/journal.pmed.0020124
BMJ 322:879–880
http://www.ncbi.nlm.nih.gov/sites/entrez?cmd=Retrieve&db=PubMed&list_uids=16014596&dopt=Abstract

He is also the author of the article you referenced on 1/3 of medical studies are unreliable. Cheers

Erich Doerr
Jun 12 2007 at 9:27pm

Econtalk,
Thank you for providing me with a high weekly surplus given my low cost of an hour.

This podcast instantly became my favorite only after a second play. The first time around, I couldn’t accurately separate Taleb’s jokes from his explanation. I had to complete the Black Swan and explore his homepage before I put it all together. I’m a shoe-in to buy Fooled by Randomness in the near future.

Thanks again for your effort and time,
Erich

Comments are closed.


DELVE DEEPER

About this week's guest:

About ideas and people mentioned in this podcast:Books:

Articles:

Podcasts and Blogs:


AUDIO TRANSCRIPT

 

Time
Podcast Episode Highlights
0:37Intro. Large library, not as large as Umberto Eco. Thematic quote from Fooled by Randomness:
We favor the visible, the embedded, the personal, the narrated, the tangible. We scorn the abstract.
Tend to believe all swans are white because you've probably never seen a black swan. Confirmation bias is not taking seriously what you don't see. Works well in primitive environment but not in less primitive environments. Real world has many observations dominated by fat-tailed distributions--dominated by a small number of observations. Imagine two countries, Mediocristan vs. Extremistan, populated with a good sample of the world's population. Next, think of the heaviest person you could find. How much of the total weight will that person represent? Trivial percentage. The exceptional is inconsequential for weight. Law of large numbers--as your sample becomes very large no single instance can influence the total. Converge to some stable average. Normal or Gaussian distribution characterizes Mediocristan. Now, for Extremistan, instead of weight, think about income (which is not normally distributed). Add in the richest person you can think of, say Bill Gates. The exceptional in this case is not inconsequential. The two random variables, weight and income, have different distributions. Currency markets. Reischmark hyperinflation. Extreme events are often not Gaussian. 1992 Israeli interest rate incident.
9:58Now, look around you. Think about published serious novels: a handful will dominate the sales (even in a world without J. K. Rowling). Between 5 and 20 will dominate sales in any given year. Scorning the abstract works well in Mediocristan, where you don't need a large sample to become acquainted with the general properties. (Common rule of thumb is that you need around 30 to figure out the general properties of the distribution.) In Mediocristan, you don't need to sample everyone. It's cost-effective to use a small sample. The size of the sample may vary, but some small sample will do. The top 100 stocks won't be enough of a sample to protect you; you probably need a broader index, say 500; but you still don't need every single stock in existence. Bogle podcast. Not just the risk, but the unexpected, the risk that is unknown. Risk of stock market is not just losing money, but also missing an opportunity from the unknown.
17:01By contrast, think about Extremistan. In general, you tend to be fooled by randomness, to think only about what happened to you in the last few days in the stock market instead of a longer, larger sample. Tend to say "What I've seen is sufficient." In summer of 1982, U.S. banks suddenly lost more than they'd made cumulatively in the past. They were fooled by small sampling. They'd had many good years in a row, and had forgotten about the small probability of a very bad thing happening. This kind of error is pervasive, way beyond financial markets. Skeptical empiricist: don't just tell me the narrative, the ex-post story, but give me the data so I can figure out the general distribution (which will include estimating the probability of extreme events). This is in contrast to misuse, deceptive use of data to give data an air of scientism which is not scientific. In The Black Swan, discuss Narrative Fallacy. Hindsight bias, retrospective view, after the fact. We are suckers for stories, and ex post we have stories. 80% of epidemiological studies fail to represent real life--that is, they cannot be replicated. What people with statistical data do is process it and find accidental associations. Can even find a study where smoking lowers the risk of breast cancer. Non-representative statistical knowledge. The more data, the worse our statistical knowledge will be! The more you read the newspaper the dumber you get. Sometimes the profusion of information is misleading. "Once you have a theory in your head you are going to just look for confirmation." Bruner and Potter paper: show blurry picture of a dog, too blurry to discern. Now increase the resolution only gradually, you don't see a dog; but if you increase the resolution less gradually, the viewer sees the dog. The person who forms a lot of intermediate theories that it is not a dog, then relies on his experience. Global warming: you make errors if you only rely on stories and experience.
28:32Riddle of induction. If you use a linear model, you have only one or two degrees of freedom, which means you have limited ability to make predictions. But nonlinear models can allow too many degrees of freedom, so you can predict anything at all if you select the wrong one. Forward and backward problem. See an ice cube on the floor and you can predict how much it will amount to when it melts. If you see a puddle of water on the floor, it's harder to tell where it came from. Empirical medicine. Inherent bias in epidemiological (and economics) studies because if you find nothing it's hard to publish it. Ed Leamer pointed this bias out. George Stigler anecdote: when he was a young economist, if you wanted to test a theory, you might run only three or four regression analyses and you spent a lot of time thinking about what variables to test because you had to do a lot of computation by hand. Today you can run hundreds of regressions, and you can easily convince yourself that the "good" ones are the ones that worked out. We are fooled by randomness, selecting the ones that confirm our bias. If you have 1000 traders, you are bound to have 3 or 4 who by pure randomness do well. They are the ones who attract capital. Those with the same skills may be unable to get their careers started. A good dentist who makes a lot of money is probably pretty good at being a good dentist; but a trader who makes a lot of money may be just lucky. "Is this person skilled?" with regard to financial markets, the answer is "I don't know." You can't infer skills from success; but we reward it the same way. "We are more likely to mistake the random for non-random than the non-random for random," because of behavioral bias and statistics masquerading as science.
37:37We tend to force the world into Mediocristan, but we really should think about Extremistan. What advice for someone who wants to build wealth? We do not know the downside risk, so how can we arrange our financial affairs to sleep well at night? There are investments that are prone to negative black swans (e.g., banking, reinsurance); investments that are black-swan neutral; and those prone to positive black-swan error (e.g., biotech, publishing). You want to mentally characterize headlines into these categories: want to select headlines that are black-swan neutral or immensely positive. Chapter 13: How-to chapter. People tend to confuse the risky and non-risky. Using Markowitz model to achieve medium risk can be prone to model error. Same average risk can be achieved with securing 80% of your cash with security guards, and with the other 20% taking all the risks you can. To find out what to do with that 20%, go to parties, try to learn about fat tails, small chance of something fascinating happening, create exposure to create an envelope of serenity. Ramifications of confirmation bias. But there is another central source of error, more extreme than Hayek described. Looking at a dog it can be believed it was the product of randomness. But looking at a man-made object like a car it can be hard to believe it was the result of randomness. Yet the car was the result of a random process, and maybe even a worse one than the dog. When reading about scientific life, it seems to evolve as if it is by design. "I have the feeling that we are fooled by our own skills." Most of what has been discovered technologically was found when not looking for it, with minimum amount of theories about what to look for. E.g., Internet, computer, laser--three recent very influential discoveries that were all found by accident. Penicillin. National Cancer Institute (NCI), Meyers [Happy Accidents: Serendipity in Modern Medical Breakthroughs, by Morton A. Meyers--Econlib Ed., courtesy of a reader], found that the NCI went through 130,000 compounds before iterating in on the ones to use for successful chemotherapy. Gives illusion of intention, but you are fooled by accidental finds. Very little is found by controlled experiment. Trial and error is inherently hit and miss. Most successful books and movies are often not predicted.
50:08But is everything luck? No. Problem of separability. If I need a tie to become a banker, does it mean that having a tie causes my success? No. It is necessary to do research to find something. Most people have trouble keeping that separate, distinguishing skill from luck. Doesn't mean all success is lucky. Chapter 4: If I tell you that all terrorists are Moslem, people are likely to mistake it for all Moslems are terrorists. All tigers are killers is not the same as all killers are tigers. You need to make your own luck, as Pasteur said. Look at book owners: they diversify to try to diminish the problem they have of predicting which exact ones will succeed. Ludic and non-ludic games. Ludic games work within the rules, logical induction, assume experimenter is asking the right questions--rules of the game are clear, no ambiguity. In typical ecological uncertainty, by contrast, you are unsure about the rules, don't know probability structure. Mostly Extremistan. No limit poker, as opposed to limit poker (where you know the distribution). In real world, you don't usually know the probabilities. Statistical books are all ludic games--rules are very clear. Hayek's critique. Idiot savant would be very good at a looting game. Economics is not to be confused with finance. Using Gaussian techniques in a non-Gaussian world, or equilibrium techniques in an (unknown) non-equilibrium world, can lead you to make errors. Knowledge as a vast, unknowable landscape. James Buchanan--we take things like prices as data and look for how they come about, but those prices and preferences emerge in part from the process itself. Sentient beings with imperfect information. Hayek's "Pretence of Knowledge". Pete Boettke. Shackle. People who discuss this become unknown. Bastiat, Seen and Unseen. French edition of The Black Swan will have more on Bastiat and also Fouchier (?sp.). What is a good name for those who doubt in this way? Hayek, Popper as philosophers. Economists who think about these things used to be called political economists. The Theory of Moral Sentiments, by the founding economist Adam Smith was a work of philosophy. If you say you are a "moral philosopher", it will cause most people to run away--will that do better at having good conversation at a party than saying you are an economist? How about "empirical philosopher"?
1:10:24The Black Swan uses statistical arguments against statistics itself. Casino story: Running a casino sounds like an easy way to make money; but what about other problems of running a business? A tiger attack like Seigfried and Roy, which can result in lawsuits and in having to find a new show? Risk can be outside your perception. Same casino had multiple problems not related to tiger incident. Tunneling. Gaussian distribution tells you how much data you need to identify it, self-reference. Using the data you first have to discover the distribution you need. Only with the Gaussian distribution can you do this and easily find the parameters. Power laws--cannot find the parameters easily. Chaos theory: result was that you cannot build models. N-body, n-billiard-ball problem: much harder to capture a model from the top down with large numbers of people than with only two. But that doesn't mean you cannot predict anything. Second problem: when you predict a variable for 25 years hence, your error won't be just 25 your error from predicting 1 year forward. Third problem: psychological. Prediction is a kind of therapy. Have to look at your error after the fact. Forecast error is central to decision-making. Traveling to France, you know the size of suitcase for a given time of year; but need much bigger suitcase if traveling to Mars. Socrates, Apology. There are degrees in your ignorance about what you don't know. The Enlightenment. "I want to turn knowledge into action"--Marx critiquing Hegel--vs. Taleb--"I want to turn lack of knowledge into action." We should embrace the phrase "I don't know." We like experts, but not all domains have experts.