Luca Dellanna on Risk, Ruin, and Ergodicity
May 29 2023

ERgodicity-200x300.jpg Author and consultant Luca Dellanna talks with EconTalk host Russ Roberts about the importance of avoiding ruin when facing risk. Along the way Dellanna makes understandable the arcane concept of ergodicity and shows the importance of avoiding ruin in every day life.

RELATED EPISODE
Nassim Nicholas Taleb on Antifragility
Nassim Taleb, author of Fooled By Randomness and The Black Swan, talks with EconTalk host Russ Roberts about antifragility, the concept behind Taleb's next book, a work in progress. Taleb talks about how we can cope with our ignorance and...
EXPLORE MORE
Related EPISODE
Eugene Fama on Finance
Eugene Fama of the University of Chicago talks with EconTalk host Russ Roberts about the evolution of finance, the efficient market hypothesis, the current crisis, the economics of stimulus, and the role of empirical work in finance and economics.
EXPLORE MORE
Explore audio transcript, further reading that will help you delve deeper into this week’s episode, and vigorous conversations in the form of our comments section below.

READER COMMENTS

Trent
May 29 2023 at 8:47am

While listening to this podcast, I couldn’t help but think of the adage that Ross Brawn lived by when he helped run Ferrari’s Formula One team:  “To finish first, first you must finish.”

Ferrari’s culture had seemingly always been to build the best, most powerful engine.  Sounds good, unless the engine failed.  What Brawn drilled into the team (and yes, he had to say it more than once) was that you needed to first have a reliable car; once you had that, then focus on having the fastest reliable car in the field.

The results were impressive.  The Brawn/Michael Schumacher/Jean Todt Era at Ferrari (1996-2006) was the most successful in the history of the team in terms of Constructor’s and Driver’s Championships.

Joe
May 29 2023 at 2:57pm

It is interesting to consider the concept of “near misses” with regards to the COVID pandemic. You don’t get the near miss information if you almost get infected. How does this affect personal decision making and transmission dynamics?

Ben Service
May 29 2023 at 5:36pm

I work in an oil refinery (thankfully mainly in the commercial space and not in anything risky or safety related) and was reflecting on the Monsanto part of the discussion where a worker gets killed in a plant.  OK so you have to fly and front up to the CEO within 24 hours but then you fly back and get back to the business of making money.  You could make it more real by saying if someone dies at your plant then we will kill one of your kids, or less extreme close the plant down, neither of which I think are a good idea.  I guess my point is that would then drive you to not take any risk and probably shut your plant down anyway (because you don’t want to take the risk or you put so many controls in place to reduce risk that you aren’t competitive any more) as one of my managers said, the safest oil refinery is a closed oil refinery but we are in the business of taking and managing risk.  Society needs oil refineries and chemical plants etc… and they all have a non zero risk of killing someone and even a non zero risk of killing a lot of people and not just people who work at the plant so how do we think about it?  Is this coin flipping from the society perspective and is this OK?

Also reflecting on the crossing the road in Israel, yes you can not jaywalk but still are not going to reduce the risk to zero (driver running a light, lights malfunctioning etc…), how do you make the right trade off, what is a good way to think about it?

I intentionally don’t work in front line operations any more (I was an operations supervisor for a period of time) for the reason that I can’t get my head around how to manage the killing someone risk but I do realise that someone has to do it and make these trade-offs, how do you pick the right people to do that kind of job and how do you incentivise them to do the “right” thing?

Luke J
Jun 1 2023 at 6:11pm

Re: jaywalking,

is it really the same to reason that because a driver might randomly run a light or jump a curb that crossing the street against the traffic laws is a similar kind of risk? It seems to me that the former is random and unpredictable whereas the latter is reckless.

Robert M.
May 29 2023 at 11:02pm

I am trying to apply ergodicity to the mRNA vaccines. The centralized health and government organizations argued that the “Average Outcomes” were favorable, as if the costs/benefits to the populations was a situation of ergodicity. But it wasn’t. It was an non-ergodicity situation. Children had near zero change of dying from covid, but a much greater change of getting myocarditis or other maladies from the vaccine.

Certain adults had access to preventative treatments like Montelukast and Ivermectin were more effective than the mRNA vaccines and without the adverse effects . . . so a non-ergodicity situation!

Ben Service
May 30 2023 at 2:44pm

I’ll start by saying I don’t know the data about covid vaccines, there is so much info out there.  An argument for vaccinating everyone originally was that the thought that it would reduce transmission, I think that it turned out that the vaccines then didn’t end up reducing transmission that much.  If this is true then they should have changed course and people do a different risk reward calculation.

However say the vaccine did greatly reduce transmission, how do we think about the trade off then?  I suspect different societies would have different points of view.

Kaitlin Johnson
May 30 2023 at 12:13am

When one possible outcome is “game over,” you will not achieve the average outcome of all the other possible outcomes. What a great insight.

I just wanted to thank you for this podcast. I have listened since March 2018, when I was in 7th grade. This week I am now graduating high school. I remember my first episode clearly- I was practicing my lacrosse skills by throwing at a wall and catching the rebound. I had become a fan of Jonah Goldberg somehow, and he was on to talk about his latest book. I don’t think I understand that much back then, but I learned how to keep listening even when I didn’t fully follow. I learned how to sit in my confusion and try to glean meaning from the little bits of comprehension. From that first show, I took away that humans are naturally conflicted beings, full of both evil and goodness, “crooked timber,” to take a line from Kant. That idea has certainly rung more and more true as I have come of age and seen more wickedness (but also more unbelievable kindness) of the human heart both in myself and in everyone around me.

I like to think EconTalk did what the best English classes are supposed to do for adolescents— assist in their moral development through spontaneous conversation. So many little pieces of me come from this show that I don’t even realize. The careful way I speak? Probably from an EconTalk I heard on an 8th grade fishing trip about uncertainty and the brain. My appreciation for fiction that can truly make me feel like I’m inhabiting a different life? Probably something to do with the “Vampire problem,” and the many beautiful shows you’ve hosted about literature such as episodes on Jane Austen or Dostoyevsky. And so much of my “random” knowledge that my teachers have commended me for— too many facts about truffle hounds, the market for goods in refugee camps and prisons, patents in healthcare, the market for ransom (?!), effective altruism— have come from this podcast. It has been my companion for many trips, runs, barn chores, and Monday mornings where I just wasn’t ready to get going yet. Thank you not just for knowledge but for wisdom. I know the hours I spent with this podcast will prove to serve me for many years to come as I move onto the next academic chapter of my life.

Russ Roberts
May 31 2023 at 11:46pm

Kaitlin,

Congrats on graduating. And wonderful to hear that EconTalk has been part of your journey in so many wonderful ways.

Kathy E
May 30 2023 at 3:56pm

Fascinating.  I’m interested in investment risk.  It’s too simplistic to say the return is the expected outcome and the variance is a good way to measure risk.  That works if one could make a given investment over and over and over, but we cannot.  We make it once and take the gain or take the hit.  We have a limited number of opportunities to invest, a finite time and finite bucket of money.

I’m not sure how to quantify risk for an individual investor but this interview gives some interesting ideas.

Thank you.

Comments are closed.


DELVE DEEPER

EconTalk Extra, conversation starters for this podcast episode:

Watch this podcast episode on YouTube:

This week's guest:

This week's focus:

Additional ideas and people mentioned in this podcast episode:

A few more readings and background resources:

A few more EconTalk podcast episodes:

More related EconTalk podcast episodes, by Category:


* As an Amazon Associate, Econlib earns from qualifying purchases.


AUDIO TRANSCRIPT
TimePodcast Episode Highlights
0:37

Intro. [Recording date: April 27, 2023.]

Russ Roberts: Today is April 27th, 2023, and my guest is consultant and author Luca Dellanna. This is Luca's second appearance on EconTalk. He was first here in February of 2022, talking about compulsion, self-deception, and the brain.

Our topic for today is his book, Ergodicity, a word I suspect many of you listening have never heard of. Despite the strangeness of the title, I think it's an incredibly interesting concept, an incredibly important concept, and a lovely book. And, while Luca concedes that many of the ideas in his book come from others, Nassim Taleb and Ole Peters, for example, he has managed to write a superb and completely accessible treatment of a very complicated subject.

Luca, welcome back to EconTalk.

Luca Dellanna: Thank you, Russ, for having me again here.

1:33

Russ Roberts: So, let's start with your cousin, who is a skier. And, in your book you talk about that when you were 14 you would ski with your cousin, who was six. And, what happened?

Luca Dellanna: Yes. He was very, very fast and much better than me because you grew up in the French Alps, in a place where you start skiing probably before you start reading. And, he was very good. He eventually did the world championship for his age bracket. And then very sadly, one injury after the other; he had to end his professional career. And, from here, one lesson I learned is that it's not the fastest skier who wins the race, but the fastest one of the one who finishes the race.

Russ Roberts: Meaning that the skiers that we see are the ones who have escaped injury, or serious injury, and are allowed to continue their career. This is a obvious point, that the skiers who win world championships as adults have obviously avoided career-ending injuries. But, I think we tend to think of those as just bad luck. Some people get lucky, they don't get hurt badly, and they manage to keep skiing. Other people get bad luck, and they are unable to compete after a while because of the damage to their body.

But in fact, I think you have something more profound to say.

Luca Dellanna: Yes, apart from some factors, of course--genetics and everything--there is also something that has to do with time horizons and risk-taking. You might optimize the way that you ski to win the race, and that will lead you to take too many risks that might actually--on one side, they are what brings you to win the race; but on the other side, they also we what can bring you to have a career-ending injury. And of course, that's bad. And, if instead you want to win a championship, you of course need to manage your risks a bit differently.

And, one thing which is very tricky is that we think, 'Yes, Luca, but there are a lot of instances in life in which winning or performing well at a task is something that you do over a very short timeframe.' And, my answer is: It looks so, but to participate in that race, or in the task, you had to have some level of skill; and to get at that level of skill, it took you a long-term practice. And, unless you had the time and you avoided injuries so that you could train for long enough, you wouldn't be able to be in that race in which you can participate as if it were a short-term endeavor.

Russ Roberts: And, one of the names for this is survivorship bias. And, the way to think about survivorship bias is that we don't see the people who have lost, who have dropped out, who have been damaged, who have been harmed, who became addicted to drugs because of their obsession with winning every race, who took unhealthy risks. And, one answer to that is, 'Well, but skiing is dangerous, and it's always risky. So, are you saying I shouldn't ski?' How do I understand the lesson from your cousin?

Luca Dellanna: Well, the lesson is more evident when we talk more concretely about what you should do. What's the optimal level of risk-taking to maximize your wins?

And, the answer depends: how long do you want your career to be? How long is the period that matters?

Let me make a very concrete example. Imagine that my cousin is an excellent skier and wins 20% of the races where he participates, but he also takes lot of risks, and he break his legs in 20% of the races in which he participates. And, now a question: How many races is he expected to win on average? And the answer is: It depends by how many races he runs. If he does one single race, the average amount of races that he wins is 0,2 [0.2--Econlib Ed.]: one race multiplied 20%, makes 0,2. However, if he makes two races, during the second race, he can only participate if he didn't break his leg during the previous race, which means that he has only an 80% chance of participating to the second race, and the 20% chances of winning multiplied 80% chances of participating makes 0,16 [0.16--Econlib Ed.] expected wins for the second race.

And, if average it over two races, 0,2 + 0,16 makes average of 0,18 [0.18--Econlib Ed.] races won. So, we see that the expected number of races that my cousin wins depends on how long is the time horizon.

So, we can reverse that, and depending on the time of horizon that you want to have, different strategies yield the different--sorry, the same strategy yields different outcomes, and therefore different strategies might be optimal for different time horizons.

Russ Roberts: I just would point out that Luca is Italian; he is recording this from Turin, Italy. And, in Italy, and I think in Europe generally, when you want to have a decimal point, you put a comma. But, in America and elsewhere it's a point, not a comma. So, it's 0.2 or 0.16, not comma, but that's a little bit of translation for the non-Italian listeners.

7:57

Russ Roberts: I think the other way that you share that idea is very powerful. If I tell you you have a 20% chance of winning your races and you run 10 races, and I say, 'So, how many races are you expected to win?' the answer of course seems obvious: it's two. But, that assumes that you finish all of the nine before you get to the 10th. And, we often just take that for granted.

And, if I would put the biggest lesson of this book into a single phrase, it would be: You only get the average if you're allowed to continue to play. Even then, you might not get the average because in a small number of plays, you just might be lucky or unlucky, but you have no chance of getting to the average reliably if you're not allowed to play the game.

And, this is a obvious point, but I have found it profoundly helpful in thinking about risk, uncertainty, and decision-making. The fundamental lesson--and Taleb says it his way, and you say it a little bit differently--Taleb says you need to avoid ruin. You mention ruin also, but you also say, you need to avoid game-over. If you can't play the game, you're not going to win. And again, this is a cliché. It's a truism. You could say, 'Everybody knows that.' But, not everybody remembers it. And, the purpose of this conversation, and I think of your book, is to help you remember it.

Luca Dellanna: Exactly. And, the reason we sometimes fail to remember is because, again, of survivorship bias. We look at people around us which are wildly successful, and if we aim to get the success, the reality is that we need to take risks. And, not just the kind of risk which is good, like, which only has upside and very low downside. But if we aim to be, for example, the richest person in the world, or the most successful entrepreneur in the world, we need to take risks which come with the possibility of game-over. And that, of course, means that the more we aim to be number one, the more we need to take risks that will decrease our most likely outcome. And, that is a trade-off we need to be aware of.

And, for most people, what we want is not to be number one, but to have a very good distribution of results so that we have a very good chance of ending up, maybe not number one, but in the top 10% of people. And, that's a different strategy.

Russ Roberts: And, you might be listening at home saying, 'Well, I don't want to be number one: that's too much. I don't take those kind of risks.' But, the fact is, is that: if you want to be successful, it's not just number one, if you want to be successful, even not as successful as the top 10%, but top 20--forget what proportion--you just want to be successful, much of life--and this is, I think, a profound lesson--much of life has the characteristics of the kind of risks we're talking about.

11:23

Russ Roberts: And, let's turn to Russian roulette, which is again seemingly an irrelevant experience that most of us have never even thought about, and I think it's a very useful way to think about risk and life. So, explain Russian roulette and the kind of mistakes people make when they think about it.

Luca Dellanna: Yeah, so Russian roulette is a gambler's game in which--of course don't try it at home--but you take a gun and you put only one bullet in a barrel that maybe has the place for six bullets; and then you randomly spin the barrel so that you don't know whether when you shoot the gun there will be the bullet leaving the gun. And, you point to the gun at you, and basically you have one in six chances of dying, or five of six chances of surviving and winning a prize. And, the question is, what's the average expected win from playing Russian roulette?

And, the naive answer is five-sixths of the reward. For example, if the reward is $6,000 for playing, when you play it, you have a five times in six of surviving and winning. So, you will think, 'Of those $6,000, I expect to collect $5,000.' And, that is true only if you play Russian roulette once.

But, if you play Russian roulette many times, the average outcome is not that you win $5,000 per times you play, but the average outcome is that you end up dead.

And, there are a lot of situations which play out like Russian roulette--not that you end up dead, but that you end up in some form of game-over. Maybe you invest in something that goes bankrupt, or maybe some negative event happens--your co-founder commits fraud, or the market changes, or you invest in something and then Covid comes and then your business goes--like, a lot of things.

Russ Roberts: The example you give in the book that I like is: You might have a very ambitious, demanding job, and it pushes you to work long hours; and you're good at it, and you're fine most of the time. But, there comes a stretch, perhaps, where you jeopardize your marriage, or your mental health, or your physical health because you're pushing so hard. And, most of the time--five-sixths, maybe it's even higher--you're successful: you get promoted, your salary goes up. But, if you're going to play for the long haul--which is a really good idea, I think for most of us--you better be aware of those other risks. Which means that there are some times when you should take a weekend off, or not work 18 hours that day, and so on. Each time that you're challenged to meet a deadline, and you work those 18 hour days, and you think, 'Well, just this time,' but if you continually do that, you're playing Russian roulette.

Luca Dellanna: Exactly. And, that's particularly true because, for example, if you have a problem with your marriage, it's very likely to be irreversible. And, when you have a risk of ruin which is irreversible, you cannot average them out.

For example, in a relationship, it's true that the more time you spend with a person, the more you can deepen the relationship and solve problems, but if you have a problem that causes the relationship to end, then you cannot recover it.

And, this irreversibility is what differentiates risks that you can take and risks that you be very must be very careful from taking.

Russ Roberts: It's a another way to say ruin or game over. Irreversibility means you can't repair it by relying on the law of large numbers.

So, even though the game is in your favor--and this is, I think, the reason these kinds of risks are so seductive--Russian roulette, five out of six; you could even play one with 99 out of 100--a much bigger chamber for the bullets--and you think, 'Well, it's a long shot. I'm okay.' But, the problem is that you will not get that average return if you lose once. One loss. It's not like, 'Oh, okay, I lost. I had a bad day.' No. It's over. You're out of the game. You're dead, or your marriage is ruined, or you're fired because under the pressure to perform, you cut corners, and do things that are, say, unethical.

It's a very deep idea that actually I use all the time when I think about my own risks that I face. It's a very simple idea. The idea is that the average return is not all that counts. It matters whether you can continue to play. But somehow that's sometimes just too hard to remember. So, try to remember it, folks.

Luca Dellanna: Exactly. And, this is a good objective to aim for. One, sometimes when we talk about risk management, we try to balance the risk, and think about average cost--like: what's the cost, what's the benefit, how much it'll cost me to manage this risk? And, you try to see what's the cost-benefit.

But, another approach is just, is to ask yourself how can you maximize the time that you remain in the game, or in case of games which are bounded, for example, maybe your business or your career--you are not interested to have it for longer than 50 years--you want nevertheless to maximize the chances that for those 35, 50 years, you can stay in the game. And, that's another perspective to risk management.

Russ Roberts: I would just mention that I took this job here as President of Shalem College with a five-year contract. I think by law in Israel you're not allowed to be the President of a college for more than 12 years. So, one of the things that the Board of my college should worry about is that I might make decisions that the consequences are going to be in 15 years, or even past my current contract--because they don't know that I'm going to stay and be happy with renewing. And to me, an ethical decision is to treat your situation--in this case, my job--as if I had a lifetime contract and I would say, quote, "forever,"--as long as I live. And certainly, the Board should be aware of the incentives that I face. If I'm not behaving that way, they should keep an eye on me, because there are many, many things I can do to push risk into the future versus the present. And, there's many amounts of effort I can take to make the long run successful versus the shorter/medium run. And, my natural incentive will be to avoid incurring costs that only benefit the college in the long run. But that would be wrong. So, I want to be aware of that incentive.

So, you can both think of it both as a ethical tool and a management tool. For me personally, I should act as if I have skin in the game beyond my formal tenure, and the Board should be aware that that might be hard for me. I like to think it's not, but they should be aware of that.

Luca Dellanna: Exactly. And, there are plenty of situations in which you also have the reverse. I'm thinking, for example, of some consulting companies where almost part of the model is expecting that the employee stays for just a few years, and therefore there is a pressure to making them work overtime, and taking risk with the health, with the marriage which won't materialize during the tenure that they're expected. So, there should be, for example, the ethics from the side of the company to behave as if you're expecting your employees to stay for their whole career and you couldn't replace them up to some measure, for example.

So, this is a very good framework that you can apply in lot of situations where you have two parties with two different time horizons, and you want each party to care about the time, or not to jeopardize the time horizon of the other.

20:03

Russ Roberts: And now let's move to the distinction between individuals and population. So, if you play Russian roulette once, you're most likely going to survive and win a prize. If 100 people play at once, you're going to get roughly 16, 17% of people are going to die, because you've now have the law of large numbers at play.

And, part of this concept--we haven't named it yet, by the way; I've loved it so far, we have not used the E-word, or the non-E-word--part of the power of this concept is to think about populations versus individuals. So far, we've talked about short-term versus long-term. You only get the long-term returns invoking the law of large numbers if you stay in the game. When you start talking about populations versus individuals, you see that same kind of potential difference.

Luca Dellanna: Yeah, exactly. Like, one way that you can invoke the law of large numbers is by having a population.

So, for example, Russian roulette, if played by an individual, has this problem of irreversible losses that move the expected average win from, let's say, $5,000 to $0, over long-term. But, if you see it from the point of view of an hypothetical company that employs Russian roulette players, and they can just hire new people when someone dies, for them Russian roulette will still almost always have an expected win of $5,000.

Russ Roberts: Yeah, the only problem is that, as you mentioned before, if they get a reputation for overworking their employees, they may have trouble getting new players to go through this process.

But, certainly you're right. I think many consulting firms, many law firms, and others put extremely high demands on a very short period of time for their employees. They do push the costs into the future. But, for the ones who burn out or have mental health or physical health issues in the meanwhile, so they understand that's part of the cost of doing business. Those don't turn out; they can't handle it. The others survive, thrive, at least in the short run.

But, I think it's a very useful way for thinking about society-wide risks when you're making a distinction between the risks to one person versus the risk to the society or the world.

And, right now, a lot of us--we've done a bunch of episodes on ChatGPT, and if you think that artificial intelligence [AI] threatens humanity's existence, even with a very small percentage, it is prudent to be extremely cautious with respect to it.

Luca Dellanna: Exactly. I think that in particular artificial intelligence is slightly different because there is the problem of competition between countries, for example. So, you can say for example, 'In my country I have the power to be very cautious, but what if this means that an adversarial country then gets very powerful AI and can take over?' So, it's a bit more complex in the case of AI.

But it applies to a lot of other risks where you don't have competition. I'm thinking, for example, about virus labs: What if you have, for example, some kind of research in which 99% chance you make some medical advancement, and 1% chance you might cause a deadly pandemic? The chances of the 1%, they keep accumulating so far that you might think, 'One percent in one year, it's nothing.' But, if you ask yourself, 'What about my lifetime?' the cumulative percent possibilities that you have a pandemic, they become very high. Now, I don't remember the number, but I think it's in the 50-to-70% range. And so, we want to think about the long term.

24:11

Russ Roberts: And also, there's the additional point that, when you're talking about low-risk activities--low-risk in terms of probability but not in terms of outcome--again, it's hard to keep these distinct: low chance of a bad outcome but when the outcome does occur, it's very bad. One of the problems with those kind of processes, either in your personal life or in a social setting, societal setting, is that the first time you play Russian roulette, you probably are going to be okay. And, each time that you're okay, it lulls you into thinking, 'I'm safe.'

I'll give you a trivial example. In Jerusalem, there's a serious fine for jaywalking. I decided when I moved here that I would not jaywalk at all--not because of the fine, but because the traffic patterns here in Jerusalem and the intersections where you're typically crossing as a pedestrian are very unusual. It's a little bit like when you go to England the first time, and the traffic is coming from a different side. That's another even more dramatic example of, you should not jaywalk, should not go against the light when you visit England; and even when you move there for a while.

So, here in Jerusalem I just don't jaywalk even when I can see, 'Oh, there's no cars coming,' because I realized early on that I'm not aware fully of where the cars can come from, and what I think looks safe actually isn't.

Now, if you didn't follow that rule, most of the time you're not going to get hit by a car. Because, you do look around. But, there's going to be a time--and again after the passage of days and then weeks, you don't get hit--you start to think, 'Well, I guess I'm pretty good at this.'

You're not. You probably are not better at it than you used to be. You've just been lucky. And so, you've actually put yourself in quite a bit of danger.

And, there are many, many things like that in life where the daily interaction of you with the risk, especially when it's unlikely or very low probability, can fool you into thinking that you've mastered the danger.

Luca Dellanna: Exactly. And, if I can jump on this, there is a very good framework to understand the risks when you have these low-chance events which can have a very big problem.

In manufacturing risk management, there is this principle called the pyramid of risk, which is an idea that comes from the 1930s, from a German engineer who realize that, in general, for each deadly accident, there are a few accidents in which injury was provoked. And, for each in injury accident, there are a few incidents in which no injury was caused. And, for each accident with no injury, there are a few near-misses, where something fails[?] but no one is injured.

And, they create this pyramid shape, where you have the deadly incident at the top, very narrow, and lot of near-misses at the bottom.

And, the principle is: You do not evaluate whether your behavior is safe based on the injuries on the top of the pyramid. But, you look at the bottom of the pyramid. If, when you cross your street you have some near-misses--for example, a car that honks at you--that's a signal that you should treat as if you were hit by that car to adjust your behavior. And of course, there are some limitations, but it's more useful than not, this framework.

28:01

Russ Roberts: That's a fantastic example. I used to be--I used to share an office at Washington University with Dick Mahoney. Dick was CEO [Chief Executive Officer] of Monsanto, and they have a lot of chemical factories around the world. They're dangerous, and they have a lot of safety procedures put in place. And, he had a rule that--this is not exactly your point but it's related--if anyone died in a factory, the manager of the factory had to be in his office in 24 hours. And, the factories are all over the world; the office is in St. Louis, Missouri. So, you could imagine a horrible tragedy: a worker is killed in an accident, and now the manager gets on an airplane and flies to Missouri. And, I think this is true--it could be he exaggerates, but I doubt it--he told me that when the manager would walk into his office, he would ask him, 'Why did you kill that worker?'

It's a horrible question. And of course, your first reaction is, 'I didn't kill him. It was just bad luck.'

And, Dick would say, 'But, surely there was a procedure you could have put in place that might have prevented his death.'

And so, it's not exactly the same point. And then, of course, the question is, 'What do you need to do now to prevent the next one?'

But, this idea of near misses is quite profound. When one of those things does happen, you are jarred and scared. But then it quickly fades away, unless you make a mental effort to pay attention to it. So, I think it's a lovely example.

Luca Dellanna: Yeah, thank you. And, I relate very much to the story you mentioned about Monsanto because I've worked for a few years in a chemical company, DuPont, and they also had a very strong culture of safety.

And, just to give you two ideas: So, one was that before each meeting that included more than two people, the rule was that we begin just talking about safety. And, in particular, most often than not, it was expected at least one person in the room mentioned one near miss that they witnessed or that happened to themselves. And, that's a great way, One, to bring the principle of: We care about care near misses.

Number Two, you can imagine a meeting with 10 senior executives. Everyone's very busy; the time cost of spending one minute talking about a near miss is very high. And, because it's very high, it is a very costly and therefore powerful signal that we care about safety not just in words, but we believe that the investment in safety is worth it.

And then, the third thing which they did very well is that--of course I'm talking about the time I was there and maybe only for my division--but safety was an objective that has cascaded down the whole hierarchical--which means, me, the individual contributor, I have an objective on my personal safety. My manager has an objective about the safety of his team. My manager's manager has an objective about the safety of everyone down the hierarchical line. The country director, about the safety of everyone in the company.

And the rule was that: If there were too many injuries, or, like, some severity--now I don't remember exactly how it was calculated--but basically the idea was: The injuries that happened down your hierarchical line affects your bonus and chances of promotion, even if there are multiple layers below you. And that was a very powerful, again, signal and incentive that we want to own the safety of everyone below us.

Russ Roberts: As an interesting example, in sports--in general, sports only only punishes actual injury, meaning: A hockey player lifts a stick above his waist and slashes an opponent, he'll get a serious penalty; might be suspended for a game or more. Similarly, in baseball, if you hit a player--if the pitcher pitches a ball and hits a player in the head--the player gets to go to first base, and sometimes the pitcher can be ejected.

In general, they do not punish near misses, but occasionally they do. I think actually in baseball, I think if a pitcher under certain circumstances gets close to a batter's head with a pitch, they are sometimes thrown out of the game. So, it's a great way to help a person think about it, and think about risk.

33:21

Russ Roberts: Let's talk about the pandemic, and the difference between, say, national averages and local averages. Tragically, you were in a part of Italy, Northern Italy, where the corona [COVID-19] hit very bad, hit very hard in the early beginning of the pandemic. And a lot of hospitals were overwhelmed. Why do you tell that story? Explain the importance of that.

Luca Dellanna: This story is because I actually checked the percentage of occupancy of hospital beds in Italy in that very week in which dead bodies were piling up outside of the Bergamo Hospital. And the percentage of occupancy was very low across Italy.

So, we see an average number which says that the system functions well, and yet it doesn't matter, because if the system fails locally, you might still have lot of extra deaths even if the average looks fine.

And, for example, what I say in the book is that you do not care about whether the system works on average. You care about whether the system works for you. And, you do not care about the average, but you care about local point of failures.

And, one takeaway is that if you are responsible for safety or for other kinds of risk management, you do not want to look only at the average numbers that you collect across everything, but you also want to care very much about spikes--about local spikes.

Which also means that in your risk management approach, you shouldn't only think about system-wide failures, but you should also think what could happen that might cause local failures, and how do you prevent local failures?

And, one very common way to think about local failures is: Local failures happen when the system is not able to redistribute load fast enough. Which means there were enough hospital beds in Italy. The problem was that we couldn't take people which were sick in Bergamo and move them to some other parts of Italy. For different reasons, of course. But, something that you want to think about is about this capacity of adapt, of adapt fast, and to redistribute load.

If you look in construction science, most of the materials which are the strongest, they're strong because they are able to redistribute load across the whole surface. And, that's something that you might want to think about when talking about systems.

36:14

Russ Roberts: One way to think about this, obviously, is even more trite than some of the things we've said so far. And, that is: You don't just care about the average, you care about the variance--another concept in statistics that measures variability.

So, if every time you tossed a coin, you could get a half a head, rather than a head or a tail, you'd get the return to heads very reliably. But you can't.

So, you can have runs: five, 10 heads in a row. And, the point, though, that I think, again, I want to emphasize here is avoiding ruin.

So, everybody understands that variance matters: the variance of your portfolio, the variance of outcomes in all kinds of different parts of your life. The challenge is to avoid the left-hand tail, the parts of the set of outcomes that are not just unpleasant, but are devastating.

And, those two things are very different. They seem like a continuum, but there's a point where unpleasant gets more unpleasant, more unpleasant, and become so unpleasant you can't play anymore. And that means you can't bounce back. And, that's the game-over point.

So, of course everybody understands that the average number of beds isn't what you really care about. You care about the average number of beds in the hospital that you're being taken to, or the number of beds. It's not the average: It's the actual number in that hospital versus the number of patients.

But, I think often, again, people forget; and when they assess quality of performance, they only look at the average because it's pretty reliable in some settings. Not all.

Luca Dellanna: Exactly. And, I think that, since you're talking about variance, a very good question is not just, 'What's my variance?' but it's, 'Let's assume that there is a moment of high variance. What happens then?' And, the answers to these questions are usually very illuminating.

Russ Roberts: Well, the way we often say it in statistics or uncertainty: there's a spike. There's a sudden unusually high outcome or an unusually low outcome.

And, another thing I learned from Nassim Taleb is that how big that spike is, is not necessarily determined by how big it has been in the past. One of the other common mistakes people make is they say, 'Well, let's look at the history. What's the worst year for the stock market, or the worst five years or the worst 10 years? And, as long as I can survive that, then I know I'm okay.' And, they forget that the next bad time might be worse than the worst time of the past. Also very hard to remember.

Luca Dellanna: Exactly. One of the main points that Taleb teaches is that, because in the tail you have very few data points, you also have an incredible uncertainty about how the tail looks like. And, that's why I usually advise people not to think about what the tails may look like, but to assume that the tail looks very bad and think, 'What about it? How can I prevent it?' or, 'How can I mitigate the risk assuming that it happens?'

39:38

Russ Roberts: And, we're going to talk about that in a minute. So, the lesson here isn't just: sometimes life's really unpleasant, you can't play anymore. I think it's a good time now--we've been talking for almost 40 minutes--we're going to talk about the word 'ergodic' and 'ergodicity,' and 'nonergodic' and 'nonergodicity.' And, you do not define the term, or I think even really use it in any real way until quite far into the book, showing a very wise and thoughtful strategy in my view. A bit of risk-taking there: if you started off talking about ergodicity, people stop reading your book and it's game over for Luca Dellanna. But, the idea here is that--so let's talk about the difference between ergodic versus nonergodic processes.

Luca Dellanna: So, one way to understand ergodic versus nonergodic is, if the outcome of an activity done many times by one person corresponds to the outcome of the activity done once by many people, then the activity is ergodic; and otherwise, it's nonergodic.

Another way that you can think about it is, if the expected outcomes--the average expected outcomes--change or are influenced by the time horizon, then it's nonergodic.

Russ Roberts: So, let's take the first one. So, if I flip a coin 1,000 times, and 1,000 people flip a coin once, we're going to get a very similar number of heads, something around 500. It could be a little more, a little less because there's still a random element. The law of large numbers--it requires even larger numbers than 1,000. But, you're very, very, very, very, very unlikely to get a number like five or 995 heads. It's going to concentrate quite intensely around 500 if it's done 1,000 times.

That's not true for Russian roulette. For Russian roulette, a person plays 1,000 times, going to die. And, that whole population--you--is not going to make it. If a thousand people play Russian roulette once, 17% die, and 83% make the returns of the game. And so, those are nonergodic processes.

The book really should be called Non-ergodicity because that is really what we're most interested in. And, I think again, one of Taleb's deep insights is that so much of our understanding of risk and probability comes from coin flipping; and coin-flipping--just in and of itself, forgetting the issues of ruin that we're talking about here--coin flipping is something that's very intuitive after you've done it a lot. And, you've gone to school, and you've heard a lot of examples about coin flipping, or cards, the probability of getting a pair in poker, or playing blackjack. Those are very different games than most of the games of life where risks and decision-making come into play.

Luca Dellanna: Exactly. The question that I get asked very often is, 'Luca, how do you determine whether something is ergodic or nonergodic? And my answer is, if we are talking about real life, it's always nonergodic. There is no single process I can think of in real life which behaves ergodically, mostly because there are a lot of things which can happen and go wrong outside of the process itself. We're talking about playing poker, for example, or playing blackjack. If you think that it's an ergodic process, if you put your bets carefully, for example, you're excluding the chances that one day you find a table where there is someone who is cheating, or you exclude the possibility that one day you need your money for medical expenses, or lot of other things. Nothing in real life is ergodic. The big question is: How nonergodic it is, or for concrete purposes, how can I make it more ergodic?

Russ Roberts: How can I reduce the variance? How can I reduce the risk of ruin? How can I push my own returns--not the market's, but my own--closer to the average? And of course, I said that, as I was saying it, I'm thinking of index mutual funds. Index mutual funds are a way to avoid the kind of ruin that occurs if you pick a handful of stocks, even if you're a genius, because you could be wiped out.

Luca Dellanna: Exactly. And, I make an example on a use case which I discussed a few times. Someone that works in tech in a startup tells me that they have lot of stock options that is in their company, which are worth a lot because the company looks very good and maybe will become the next Google or something like that. And, they say that they know that they should diversify, but they tell me that for them it doesn't make sense to diversify because any other investment will have a lower expected return. Which on one side is true.

But, the question is, what's the distribution of the average expected return? Because, if you keep all your money in the stock options of your company--which by the way also provides for your salary--what happens is that you have a certain number of possibilities where you become a millionaire or maybe even a billionaire, and then you have a certain other number of possibilities where you lose almost everything.

Conversely, if you take half of your stock options and you invest them, and you find some way to diversify them with an index fund, you reduce your average expected returns, but the distribution of those returns is such that almost certainly you will end up a millionaire.

And, the question is, what do you really want? And, for most people the answer is, 'Yeah, actually I want to make sure that I end up very comfortable with almost certainty.' And, that's another example why you do not want to look at the average, but you want to look at the distribution of outcomes.

Russ Roberts: I think I may have given not a very good example when I talked about picking five stocks. If one of the five stocks goes bankrupt, you're fine. If all five stocks go bankrupt, you're not fine. If two of the stocks go bankrupt and you need money for a medical procedure, you're cooked.

But, the more interesting example is usually where you've borrowed money to finance your investments, and the bill comes due, and you can't cover it; and then you're done. You don't get the average of the four stocks that make it, and the one stock that doesn't. And so, again, I think it's important to keep in mind those different examples.

46:52

Russ Roberts: I want to turn to the strategies for coping with these kind of issues. And, you make the point a number of times in the book that, as you just said a minute ago, you can't avoid nonergodic processes. They're everywhere. They're really part of your life. So, you can't say, 'Oh, I'll just stay away from those.' They're everywhere, in various amounts. And, the question is, what might you do to protect yourself?

Luca Dellanna: So, some strategies: one is redistribution, which means every now and then you take some of the investment that went very well, or that looked the most promising and you invest them somewhere else, so that even if those that were good, you have a problem, you are not in a game-over situation. And, I mean this both literally with investments, and more figuratively. For example, you take some of your wealth to build some relationships, or to have some base, I don't know, a house, or something that you can rely upon, and these kind of things.

Another thing that you want to do is to never go all in on a bet, no matter how good is the bet. And, people who know about gambling, for example, usually they know a concept called the Kelly Criterion, which, on the positive side, it's basically the idea of: How do I make betting more ergodic? Kelly Criterion basically says: You never bet everything; you always bet only a fraction of your wealth--how much of the fraction depends by how good is the bet--so that you can keep playing the bets very long time, so that you maximize the chances that, over the long term, you manage to reap the expected average.

However, I just came across a very good point from Harry Crane recently on the Kelly Criterion which says that the Kelly Criterion is too aggressive because it assumes that you know what's your hedge--that it assumes that the real odds of the bet. In real life, you don't really know that. And, the only way in which you can know whether you have a hedge on a bet is by playing the bet multiple times and seeing whether your returns are actually superior than average. And, to have that certainty, you need to play the bet many times; and, to have the certainty that you can play the bet many times, you need to bet even smaller fractions on it. So again: Don't go all in, and try to instead maximize the number of times you can play.

49:55

Russ Roberts: So, I don't know if I have told this story before on EconTalk, but I think you'll like it, Luca. It's about a professor at Stanford on decision-making, Ronald Howard. And, I heard the story from one of his students, and then I contacted him--he's retired, but I tracked him down and verified at least the basics of the story. It would go like this. He would give an exam, and on the exam, you would put what you think the answer is. Let's say it was a true/false exam. So, you'd put true or false, but not just true or false, you would put down the probability that your answer was true. And, your score depended on whether your answer was correct and how confident you were in it.

So, the problem was, is that the loss function was very steep. So, if you said you were 100% sure of the answer, and it turned out to be false--your answer was wrong--you would get a negative infinity on the exam. And, a negative infinity is a beautiful example of game over. Doesn't matter how well you do on the final, if you get a negative infinity on the midterm: it's still going to average out to an F.

So, he told people--he begged them--'Do not put 100% on any question.' And now, you're thinking about it, and you think, 'What's the capital of France? It's Paris. I can put 100, it's okay.' But, he said, 'Don't do that.'

And, the student told me that some student put 100%, was wrong, and flunked the class.

So, I asked Professor Howard if that was true. He wasn't sure about it.

But, what I found most interesting about this example--two things, by the way--one is that Ron Howard told me that when he would do this at other universities, when he would visit somewhere, he would be told he can't do it. It's too cruel. It's not nice to put students under this kind of pressure.

But, here's the power of it: This student who told me the story 20, 25 years after he graduated, he never forgot the lesson: There's no such thing as a sure bet. No such thing.

And, in particular, when you're playing cards--many movies revolve around this point--where a player thinks he's got it won, goes all in, and forgets the possibility of one in 52 or whatever it is that there's another pattern that could destroy his hand.

And of course, there are many things in life that are just like this. It can be a stock that you think is really going to be a big success. It could be a move to achieve something, or a promotion that you know will make your life great. It could be a relationship that you're going to go all in--I'm pretty big on all-in on relationships, at least once you get married.

But, all-in is incredibly dangerous, and you're prone to overestimate your confidence and certainty of success.

Luca Dellanna: Yeah, exactly. And, I think that what the teacher was doing there is beautiful because he was teaching a very important lesson. I remember reading an article at [inaudible 00:53:13] Hobart, where he was saying that Jane Street--the investing firm--used to have something similar for the interns on the last day of internship. He would give them some credits, and then they would have to bet--I don't remember the details--but, like, bet on some events or investment decisions or something like that, betting the credits. And so, they were really showing, playing with uncertainty, probabilities. And I think that these are very powerful lessons. I'm grateful for the teacher to do it in a relatively safe setting, which is: worst case, you fail the exam; you can try it the next time, rather than you lose a huge amount of money.

Russ Roberts: And there's a famous card game called Three-Card Monte. It's played typically on the street. It's very popular in New Orleans, New York City--I've seen it--where there's maybe two red cards and a black card face down, and you just have to decide which card is the black card. He moves the cards around, and then you pick the one that's the black card.

And of course, you watch someone else play, and they win. You don't know that that person actually works for the person running the game, because you're a sucker. And, then you decide, 'Oh, this is going to be easy. I'm going to make a lot of money because I can follow the black card.' And, you can't. And, in fact, anytime, for me, I think I know the kind of example like that where the black card is, an alarm goes off in my head. 'You probably don't know. You're a sucker. Don't play.' And, there's no such thing as a sure bet.

Luca Dellanna: Yeah, exactly. And, I just wish that this lesson were taught more often and more early. Like, everything--before I was talking about spotting near misses: again, the principle is to teach/learn lessons earlier than when the bigger loss comes.

Russ Roberts: Yeah. And, as you mentioned a few times, time horizon matters a lot. I'm 68. My wife is younger than I am, so it's a little bit complicated. But, I am much more interested in the security of my assets than I am about their return because the only thing that can ruin my life is to be all-in, say, on the stock market, and a very bad set of outcomes occur. So, it's much better for me to take a safer outcome with a lower return and bet accordingly.

55:52

Russ Roberts: One more technique is the barbell strategy. Talk about that.

Luca Dellanna: Yes. The barbell strategy is something that Nassim Taleb mentioned in his book, Antifragile. And, the idea is that you want to put, to allocate part of your efforts or portfolio to safe bets, and then a small part of your portfolio to risky bets which might go to zero, but which have a good chance of extreme positive return, so much that the average expected return is positive.

And, that's a way in which you can expose yourself to bets that have high expected returns while also making sure that, even if they go to zero, your return profile still looks quite good, or at least doesn't look bad.

And, I think that something that applies very much, not only to investments, but might apply also to career, how someone lives his life, and so on: Making sure that if you pursue something that is risky, you pursue it only exposing yourself to it to the amount which is safe and won't jeopardize you.

Russ Roberts: I will say that when I took this job in Israel, my wife and I sold our house, which is a little similar to burning your ship when you land in a dangerous place. Sometimes you do want to expose yourself to risk, and do it in a way that ensures that you'll keep trying to succeed when things are going to be challenging.

57:53

Russ Roberts: Toward the end of the book, of: you give an example of when nonergodicity is desirable, and you use the example of baking a cake. Do you remember that?

Luca Dellanna: Yes. I make the example that if you bake the cake once, for the first time in your life, the average outcome is a burnt cake or maybe a bad cake. If 10 people bake the cake once, you get 10 bad cakes. But, if you bake a cake 10 times, the first few will be bad, but then you also get some good cakes. And, this is an example--baking cakes is nonergodic, but thankfully it's nonergodic, because that's how we get good cakes.

So, nonergodicity happens when you have irreversible consequences. And, in this case, the irreversible consequence is learning, and that's a positive consequence.

So, there are some cases in life where you want to increase nonergodicity. For example, and it's all the times when you have positive irreversible consequences--it could be learning, it could be creating something that lasts in the world. I'm thinking, for example, if you want to launch 10 products, and nine fail, one works, then the returns that the one that works brings are positive and are here to stay. Maybe relationships, maybe some money that you can use for something else.

So, nonergodicity is not necessarily a bad thing. Just as none of the advice that I put in the book is prescriptive as in: You should do it this way. It's much more about: You should know about how the true profile of outcomes looks like, and then you can take your decisions accordingly.

Russ Roberts: Well, learning is a very interesting example because if you bake a cake every week, you do get better at it--most people do, not everybody; and if you don't, stop baking cakes and just buy them.

But, you can forget what you've learned. And, one of the lessons of your book that I really like is how important timing is. And, you talk about organizations: this is really an application of this learning phenomenon.

So, if you bake a cake--to take your first example--if 10 people bake a cake once, for the first time, they all come out pretty bad. If one person bakes a cake 10 times, they learn and they get better.

But, if you bake a cake once a year for 10 years, they're all probably pretty bad. So, you have to invest in a little bit of intensity to produce the knowledge so that it can be retained. And, you use that point in a very nice way when you think about organizations trying to focus on an issue or change their culture from your own consulting experience. Talk about that.

Luca Dellanna: Yes, I say that behavioral change is nonergodic, and I mean it in the way that, for change to happen and to last, it has to reach a certain critical mass. For example, if you're a manager and you want your people to do things in a different way, you can remind them once a month for your entire career, and their behavior will not change. And, the reason is because you remind them once, and then the next day maybe they do the thing the old way, and they notice that you don't notice, and then they're, like, 'Okay, then I can just keep doing it the old way. Maybe he doesn't really care, maybe he didn't really mean it. Or maybe it was important yesterday, it's not important today.' Anyway, it won't stick.

What sticks instead is if you tell the same thing to your people once every day for one month, and you do not let the old habit pass even once, what happens is that the new habit will become ingrained, and you will not have to repeat yourself for the rest of your career, basically. Of course, there is a little bit of exaggeration here, but it's the importance of reaching a critical mass.

Russ Roberts: It's very important because, especially--I think I've become obsessed with this general idea that the right thing to do is often obvious, it's just hard to remember what it is. And, this is a really good example--we've been talking about examples of it really over the whole program. Obviously, you don't want to get hit by a car, or you don't want to go bankrupt, you don't want to get fired because you've cut corners. But, these things tragically happen all the time, which suggests either people didn't know these principles or they forgot them when they had to apply them. And, I think the latter is quite common.

1:02:59

Russ Roberts: So, to come back to the management example, in my experience--limited but growing every day as a manager in my current job--it's very hard to remember how important it is to remind people, because it's on your mind all the time. I'm very focused on a few things here in my current job. I think about them every day, and I've told people in my organization how important they are, and they know I'm the President, so obviously they're thinking about them all the time, like I am. But of course, they're not. And, it's the equivalent of changing a habit. And so, you feel a little silly reminding people of certain principles or ideas over and over again because you think, 'Well, of course they know this already. This must annoy them.' Whether they know it or not doesn't matter: it's that they have to remember it when it comes time to make a decision or to make a choice. And, it's not easy, not easy to change your mindset, and it does take a relentless, maybe once every 30 days, excuse me, every day for 30 days, or maybe twice a day for 30 days, or twice a day for 60 days. But, the point is, is that getting other people to be as focused as you think they ought to be is not automatic.

Luca Dellanna: Exactly. And, I have a great example on this. A few years ago I was consulting for an automotive company, and they had a warehouse which was really not in good order. Sometimes the workers would put components in the wrong place, on the ground, and so on. And, the warehouse manager told me that he tried; he keeps reminding his people, that nothing works.

What we told him to do was to stop thinking about the whole warehouse at once, and to only focus about the corner where there was a fire extinguisher. And, we told him for the next month, every time you enter in the warehouse, first thing you should do is to look at the corner with the fire extinguisher, and if there is any component in the one or two square meters nearby, you catch the nearest worker, and you get him to remove them immediately. And, you need to do it every single time, because the first time that you walk into the warehouse and there is a component on the ground and you don't say something, the message that your people will get is that it doesn't matter.

So, what happened is that during the first week, this manager has to work really hard to remind his people every single time about that corner. Second week, the people got the message, and that corner was empty of components. Third week, the magic happened and the whole warehouse got clear of components on the ground.

And, there are two lessons here. The first lesson is that habits are hard to adopt, but once you adopt them in one corner of your life, they're very easy to spread. And, the second lesson is that habits require the critical mass. If we focused on the whole warehouse, the manager would not have the bandwidth to remind his people all the time. Instead, by focusing on one single corner, he narrowed down the scope so that he had the bandwidth to be obsessive about it for one month. And, this is what you need to do if you want to change some habits in your company. You narrow the scope so that you have the bandwidth to obsess on it for one month, and if you do it, change is almost guaranteed to follow.

Russ Roberts: My guest today has been Luca Dellanna. His book is Ergodicity. It's very short, by the way. It can be read very profitably in a very short period of time. Luca, thanks for being part of EconTalk.

Luca Dellanna: Thank you, Russ, for having me.