Intro. [Recording date: August 16, 2018.]
Russ Roberts: Charlan Nemeth's latest book is In Defense of Troublemakers: The Power of Dissent in Life and Business, which is our topic for today.... This book is about decision-making: in particular about how our awareness of what others say and believe, how those others affect our own decisions. Let's start with the power of the majority. How is my decision or opinion affected by knowing what the majority believes, or if there's some kind of consensus?
Charlan Nemeth: That's probably the most powerful phenomenon in social psychology, in all honesty: is that, back in the 1950s even, this classic study by Solomon Asch had shown that even when you are making judgments about things that have factual evidence right in front of your eyes--even when there are things you make [?] mistakes on, making judgments alone, such as, you know, which line of three is equal to a standard, or 'Is this blue or is this green?' Anything in which there is truly a factual basis. And where people literally make no mistakes. What will happen is that as few as 3 people agree on an erroneous answer, can actually get you to agree with them--contrary to what your own senses tell you. And, you are often unaware of it. I can go into the specifics of the whys if you want, but, basically, those studies in the beginning had enormous impact in the field, and it's one of the few studies that's been done in over a dozen countries with essentially the same results, and where there's enormous power in the judgments of the majority in terms of shaping the judgments--what we think is accurate. And it's obviously a business model for many, many companies who rely on the opinions of others, particularly [?] great numbers in order to sell products make you think of, you know, if something is true even if it isn't. I mean, you think of the fake news: you see it often enough and if enough people agree in your sphere you start to become unsure of what you know and what you believe. And it has enormous power, which we see in every place--in groups, mainly in organizations; and even in all these experimental studies which, as I said, have stood the test of time.
Russ Roberts: What's strange for me--and, of course, I may be very unrepresentative of the average person in these experiments--but, I tend to be kind of a contrarian. Not 'kind of.' I'm a contrarian. I tend to be automatically skeptical of what's popular, what "everyone else thinks." Aren't there a lot of people like me who are actually going to go the other way? Who are not going to be swayed by the majority, the so-called consensus?
Charlan Nemeth: There may be fewer than you think. Or, at least most of us who think we are contrarians, myself included--I think there's a reason why I studied this all of my life; and my colleagues, I can guarantee you, would attest to the fact that, you know, speaking up is certainly not always welcomed. But, I think, though, that it partly depends on the situation, as well. And I think it also depends to some extent on whether or not it's an issue about which you have conviction or deeply held views, as opposed to something where you are not so sure. But I really do believe, though, that it affects all of us, regardless of how we view ourselves. And the main reason, I think, is that we're unaware of doing it--
Russ Roberts: Yeah, for sure.
Charlan Nemeth: is that many times we do so habitually. And, if I mention, because I find it slightly humorous, but I mention often will show it, a very, very dated episode of Candid Camera. You are probably too young to remember it. But I certainly remember--
Russ Roberts: No, I remember Candid Camera--
Charlan Nemeth: Do you? Okay. But there was one segment that I often use in courses. It was called "Facing the Rear." And basically, as you know, Allen Funt used to kind of do experiments in the street to show--human nature, essentially. And in one of them, he had three confederates--you know, people that were asked to do this, who were in an elevator--and then this one innocent person walks in. Any, the bottom line is that after the doors close, all these people are turning in a different direction--turning to the rear, for example, of the elevator. And so, the doors will then open, and you'll see that the innocent person has also turned to the rear. And it's because there is an assumption that if they're all doing something, it must be right: 'There's something I'm missing.' But, it's not so many who kind of think out and calculate. So, you see this guy turning around--okay, it's funny in a way. But then the thing proceeds where, even if they--this was a day when men wore hats--but even when they take off their hats, he takes off his hat. And he looks puzzled; and he's looking around because he doesn't really know why he's doing what he's doing. Anyway, without going into all the specifics of the episode: Any class you show it to, even decades later this is, they are just laughing--roaring on the floor--
Russ Roberts: Because they knew they would do that.
Charlan Nemeth: Well, at an intellectual level, yes. But the laughter is because they know they [?] be doing that. There's a recognition because you identify with this guy, who, when you think about it, this is in the face of even roll[?] evidence. Because, it isn't just a matter of, 'They're doing it so I think it's right, so I'll do it.' When that door opens, he sees they are facing the wrong way. Namely, that's not where the door is going to open. And yet he continues doing it. And, what I love about it is that it sort of captures in a human, humorous way essentially what that research all shows: Namely, how blindly we follow; how unaware we are of what we are doing; and yet the fact that we do it even despite physical evidence to the contrary.
Russ Roberts: Now, Allen Funt is not a social scientist. He's an entertainer, the host of Candid Camera and the person who created it. But, he doesn't show us the clips where the people didn't turn around, because that's not interesting or amusing. But what you are suggesting is that that's rare: that most people, subconsciously even, follow the majority or follow the consensus. Why don't you speculate a little bit about why you think that's true, either evolutionarily or for whatever other reasons? I think we all recognize that there's a certain wisdom of crowds that we're drawn to. And that if, "everyone's doing it, there must be something to it," as you suggested. Anything more you want to add to that?
Charlan Nemeth: Yeah. Actually, I do. Because it's--at one level, it's at the heart of one segment of the book. What most of the research trying to find out why--I mean, some of it done by interviews with people afterwards that were pretty in-depth, but more done by even experimental follow-ups in which you can actually, you know, change some of these reasons. But, what it comes down to is that, Number One is that there is an assumption that truth lies in numbers. It's what you were talking about, the so-called wisdom of crowds. And we can go into that if you want. But the fact is, is that numbers are only better than the individual. Namely, there's wisdom in the crowds only under certain circumstances. And, one of them that's quite critical is that they need to be independent judgments. Namely, if you've got a bunch of people but they are all herding and following each other, that's like equivalent to the judgment of one. It is not a group of independent judgments--which does have value and power. It also assumes that they each--that the task is something that they have some knowledge about. So, if I'm going to--if you want to know, for example--
Russ Roberts: I've always liked that caveat. Because, some people seem to think, 'It doesn't matter. You just average them all out and it comes up with the right answer.' But, I'm a little skeptical when people are totally unaware of what's going on and not knowledgeable.
Charlan Nemeth: Yeah. Well, many of the studies, when you think about it--I mean, if you look carefully even at the stuff that the, that the books on wisdom of crowds talk about--there's, like, you know, estimating the number of balls in a jar: something that asks for. Well, everybody's got some knowledge. But if I were to ask, like, you know, 'Who is the Nobel Laureate known for finding the transuranium elements?' for example, a lot of people wouldn't know. And it wouldn't matter if I asked 50 people and they gave me a name. You'd ask somebody, for example, who understood chemistry or who had heard of Glenn Seaborg, or whatever. So, what I'm really getting at is that: majorities can be correct; but they are not necessarily correct. The real concern is that when they are wrong, we still follow them. And we do so, in part because we make that assumption that truth lies in numbers. So, as an aside, where it concerns me a little bit: the books that help, that have you believe that somehow you should[?] believe in it, because I think that's a problem we have anyway. We start by assuming there's truth in numbers. And, all you need is for some, you know, people to basically come out and basically tell you that you should trust that and not trust your own judgment. And that's exactly what I argue against. Because what we need is independence of judgment. Which means that: If you choose to follow them after assessing the value of what they are contributing, that's great. There's nothing inherently right about being a contrarian as opposed to following the majority. The question is whether you are reflective about it. And you make a considered, independent judgment. While I'm at it, though, the second reason--sorry to go on, but I'll stop--is that, the other reason, which is a powerful one, is that people fear being in the minority. They fear being the odd man out--the old adage that the nail that sticks up is going to get hammered down. They fear reprisal, and they fear rejection. And so, what happens is, is that--sometimes this isn't even conscious. It isn't even a considered notion that I'm not going to disagree: They actually don't even know that they have a different opinion. In part, because the assumption that the truth is in the numbers and they are missing something, coupled with the fact that that's a more convenient truth to have--namely, to join them and not to be the odd person out--those two conspire to be very powerful reasons for following error[?].
Russ Roberts: It seems to me--we've been talking a lot about tribalism in recent episodes. And, it seems to me it's a, part of this is a variant is a desire to make sure you are in the tribe. You are in the outcast--if you are an outcast, if you fail to meet the norms of the tribe, you are going to lose access to the tribe's benefits. You risk, as you say, reprisal, rejection, quarantine. And just being lonely. And so, I think there is a very powerful impulse we have to go along. To conform. And, it's--many times it's a good impulse, right? It keeps families--in the tribal setting, it keeps families together. It keeps marriages together. It keeps religions together. Communities of various kinds. Which is--you kind of accept of the norms of the group. And there comes a day, sometimes when you wake up and go, 'Uh oh. What have I been doing?' And you are forced to confront the fact that you've blindly been following these norms. And, again, many of the times, that's great--those blind followings; and it economizes on time. You can't sit every minute thinking, 'Should I do this? Should I not do it? Which way should I face on the elevator? Should I wear a hat? What kind of hat?' All those decisions in much of life we make unconsciously--which is really helpful most of the time. But, when we are acting politically, when we are making business decisions, investment decisions, personal-life decisions, you probably want to pay a little more attention. And I love this idea that runs through the book that you really have to be aware of this. Because it's working on you sometimes--maybe always--when you don't know it.
Charlan Nemeth: Yes. That's exactly right. I mean, as you were talking, I mean, it's clear that you don't have to be authentic and discuss everything and say everything that's on your mind. I think to some extent even some of the recent conversations--which I'm sympathetic with the principles--to use a Dalail[?] quote of transparency. But you don't have to say everything that is on your mind every single moment, without any regard to what the impact is going to have or whether you are going to succeed, or the other considerations. I think, though, that, what I'm sort of really arguing about though is that so often, we really--Number One, we aren't aware of when we are just following blindly. And, I think that you need that wake up call. But I think the second thing is that: People are afraid to speak up. I mean, much for the reasons that you are saying. So that, what happens is: We are not honest with one another. And, it's easier said than done. But, there are many cases--it doesn't even have to be something dramatic. But, let me just for a moment kind of go to maybe the more dramatic kind of things where people are, kind of, recognize the importance of this: Is that, you know, you have planes falling out of skies which is an illustration I use because somebody doesn't speak up, or doesn't speak up forcefully. You know, they kind of note something in passing as though it's a paraphrase rather than, 'You know, we're running out of fuel,' and that plane falls out of the sky. You know, you have surgeons can operate on the wrong limb. And, again, you can have members of the surgical staff who--they may not be absolutely sure, but it looks as though, say, a piece of equipment is malfunctioning. Well, instead of speaking up and saying, 'Hold it. Stop. This may be a problem,'--now, again, you are weighing that decision, obviously; there is some urgent situations in which, you know, you have to be quite sure; and that's still going to be a calculation. But nonetheless is that there are many, many reports of people who had a differing viewpoint who never spoke up. And catastrophes occur. You see them in business, with mergers--you know, even the old AOL/Time Warner one. You know, it was done in a matter of a few months, and there was strong dissent from people within the company, some of whom happened to be correct, at least with hindsight. But they weren't consulted. They had no opportunity to express it in that particular case. But the point is, is that there's sometimes truths out there that go unexpressed, because people are afraid of it. And there are studies that show that even, like, 70% of people report not speaking up about a problem in a company. And it's not a great way.
Russ Roberts: Yeah. It's perfectly rational on one level, which is, 'If they're not saying anything--if the surgeon, who is much smarter than I am, much more experienced than I am--thinks that's the right legislation, I must be the one that's wrong.' So, there's a certain natural self-doubt there, that, in the face of "expertise." But a lot of it is like what you said; and I think this is the more interesting point. I think a lot of it is fear--just pure fear of the remark coming back to you like, 'What are you thinking? Of course, we're doing the right leg.' And I think the lesson here for managers and for families and for friends is to react graciously to negative comments or constructive criticism. That's very hard for us. Right? We don't like to be told we are doing the wrong thing. And, the more we bark back at the people who criticize us in those settings, the more we create groupthink. Which is, as you said, it's the Wisdom of One.
Charlan Nemeth: Mmmhmm. Yeah. I had several thoughts as you were talking. Because--I mean--first, let me just--when you commented on expertise: It's interesting because you sounded almost as though you trusted that.
Russ Roberts: Well, I tried to put it in scare quotes. Very subtle. Maybe you missed it. Because I'm kind of a skeptic about expertise generally.
Charlan Nemeth: Okay. I was just going to say, I certainly have been. Because, I live in a world where everybody presumes expertise because, and they'll trot out all their degrees and this and that and the other thing. But, I did jury consulting for a number of years at one point, and the thing I was often struck by is--because judges always think they know more than jurors, you know--but sometimes I'm often struck by the fact that common sense is not so common. And that many times people can see through things: where you can see judges who are very articulate and can create a brief that looks very brilliant; but the fact is that it represents a very single perspective, you know, his own opinion. It's just that he's more artful in the way that he conveys it. And so, I've learned kind of long ago not to trust, essentially, things of status and prestige. Or, at least to question it.
Russ Roberts: Yup.
Charlan Nemeth: So, that falls in the same camp as the assumption that truth lies in numbers. Because, what happens is that we just assume that what we know or what we see can't be true. And, we tend to discount it. Which is, like, the worst way to make a decision, you know.
Russ Roberts: You know, a variant on that is when you go to a movie that everyone is raving about, and you are just struck by how bad it is. And, I walk out of that--and it happens to me. Not infrequently, but occasionally, right? And part of it is--well, I'm not like everybody else. But, a lot of the times, I'm thinking, 'Did I miss something here? Was I in a bad mood when I watched this movie?' Or, 'What were those other people thinking? Were they just--they got swept up?' And I'm just struck--you know, after going to certain movies, reading certain books--it's like, 'Gee, this didn't work for me at all.' It's not, 'Well, I didn't like it as much as they did.' It's, 'I didn't like it. I actually disliked it.' And 'How could they have raved about it?' And you wonder, 'How much were they swept along by that consensus, majority feeling?' and 'How much am I, you know, maybe misreading what it's real value is? And maybe I need to see it again.' And occasionally I have. I've gone back and watched a movie a second time that was raved about, or read a book or parts of a book that were raved about that I couldn't get into. It's not as if it's different. But often it does speak to me, just the way it is.
Charlan Nemeth: Yeah. I have that experience often, particularly with restaurants that are raved about. In San Francisco, that's all we do is eat, basically. You know. But, when I go, and sometimes realize that not only is it overpriced, or it's remarkably mediocre, or it is just novel for the sake of being novel even though it isn't a particularly pleasant experience. So you can go through all kind of things. I think, though, that sometimes it really is a herd. There is something to be said for a place because they work on it in marketing--it has a kind of a cache. It has the buzz, if you will. It's where all the hip people are going.
Russ Roberts: Yup.
Charlan Nemeth: I mean, Apple was great--whenever they were launching a product they'd have long lines out the front. They'd turn it into a happening--
Russ Roberts: It's brilliant--
Charlan Nemeth: and so people would go just to kind of be part of the crowd, part of the latest of whatever it was. And, you're an economist. I'm clearly not. But even, you know, a lot of the bubbles are attributed to kind of herds on the way up, and herds on the way down. And so those notions of contagion, and even the difficulty of being a contrarian, even in those situations, because you think, if you are a contrarian--if you are right, you are a genius, you think; and if you are wrong, you are an idiot, you know with big reputational fallout. But sometimes even if you are right, there's a quick way to view you reputationally as somebody who got lucky, for example.
Russ Roberts: Yep. Hard to know.
Charlan Nemeth: So, I'm just saying that it has costs in many different fields, being that contrarian. So, I don't underestimate for a moment the difficulty of it.
Russ Roberts: Well, let's turn to dissent. Because, that's the troublemaker part of your book, and the part that, being again myself a little bit of an outlier and not much of a majoritarian in terms of my outlook, I've gotten used to being something of a dissenter. And, dissent is a fascinating piece of your book. You argue that dissent, even when it's wrong, helps our decision-making process. Explain that, and talk about dissent more generally.
Charlan Nemeth: Okay. Well, I think that's the powerful message, in all honesty, because most of the time people think, 'Well, dissent is fine,' because they think, 'We're tolerant people, and everybody should have a voice.' But the reality of it is that--myself included, we're not immune to the human tendency to get irritated when someone challenges a view that we believe. And, we don't mind them saying it; but we don't like when they persist. And so, what happens is you are tolerant for a short time; and then what happens is that you start to both think and many times speak in rather derisive ways toward them. I mean, again, it's sort of like when I was teaching classes in this sort of thing: I'd sometimes do demos. And, I'd just bring people up practically at random, and I'd just plant one individual ahead of time who only took what I knew would be on a very different viewpoint: It was like, how to deal with a juvenile delinquent. Anyway, so they'd have this discussion in front of the class. And when the one individual espouses what I knew was going to clearly be a deviant viewpoint, what happens--and you can see it dramatically--it's one of those things that you don't take a risk in doing this demo: The phenomenon is so powerful that it works all the time, and it doesn't matter who is up there. And, what it does is supports the research findings, which now the students will start to pay attention to because they've witnessed it. And the first thing is, is that all the communication gets directed at the dissenter. Like, 'Why do you think that?' 'How can you believe that?' [?], whatever. So, everything zeroes in on him. The same thing like in the film of 12 Angry Men, when Henry Fonda is the only lone hand going up for not guilty, everything goes in with him, with the opening caveat of: there's always one. Okay? But what happens as it persists, you start to see derisive behavior. And so, even when you do the demo in class, you'll find that the other individuals, may be perfectly well-behaved college students; and they will start to, at first by innuendo and then more directly, suggest that the guy is either stupid or immoral--one of the two. 'How could he have this particular position?' And there's really pressure on him. And so--and then what happens is that at the end, even if I have them all turned around; I say, like, essentially, 'If the group had to be smaller and you had to exclude somebody, would you vote for x, y, z?' So you think, like, maybe there would be a bunch of hands admiring this lone dissenter. And there are very, very few. Maybe, almost everybody makes it clear he should be thrown out--
Russ Roberts: Well, he's ruining it. He's ruining it for everybody.
Charlan Nemeth: Yeah. I'm sorry--I went off on a tangent. I think maybe I didn't address the question you were raising. So, bring us back, if--
Russ Roberts: Well, we started off by saying that a dissenter bears a cost. They get treated derisively, sometimes violently. Sometimes they are shoved aside or ignored. I think I've probably told the story on EconTalk where I voiced some support for shopping at Walmart at a dinner party with a little bit of humor, and the reaction from the person I was talking to was to stand up and say, 'I don't have to listen to this.' And she walked away. Not, 'Oh, that's interesting. Oh, you're an economist. How would that work?' It just isn't--people's normal reaction is an enormous wall goes up. They are not interested in hearing a contrary opinion. Most people aren't. There are exceptions, obviously. So, the first thing is that the dissenter gets derided, criticized, insulted; and usually treated with some anger, as you point out many times in the book. But the other part that's interesting, you didn't say enough about, is how it forces the group consensus to evaluate, re-evaluate. And to confront it. Even though they might not say it so publicly. Which is even more interesting to me. They might publicly just continue to resent and dismiss the minority opinion, the dissenter. But inside actually something is going on. So, talk about how a dissenter improves the process.
Charlan Nemeth: Okay. Because, again, what happens is, you are bringing up a lot of thoughts. The one thing: First, in terms of it being private, it is set[?], many, many studies we did, particularly in the early days, show that, you know, we'll do mock jury deliberations or things of that sort. And what happens is, is that they won't move one inch in public to the dissenter. And it--sometimes I'd be asked for combat pay, actually, by people I put in, in that position of being the dissenter. But what happened, though, is if you asked them later--and you can't ask them very directly--so, if you really say, 'Did you change your mind?' it would be 'No, no, no.' And, 'He's still an idiot.' And so, you can have all the, you know, great questionnaires you want, but it still comes down to the fact that they dislike him and think he's ridiculous, etc. But, what happens is, is that if you are even slightly subtle--so, if you say, 'Well, you know, supposing he asked for twice the amount in compensation. What would have been your judgment, then?' Or, if you say--
Russ Roberts: For an injury. It's an injury case. Right?
Charlan Nemeth: Yeah. Yeah. This is a personal injury case. Is the one I'm thinking about. Or, if you say, if you go along with the judge rather than a member of the group operating as a jury, what would your decision? Those are like variants. You'd think it wouldn't vary at all. But, more importantly, is: You give them, then, totally new injury cases, where the facts are quite different and everything else, and ask for compensation. Basically, what you find is that there is noticeable attitude change. Namely: They have actually moved in the direction of what the minority judgment was. The one that persisted over time. As--that's a caveat in that. Because, to have any impact as a dissenter, you have to be consistent over time. It isn't a one-shot thing and then you are out of there. You really do have to continue, which means you are taking the risk. You are going to be the subject of derision. But that's where the impact is. But, don't expect it to be public, or you are going to be applauded for it. You will be derided. And, at the end of it all, there is noticeable attitude change; but it occurs at the private level. And that's the hard part about this message, is that: You can be standing up and taking the risk, but you won't necessarily benefit from it, in the sense that people will think better of you. And many times people will not even be aware of the fact that their opinions have changed. What I think is more important in terms of the research we then were doing more, and they--later, many years of my career--is taking a look at influence at a different level. Namely: Not whether or not you win as a dissenter--whether or not you've changed their mind, but whether or not you've changed the way they think about the issue. So, the reason I use a film like 12 Angry Men is not so much for what most people see--which is the ability of a dissenter, an alone individual--the Henry Fonda character--to win, actually to change the minds of the others. But, rather that it changed the nature of thought about the issue. So, what I pay attention to--and, again, it's more of a dramatic vehicle in 12 Angry Men--but you start to see the mechanisms by which, some of which have to be strategic, in terms of keeping the discussion open. Because, otherwise, they'll just sort of stop talking to you, and essentially try to yell you out of the room if they could. But, what you start to see is they start to look at, say, the downside as well as upside of their own positions. They start, for example--before, it was like, 'We've got two eye witnesses. What's there to talk about? I'm out of here.' And they start to kind of question whether that eye witness is accurate. And then they start to notice and remember other pieces of evidence that start to call it into question. And in one case, they recreate the scene, only to find that it couldn't have happened as the witness described it. Now, that's a dramatic vehicle. But it parallels what we did in many different studies. Which is that what we tried to do is to try to look at a lot of elements of what constitutes good decision making. And, things like, when you search for information, do you search for information on, say, both sides of the issue? As opposed to favor one. Or: Is your recall selective? Or, in other studies, you know: Do you consider alternatives? As opposed to just ruminate logically a lot about the one position, the yes or no? Or--sorry, I'm trying to just kind of scan some of the different studies. But, essentially, what you are looking at, is, if I use one word for it, it's divergent thinking--is that, instead of a kind of a linear, problem-solving thinking is that you start to think about the ups and the downs of each position. You start considering alternatives you wouldn't have considered otherwise. You start--in problem-solving, we've got studies where you utilize multiple strategies. And to great effect. Because the performance is much better. But, the bottom line of it is, is that the mind opens. And it opens because it's challenged. And so, we can kind of think that personally people don't want to hear it. But, if they do have to hear it, and it persists over time--so, it's in a meeting and they can't just sort of walk out and tell you to go home from your dinner party, you know, that you describe--what happens is, is that that challenge just changed the nature of discourse. And that's why, what all these studies show--and I think, for me, the most heartening finding was that: Whether or not they were right or wrong, is that, that kind of thinking is the kind that was stimulated by the challenge to the majority opinion. And that's in some ways surprising, because we see its value when it's right. I mean, who wouldn't argue that the guy shouldn't have spoke up when they were running out of fuel on the plane? But, what's a harder thing for people to get their arms around--but I think is most important--is that you profit from that dissent even when it's wrong. Because even if you are not totally aware of it, even if you don't want to give him credit, is that study after study shows that you think better. You actually are wiser in the kind of way you search for information, assess it, and think about it. And that's the beauty of it. So, it really suggests that you shouldn't just tolerate dissent: that you should welcome it. And I'm paraphrasing Senator Fulbright[?], which is a quote I happen to like. I've got certain authors that I love quoting. But that's one of them.
Russ Roberts: It reminds me of--long-ago EconTalk guest Ed Leamer, who said, 'Man is a pattern-seeking, story-telling animal.' And I think, it is also in the work of Nassim Taleb. It's a common problem we all have. We have a linear story; we cherry-pick the data that makes us confirm that story. And we're not really interested in looking at other parts of the story: 'It's a nice story. Don't ruin it.' And when the dissenter comes along and says, 'Are you sure? But what about this?' It forces you to imagine that there is more than one story. And I think that's really an important--it's important for, again, I think our personal decision making. And what I've found in interacting with people, both when I use this technique and I hear, feel, hear it used on me--how powerful it is--someone will make a bold claim: 'So and so is the all-time leader for doubles in the season, in baseball.' And I'm thinking, 'I don't know if that's right. I think it's Earl Webb. I think it's '67.' I'm thinking to myself, when this person says this other person--sometimes I just say: 'Are you sure?' I don't say, 'Oh, that's ridiculous. It's Earl Webb.' Instead I say, 'Are you sure?' Those three words, 'Are you sure?' It's remarkable how quickly people climb down from their over-confidence. And I feel it in myself. I'll make a claim. And part of the confidence I'm exuding is just pure conversation. Right? And in a meeting, that can be very powerful. And, someone will say, 'Sure.' And I realize immediately, 'Oh my gosh: Maybe I'm not so sure.' And I just think it's just opening that lid, just a little bit, does make a difference.
Charlan Nemeth: I agree. And that's a very artful way--
Russ Roberts: of dissenting, I am sure--
Charlan Nemeth: of dealing with it. But also, I think you are honest enough that when people say, 'Are you sure?' you take that seriously, and at least ask yourself whether or not you have full confidence or there is an area of doubt.
Russ Roberts: You can double down.
Charlan Nemeth: Yeah. I mean, some people actually will know--we see it every day--is that for a lot of people, 'Are you sure?' would be kind of like, 'Of course I am. You just don't know what you are talking about.' Or, you know, it can have any kind of a reaction to it. And, I think that, if you do that, I think to an honest person who is listening, I think it does open the door for conversation. And that's an enviable way for it to occur. I think in a lot of settings, though, and particularly ones where the decisions are really important, you are not really asking whether they have a doubt. You are really telling them that you think they are wrong.
Russ Roberts: Yeah.
Charlan Nemeth: That there's something else that's true. And, you can even say, I'm not 100%, but this is what I believe and think is true. And here is why. And however you want to phrase it, that you are wrong. But that there is an authentic difference of opinion. And I think that's the challenge that stimulates thinking. You see, you are opening a door to open-minded people who really worry about whether they are being accurate. I think, for a lot of people--I hate to say this, but I think for most people--they have to be challenged frontally, in a way. It doesn't mean you have to be rude, or angry, or certainly not disrespectful. But, you have to be clear. And it is when there is an opposing truth that someone really believes, that I think that's where the challenge comes for, really, re-thinking your own position. For wanting to search for information, to find out which is accurate. For considering other options. I think it requires--I think there's power in that. It's like--often, I think about, like, martyrs. You can think that they are dying for something that you can't imagine or may even seem wrong-headed to you. But the fact that they are that convinced and willing to pay that price--you don't just dismiss it. And, it is certainly food for thought. And many times in ways that you are not even fully aware is a stimulant for reflection, for searching for other information. All the things that I think are extremely important in good decision-making or in a vibrant society.
Russ Roberts: And you mentioned authenticity. And in the book, you point out--this also, I found very interesting--that being a Devil's Advocate is not enough. You can't just say, 'Well, what about this? Have you ever thought about this?' It's the persistent, consistent, authentic statement of an alternative truth that gets people to think. And, just merely stating an alternative view--and particularly if you make it clear, 'Well, I don't really believe this, but I just have to play Devil's Advocate to make sure we consider all viewpoints,' that that makes a difference. Talk about that.
Charlan Nemeth: Oh, yeah. Yeah. You know, to me, 'Devil's Advocate' is one of a bunch of things that have always kind of bugged me. Because they seem like an intellectual game. As though, everything is about transferring of information, and if we think if we can kind of intellectualize it and be logical about it that we are kind of, kind of arrive at truth. Or that we are really going to consider multiple options because we are such well-meaning people. And we are well-meaning. It's not that our intentions are bad. The thing is that I don't think people operate that way. They may try for a while, but they don't seriously reconsider something they believe. And, they can pretend--I mean, it's not total pretense. I mean, they are sort of like trying. But it's--one level, it's sort of like lip service. It's sort of like--I liken it akin to an intellectual game. And I always had that sense, about Devil's Advocate. So, for years, whenever I taught Group Think, and there was always the, you know, the antidote to group think--one of them was Devil's Advocate. And they'd been teaching this forever in Business Schools, for example. And, I just never believed it. And so, finally, I think particularly because a couple of students really got interested and one I did work with me on it, and then we did a couple of studies, and then it garnered a fair amount of attention, or at least it garnered a lot of discussion and rethinking: Okay, even if people were pushing back because they were kind of married to a--
Russ Roberts: You were dissenting.
Charlan Nemeth: Yeah. Well, also, I mean, they were married to that, um--that antidote. I mean, not just academics who might like it because it was sort of, you know, treasured territory for them. But, even, in many ways, I think even businesses said, they were searching for a way to get some of the benefits of dissent. At least as well as they understood them. But it was usually thinking that maybe there was some truth in there, someplace. But always they wanted to do it where we sort of stay still friendly. And we're [?] high. And we're kind of nice to each other. And we are kind of back-slapping. And, so what happens is, if you can use a device where, you know, you don't create the conflict and irritation but you still gain some benefits: I mean, that looks like a win-win. It's just that it doesn't really work--is that--at least, it certainly doesn't--it may be better than absolutely nothing, where you are just racing[?] totally to consensus. But, at least in our studies, it doesn't work anything nearly as well as authentic dissent. Because in fact, you know someone is role playing. You know they are kind of 'what if.' Or, you know, it drives me nuts if you watch the news--the interviewers will often say, 'Well, let me play devil's advocate,' or even if they don't say that word, they'll say something akin to, 'Well, some people might say.' It's still a variant on the same thing. They aren't just saying, 'What about this?' or 'I don't agree with that.' Instead of its being authentic discussion, it's more kind of couched that this is someone else's opinion, or this is a role-playing. And so, what we're doing is we're engaging in sort of an intellectual exercise. That doesn't really stimulate the kind of thinking, though, that an authentic difference does. And as I said, people always have in their mind that authentic difference has to be rude and impolite and confrontational and all of that. When it is an authentic difference in which you are clear about what it is you believe and you are willing to say so, knowing full well that you'll probably pay a price for it. And there's power in that. And I think there's value in it, I guess is also what I really believe.
Russ Roberts: Well, there's integrity, too. I think it's really--long term, there's an incredibly corrosive effect of going along with things you think are wrong, in meetings and family gatherings, whatever setting it is. I think of a lot of nonprofits that I've been involved with, been on the board of, been in meetings of, and, like you said: People want to get along. They are going to see these people. They are going to synagogue with these people, or to church, or whatever communal activity they are doing. And they know they are going to see them for the next x years. You don't want to be seen as the person who is disrespectful, or rude. Or just didn't want to go along. And so, it's very hard to raise those kind of questions in those settings.
Russ Roberts: I want to talk about style for a minute, because I don't think you do this too much in the book, and it seems to me to be important. And, as an economist, I want to focus on Milton Friedman. For a long time, Milton Friedman was a lone voice. In more than one area; but, I'll just pick one. He believed with his research in 1962, I think, that inflation was typically caused by monetary policy. Which is now widely accepted by most people, although the last few years have been a little bit funky, given the Fed's behavior; and people debate about that. But, for the last 50 years or so, he won that battle. And he started off virtually alone. And was derided, and made fun of, and mocked. And, of course, my joke is he also missed all the good cocktail parties, too. It wasn't just--right? He paid a social price for it in his daily life. And, if you watch film of him--he also was a contrarian on many other things--when you watch him discuss controversial issues with people who are clearly hostile to his viewpoint, he always smiles. And he's never disrespectful. He's never crude, or rude. He never raises his voice. He never shows anger. And I think--and I wrote about this--there's a chapter in my book, The Invisible Heart, where I describe a different dinner scene than the one I mentioned earlier where I--the character is fiction, but I have the character basically have to leave the dinner party. And, I think, when I was younger--although that dinner party never happened the way I wrote about it--but there were times when I would be angry and I would state my position with vehemence. And I think, deep down, I was thinking, 'This way they'll think I have these strong convictions, so maybe it's worth considering.' And, as I've gotten older, I've realized that's not fruitful. In fact, that makes them think, 'This is not a nice person. I don't want to be on his team.' And my view is that the lone dissenter who is polite and who smiles and is pleasant has an advantage. And I don't know if you agree with that. It's not necessarily consistent with the way you've described it in our conversation right now or in the book. But, it seems to me that the dissenter who wants to have an impact has to keep his or her cool and stay polite and respectful of the people, and just state their views firmly and with confidence. Is that--what do you think of that?
Charlan Nemeth: I think you're absolutely right. There's a reason that--the last chapter in my book, for example, I sort of felt compelled to have a--to make it what this book is about and then a big section what this book is not about. And part of it is, I realize, is that words conjure up images. And many times the image of a dissenter is one of fists raised and loud voice and all the things that people kind of say you should do to look confident. You know, is that: beat your chest. And those things are not what dissent is about. And it's really not even what the point of the book is about, really. It has to do with authenticity and conviction. And yes, it does require consistency over time. And you know that's going to make you perceived as unpleasant. Or, it certainly will invoke dislike. But it isn't the way that you're behaving. In fact, you are--I think you are exactly right: that it's maladaptive. Is that, if you are rude, it's almost easier to dismiss you, because then it's much easier to attribute it to your personality. And to essentially ignore the fact that you have a position about which you have conviction. You can show conviction without yelling. You can certainly show it without rudeness. But I think what's absolutely essential, particularly when you start thinking about functioning groups that have, as you point out, many times have to see each other over time, is that I think the absolutely--there are two words, that they are the hardest ones to really get in a company but are essential. One of them is respect: I think that no matter how much you disagree, the notion that you respect the fact that is something they believe, even though you strongly disagree with it. That's a very different attitude than resorting to name-calling, for example.
Russ Roberts: Yep.
Charlan Nemeth: And again, we see this in the political discourse on many different levels--of kind of reasoned judgment that doesn't compromise, it doesn't back off from what they believe, but who shows basic respect for the other person. And not only their right to say it, but even their right to believe--think--differently, which is probably the ultimate freedom. And so, I think that becomes very important. And again, if I kind of go back to the 12 Angry Men example, in a way, is that Henry Fonda never loses his temper. He never resorts to even what some of the members of the majority, who are armed with their hubris, feel that they can dish out. But, they are--it is not, though--showing respect doesn't undermine the sense that you have conviction. In fact, I think it augments it.
Russ Roberts: Yeah, no; I think that's well said. I think it's exactly right. Let me take another example, and I want to use it to criticize my general outlook and get your reaction to it. Another dissenter, I feel like right now who is getting a lot of attention for that dissent is Jordan Peterson. Who was a guest on this program. And, I find his writings and his videos and other things--provocative. I don't agree with everything he says. But he's made me think; and I wrote an essay about what I've learned from interacting from his ideas. And, he's very lonely. He's increasingly joined by large groups of people in his speeches, but he's staked out a very lonely position. He states it with total confidence. Politeness. He doesn't raise his voice. He's respectful--more or less--of the other side; you can debate that. But, in one-on-one conversation, he never resorts to insults. And it grabs your attention. It's an incredibly effective way for a lone voice to operate. And, what I want to say, critically of myself, is that I emphasize on this program humility. And the importance of saying, 'I don't know.' And, the willingness to be open to other viewpoints. As the saying going, 'Strong opinions, weakly held,' I think would be a decent description of my general outlook. And yet, I wonder if the overconfident people of the world make an important contribution, because: It takes courage to go out on the limb. It takes courage to be that lonely voice standing athwart history. We're going to be talking about Aleksandr Solzhenitsyn soon on this program, and his book, In the First Circle. Solzhenitsyn, Milton Friedman--they stood and took enormous intellectual beatings--and sometimes physical beatings in the case of Solzhenitsyn, for being that dissenter. And to have that courage in the face of that dislike requires tremendous, I would think, confidence. And so, I wonder, sometimes about the cost of my humility and the value of being overconfident.
Charlan Nemeth: Oooh. I mean, you are a wonderful interviewer because there are nuanced and complex questions--in all honesty. You know, I've thought a little about this. I probably would use the term that they aren't so much confident as they are, as they have conviction.
Russ Roberts: Well said.
Charlan Nemeth: And, they are stating something that they really have--they've thought about and really believe. But that doesn't mean that they also don't, at a very deep level, know--or in fact are sure that they don't have the full answer. That they haven't considered every possibility or every contrary to what they've come to believe. And so I think you can be both. And, I'm--I think your phrasing of the 'strong opinions, weakly held'--the weakly held is a willingness to know in your heart, and sometimes even to acknowledge that you might be wrong. That have still much to learn. But this is really what you believe, given that you've given a lot of thought and you are quite convinced that this seems to be accurate. I mean, given what you are capable of. But it also means you are aware--it's like almost what I was referring to earlier--is like, being aware of the fact that you are human beings means that fall prey even to the things that we study. So, we could talk about embracing dissent; and then we get so annoyed at somebody, it doesn't matter if you are studying it. I mean, you remember Danny Kahneman [?] talking about--[?] more personally. And he studies bias, and he's subject to biases all the time. You know, and strong ones--type of thing. It doesn't remove you from the humanity of what it is we are talking about. I guess the other thing, because also it's a person I happen to be very fond of, is Karl Weick, organizational psychology. And everybody quotes this, but many people don't attribute it to him--which is, by the way, prominent[?] in the field, to some extent. But, you know, his phrase is: Argue as though you are right, but listen as though you are wrong. And that, to some extent, captures doing both of those--I think is getting close to what you were sort of suggesting: Is that you are arguing as though you are right, because like the strong opinions. But you listen as though you are wrong. And that's because you know they are weakly held and there's still much to learn. So, I don't think that we're--that one removes the other, I guess.
Russ Roberts: That's a fantastic, subtle distinction. Which I actually find deeply comforting. I don't--I'm not--this is not a joke. This is not an intellectual game for me. This is not like, 'Oh, I pretend I'm humble and I think it's important to be humble. I think it's an important part of being a good human being'; and yet I worry about this. We live in times where it's very unclear what direction the country is headed; what the right direction is. And I don't want to--if you are not careful, humility leads to paralysis: 'Well, we just don't know what to do. So, we don't know what's right.' But, as you point out, you still have to have conviction sometimes. You have to make a decision, often, in the face of imperfect information. In fact, always, in face of imperfect information. And you have to rely on your judgment. And, I think the deep lesson of your book is that the rush to judgment, particularly in groups, is something to be very, very aware of. There's a great quote you have in the book--I think it's a subheading or something. It says "Group Decisions Often in Error, Never in Doubt." I think that's just a great line. And, when we leave the meeting having made that decision, we are all high. "We did a great job. We did the right thing. We came to the right conclusion"--whatever it is. And, being aware of the possibility that you are wrong is a great thing to be reminded of, and I think that's one of the things your book does.
Charlan Nemeth: Well, thank you. I appreciate that. I think even as a child--it's funny how--I've been thinking of quotes lately that kind of registered with me from way back when; and they sometimes take on new meaning when you have enough experiences in life. But, I can remember always being told--whenever I thought about the term 'humility,' which we were always taught was important, because pride was the biggest-sin-type-of-thing. But I remember the line that 'Humility is truth.' Namely, it is not false modesty; it's not saying, 'I don't know what I'm doing,' or 'I'm stupid' when I know I am not stupid. We don't need to go into it, but I spent three very happy years doing interviews with 5 Nobel Laureates in Physics and Chemistry, just a number of years ago now, so my memory is a little bit faded. But, I remember the thing I was so struck by is that, to a one--and these were the individuals who had really done something. They were not just good physicists, but our world had changed. And I would hear colleagues of mine say something like, I sat next to x or y and, you know, he couldn't be bothered talking to me. They are so arrogant. And I thought--I actually had an unkind thought about whether I'd talk to this colleague of mine. But nonetheless, my experience couldn't have been different: Is that, contrary to almost all my colleagues, they were the most--at one level--humble people I've ever met. They didn't tell they weren't, like, smart. They knew they were smart. But, you know, it's the sort of thing like with Charlie Townes: If you are thinking about the cosmos, or issues related to the Prime Mover, the origins of the universe, or you think of yourself in terms of--I mean, if you are smart enough to know how small you are in the context of history or the cosmos, you don't get so full of yourself. But, back to the notion that it was truth: it wasn't a false humility. It just was an awareness of how little all of us matter at some level, and how frail our understanding is at any given moment. But that shouldn't stop us from essentially speaking in authentic voice, because, as you say, if you don't, there really is a kind of a paralysis where everybody gets so confused nobody knows what they know. Which is why I go to even those early studies. Which is: If I know this is blue; this is not green--and I can tell you we've done studies--my colleagues would never believe anyone would call them green, for example--and yet, they do. And so you begin to realize that how ephemeral all this is, but how serious it is that we, at some point, we get ourselves in a pretzel where we are so worried about getting along, and sort of when we are articulating a judgment or half-thinking about 'What are they thinking of me? Could I be wrong? They're not going to like me. This is a stupid strategic movement'--whatever it is--you can't even think any more. And you don't even know what you believe. The scariest part for me, in all the study about the shaping by majorities--which we haven't gotten into, but how cults use it in myriad[?] ways--is that they shape the mind so that it literally is not aware of the bubble in which it exists. And terrible things happen as a result of that. And so, the worst part is that we don't know what we know. I mean, somebody can lie to you five times, and you can call and you can double-check every fact, and you can find out they are lying. And they are lying again, and again, and again. And yet, sometimes, when you don't want to believe that, you can still kind of go back and want to make excuses for them. Or, whatever. But, at some point, we lose the ability to have clarity about what it is we know. Even when we've done due diligence. I was--actually running into this sort of recently, where I just recently just had to take a stand at the University, because there was some very inappropriate behavior, I felt. And I felt it led a lie to all the rhetoric. And ended up--part of me thought, 'God--I don't need another battle in my life,' you know, kind of thing. But then I thought, 'You know, you can't preach this stuff and talk about what you believe, and not act on it.' But I know, I know, you're going stand up; and you are going to see no action. You are going to see irritation with you. And some days you just don't feel like it. But, I'm going to do it.
Russ Roberts: Good for you. Good for you. I just want to mention a quote--I know you like quotes, so, my favorite version of that, the humility of great scientists--and I've read it from Taleb and he attributes it to a Phoenician proverb: "The farther from the shore, the deeper the ocean." And for me, that really sums up: as I get older and as I get farther from shore, I know a lot more than I did when I was 25, and 26, fresh out of grad school; and yet somehow I feel like I know a lot less. And I think that's a sweet, sweet feeling that comes with age, maybe if you are lucky. The alternative, probably next to getting more and more confident as you get older. But at least for me, that's the way it works.
Russ Roberts: But I'm glad you mentioned the campus thing, because I wanted to ask you: One of the things that disturbs me deeply about America today--one of the things--is the low quality of political discourse and the lack of respect on both sides. That really distresses me. But the other thing is the failure on campuses to always encourage dissent and conversation. And, it seems to me it's the essence of education. It's the underlying theme of your book, really, is that: You can't think clearly if you are not open to the possibility that there is a different viewpoint. And, to me, that's what a great university is about. It's how Western civilization, civilization generally goes forward. I only said 'Western' because it's the one I know. But, is better than others. But that's the way civilization advances, through the back-and-forth conversation between great ideas. At least I like to think so. So, what do you think of the current state of campus debate, and the willingness of universities to squelch it? At least, that's the way it seems to me.
Charlan Nemeth: Oooh. That's a very heady issue.
Russ Roberts: You mean, 'Turn the recording off for a few minutes?' No, go ahead.
Charlan Nemeth: [laughter] Actually, I might want you to. No; there's, I think, a lot of concern, and certainly discourse--you know, again, I've been at kind of leftist universities my whole life. My first job was University of Chicago, for example, and I was there from 1968-1973, and it was the real turmoil of 1968: A colleague beaten up and nearly killed in his office for basically, because he was on the news a lot backing the anti-Vietnam protest. So, I mean, I really--those social issues were really in our faces in those years. And, the thing is, I often look back to those days, by the way, Russ, with--maybe it's a sign of age, but I look back on them with an enormous kind of warmth, because people cared enough that they would get up at almost like town meetings, and argue vociferously about every topic imaginable. But, there was an engagement; and there was an authenticity. And, of course it would make you nuts. You had a sense of, 'God--I wish this rhetoric would calm down.' But, I have never in my life experienced a period, and a location, where I found that I thought so much. Where I really had a sense of growth, in terms of my own convictions. And, you know, then kind of via[?]--I went to Virginia and British Columbia and whatever, and then ended up at Berkeley. And, I've often kind of been aware of the fact, though, that it's very difficult for students who have different political persuasions than the ethos. You know, say, for Young Republican students or whatever, to feel comfortable in classes: they are afraid of speaking up. It's the same thing, it wouldn't have mattered if it was a Liberal student at a very conservative college: it would be the same thing. The main thing is that you are the odd person out. And so, you do see this reluctance. And so you have to go to a lot of trouble in order to invite their speaking up. And to really make clear that you welcome it and you want to hear it; otherwise you can't have an honest discourse. I mean, how can you talk about, you know, women's issues, for example, if somebody who has a contrary view doesn't at least engage you? So that the dialog is better. Let me say that I think it's not only good in the sense that it informs people who didn't adhere[?] to that position, but again back to that line, it gets them to think. But, I think the beauty of thinking is that, even if you end up believing exactly what you did at the start, you do it with--you have it with more clarity. Because you've really engaged in re-thinking the issue that you hold to some extent almost blindly. I think, though, that now there are moves afoot to bring in speakers, for example, who offer differing viewpoints. Now, obviously, if I had my druthers, they would engage in debate. In respectful debate. I think that's the way you are more likely to learn. I mean, I'd prefer to be a member of that audience. I think what happens, though, is that it's much like watching the political discourse, where--you don't see people engaging in dialog. You see them making their talking points side by side. And occasionally sniping at each other, kind of thing. Which is not what we're talking about. We're talking about intellectual engagement, essentially, with someone who differs. But I think what's happened is, you know, they end up on different parts of campus: they usually pick people who are particularly controversial, who want to speak to their base--
Russ Roberts: crazier choices. Yeah.
Charlan Nemeth: And so, security becomes a very serious issue in an open campus. And, you get a lot of people who are drawn to these occasions in part looking for a fight, essentially. And I think that turns this whole thing into a whole different level of problem. I do think though, that seemingly at a less dramatic level, but what I get concerned about is I watch even faculty who, they even spending their whole lives thinking about fairness and equality and inclusion and rights and everything: but what bothers me is that the speech is good but the actions are lacking. And, what I watch is a kind of a deafening silence at wrongdoing. Or, if--an unwillingness to stand up and to pay that price. And understandably, as we're talking about, they're concerned about they're going to see each other. There's long range relationships. Academics can have very long memories, you know, if they feel smited or whatever. So, it's all of those human reasons. But I don't see them at all immune from this silence and lack of authenticity, even about issues that they at least say matters to them. Because, it takes a different--it takes more than speech, I think. And that's where the dissent--I even winced at the title, by the way, because I wanted to dignify title by its courage as the image. And I thought, 'Troublemaker' is the pejorative way in which people think about it. But the publisher was right in a way, that the term 'troublemaker' has actually had an interesting kind of an appeal where people even identify with it. We just, like two days ago I saw a posting, a guy who was given my book saying that it must have been his autobiography. He was at some Hilton worldwide conference, whatever; and he's standing there posing with it against his chest, proud of the title, in a way. And so I found myself thinking I've learned a lot from the kind of interviews and reactions of people, because you never quite--the word has such different meanings to people. So, ergo. Yeah.
Russ Roberts: Well, I'm thinking about--it's funny you mention that. I hadn't thought about it till just now, the fabulous, just incredibly great ad that Apple had early, early on, called 'Think Different.' And it got a lot of attention because it's grammatically incorrect, probably, maybe, I don't know; there's a certain argument for 'think different' instead of 'think differently.' But, what that ad said was--I think the opening line might be 'Here's to the troublemakers.' And it just highlighted dissenter after dissenter after dissenter. And, of course, some dissenters are wrong. They are wrong at the time; and they get brushed aside. And other make history. You mentioned Gandhi and Snowden in your book, and others we haven't had a chance to talk about. I also want to mention just in passing George Orwell, who had the courage to call out the Communists in the Spanish Civil War. And I encourage listeners to go back and listen to my interview with Christopher Hitchens on EconTalk on that; we'll put a link up to it. But, these people--they were incredibly brave. And the world's a different place. And mostly a better place. And those nails that stick up but get hammered down pay a terrible, terrible price. But, they maintain their integrity and they make a difference. Put a dent in the universe, as Steve Jobs would say.
Charlan Nemeth: Yeah. I mean, thank God for them. And, of course, now, even though, Jobs obviously, you know, was not universally loved. But no one questions the impact, and in many ways, you know, they really admire that kind of independence, if you will, and that appreciation for the role of people who really stake out a different route or who challenge. By the way, I should also mention to you--by the way, before I forget--is that when you mentioned interviewing Christopher Hitchens: My son who thinks that nothing I do is all that important, when he knew that I was going to be interviewed by you and that you had interviewed Christopher Hitchens, who is one of his, you know, kind of favorites of all time, he really sat up. So, you have made my family interaction actually a lot more positive.
Russ Roberts: Well, I'm happy to hear that. What I was going to say is that--we were talking about the admiration people have for Steve Jobs--you always admire the dissenter after they're dead more than they did when they were alive. They get romanticized tremendously in death. Which is interesting. But I'm glad we brought up Christopher Hitchens. And I'm happy about your son. But there's a story in your book which blew me away. I mean, literally, I just could not believe it to be true, about Mother Teresa. Could you tell that story, and Christopher Hitchens?
Charlan Nemeth: Yeah. Well, I just--I read about it, actually. You know. And basically, because what struck me is that, you know, the Devil's Advocate is a divisor--you know, a mechanism originated in the Roman Catholic Church. And it's used with someone being considered for Sainthood. And so, you know, you go through beatification and then sainthood. And, you know, kind of like my joke about it is that you don't want to find out after the fact--the person is a saint, and then you find out he was a pervert or something. You know. So, what they had for centuries is this mechanism. And that's what the Devil's Advocate--
Russ Roberts: It's an actual person. Appointed.
Charlan Nemeth: Yeah. They have--you know, I think it's--I don't sit in the Vatican, needless to say. But, where the process is to make sure that someone takes a Devil's Advocate position, and my understanding is that they go to great trouble to find out everything wrong with the decision to put this person up for Sainthood. And they've always used it as a mechanism--much like the devil's advocate has come to be known--which is essentially kind of role-playing. Namely, someone tries to dig up all the dirt and to present it. So it's, at the informational level, it's seeing the negatives of this position. But what I was reading--I'd have to look up my source, though--now, is that, at the time of Mother Teresa, is that Christopher Hitchens was apparently one of her great critics. Because, if I have this correct, he viewed her as less saintly, mainly because she was trying to convert people to Catholicism. And so he viewed her more in terms of someone who was proselytizing, and therefore had an agenda in addition to or other than simply caring for the poor. And so he, from a purist point of view, or at least from his point of view, as she was kind of fraudulent, if you will. I mean, it actually took on that kind of a negativity. So, what I found kind of interesting when I read it, is apparently the Church had invited him. I don't know the details of exactly how much he spoke or whatever. But, according to the report, it said, he was asked to essentially argue his position: Namely that she should not be granted Sainthood. Now, I found that interesting, because I thought, 'You know, people have been looking at these contrivances--the Devil's Advocate--as a way of kind of acting as though they are considering the opposite.' Which I never believed; and our own studies show it's not a very good device. And I thought, 'Boy, if the Roman Catholic Church even after devising this and using it for so long is actually bringing in a true dissenter, whose opinions now they are going to listen to, rather than someone within who is role-playing a dissenter,' I thought, 'Boy, that's promise for the future.' And that underscoring of the point I want to make in the book. So, that's really why I used it--
Russ Roberts: Yeah. It's juicy--
Charlan Nemeth: [?] So for all the groups you'd think that would not bring in a role-dissenter, you know, the Roman Church has never been kind of one, wanting to hear heresy. You know.
Russ Roberts: [?] Yeah. That's just such a great story. I hope it's true. I'm tempted to say I'm sure it's true, and then you might have to backtrack. But, I'm going to leave that alone. We're going to look for a link to that story and find some source for it.
Charlan Nemeth: Well, you've got me doing it, too, because I know when I first looked it up--because this book has been, in forms, over quite a period of time--I know I looked it up at the time. And I remember reading about it because it was such a great--I didn't treat it as a proof of concept. But more of something consistent with it.
Russ Roberts: Yeah. Yeah; yeah.
Charlan Nemeth: But I know that I would not have used it--I know I researched and it was a credible source. What I can't tell you is exactly what the source was now, unless I go back and revisit it. But I actually will do that.
Russ Roberts: Yeah; I will, too.
Russ Roberts: Let's close with the replication crisis in Psychology. Which, you know, your work, formal work--we didn't talk much about it. It came up in passing, indirectly, of course. But you've done many, many studies that you are drawing on for these conclusions. And, unfortunately, almost by the nature of the study, they are small groups. Because you are interested in small-group decision-making. And many studies in Social Psychology have struggled to be replicated with larger groups, recently. It's a little bit of--for me--it goes way beyond Psychology, of course. But it started in Psychology with the work of Brian Nosek, who has been a guest on the program. And I know it's a very controversial theme in Psychology today. A lot of people have pushed back against Nosek, saying, you know, 'We've gone too far.' It's always easy--'I don't know if these replications are done well. Maybe it's the original studies that are right and the replications are wrong.' What are your thoughts on that? And has it caused you at all to revisit some of your work and wonder about its reliability?
Charlan Nemeth: Yeah. I guess the short answer is, Yes. I've thought about it some. I haven't tracked all the minutiae lately of attempts to try to correct it. And most of them don't strike me as going to solve the problem. Because--it--let me start with, I'll start with kind of what I believe, in a way, is that I used to struggle, when I was very young with the fact that I could spend a lifetime doing one study. And, essentially, that--you know, whenever you do it--
Russ Roberts: At least--
Charlan Nemeth: you certainly say--well, yeah, yeah. Like, what if I had older people rather than younger people? What if I had people in China as opposed to the United States? You know. What if the task that they did in order to show performance had been a different type of task? Okay? Now, just in that statement, I could set up about 6 different studies. Okay? And you could spend a lifetime trying to find the--which variables matter and which ones are boundary conditions. And even then, the problem is, that if you are going to replicate exactly, and you are going to change one variable, it means you have to keep everything else constant. And you never know that if you changed any one of those things, or even if the zeitgeist had changed--you know, in terms of the meaning of that instruction. Namely, you'd go crazy. The bottom line is that, without going into all the logic of that thinking, is that I had a sense in which, you know, I could have ended up in an asylum; or a sense in which you could spend a lifetime ending up essentially with one, every variant on the theme of a single study, and you still don't have a full answer. And I thought, 'This is ridiculous.' And, essentially, what I think I came to terms with is that--and I really differ with some of my colleagues who really think that a study, since it has data, and since they've done careful statistics, is it becomes a fact. And I used to argue with colleagues about this. Because, particularly when I do testimony, like in legal cases--like, when I did some consulting back when--colleagues of mine they'd go in and they'd talk about studies as if they were facts laid in stone. And I knew that wasn't true. Because, to some extent, I could choreograph almost any study, and have gotten a different finding. Gotten a different variant on it. You know--you can shape, you know, what they are doing, what they think their task is, who they--
Russ Roberts: throw out--
Charlan Nemeth: if you want to--
Russ Roberts: if you want to get results--
Charlan Nemeth: [?] you can do that. Okay? And I realized that it was very hard to do well. Social psychology is very hard to do well. In part there's still an element of an art, in which you have to put together and integrate a lot of different studies. In many ways, I think for those of us, at least as we try to be responsible, we know that some studies should be believed more than others. Either because we know more about the person who did it--or, you start--the devil is always in the details. You start kind of looking at how did they exactly select the subjects? How did--exactly what was the task? How did they use that term? Because, they use the term and you look at the operational definition of it, and 'cohesion' means they only checked whether or not they liked each other. Okay. And yet, they are spinning it as though the word 'cohesion' has many other meanings; but that isn't what they studied. I mean, I could go into--believe me, I know, probably, you know, at least as well as most, all of the failings that you could come to. So, I know how hard it is to do well. I know how easy it is to criticize each study. And I'm very aware of that. So, I have an appreciation for studies that are well done. Having said that, is that: I think there's partly a bit of a crisis, because what's come, which used to be very, very few and by rumor, are people who kind of, if you will, coax their data. I say that as a way of saying, rather than pure, outright manipulation or fraudulent--you know, putting in data that wasn't true. There are many ways in which, you know, if you stop collecting the data short of what you would have otherwise because you've got the findings you want; or if, on the other side, you add to it thinking you could juice up the confidence level of your significant difference. You know, there's all kind of ways in which it's really hard to legislate ethics. And for me, increasingly, when you think about that ocean being deep as you go farther from the shore. For me, so many things in life to me come down to values, come down to ethics, come down to whether or not someone is honest and whether I can trust them. You know. And I think that's true even in the profession: That you count on those norms as a way of guiding behavior and giving you confidence in what you read and see. And, those are a little bit--I don't think it's on a large scale. Don't get me wrong. I think the ethics are still fundamentally there in the field. But you also have, now, I think, a somewhat scarier development. And that's that I think a lot of academics are becoming like mini CEOs [Chief Executive Officers].
Russ Roberts: Yeap. A lot of money at stake.
Charlan Nemeth: And there are--yeah. And I know people, and they get grants; they are really interested in ego, and so--
Russ Roberts: [?]--
Charlan Nemeth: you find that people appropriate ideas. They'll re-bottle it and give it a new name. And then they cite themselves as though the are the originator of the idea. People, many times and particularly when you look at popular writers, they will appropriate even the phrasing that, like, one of us might use in an academic article. And really not cite it. I mean, some are careful about it. But some are not. And, because it--people don't pay a lot of attention to who the originator of the ideas are. They are interested in whether the idea sounds interesting. And then what happens is that people profit from essentially being a little slippery about that. But even within academe, what you find is that sometimes--which I really object to--but graduate students won't even collect their own data. It's like, you know, the professor has 20 people--I used to never have more than 5, which is full time for me--and each of those people would have a study, and they'd have, like, undergrads collecting data. But you can't have that. They don't know what they're doing. They have no idea of the rigor involved. You can't all of a sudden just change a word in the instructions because you feel like it and you think you are--I mean, the whole thing is ridiculous. And so, nonetheless, what I'm getting at, is that when you find that the benefits in academe often have to do with the quantity of the publications, not the quality, or they have to do with how much buzz or the notoriety that it creates, you get all kinds of incentives to essentially kind of rush to seeing what differences look significant after the fact, without carefully looking at whether the design has been really thought through--whether you've been vetted, whether or not, if it didn't work out that you can track what happened. And where you use it as a learning experience, not just a confirmation experience. All of those things we took for granted in terms of diligent and vigilant studies--which took us sometimes 3 years to do one, by the way, if you do group stuff, okay?--is that the cost/benefit analysis for many people is not to go that route, but to do surveys. And to treat the fact that they think they were creative as being equivalent to whether they were creative. And I'm here to tell you that almost all the studies show you that what people, when they think they're creative as opposed to seeing what they actually produce, are almost orthogonal. You know, I mean, you can't trust the perception, because they confuse morale and feeling good and all of that with whether or not they produce anything. What I'm getting at, though, is that there are many kind of layers to this. But I do think that the incentives, which have gone--they used to be just in the Business School more than Arts and Sciences--but I think it is, if you will, kind of eroding throughout. And the draw of--you know, fame, and money, and all of those consequences, which are there throughout society, have permeated it more at least than I remember seeing it many years ago.