Intro. [Recording date: August 17, 2022.]
Russ Roberts: Today is August 17th, 2022 and my guest is journalist and author, David McRaney. He hosts the podcast, You Are Not So Smart. His latest book and topic for today's conversation is How Minds Change: The Surprising Science of Belief, Opinion, and Persuasion. David, welcome to EconTalk.
David McRaney: Wow. Thank you so much for having me.
Russ Roberts: Underlying this book is the simple but often-unappreciated idea that we are not fully aware of what we know, what we don't know, and what motivates us about what we think we know. How did you get interested in these ideas?
David McRaney: The entry point of this--I think there's a lot of things. One was growing up in the Deep South, but being part of a generation that had so much media coming at you that you started to feel like: Hmm, there's another world out there. We were kinda, sorta finding the others through Saturday Night Live, and MTV [Music Television], and Liquid Television, and Beavis and Butthead. It was clear there were other voices, other people living in other places thinking different things. That was part of it. And, just having the real world on your television and then walking around in a space that was full of a lot of prejudice and bigotry, and religious fundamentalism--that friction always fascinated me.
My dad is also a Vietnam vet who has a pretty strong conspiratorial streak in him. And, that's understandable sometimes and not understandable in others. Just as a real love-my-country, don't-trust-my-government, fear-my-government kind of mentality. All of those things played into all of that.
I think that when I finally found my way into a university setting, I had my first psychology professor just overwhelm me with ideas and perspectives and ways of seeing the world that played into all these things. Plugged into all of these things.
And, especially when it came to motivated reasoning, that was something that started to really make me feel like, 'Okay, that makes a lot of sense. That makes a lot of things fall into place for me.'
And, if you're not familiar with motivated reasoning, anybody listening to this, you are. You're very familiar with it. I promise.
And, someone is falling in love with someone--this is my favorite example of this. When someone is falling in love with someone and you ask them, like, 'What reasons? Why do you like this person?'
And they say, 'Oh, wow. They've introduced me to this interesting music. They have all these wild opinions. The way they talk, the way they walk, the way they cut their food.'
And, if that same person is breaking up with that same person and you ask them, 'Well, what reasons? Why are you doing that?' They'll say, 'Oh, they have these stupid opinions and the music they'll choose is bad.' And, 'The way they talk. The way they walk. The way they cut their food.'
So, the reasons for become reasons against, when the motivation for searching for reasons changes. And, that's sort of the essence of motivated reasoning.
And, I find that immensely fascinating and applicable to so much in life.
Russ Roberts: Yeah. We had Luca Dellanna on, talking about many of these issues: that it's discomforting to one's self, to think that you might not understand why you believe what you do. Because, we think we know exactly why we believe what we do. We think 'I've got the facts on my side. The theory behind this is iron-clad.' And, 'I've learned so many things that point me toward the truth.'
But, of course, a lot of times we say things that really have nothing to do with why we do what we do, believe what we do, say what we do. You have a story in the book. And, I think it's also in Robert Burton's book--that you referenced Burton in your book. The person--
David McRaney: Love Robert Burton.
Russ Roberts: The person who--she can't open one of her eyes, but she's got some--
David McRaney: That's an Eagleman story, David Eagleman.
Russ Roberts: Yeah. Tell that story, because it's a nice example of this--that, if you're going to come up with something, you're going to explain something. You have to. Just your brain is going to take over for you.
David McRaney: That's right. In neuroscience, psychology, they call these causal narratives.
And, the Eagleman example that I used--and great thanks to David Eagleman who was helpful in putting together a lot of the neuroscience stuff in this book--he had a patient who had suffered from something called anosognosia, which is the denial of disorder. Oftentimes people--and I also spoke with several people who treat people with anosognosia. It often manifests as one of your limbs doesn't seem to work properly or is paralyzed and yet you will--for some mysterious reason that used to be mysterious--he's much more better interested now: The person would just deny that the arm has anything wrong with it.
And, then you ask, 'Well, then why can't you lift it?' And, they'll come up with an incredible array of rationalizations and justifications. 'I don't want to,' is the first way. But then they can go into saying, 'Oh, well, I slept strangely on it.' There's all these things they'll say.
And if you keep knocking it down, these justifications they bring forth, they'll sometimes go into territory like, 'That's not my arm. That's my mom's arm. She's playing a trick on me.'
With this patient I talk about in the book, she had something wrong with her vision. He--she--was trying to sort out what exactly was wrong with her or what she was experiencing. And, he asked her to close both of her eyes, but she could only close one. And, when he asked her, you know, what was going on, she believed or she thought that she could close both of her eyes.
And so, he asked her to look at a mirror and hold up a couple of fingers, two fingers, or hold up a random number of fingers. And, she was holding up two fingers, and he said, 'How many fingers are you holding up?' And, she said, 'Two.'
And, he said, 'If both of your eyes are closed, how are you seeing how many fingers you're holding up?'
And, in this case, what I thought was--unlike the arm raising thing, she was faced with undeniable evidence that she could see her own reflection. But she also believed in some other manifestation of her subjective reality that her eyes were closed.
So, in that case, what she did was nothing. She just stopped. She didn't respond at all for a little while. I thought of it in the book as almost like she had a blue screen. She had to reboot for a second. And, she didn't become confused or alarmed, but she also--after that was over, when he kept speaking with her, she didn't update her beliefs, either.
And, what was going on in there as he explored the case further was she had a particular set of lesions and damage that had caused problems with her interior singular cortex.
And this, to shrink down the full explanation of that from all the neurological stuff: This is a portion of the brain that's key for experiencing cognitive dissonance.
And, she just could not experience that phenomenon of cognitive dissonance when she had one input that said one thing to her and one input that said another. When these two perceptions would lead to two different conclusions, instead of feeling a dissonance that most of us would feel--whenever two beliefs don't seem to fit together, two attitudes don't fit together, two interpretations don't fit together--usually what happens is you go through a period of discomfort and you want to resolve the discomfort. And, by resolving the dissonance in the direction of one or the other, you resolve it.
She just didn't have this urge.
And, since she didn't have the urge, she simply allowed both of these things just to be true at the same time. And, she exhausted--the way he described it was the brain just reached a cognitive filibuster and exhausted itself and just started back over again and said, 'Okay. What were we talking about?'
And, she just skipped past the experience and returned to the before of the experience. It's a real insight into some of the mechanisms that take place whenever we feel that something--if we're reading information that doesn't comport with our views, or maybe we support a political candidate who does something naughty, at least we will feel that dissonance of like, 'Hmm. I have an attitude that's positive, and an attitude that's negative at the same time.' And, that urges us to try to interpret the evidence in the direction of one of those feelings.
And, if you don't have the wet wear[?] and you're inside your skull to do that, you just engage in a cognitive filibuster and escape the situation by just doing nothing.
Russ Roberts: Well, it's not the worst thing in the world for most of us because we have lots of contradictory things almost by definition. And, I think most of us, when confronted with--we'll talk a little bit about how we process new information and sometimes change our mind. But, most of the times we don't change our mind at all. We either put it down, that new piece of information. Just push it back and ignore it. Don't want to think about it. Sometimes we can convince ourselves it must not really be a fact, this new thing we were told. It's the bad people who are propagating this story about this person I like.
And, sometimes we just go on with life. We don't push it away, we just go, 'Well, there's two things. Maybe I'll figure out how they work together, but I can't be bothered right now. I got to pick up the kids, or do the dishes.' And, life is--
David McRaney: It really depends on the motivation, right?
Russ Roberts: Yeah.
David McRaney: Some things you feel strongly motivated: I need to resolve those.
And, for other things, you can let it go.
Like, oftentimes that phrase, 'Change your mind,' is so fascinating to me that oftentimes something does change. I think about, like--the political example is a great example because people become very motivated. They're a group identity. They're a self-identity. They're a value. There's so many things wrapped up in a political concept or a political candidate. And, if you have a strong, positive emotion toward the political candidate or toward a party or toward people who are in the party or support the candidate like you, and then they do something or something happens that seems, 'Well, that's quite negative. This violates my other values. This makes me feel very negative emotions about what just occurred,' you feel that dissonance of, 'Well, I have a very strong positive emotion, but I also have a very strong negative emotion right now. So, which way do I go?'
And, there's an easy way out of it for a lot of people, which is, 'Well, I'll just interpret what just happened as actually being positive.'
Or, 'I can do something even more'--
Russ Roberts: It's even better--
David McRaney: 'I can do something even more problematic,' which is: I will interpret that: 'That was not actually something that happened. I will interpret that it didn't happen.' Or, 'The people doing that, those actually aren't people who are in my political party or who support my candidate. Those are actors of some kind or those are agents from another force.'
It can go the other way, too. You can have a very strong negative opinion toward a political candidate or a party. And, then they do something nice or something good.
And, it's very easy to interpret that as, 'Oh, well, they're just doing that for gain.' Or, 'They're just doing that to trick me.'
So, there's a lot of ways out of dissonance that feel like you're not changing your mind, but you did just update your priors and understanding of the situations that you have a new set of beliefs and attitudes about this whole situation going forward. And, that has changed.
Russ Roberts: I will say one thing about that before we go on, which I think it makes it a lot trickier, which is: I always find it amusing when somebody on Twitter, referring to something else someone has said on Twitter: 'That's the most ridiculous thing--that's a horrible statement' or 'That's not true.' Or 'What an idiot.' And, I'm thinking, 'Do you really think people always write things on Twitter that they actually believe rather than the things they think will advance their career or advance their number of followers?' I mean, don't you think people are sometimes strategic?
With politicians, we're pretty sensitized to that, but we still make that mistake, I think often. And, we say, 'Well, he doesn't really--why would he say that?' And, I'm thinking, 'Well, to get elected.' I don't know. It would be a normal reason. That's usually what politicians do.
But we want to believe that we're getting a window into their soul and we don't always think of them as behaving strategically. Especially when you don't like them, it's really easy to say, 'Why would he say that? He doesn't really believe that. Does he?' And, the answer is, 'Of course he doesn't. He's a politician. He's trying to'--it's like, 'Why would you think he would be transparent on Twitter or in the press conference? That's not the goal. That's not what the pre[?]--it's theater! Don't you get it?' We want to have, like, the suspension of disbelief: 'No, this is actually what he's thinking.' No, it's not. It's what somebody told him to say.
David McRaney: It's so easy. It's so easy to slip in that frame, because on social media, you feel like you are an individual looking at the information. I'm one person looking at the tweet, one person looking at the Facebook post, one person watching the TikTok. But you are actually at a gigantic audience. And the person on the other side of that equation, they know that they're speaking to an audience. So, they speak the way a person would speak to an audience. And, status concerns, reputation concerns, making sure you don't say things that are going to completely torpedo, whatever it is you are up to, your image, your social support network, those are all major concerns once you feel like you're speaking in front of an audience. And that's where tweets come from.
But, on the other side of it, it's easy to think, 'Oh, yeah. I'm just sitting across from this person having coffee.' That's not the dynamic.
Russ Roberts: Yeah. It's what we call a performance. When an actor says something in character, we don't say, 'Boy, that's really mean. I always thought he was a really nice guy. Why is he saying that horrible thing?' Because it's in the script. Okay.
And of course, I'm different, David. I don't know about you. But, when I'm on Twitter--
David McRaney: Of course. I'm nothing like any of these people.
Russ Roberts: No, I'm totally--
David McRaney: When I go to the grocery store, I'm like, 'These people in this grocery store, why do they act this way?' Of course, I'm not in the grocery store. We're doing the same thing. Yes. I'm subject to the third-person effect all the time, as well. We all feel that way on Twitter: 'I'm the only rational person on here.' I'm the only person who has went down into the bowels of my castle and looked at all of my scrolls and thought to myself, 'Aha, this is what I think about gun control.' We all think we're that person.
Russ Roberts: Yeah. Well, what I always find interesting--and you don't have to volunteer anything--but I always find it interesting that I spend more time thinking about, say, confirmation bias and motivated reasoning than--I think I'm in the top, at least 10th of 1%. I'm certainly in the top 1% of people who think about it.
David McRaney: Everything I read seems to confirm that.
Russ Roberts: Yeah, exactly. And here, and yet, every once in a while I do notice that I fall victim to it. And when I do, I realize, 'Wow, I think about it all the time and it sometimes gets me. It must get other people even more often.'
David McRaney: Yeah: If it gets me this often, then it must get everybody else so much more.
Russ Roberts: It's bad, it's bad. But, I think that's true. I do think that's true by the way. I do think if you think about it a lot, you are maybe--one, it's a little more inoculated against the problem.
David McRaney: I think it'd be like a pilot's checklist. A pilot knows that you shouldn't trust yourself. A pilot knows that: Even though I've got a thousand hours in the sky--obviously, more than that--I've got so much time in the sky. I went through all this training. I went through all these situations. I've done this startup sequence so many times I could do it without even looking. I could fly this plane, at least take it off and get it going in the right direction without opening my eyes. They still have a checklist that they go through because they don't trust the fact that we are--first of all, we're biological. We're a big lump of atoms and molecules. Things can go wrong there.
But also, there's just simply systems of thinking, reasoning, and decision-making that it's best to inoculate yourself against or have some sort of safety in place. I advocate for all of that in all situations.
Russ Roberts: Hear, hear.
Russ Roberts: Now, a chunk of the book is about people's--I would call them facts. What happened on 9/11? Is the earth flat? And, I'm pretty sure the answer is no, but there are people who think of the answer is yes. There are people who think 9/11 was a conspiracy from the U.S. government. And, you engage or tell stories, some of you engage with people on this--just the question or just the facts. And, what's interesting to me about that is that--I mean, you point out in the very opening of the book that sharing facts with people actually is remarkably unhelpful in getting them to change their mind.
It takes a massive amount of factual--a cornucopia--of facts--to get people to think, 'Oh, maybe I'm not right about this.'
And I think that--I realize that a long, long time ago, and partly from being the host of EconTalk, that I used to think I had the facts, and the other side, of course, in terms of economic policy, that they're just wrong. They don't know. What they believe isn't true.
Fortunately, for my own wellbeing, I think, I've come to believe that's not the case. They have their facts and I have mine. And they don't always overlap, w. Which is the problem. I've cherry-picked the facts that fit my worldview and they've done the same. And, now what?
And I think--you know, one of the questions in economic policy, I think, is: these two groups never confront that. They just keep producing more facts. They do one more study that shows the minimum wage does this or that to the wellbeing of poor people.
And, I think at some--I think most of us believe deep down that it's facts. But it's not facts. It's tribe or identity more than so many other things that push us to believe what we believe, I think.
David McRaney: Yeah. This is something that comes up so often. And, scientists are not immune to this and certainly journalists are not. No academic is immune to this.
It used to be called the 'information deficit hypothesis' in science communication--it's this idea--and governments also have fallen prey of this. It's an old idea: the 19th century rationalist philosophers, the Founding Fathers of the United States. Like, the Founding Fathers of the United States were like, 'All we have to do is build a bunch of public libraries and then everyone will have all the access, the same information. Democracy will finally reach this utopian dream.'
The 19th century rationalist philosophers were like, 'All we need is public education. Once all the people who are working on the farms and the fields and the factories have public education, everyone will have all the access to all the same facts,' and then rationalist utopia of democracy.
Then--even the cyberpunks, they were like, 'Oh wow, the Internet are just computers at first.' And, then the Internet comes along. At first, it was like CD-ROM [Compact Disc, Read-Only Memory (an early type of CD to which you could write only once with no re-writing)]--Encarta on CD-ROM. And, then it was like, 'No, no, the Internet in total. You'll be able to email professors.'
Timothy Leary had this thing called Power to the Pupil, which is his idea that once every single person in the world can get rid of all the gatekeepers and they have total power over what goes into their eyeballs, rationalist, democratic utopia will prevail.
This is the idea all the time, because what keeps getting--you know, still I see this, too. I'm not starting a beef with Neil deGrasse Tyson, but I've seen him put this forth, too--the whole idea of rationality, of this beautiful utopian democracy of where everybody has a degree in science. The, 'We're all STEM [science, technology, engineering, mathematics] people. And, if we do that, we'll have Star Trek: Next Generation in a generation.' That seems to be the view of that.
What's always missing from that is people don't--it's the idea that people believe or not believe things purely based off of how much ignorance they have about the issue. And, if you just give them all the facts, then they'll see things exactly the way that you see it. Not only will they be knowledgeable about the subject, but their attitudes will change. Their values will change. The policies they support will change. The way they behave will change. And, strangely the way it will change will be to be just like you. They'll see things in the world just like you do.
Thankfully. we have psychology and neuroscience and other social sciences who have done an incredible amount of research over the last 100 years. And, the evidence just keeps piling up that: Well, that doesn't seem to be the case.
The smarter you are, the more educated you are, the better you become at rationalizing and justifying whatever it is you already believe. The more smarter you are, the more educated you are, the more connected you are, the more access you have to search engines; the better you get at rationalizing and justifying whatever you already felt, valued, believed, supported and so on.
This is--there's a long string of mechanisms that make this.
So, I think the easiest one to explain is that when it comes to reasoning, in a psychological sense, it's not the big R reason--the propositions and logic and things you would learn in a philosophy or a debate class. Reasoning, psychologically speaking is just coming up with reasons for what you think, feel, and believe.
And, those reasons are motivated by a desire to--a drive--to be considered trustworthy to your peers.
So, not only are you driven to come up with reasons for what you think you're going to believe, you want them to be plausible. And, plausible, in this sense is: 'What would your most trusted peers, your social network think?' 'Oh, yeah. That's a reasonable way to see that.'
And, in a different environment--if you're in a more prototypic, proto-human environment, if you're in a tent in the woods and you hear a sound and you're like, 'Oh, that made me feel a negative emotion,' you would be motivated to go search for information to confirm that your anxiety is justified. And, that's good. I mean, you should make sure: Is that a bear? And, you go looking; you might get false positives, and false negatives. But you search with your flashlight. You're trying to see if your anxiety is justified because there is this sense that you're going to argue this to your peers, 'Hey, there's a bear in the woods, and I found some evidence.'
Because, they're going to ask, like, 'What do you mean?' 'Well, I heard a sound.' 'Not good enough.' So, you need a little bit more evidence.
You take all of that and you put it online, and you feel something happens in the world that gives you this a negative emotion. Some anxiety starts to come up. It could be for really good reasons, but it could also be because you have some sort of prejudice or some sort of political bias.
So, then you do that thing. You go, 'Hmm. Let me search for evidence that justifies the anxiety that I'm feeling.'
And, when you do that, online, you absolutely will find something that suggests your anxiety was justified.
And, you also might find people talking about that. And, you might end up wanting to talk with them about that. You might end up spending a lot of time talking with them.
And slowly you can radicalize yourself. You can cultivate yourself--cultivize yourself--and by, you start snipping your connections away from people who don't share the attitudes being expressed in that community and you start strengthening the connections you do have with those.
And, now all of a sudden, you're in a group. You're in a community. And, the great sociologist Brooke Harrington told me that, if there was an E=mc2 [energy equals mass times the square of the speed of light, Einstein's Equation] of Social Science, it would be: the fear of social death is greater than the fear of physical death. And, if your reputation is on the line, if the ship is going down, you'll put your reputation in the lifeboat and you'll let your body go to the bottom of the ocean.
We saw that with a lot of reactions to COVID [coronavirus disease]. As soon as the issue became politicized, as soon as it became a signal--a badge of loyalty or a mark of shame to wear a mask or to get vaccinated--as soon as it became an issue of 'Will my trusted peers think poorly if I do this thing or think this thing or express this feeling or attitude or belief,' people were willing to go to their deathbed over something that was previously just neutral. As neutral as talking about volcanoes.
And, these all come together into one thing and we can talk about it more deeply. It is that there's an idea in psychology called elaboration. And, what it comes down to is, when you're trying to convince somebody to change their mind about something or see a different perspective, you can't start at the end of the process.
It's a process that can be invisible to yourself. You went through all of this reasoning to get to your conclusion. Well, they went through a whole reasoning process to get to theirs--that whole search-the-woods-for-the-bear thing has happened.
And, you've got your conclusions that feel like facts, but what they really are, are justifications and rationalizations that you cherry-picked out of all the possible evidence that you could find on the issue. And, when you approach a person with, 'Shouldn't this be obvious to you? Shouldn't I just give you this YouTube link?' and, you go, 'Hey, why don't you just watch this video and feel the way I feel about the issue?' of course they're not going to do it.
Has that ever happened to you? Have you ever had someone go, 'I don't know. Watch this flat earth video and see what they think.' Can you imagine watching a YouTube video and going, 'Oh, the earth is flat.'? No.
Russ Roberts: Or not.
David McRaney: Or not. So, you can't copy and paste your reason into another person. You have to hold space with the other person to engage in a reasoning process.
And, that's what I'm talking about when it comes to this sort of thing in the book. That's why facts can work in a good-faith environment where everybody's playing by the same rules. But, when you're just walking up to another person, you're talking to a person online, like, it's very difficult to establish that kind of good-faith environment where you can just raw, put out facts and vet them like you're doing research papers with each other. You need to employ a different technique that takes into account all these propensities, compulsions to justify, rationalize. and all these sorts of things.
Russ Roberts: Let me take us on a little bit of a--what might look like a digression. I don't think it is, actually. I think it's a really interesting application of what you're saying. You're making the point that I've gone through this experience, this intellectual--we'll call it cognitive journey--and I've come to this destination. And, as a result, I have an ideology or I have a view of the world on a particular issue.
And, at that point, because I'm already in Rome, it seems easy for me to say, 'Hey, join me in Rome. It's really nice here.' And, you're thinking, 'But, I'm in Warsaw. Rome? What's that? I don't like Rome. I hear bad things about Rome.' And, it's so obvious to me that Rome is great that, like you said, I just say, 'You just watch this video. Read this essay. Read this book, even.'
And, I once was at a meeting--I've probably told this story, maybe not on EconTalk before--but there was a group of people who wanted to educate people about the power of markets and capitalism. And, they were funders of this project. And I was there because--I don't know why I was there. I wasn't one of the funders, for sure. But, finally, we were batting around a bunch of different ideas. And, finally somebody said, 'I got it. I got it.' This person was so excited, and he said, 'We just need a book that explains how capitalism works.'
And, I just--I didn't know what to say for a while; and they got all excited. And I said, 'I don't mean to discourage anybody, but, you know, there's already been a few books. Many of them very good. Maybe 100. One hundred and one wouldn't be a bad thing.' But, in their mind, if you just had the right arguments, I'd give that to you and you'd read and you'd go, like, 'I have been wrong my whole life. I never realized this.'--
Russ Roberts: I suggested, actually, they'd be better off making a movie, because a movie with an emotional kick might do a much better job. I've often suggested maybe we should have an anthem. I've tried to write a couple myself at various times and ways.
But, the idea that someone would just need to explain it better--there is a certain type of person, by the way, who has been on that journey to Rome. And, that journey to Rome was filled with logical arguments, and cherry-picked, of course. And, they've decided that--I'll just give you the last--'Look, at this postcard of Rome. Isn't it beautiful? I mean, wouldn't you want to live there?' Basically?
David McRaney: That's a great way to put it. That's a great way to put it. Facts can be like a postcard of Rome.
Russ Roberts: And, the same applies to things that are not ideological, that are purely the result of life experience.
So, you know, I've had a bunch of life experiences. You've had your own. And, I come to you and I say, 'You know, you're making a terrible mistake. You shouldn't be a journalist. It's a terrible career.' Now, it might be, and it might even be a terrible career for you.
And, let's say it was for me, and we have a lot of things in common. And so, I, at some point pivoted away from journalism. And, I'm trying to say to you, 'David, let me save you the trouble. In 10 years you're going to say: I should have done this sooner.' And, it could be: quit law school. It could be quit being a lawyer. It could mean go to law school. Any life lesson. And even more importantly: 'You should talk to your parents more. Now, that I'm 67, I realize I didn't talk enough to my dad. And, I want you to know that, so I'm going to tell you that.' What are the odds that goes into your brain and causes you to change your behavior?
And, I think about this because--you've written a book, more than one--and I've written more than one. Sometimes people say, 'Oh, there's nothing new in there.' Well, of course there isn't. There's rarely anything new under the sun, by long understanding. Part of what makes a book useful, isn't that it's informative. It's that it helps you understand why these things, say, are important, and to remember them.
And so often we don't remember them. We know what's good. We know what's right. We know what we should be doing. The challenge isn't to know what's on that list--exercise more, eat fewer desserts. The hard part is implementing the advice.
And I wonder--in many ways, this showing up at the last minute with the video, with the YouTube video, in some ways is like coming to the person and saying, 'You know, it's a really important thing to have spent time with your parents when they get older.' Or whatever is this life lesson that you've accumulated. It may be the case that the wisdom of the elderly--which I sometimes imagine I might have a little of--is absolutely useless, in the same way that YouTube video is. I hope I'm wrong. What do you think?
David McRaney: You're on the right path here. It's complicated, like anything else is, and nuanced. There are two things that make those appeals not work well and there are ways around them.
The first one is just something called reactance. It's a psychological term. It's something that was fully developed and understood, and fleshed out, and studied in the therapeutic domain, thanks to--people would come to therapists trying to--they wanted help with alcoholism is where most of the work was done; and it's expanded into other things, especially drug abuse and all sorts of things.
Basically, anything where a person goes to a therapist where they aren't necessarily looking to introspect as much as they are: 'I have a very specific behavior I would like to extinguish. Can you help me with that?'
Often what would happen is, in the earlier days, they would come to the therapist and the therapist would say, 'Well, you know what's your real problem is. You should be doing this.' Or, 'I don't know if you've noticed, but you don't do this very much. You should do this.' Or, they all say, 'I get clients in here like this all the time and here's what I think you ought to do based off of what they--.' All that feels pretty good. They now call that in psychology the writing reflex. And, we've all felt that whereas someone is saying something and you're like, 'Oh, I have the advice for them. I know what to tell that person.'
But, you also have also experienced this other thing that happens, and this seems to be something that's universal to human beings across all cultures. It's just something that the brain that we're issued at birth, it's something that's a feature of human thinking, rationality, psychology. Human brains do this. It's called reactance. In the psychological parlance, they'll say something along the lines of, you feel motivationally aroused to remove the influence of the attitude object, which just means: 'You made me feel a feeling I don't like and I want it to go away. So, I'm going to push you away,' or 'I'm going to disengage.'
What is the feeling that's causing the motivational arousal? It's the sense that your agency is under threat--your autonomy is under threat. It's the 'Unhand me, you fools,' feeling. You've all felt this. If you've ever been a teenager or you've ever spoken to a teenager, you know what I'm talking about, which is: you could tell a teenager just as an adult--you can be another teenager that tells a teenager, 'Hey, you shouldn't do that.' But, especially in adults, like, 'You should eat this. You shouldn't do this. You should study more.' This is good advice that the person when they're 35 will go, 'Man, my parents were right about that.'
But, in that moment it's just the fact that you're saying, 'I have a thing in my head that should be in your head and I want it to be in your head.'
And, oddly enough, it's the want that make this, that creates the reactance. The person's feeling that you have approached them in some way and said, 'I want you to think, feel, or believe, or act in a certain way that you're not doing right now,' and it feels coercive. It feels like they'd come at you and they're threatening you. They've got a knife in their hand, and they're saying, 'Walk this way.' That's what it feels like.
We just, at a visceral level, will react by saying 'no thanks' to that, and we'll push against it.
And, often what happens in a therapeutic framework, when a person pushes against it, then you react. Like, you will have reactance by, 'Oh, this person is pushing against me and trying to get me to stop telling them what to do'--
Russ Roberts: 'Giving them this great advice? You're crazy.'
David McRaney: Yes. So, then you're like, I push, you push--the person feels the push, which is, 'I want you to act a certain way.' Basically, what you're saying is 'I have a goal and I'm not even concerned with what your goal is. This is the goal that I want you to go toward.' Then they say, 'Oh yeah, well, no. How about I don't do that? I want you to stop talking to me that way.' Well, now you feel reactance, because you're like, 'Oh, you're telling me how to talk to you? How about I double down?'
And, then you enter into a negative--you get this horrible feedback loop. What ends up happening, and I hate this phrase, which is, 'Let's agree to disagree.' Like, of course, we already agreed to disagree. That's how come we're having an argument, or a discussion, or a debate, or we're deliberating.
So, this happened so often in therapeutic frameworks that they're like, 'We should really develop a way to stop doing that.' Because, what started happening was people would come in wanting to extinguish a behavior and then they would leave therapy more likely to engage in the behavior than if they had never seen a therapist because something along the lines of: they had these arguments for and arguments against. So, they were at a state of ambivalence when they arrived, but they wanted a little bit more in energy on the side of 'Let's not do the thing anymore.' But, because of that, they counter-argued with the therapist. They generated counter-arguments inside of them that put more weight on the side of continuing to do the thing. So, they walked away with more arguments for than against than when they walked in.
This is also what happens when we have a conversation with someone where we disagree on an issue. Very often, if we create that feedback loop, they will walk away with more arguments in their mind than they had coming in to continue believing or feeling in the way they had before we had the conversation.
So, oddly in this--believe it or not, I am actually going to answer your question now.
You were talking about giving advice--the advice from the elderly, or at least advice from a position of wisdom. And, if it generates reactance, then it doesn't land. In fact, it would cause the person to say, 'Ah, whatever.' It won't go into their mind at that moment. It may come around later when three or four other people tell them the same thing and they notice a pattern.
You're right, though, that if you offer a person a movie or a TV show, oftentimes it's much more effective because there's a thing called narrative transport in psychology, which is when a person gets completely immersed in a story, they basically forget to counter-argue. They feel as if it was an experience that they actually are having themselves. The vicariousness of it kind of washes away, and it feels like, 'Oh, I learned something here about life. I have an empathy for a perspective that I've never experienced personally.' And there's no counter-argument that takes place. So, whatever arguments for-or-against an issue that are generated by watching the film, they get to stand in isolation and not get counterbalance by something else after the end of the program.
So, that's one thing to consider, the reactance. Even when you're giving good advice, you're helping people or you're trying to help let them see something that you really are an expert on a topic, you can generate reactance by either saying, 'I want you to feel this way. I want you to see that you're wrong and I want you to believe differently.' Or, 'There are other people out there who want you to do that.' If you just feel that agency removal or autonomy removal, that's one thing that will cause it.
The other thing is to, say, make a person feel that they should be ashamed for their current position. If you say something that is interpreted--you may not mean to come across this way, but if it can be interpreted as, 'You should be ashamed for what you believe. You should be ashamed for what you feel. You should be ashamed for that value or that intent to behave,' even if I'm putting my hand at the side of my mouth. Even if they should be ashamed. If you communicate it that way, then you're going to activate the person's fear of ostracism. And there's nothing more--like we said, there's nothing more fearful for a social crime made than the suggestion that they may be ostracized. So, if you tell them they ought to be ashamed for feeling that way, it's going to cause them to feel very viscerally upset and angry, and they're going to push away from the conversation.
So, those are two things to avoid.
And, I have one other point--we can talk for a second--but I have one other thing that you can do about, is called moral reframing, which is something that often gets lost in all of this. But, that's a whole other issue we can get into. But, I've seen this recently with a lot of these political ads that I'm seeing come across social media for places that I don't live where they keep making these--I saw one today where someone was, like--they were in, like, the Midwest and they had these two people trying to survive in the desert. And, one of them is doing everything right because they're a cowboy and they understand how to survive in the wilderness. And, the other one is a Senator who has no idea how to survive in the wilderness. But, the cowboy dies on Day 2 from a heart attack, because he doesn't have good healthcare; and the Senator lives, because he's got great healthcare, the Senator.
And the whole idea of the ad is: See. Senators have the healthcare that you don't get to have. And, even though you're a good, rugged individual who lives out there in the wilderness, who can survive in the wilderness, they'll out-survive you because they're taking away the healthcare you need.
That seems like a great political ad because it focuses on the identity of the individual that you're approaching. But, that is an awful political ad based off of everything that I've learned in this domain, because it only feels like a great political ad to people on the Left--to liberals. It feels like a great ad for people who already have the values to which that makes you angry about that. It's the inability to see that you can't make an argument from your moral framework to a person who is in a different moral framework and expect it to land. You have to actually couch the argument in that person's moral framework and their values. It's a very difficult thing to do.
And so, everything that I've discussed in this long-winded answer falls under a giant category called cognitive empathy. Of all things in the book, I use the dress[?] to explain how that works under a framework called surf pad. But, it's a huge complex idea, but I think it all kind of plays into what we've been talking about previously, which is that sense of naive realism, where you just think that: 'All people have to do is see the things that I've seen and they'll naturally agree with the things that I think,' if you don't believe.
And it just takes--what it shows is a complete lack of cognitive empathy that other people come from completely different priors and experiences and social influences that affect the way they see--the way they form their beliefs--but also the way they interpret evidence.
Russ Roberts: Well, I just want to put a plug in for Arnold Kling's book, The Three Languages of Politics, which does one version of this, or trying to get people to see how political disagreement is not over the facts, but rather on the underlying worldview. And, that explains why a seemingly nice person could have such a horrific perspective on politics. They are a nice person. They have a different lens that they're using. I think I have two interviews with Arnold on that book--
David McRaney: That's awesome--
Russ Roberts: Which I recommend.
David McRaney: I'm taking a note for myself.
Russ Roberts: Yeah, it's a tremendous book. I don't want to go into the framework right now because we have too many other things to talk about, but I encourage people to go listen to those if you missed them the first time around.
Russ Roberts: I want to just come back to one thing you just said, and then I want to turn to street epistemology, which I think is really interesting.
Russ Roberts: I think the autonomy point is so powerful, is so undervalued. The E=mc2 of social science--say it again?
David McRaney: Oh: The fear of social death is greater than the fear of physical death.
Russ Roberts: Yeah, that's really powerful.
The second piece of it is that we cannot bear the feeling that we are controlled by others, and we will cut off our nose to spite our face. That expression doesn't really make sense but I think we all know what it means. We will adopt harmful things, like drinking or eating things that aren't good for us or indulging in various bad behavior because we're in charge. We're afraid that someone else is trying to run our lives.
The teenager is the best example of it. I think most parents are puzzled--I certainly was, at times--how uneager our children are to take our wisdom in this case. And, don't take into account the fact that it has nothing to do with the quality of the wisdom. And it has nothing to do with the teenager's assessment of whether that advice is wise. It's simply: 'You are not going to tell me what to do.' Which is such a powerful human impulse that--
David McRaney: Yeah, of course--
Russ Roberts: But, similarly: forget teenagers. When we're talking to another adult about a political phenomenon or a policy disagreement, a lot of the intellectual sparring that takes place, I think is about--I've underestimated it--a lot of it is about control. It's, like, 'You're not going to tell me what to think. I don't even care if you're right 100 times over. These are my views and I decide what I believe. I don't care about your facts. And, in particular, I'm not going to listen. My brain is going to switch off, if I think you're proselytizing.'
David McRaney: I see it all the time. I see stand-up comedians slipping into this. I see politicians slipping into this. I see it every day on the Internet. I see it among family members and friends. That moment when a person comes in white hot, they come in, barreling in, and saying--even if they're not coming in aggressively, I've seen a sort of pill on the pie approach. But we're too savvy for it. This is too ancient of a mechanism. You can't trick this thing. You know when you're being--because what it feels like is someone is pulling a weapon out and saying, 'Step into this room, please.' That's what it feels like.
And, they're like, 'Oh, okay. Well, I know you think that, but have you seen this? This is what you should think.' Or, 'That's wrong. You shouldn't think that.'
What I want to emphasize here is you can be very much correct. The facts can be on your side. You can be really trying to reduce actual harm in this world. You can have the moral high ground, and you can be dealing with a person whose intent, they're, like, their action and behavior, their political stance harms you. They may even hate you.
So, what I'm saying is you can be on the right side of all of this, however you want to define the word right--you can be on the correct side of all those things. And yet, if you generate reactance from the other person to what we're talking about, you will not be able to change their mind. You lose out.
And, it's a very difficult thing to offer a person the space and give them the respect that would avoid reactance when you are dealing with a person that you feel like doesn't deserve that treatment from you. And, I totally understand that.
But, there is a way out of it. And, here's the easiest thing, is a very teeny bit of advice. All you have to do is get out of the debate frame with the other person. Don't make this feel like, 'I need to win and you need to lose. I am and you are wrong.' Just get out of that frame.
And, the easiest way to get out of that frame is to, first of all, say something along the lines of--instead of saying, 'I want to show you what you ought to think, feel, and believe,' you say, 'Hmmm. You seem to know a lot about this issue and you seem to care about it a lot. You seem to see that these problems are problems. I'm wondering, given what you know, I wonder how it is that--because I look at a lot of this stuff, too. I wonder why we disagree on this issue? It's really curious to me. I would love to talk to you a little bit more about that. I wonder if we could look at this issue and see what is it we disagree on here?'
What you want to do in that frame is give the other person a chance to feel like, instead of being face to face, you're going to go shoulder to shoulder, and we're going to--instead of looking at each other as obstacles, we're going to turn and face in the same direction and look at the problem at hand, the goal at hand, the issue at hand. And, we're going to collaborate now. We're going to work together and say: Well, you've got your side of things, and your views, and your experiences, I've got mine. I bet if we joined forces, we could get to an even deeper truth on this or higher truth or a solution that works well for both of us.
You don't even have to put it in those words. That's another thing we have in innate inclination for which is, 'Oh wow, we get to snap together and work together on a problem.' You can frame things that way with just a slight change in approach and language and you will escape the debate frame that leads to reactance; and it's much more fruitful.
Russ Roberts: I would just add--and I'm going to add a couple of sides and then we'll go to street epistemology--but I would just add that if you think you're doing that because it'll make you a more effective convincer of the other person, you won't be so effective. It would actually be better to open your mind to the possibility that the other person does have something to teach you, this kind of empathy. And, the problem with it--you'll have a much better relationship with your fellow human beings. You will probably learn something, but some of your deepest-held truths may be a little more fragile.
So, it has strategic significance, certainly in parenting. I recommend it. But, in most of life, I think it's also good to actually believe that you could be shoulder to shoulder with someone as opposed to seeing it as a trick to get them to come over to your side.
David McRaney: Yeah. I feel[?] you so strong. That's why I spend way too much time maybe in the introduction--I'll put it--I spend just the right amount of time in the introduction trying to show: we're not going to be talking about manipulation or coercion here. This is a completely different thing. It's about reframing the conversation space so that--what we're hoping to do here is give the other person a chance to see how the message at hand aligns with their way of seeing the world.
It's all called elaboration in psychology. And, I was worried, too. Like, I did not want to write a book that was How to Win Friends and Influence People, Part Two. I wasn't trying to create some sort of guide for brainwashing. I wanted this to be why is it that I can't seem to get people to--why am I getting into all these arguments? Why are these conversations crashing against the side of the mountain here? I can't seem to get things to work. Are we lost? Is it over? Are we going to go into two different realities from now on?
I was just deeply concerned with wanting to find: Does science have anything to say about this? And, I was surprised to find, 'Oh wait, it has a lot to say about it.' And, that's why I want to always make the case for--or always suggest that, 'Yes. Let's stay on ethical and moral grounds.' And, the weird thing about that is exactly what you just said. When you enter that different space for conversation, 'Oh. It turns out we might both be wrong. It turns out I might be wrong.
And, that's absolutely the better frame to be in than the--because if you have a debate--the only person who wins a debate is the person who doesn't learn anything. It's the person that walks away and was like, 'Hah, I was right all along.'
Russ Roberts: The two aside I was going to mention is I have a friend who majored in sociology and he is one of the most gifted STEM people I've ever known. And, when I found out he majored in sociology, I said, 'Well, why did you choose sociology?' He said, 'My parents ruined everything else.' And, I thought, 'Oh, that's that reactance thing.' Now, I've got a name for it. I've understood it myself, for a bit.
And, the other phenomenon, which we're all familiar with, is: you have a book or a movie that you love and you keep telling the other person how great it is, and then they don't like it so much because you've taken away their autonomy in some dimension.
So, you try to be sort of lukewarm, 'I think you might like this.' And, then they don't read it. So, you're, like, going, 'No, no, no. You don't understand. This is one of the greatest books ever written. You're going to love it. You're going to love it. You're going to love it.' And, they go, 'Eh, it was all right, I guess.'
David McRaney: Yeah. That's a lot of reactance to that. Plus, you also have put them in that state of mind where the whole time they're watching it, they're thinking, 'After I watch this, I have to tell that person what I thought about it.' And so, now they're thinking about--well, they're building arguments for and against everything as they're watching. You stole their ability to reach that narrative immersion space. You stole their ability to completely be washed away by the story because part of their mind is still thinking about you the whole time.
Russ Roberts: Yeah. Problem.
Russ Roberts: Let's talk about street epistemology.
Russ Roberts: What is it?
David McRaney: The most surprising thing in this whole book--and there were a lot of surprises. I'd never intended for this book to be--I didn't think that this was going to be a selling point, some sort of marketing trick where I would tell people, 'I wrote a book about how minds change, and I changed my mind about how minds change by writing a book about how minds change.' I thought it's really nice to say all that stuff, but it's oddly true. The incepting point of this book was someone in a lecture came up to me and asked about their father who had slipped into a conspiracy theory and they said, 'What can I do about that?'
And, I told them, 'Nothing.' They said, 'How do I change his mind?' I said, 'You can't.' And, I really felt, the second I said it, that: I don't know enough about this to say something like that. I don't even know if I believe what I just said, but I know one thing I don't like this attitude I have about this issue. I should at least learn more about it.
And, if I was in that same situation today, I would actually be able to say, 'Oh, here's what you should do. Here's what you should say.' I no longer believe anyone is unreachable. I no longer believe anyone is unpersuadable. I think that a lot of the frustration we feel is the frustration you would feel if you were trying to reach the moon with a ladder, and then you don't get there and you say, 'The moon is unreachable.' I think that's the kind of frustration we often have, which is we try these information-deficit models. We try these corrective mechanisms and we suffer from our own motivated reasoning and that we don't see, and neorealism and all these things.
And, we use those intuitively in conversations that don't work out the way we think, and we blame the other side. We say, 'They're dumb. They're mean. They're evil. They're ignorant. They are unreachable, unchangeable, stuck in their ways.' All the things we say, which are all forgivance[?]. These are all things that we are using to forgive ourselves for failing.
Now, that's my new frame on all this.
And, I think the thing that was most surprising in all that was discovering that there were all these different organizations that had said, 'Okay. Well, what do we do about this?' And, they started A/B testing conversation techniques. I found deep canvasing, and street epistemology, and smart politics, and then all the therapeutic models that I mentioned--motivational interviewing and cognitive behavioral therapy. And, on and on and on. There's so many.
And, the thing that was most surprising was: Most of them had never heard of each other, never seen each other's work. Many of them, the majority of them weren't aware of the--if they weren't in therapeutic domains--they weren't aware of the science that would support what they were up to. Yet, independently, they all came up with pretty much the exact same technique. And, if you put it in a step-by-step order, it's almost in the exact same order every time, too.
And, that seems to me like something almost in the world of physics or chemistry, and that if you were to build an airplane--the first person to build an airplane, it was always going to look like an airplane. It doesn't matter where they built it. It doesn't matter what culture they were from. It doesn't matter how old they were, what they looked like, what they knew about anything. Airplanes have to look like airplanes because physics works like physics on the planet Earth.
Conversational techniques that actually shift attitudes and open people up to different perspectives, that get past resistance, all pretty much work the same way because brains resist for universal reasons and brains work in a very particular way.
We have these evolved responses and we have these evolved mechanisms for both generating and receiving argumentation.
And, there are chapters about all that in the book, because I want you to understand all that before we get to around page 200 where I reveal things like street epistemology.
So, street epistemology is something I came across. I did an episode of my podcast about some of the stuff I had found in deep canvasing, and people in the street epistemology world reached out. Actually it was someone who was at a school for the deaf and hard of hearing who said that: We use something called street epistemology here that's very similar to the deep canvasing thing that you talked about. You should look into it.
And, I looked into it and I found some of the biggest spokespeople for that. I went to Texas where a lot of them live and work. And, just like everything else in the book, I just spent a couple weeks with everybody and hung out with them and got to see it in person.
I did one of the most journalistic things ever for this. I hid in a bush while somebody engaged in street epistemology with strangers. They had one AirPod, I had the other; and I could hear everything they were saying and took notes.
Street epistemology came out of the world of the angry atheists and the militant agnostics who were having their own reaction to getting online and meeting each other. And, they've gone through several phases of growth and evolution themselves where they have schismed off. And, there's some who are still very angry and there are some who are much more humanistic and empathetic.
And, within all of that, there was this movement that came about where they wanted to know, like, 'How do we talk to people in a way that could avoid the angry pushback that we so often get when we speak with people who are not in our subculture or do not see or have our same theistic or atheistic views?'
And, they did the same thing that people did in deep canvasing. They went out. They had conversations with people. They recorded those conversations. They shared them with other people in the group. And, when something seemed to work well or get them closer to having a good conversation, they kept it. Anything that made it go the other way, they threw it away. And, through thousands of A/B-tested conversations, they started to zero in on something that worked.
And, now they've expanded it to: this can be applied to anything. You don't have to be in their sub-community or have their theistic views to use it.
In the book, I talk about how there are techniques that work well on politics, techniques that work well on attitudes and values. And, then this one specifically works best with fact-based claims, things like, 'Is the earth flat?'
It's a stepwise method for having the conversation that we all should be having on any issue. Without going through an hour of trying to go through all the steps, I'll give you sort of the quick version of that, which is: You open with a lot of the stuff we've talked about before--you open by establishing rapport. That's that assuring the other person you're not out to shame them. Assuring the other person, you're not even there to change their mind. What you are there is to explore their reasoning. You ask them, 'I would love to have a conversation with you in which we explore your reasoning on a topic and see what your views are and understand it better.' Maybe: 'You might shift, but you will have a deeper understanding of what we're talking about.' However, you want to frame it. Use your own language. You're telling a person you're going listen. And, most people will take you up on that offer.
I'm doing it right now. You asked if I would talk about something; you said you would listen; and I'm doing that right now. The podcast world depends on the fact that we're all very willing to tell people what we think and feel about things.
So, give people that opportunity. You open the space for it. In this method, you ask for a claim. You ask for a very specific claim. It could be: Is the earth round or flat? And, then the person tells you--and then you repeat back the claim in the other person's words. You make sure that you're always using the other person's words, because the big lesson in all of these techniques is that you are in their head, not yours.
You stay on their side. And, your job is to hold space for the other person to non-judgmentally listen and give them a chance to have a safety net, to metacognate and introspect.
And so you repeat the claim back to them. If they have definitions for terms, you ask for them; and you use their definitions, not yours. Like, if they say 'the government,' don't assume that they're talking about something from a civics textbook the way you look at it. They might be thinking of a group of reptiles in a round room talking about how they're going to divide the country up to play golf. They have a different view of it. Let them--use their definitions.
And, then this is the big moment--and this is true across all of the conversation techniques. They all open in a pretty similar way with this space-creating moment. And, then they move to this thing that is magic.
It is ask the other person on a scale from zero to 10, or one to 100--and this is also in motivational interviewing, by the way, where we talked about earlier of witting[?] reactance. The scale is a great way to get out of the debate frame and to assure the other person that this is not going to be a binary, right/wrong, black-and-white view of things. And, it even will work with the movie example you gave earlier, which is like, 'Hey, Top Gun Maverick, what did you think of it?' A person will say, 'I loved it.' That's a very, like, black, white binary abstract. 'Oh yeah? What would you give it on a scale from zero to 10?'
There is a moment when you ask a person a question like that, where they'll go, 'Oh, well,--that moment is, when they pop into that metacognating frame; and it could be like, 'What did you think of this talk?' 'Loved it.' 'What would you give it on a scale of one to 10?' 'Oh, well--.' That moment is what you're looking for on any conversation topic.
And you ask them, 'What would you put it on a scale from one to 10, or zero to 10, or one to 100?' Whatever they tell you, ask, 'Why does that number feel right to you?'
This will encourage the other person to engage in reasoning--motivated reasoning most often. And, you let them do it. Let them do it the way they could do it. They're going to come up with reasons that seem plausible for that position. But, what's likely is that they've never done this. Not in this sort of like, 'Please, present your reasoning to me' kind of way.
It's marvelous to witness a person saying--well, if they're talking about Top Gun Maverick, they'll have to start thinking, 'Why do I have this emotion? Why was that so quick--why was it just like--it popped right in my head. What caused that to happen?' And, they start coming up with reasons why that could be. Most of these are exploratory and they're definitely going to be justifications and rationalizations.
Then, if you are actively hoping to get the person to see things closer to your perspective, if you've already done this for yourself and you know where you're at on the number scale, ask the person how come they're not in the other direction that you--appropriate to the issue. So, if I feel like--if a person says--if I say, 'Is the earth flat?' And, they say, 'Absolutely.' And, I say, 'How certain are you of that from a scale from zero to 10?' They say, 'I'm probably a seven.'
Well, what you would ask is--first, you'd ask, 'Why a seven?' The next thing you'd ask--and this comes from motivational interviewing--is, 'How come you didn't say eight? How come you didn't say nine?' Because you're asking how come they didn't go all the way to 100% confidence. And, they must, on their own at that moment, generate their own counter-arguments against their position. But, you didn't do that. No reactance. You're not telling them what to think, feel. You're not giving them your counter-arguments. It's not your reasoning. They have to generate reasoning that counter-argues their position that will be new, that will be fresh, and that'll be added to the collection of counterarguments in their mind. And, it will affect how they see things going forward.
With street epistemology, it's more about just getting the person to examine: are they using a good epistemology to vet what they think and feel? So, after you have done all of these things with the number scales, you'd ask them what method that they used to judge the quality of those reasons that they presented? And, then you just stay in that space for the rest of the conversation as long as they're willing to do it, and continue to listen and summarize and repeat and wish them well. And, try to make it so that you can have more than one conversation.
So, people do often--not often. People do experience 180s in these moments sometimes. But, usually what happens is it's by degrees, by increments. And, at the end of the day, the street epistemology people, they'll tell you, 'We're not interested in changing people's minds. We want people to just be critical thinkers. We want them to have more robust epistemologies.' Which is sort of an even deeper way of changing a person's mind. Getting a person to change their epistemological approach to the world is even more powerful than getting them to change just one belief, or attitude, or value. And, that's the whole thing in a nutshell.
Russ Roberts: So, let me share some skepticism. Now, you didn't talk about what deep canvasing is, but deep canvasing is when you show up at somebody's door, and you're canvasing on a particular issue. One of the examples you use is gay marriage. And, they might say, 'How do you feel about gay marriage?' 'I'm against it.' And, they give a--and then you ask them a number and they might say, 'I'm an eight. I'm against it,' on a scale of one to 10, where 10 is 100% against it and one is: I'm in favor of it.
So, say I'm an eight, or a nine--whatever it is. And, they have this sort of open ended--instead of arguing with them, you have this more exploratory conversation. And they then have a lower number, later on. And, they might even say, 'I've changed my mind,' which they did in some of the research on this topic.
So, I have two issues here with this. First of all, I love the emotional openness of it; and it makes for a much more pleasant dinner. On this program, we've talked about a number of dinners. I've written about a number of dinners in my past, where they got ruined--where I said something that was very clever in my mind, but was not the right thing to say at a dinner. It was a great thing in a debate, but dinner is not a debate; and it's taken me a long time to realize that. And, it's not about winning.
So, I understand the argument that if you explore this together with another person without judgment and without combativeness, it's more pleasant and it possibly could open the person's mind in a way--you know, it's like the sun and the wind having the debate about the guy walking along and the wind says, 'I can make that guy take off that coat.' And, the sun says, 'I think I can do better.' And, the wind blows really hard and the guy pulls his coat ever tighter. And, then the sun comes out and the guy takes his coat off.
So, in a way it's playing the sun rather than the wind, which I'm a big fan of. I think it's the right way to live and I think it's the right way to proselytize, if you're going to proselytize.
The problem I have is that, of course, what some people say about these situations isn't what they really feel. Or--they're trying to be nice. A nice guy comes to your door or stands on the street and interviews you and you're like, 'Oh yeah, he's a nice guy. He didn't agree with me, but I'll say I moved a little bit. It'll make him happy.'
So, that's one worry I have.
The other worry I have is that most of the examples in your book--I don't know if you did this on purpose or not--but some of them are just factual, like is the earth flat or not? To the extent they have cultural resonance, they tend to be Right-to-Left arguments. They're trying to convince people who believe in God that maybe there isn't one, or it's trying to convince people that gay marriage is good when they don't believe that.
Can these techniques work in the opposite direction? Could I convince an atheist to consider the possibility of God? Could I turn a person who believes in gay marriage to be against it? I don't know the answer to that, but you don't talk about it in the book.
David McRaney: These are good, great questions. I love these questions. I will say, I brought all the people together who are sort of the spokespeople for these techniques. And, I did a roundtable video with them. I have it on the page for the book; and we did it about an hour and a half together over Zoom. And, I asked that question of them as well, like, 'Could you use this to go the other way with these issues?' So, I have an answer for that.
Russ Roberts: And, people could say, 'Well, of course not, because they're wrong,' but that would kind of go against the whole--
David McRaney: That's right. So, that's one thing. I'll ask that second.
The first question was, could it be performative[?], what they're doing? All of the street epistemology, deep canvasing, all these groups, and also in therapeutic models like interviewing, that's baked into it. They will often say--when you ask people their reasons, know that the first thing they say probably will just be an attempt to make them seem like a nice person based off what they know so far about what you're saying, what you're asking. So, all of us do this. The first thing you say is, when you explain why feel the way you feel to another person, why you hold a certain position on something or a certain level of certainty, oftentimes depending on the individual, that first volley, that first attempt to create some introspection and present it to you will be for the sake of, 'I want to make sure this person doesn't walk away thinking I'm vile.'
But, some people do. Some people have no problem telling you how they really feel. And, some people are more sensitive. It's nuanced to everybody. There's a gradient for how--different--there's a lot of nature and nurture as to how a person is going to give you either a purely, 'This really is why I feel this way.' It's also, how much has that person considered that issue? How much did they know about it? How much have they previously introspected?
So, most of the people who practice this, they do acknowledge that is part of the process. And, they even will say things in training like, 'But, you don't say that to the other person. You want them to save face.' The goal here is for them to have a private epiphany if there's going to be one or to feel cognitive dissonance privately and then resolve it more in the direction of one way or the other privately, so that they'd never feel like they got called out in some way.
So, if they do say--if you say, like, 'How do you feel?' Like, if you know that person, if you're looking at the flags they have surrounding their house and you ask them, 'How do you feel about this political issue?' And, they say something that you know is not what they think, you just let them do that. You don't say, 'Well, what about all these flags? It doesn't seem like--' You let them do it. Because the object here is that you want to give them--it's a service, in a way. It's free therapy, in a way, where you want to give them a chance to introspect. And, if they feel being more honest with themselves and you in the conversation, let them come to that on their own.
So, that's addressed.
Also, the scientists who've been studying this who have gone out and studied--especially deep canvasing. They've done an incredible amount of research into deep canvasing. They've done all sorts of field studies both on phone banks and in in-person versions of it, and across many different issues of same sex marriage, transgender bathroom rights, just voting for one candidate or the other. Because deep canvasing has been used in a lot of places these days.
Not only have they found that it has an incredible success rate, it seems very sticky in the sense that even when people return to their social networks, the social networks don't reassert the same amount of influence on that particular topic.
It's more powerful than canvasing TV ads, flyers, candidate coming to your town--all of that combined. This seems to be the thing that works more powerfully than any other version of political persuasion--but, not that they would want to call it persuasion. They would want to call it something else. I'm calling it persuasion. If you're coming into this thinking like this is going to be a slam dunk every time, like, you're going to 100% flip a person, it's going to be a 180: that's still something that is rare.
It's usually just by steps, by increments. They move a point or two in some direction or another. Which is enough to win elections and change attitudes and create cascades to create social change. So, that's all things that, if I'm hearing you correctly, those are the answers to those questions.
The other thing though, was: Could this be used the other way? I love this question. I've asked this question several different times.
Russ Roberts: And, just to clarify: A lot of the persuasion that you're talking about that works with deep canvassing--and I know it's different with political candidates--but I would call it encouraging tolerance, whether it's about sexual preferences or religious differences, or even crazy views about, say, whether the earth is flat. It opens your mind a little bit. It does raise the question of whether if you wanted to, could you increase intolerance? There's a lot of that going on in the world.
Russ Roberts: Part of my skepticism, by the way, about the deep canvasing or street epistemology, I stand at your door for 20 minutes or half an hour or whatever the length is and I've changed your mind. That's a little like the YouTube video. We would normally not find that plausible. And, out in the world, there's been enormous increases in all kinds of social phenomenon. We can debate whether they're good or bad, but the world has changed. There are enormous cultural changes across the spectrum, which we would identify with Left and Right. So, something is going on in the air, in the atmosphere, in the Zeitgeist. And, it's a bunch of things. But, I wouldn't think it normally would be somebody standing on your stoop or across from me in the street for 20 minutes and saying, 'Hey, have you ever thought about how horrible immigration is? You're pro-immigration, but let me tell you: it's awful.'
I don't see that working. I understand there might be more stances[?] about immigration than it used to be. But, I don't think it's coming from this kind of technique. It seems unlikely that it could do that.
David McRaney: I hear you there. Dave Fleischer at the Leadership Lab, that's what he told me. He was, like, 'There are other techniques that are much better for that: fear-based techniques and fear-mongering. All the toolkits of the demagogue are things that will work better than this.' Plus, this is very difficult.
When I asked the street epistemology people, 'Could this cause a person to have a weaker or stronger view about something that we wouldn't consider okay?' they said, 'Sure.' But, only--but that's because street epistemology is absolutely neutral.
The idea behind street epistemology is: Do you have a robust, complex, nuanced view of this issue? Do you have a robust, complex, nuanced understanding of the epistemology that would be required for you to have a concrete belief about this thing?
If the answer is, 'No. No one does. No matter how much you've considered the issue,' but, for many of the people who are confronted with this for the first time, it's really their first chance to start going, 'What do I think and why do I think it?'
So, necessarily, you will walk away with a more complex and nuanced view of the thing, which will mean some of the things you thought about it will be questions. Some of the things will be strengthened. But it's different: I do understand, like, 'Ooh, I don't want people to have a slightly different view if they're already on my side on the issue or they're already on the side of something that I consider good and moral and ethical.' Of course, none of us want that. But, having a more--we should never be afraid of somebody having a more complex and nuanced view of anything. That's one way of seeing it.
When it comes to fact-based things: No, there's no danger in that whatsoever. If it's actually a true thing, if it's actually something that the evidence will support over and over again, if it's actually an actual, fundamental STEM-based truth, a stronger epistemology will get you closer to that STEM-based truth and that's what happens in a street epistemology thing.
On the other hand, if it's an attitude-based issue, the thing that makes it less likely that a person is going to use this to turn people into Nazis, or get them to hate people they don't currently hate or to support things that harm people who are marginalized, is that in deep canvasing there's a step toward the end where they ask people about their real lived experiences with the issue. And, they'll ask them: How much experience do you have with--for instance, if it's transgender bathroom rights or just transgender rights in general--they'll ask, 'What experiences do you have with transgender people?' So, often, it'll go one of two ways. They'll say, 'I have no experiences with them.' Like, 'Everything I know is received wisdom.'
So, that's--very rarely is that going to put a person more in the camp of, 'And so, now I really hate him.' Or, 'Now I really don't support this.' What you realize is, 'Oh, I'm quite ignorant on this and everything I know has been told to be by somebody else.' Which, that creates an urge to, like, 'Oh, maybe my position is strong on this and it shouldn't be.'
And, the other thing that could happen is they do have experiences with those people.
And, rarely is the case that the person has had a negative experience with such a person, or a negative experience with the issue and that supports their attitude in that regard.
So, in that question, what it comes down to is: The person who is at your door, that's when the conflict arises where if it is a member of the SS [Schutzstaffel, political soldiers of the German Nazi Party, 1925-1945] at your door, knocking on your door in 1944, and they're like, 'We'll talk to you a little bit about our policies.' And, they get to that stage of, 'Tell me a little bit about your experiences with Nazis.' It's very likely[?] they're going to say, 'I've had a lot of negative experiences with them.' So, it's going to be very difficult for them to use one of these techniques that allows you to metacognate and introspect where they're going to go into some domain of going like, 'Oh, I never really considered though that they're good people who are trying to do good things in the world.'
I'm sorry. I will say one last thing. I'm speaking on behalf of deep canvassers, here. I'm sure they would have a much better way of answering that question. But, I have noticed that toward the end of their technique, they tend to ask people about their personal experiences on the matter. And, the point of that has always been so that the person doing the canvassing is--it's almost a moment to say, 'Am I on the right or wrong side of this issue by my own standards?' Because the other person, if they routinely tell me they have negative experiences with this issue, then we should reevaluate what we're doing out here. And so, far as it goes, that's not something that's taken place with most of the issues.
Russ Roberts: Well, I don't think it's much different from education in general. You started off earlier in the conversation pointing out that education often prepares people to be better at self-rationalizing and self-justification. I meant to point out then, by the way, that economists always have believed that education should be subsidized because there's a positive externality: the more educated you are, the better the world is going to be because you'll have all this productivity and people will be able to talk about things and they'll be better citizens. It could go the other way. So, just Frank Kaplan[?].
David McRaney: But, I do agree. I don't suggest that I'm, like: 'These techniques are the greatest thing that's ever happened. They can never go the other way or it'd never be used by nefarious entities.' At the end of the day, these are all neutral tools and I'm not a person who--
Russ Roberts: I'm not accusing--
David McRaney: At the end of the day, a person who wanted to--I do agree. There's almost nothing that's come along like this that bad actors couldn't attempt to use in bad ways.
Russ Roberts: No. My point would be that in 1933, when that SS person came to your door--I picked 1933 rather than 1944--they come to your door at 1933 and they say, 'Do you know any Jews?' 'Well, yeah, I know one.' 'Do you like them?' 'Nah, not so much.' 'Okay. Well, let me--'.
I think there's plenty of room. We like to believe that if you have experience with the other, it turns out okay. And that a lot of the intolerance in the world comes from people not knowing the other and only hearing about it from those third parties. But, unfortunately, there's other--it's complicated. That's all I meant to say.
David McRaney: I retract my example. You did a great job there, which is: You're right. When there are existing prejudices in the world and a person can have a negative perception about a group of people, and that can be inflamed by just asking that person, 'Yeah. Tell me more about why you don't like those people,' and, just let them talk, and talk, and talk, and talk and then walk away. And what have you done? You've given them more arguments in the favor of being prejudiced.
I feel like I stand corrected there. So, I don't want to suggest that that's not something somebody can do. That is something someone can do.
Russ Roberts: Yeah; okay. The book is not a defense of this as a magic potion that will inevitably make the world a better place. But, I do think it has--I'm more interested in it not as a proselytizing tool, but as a form of human interaction. We don't know each other. You just retracted a claim. You didn't double down. We've established in an hour and 25 minutes enough rapport that you and I can have--I read your book, which is a sign of love and affection, as every author knows. So, I just think that's--I'm less interested in it as a tool for political change. And, the book has issues and examples of social change that you're in favor of, and that's fine.
But, I just think for me as a reader, I found it equally, if not more interesting as a tool for having better conversations, or these examples. I think that's, in our day and time when we're trying to save democracy in a lot of places, it's really powerful. I don't underestimate just that piece of it. I think it's phenomenal.
David McRaney: Yeah. In the book, that's the arc I finally get to at the end is, like: I have a dalliance with, 'Oh wow, let's persuade people.' And then at the end of the book, I've given two opportunities to use the techniques. And, in both cases, I don't--because I realized at some point in writing the book and spending so much time with it and then actually going, like, 'Ooh, I've got all these new tricks. I'm going to go out and use them.' Sometimes it's not changing the person's mind. The question is, 'Why would you want to?' sometimes.
And, I found myself in a couple of situations where I was like, 'What good is there in that, in this particular situation?' I'm actually gaining more from hearing this person talk about this and seeing how they see the world than I ever would--I'm not going to put any more good in the world by taking that away from that individual.
And so, it became more about exactly what you just pointed out. It's more about having better conversations and about trying our best to actually have democracy in a way where my experiences matter and your experiences matter. And, my values matter and your values matter. What if we all got shoulder to shoulder on certain things and said, 'That is a problem. And, I have a shared goal with you'?
So, how do we go about, like, Venn diagramming ourselves in a way that doesn't let either one of us get away with being wrong, but does help us take what's right about each of our viewpoints and mush them together and magnify that? So, that's a place I get to. It sounds very Pollyanna, but boy, is there a lot of science that supports it.
Russ Roberts: My guest today has been David McRaney. His book is How Minds Change. David, thanks for being part of EconTalk.
David McRaney: I really appreciated this and I'm glad we went over time, to tell you the truth. This has been one of my favorite conversations about this so far. Something happened in this conversation that I wish happened more often in podcast conversations, about this book, especially. You changed my mind about something. The next time I have an interview and someone gets around to that question of, 'Could these things be used for different purposes?' I'm going to remember this conversation that we just had. And, the way I've shifted my perspective will be the way I push that conversation forward with another person, which may also change their perspective. And, that's what all this is about. So, I appreciate that very much.