Jennifer Doleac on Crime
Jan 21 2019

handcuffs-300x197.jpg Economist Jennifer Doleac of Texas A&M University talks with EconTalk host Russ Roberts about her research on crime, police, and the unexpected consequences of the criminal justice system. Topics discussed include legislation banning asking job applicants if they've been in prison, body cameras for police, the use of DNA databases, the use of Naloxone to prevent death from opioid overdose, and the challenges of being an economist who thinks about crime using the economist's toolkit.

RELATED EPISODE
David Skarbek on Prison Gangs and the Social Order of the Underworld
David Skarbek of King's College London and author of The Social Order of the Underworld: How Prison Gangs Govern the American Penal System talks with EconTalk host Russ Roberts about the written and unwritten rules in America's prisons for the...
EXPLORE MORE
Related EPISODE
Becky Pettit on the Prison Population, Survey Data and African-American Progress
Becky Pettit of the University of Washington and author of Invisible Men talks with EconTalk host Russ Roberts about the growth of the prison population in the United States in recent decades. Pettit describes the magnitude of the increase particularly...
EXPLORE MORE
Explore audio transcript, further reading that will help you delve deeper into this week’s episode, and vigorous conversations in the form of our comments section below.

READER COMMENTS

Joe
Jan 23 2019 at 9:17am

Around the 60min mark, Professor Doleac cites a study (but does not identify it) that some prisoner reentry programs actually increase recidivism.  Is there a link for that? This review suggests reentry programs do tend to work, though quite modestly.

https://www.heritage.org/crime-and-justice/report/studies-cast-doubt-effectiveness-prisoner-reentry-programs

The Original CC
Jan 23 2019 at 4:55pm

What a great episode!  I also think it’s worth reading the article about opioids that is linked in the “delve deeper” box.

Conor Lennon
Jan 24 2019 at 8:45am

I enjoyed this episode, particularly the discussion on the privacy concerns surrounding DNA databases. Russ, of course, is right that we might get as much of an effect if people just believed their DNA was stored rather than being stored, avoiding the potential pitfalls of a data breach or misuse of the data.

What wasn’t discussed was the effects (and ethics!) of a voluntary DNA database that could be offered as an option to those being released from jail. I wonder how many criminals would voluntarily choose to “tie themselves to the mast.”

Brian
Jan 24 2019 at 2:40pm

Interesting discussion. I would like to know more about the RCT related to police use of body cameras, because the discussion suggested that this was a poorly designed study. The description indicated that half the officers were randomly assigned to wear body cameras and the other half were not. This is not the right level at which to randomize because there are potential spill over effects from the treated to the untreated officers. The idea behind an RCT is that conditions are the same except for the intervention. That’s not the case in an organization in which there is so much collective effort and interaction.  A cluster randomized trial in which separate groups of officers (e.g., precincts) or separate police forces would have been better.

andy mcgill
Jan 31 2019 at 10:50am

I think police shootings of unarmed suspects is a big problem, but the issue is hopelessly debated as a racial matter.   The city I live in had an unusual number of police shootings for a couple years, before the Ferguson fiasco, and they just changed some police policies.  After which the shootings went way down to below national levels.

The policy changes were things such as not engaging in most high speed chases, not considering moving vehicles to be a deadly weapon in most situations, and changes to dealing with mentally disturbed people.

Too bad it has become such a racial issue.

Comments are closed.


DELVE DEEPER

EconTalk Extra, conversation starters for this podcast episode:

This week's guest:

This week's focus:

Additional ideas and people mentioned in this podcast episode:

A few more readings and background resources:

A few more EconTalk podcast episodes:


AUDIO TRANSCRIPT
TimePodcast Episode Highlights
0:33

Intro. [Recording date: December 13, 2018.]

Russ Roberts: I want to remind listeners to go to EconTalk.org. In the upper-lefthand corner you'll find our Survey where you can vote for your favorite episodes of the year and tell us about yourself and your listening experience.

0:50

Russ Roberts: And now for today's guest, Economics Professor Jennifer Doleac of Texas A&M University, where she is the Director of the Justice Tech Lab. She has done extensive research on crime, which is our topic for today.... The home page of the Justice Tech Lab, which you are the Director of, says, 'Technology is transforming the criminal justice system: Let's make sure it's for the better.' I want to take each of those one at a time. 'Technology is transforming the criminal justice system': In what ways? What are some of the ways that's happening?

Jennifer Doleac: So, yeah. There are a lot of ways. So, I think people probably watch shows like CSI [Crime Scene Investigation, television show], all the super-high-tech potential, advances, that could be making police investigations better or just making crime-fighting more efficient. A lot of those, you know, TV depictions of how technology is affecting things aren't entirely accurate, but there have been a lot of advances in recent decades. And we don't fully understand them. So if you think about things like just having cameras everywhere, I think we are kind of used to or almost expect that there will be surveillance cameras in most public places now. GPS [Global Positioning System] monitors for people who are, yeah, on probation or have been arrested and waiting for their trial. DNA [deoxyribonucleic acid] databases. Just a whole bunch of things that have been, that are all relatively new advances, and that can make our crime-fighting more efficient.

Russ Roberts: And, what are the worries about those technologies being used poorly? Why do you have the tag-line 'Let's make sure it's for better'? Because, we would hope--naively, of course, but we would hope that adding technology would make it better. But it doesn't always.

Jennifer Doleac: Right. It doesn't always. And, for some of the same reasons that all government programs do what we hope they will do, right? I mean it's just hard to--it's often hard to predict how all the players will adapt to the implementation of a new technology. And so: so, measuring those effects in the real world to make sure we are actually getting the benefits that we hoped that we will, I think is important. But then, in addition, a lot of these tools have substantial costs. You know, partly financial costs. I mean, these are just really expensive, and we could imagine spending our money in more productive ways if they are not that useful. But, a lot of the tools are--really, they are core surveillance tools. And they could make policing and crime-fighting more efficient and productive by increasing the likelihood that you get caught for your crime. Which means at its core, they are keeping, it's keeping better tabs on everybody. And so that has privacy costs. And those privacy costs are extremely difficult to quantify. Right? That's something I still haven't figured out a good way to measure, what those costs would be. But, I think people like me, economists, can add to the conversation by at least calculating what the benefits are, so that we can have a more informed conversation about whether there were any potential costs.

Russ Roberts: 4:19 Yeah. My view on that--we'll talk some more about this later--but my view is that economics has something to contribute in measuring costs and benefits. But there are certain things that are inherently--they are not unquantifiable, but they are not easily quantified. And we are going to have to make a tradeoff, then, between, say, the financial costs and benefits and some other costs and benefits. Unless you are a utilitarian. I'm not. We'll come and talk more about that later. But I think that's the issue. And as exactly as you say: You don't know how to quantify privacy loss. I don't think that's what--economics is[?] particularly good at it[?]. You got a good paper out of it, potentially. If you can take a stab at it--

Jennifer Doleac: Sure.

Russ Roberts: But I'm not sure it will be very accurate.

5:01

Russ Roberts: One thing you did not mention in your list of technology was artificial intelligence. And we had Catherine O'Neil on the program, quite a while back, and she voiced concerns that artificial intelligence and machine learning or being creatively used in the criminal justice system without full regard for what their impact might be. Do you think that's correct?

Jennifer Doleac: Um, yeess, with some caveats. So, I think a lot of people are thinking very deeply about this. I actually have some ongoing work with Megan Stevenson, who is an economist at George Mason Law, looking at the impact of risk assessment in sentencing. And a lot of the--you know, risk assessment as implemented in a lot of places is not a fancy machine-learning tool. It's, you know, a checklist. And you add up the points. But it's based on a regression based on existing data that kind of comes up with, with a prediction, of your likelihood of re-offending. And then, they apply the coefficients from that regression to come up with a risk score for future defendants. And, part of the reason we are interested in this question in studying it is that we think there's a lot of hand-wringing on both sides about the potential of this type of technology and this type of innovation and application in criminal justice, but a whole lot of other contexts, too. And one potential downside, one reason that they might make things worse, is that you could imagine them sort of baking in existing biases or doing a really a good job of predicting the bias behavior that is currently exhibited and then sort of like baking it in and, you know, accompanying it with a sort of veneer of science that gives it more credibility than it might have in the current courtroom. At the same time, there's tremendous potential, where we know, like--so, the way I push back on those concerns is usually: We don't have to come up with a tool that's perfect; we just have to come up with a tool that's better than the status quo. Right? And so, if what these machine learning algorithms or any algorithms are doing is removing some of the bias that we know exists in the current system--police officers operating based on their hunches; they don't like the way that that guy looks; he seems suspicious; a judge, his football team lost this weekend and so now he's in a bad mood and the kid in front of him gets a long sentence--if we can get rid of those biases, then we could be moved in a much more--we could be moved in the right direction. Even if we are not, even if don't wind up where we want to be in the end.

Russ Roberts: It's kind of a perfect example about all that's wonderful about social science and all that's horrific for me. I'm more of a skeptic about empirical work than most economists. And I look at that and I think, the average person--excuse me--the average economist, will say 'It's better than nothing.' Because the judge just has a gut instinct, as you say, or a hunch. Or worse, is affected by some life event that is way outside of the life of this poor person standing before the judge. And yet, as you say, the challenge is, if you are not careful you'll over-convince yourself that you've done something scientific that is actually not, and is merely what Hayek called scientism: It looks like science but it's not really science. For me, it's very similar to the example we just talked about in costs and benefits: It's a useful tool. It would be good for a judge to be aware of that analysis. I just would never want to be the determinant by itself.

Jennifer Doleac: Mmmhmmm. And, in practice it isn't the tool, the determinant itself. Right? And it's actually in some ways part of the problem, I think? So, one of the reasons that--Megan--I think this is an interesting topic for economists to study in particular is that what we really want to know is, when this tool is implemented, what are the impacts on things like recidivism, or racial disparities in sentencing? Do we have the sense that this is making our decisions more accurate or more efficient or better for society in some way--kind of beyond how it's--the challenge here, right, is we never know in and individual case what the right answer was. Whether the person was actually guilty or innovation, or what the optimal sentence should be in a particular case. And so, so, we want to know, like, when you consider all of these potential pros and cons of this policy, and how it's implemented in practice through the judge: the judge is kind of going to look at kind of the score, in addition to everything else they are looking at, how does that affect their, you know--and they have a whole bunch of incentives that they are weighing; and so, um, how does this affect their decisions in the end? That's ultimately going to have an impact on social outcomes that we care about. And so, what economists can contribute here is measuring what the impacts are on those ultimate outcomes rather than getting really bogged down in the details of whether the algorithm is accurate.

10:18

Russ Roberts: It reminds me a little bit of the attempts we have, we've made in the past, as economists, to measure the impact of schooling on wages, various other outcomes--but typically wages. Some form of income. And, at some point, somebody realized that 'years of schooling' is not a really precise variable. That, for some schools, were awful. And a year of having your rear end in the seat is not necessarily the accumulation of human capital that occurs at a different school or different university. It depends on what you do. Your grade point--a thousand things. And yet, we spent an enormous amount of effort trying to measure "the return to a year of schooling." Whatever--with all the flaws that that means. And I'd like to get your thoughts on what a year in prison means. Obviously--what we ultimately--well, we care about a lot of those--I was about to say, we care about a lot of things. Different people care about different things. One of the things I would care about, would be an important thing for me, would be the post-prison life of a prisoner. And whether prison reduced the chances of that person committing future crime. And equally important to me would be whether their life outside of prison can be a life of some potential and flourishing or whether that prison sentence has a lifelong effect that's destructive, way beyond the physical time spent behind bars. What are your thoughts on that?

Jennifer Doleac: Yeah, I agree that this is a really important question and there's definitely the research frontier, in economics research. I think--so, there are a lot of things to consider here. As you know, there are a lot of factors that judges weigh that we as citizens consider when we are thinking about the right sentence for somebody. I think economists in particular take a view that if we want to allocate incarceration efficiently, we're most focused on what the impact of incarcerating someone is on social costs and benefits; and that would include any sort of rehabilitation effect on the individual who is being incarcerated. And ideally we would love to move to a world, I think, where we can implement programs and policies that really transform the person's life for the better, give them the best chance to have a good life when they get out; and that might mean very short sentences but a whole lot more investment in mental health programs and substance abuse treatment, and education, and all those things. So, in terms of the research that we have, there's been some good research--so, the challenge is we don't have great data on any of this. In the United States we don't have any big surveys like in the census we don't have a question on whether you have a criminal record. Right? And so it's really hard, actually, to get the data that you need to study these questions, to look at the population that has a criminal record or is incarcerated, link that data with what their educational outcomes are, what their employment outcomes are down the road--if they get married; if they have kids, what the kids are doing. This is all the stuff that researchers are currently working to link those kind of data, that's the United States. In Scandinavian countries it happens a lot more often; so our best evidence comes from that context but it's probably not--it's unclear how relevant the Scandinavian experience in prison is to the United States. So I think in the United States, Mike Mueller-Smith, at Michigan, has a nice paper showing that for people who are on the margin incarceration--so if you get randomly assigned to one courtroom, you have a judge who is really harsh; if you are randomly assigned to another courtroom, you have a judge who is really lenient. That's essentially equivalent to being randomized to incarceration or not. And so, if you are unlucky and have the harsh judge you are more likely to be sentenced. And, for people in that situation who are really on the margin of being incarcerated, where it really comes down to which judge you saw, incarceration harms their outcomes. So, employment decreases. He's able to look at a bunch of different effects on social services and so on. So, in general it really does seem--it seems like here in the United States, the punchline here is that we're incarcerating too many people. Right? We could scale it back a little bit. That said, we've got other evidence, kind of similar effects for juveniles: Aizer and Doyle have a paper showing, again, that your mental effects for juvenile incarceration, but there was a more recent paper that looked at juvenile incarceration in Indiana and found that actually the marginal kids who were sentenced and incarcerated actually were better off than the kids who weren't. And so it really does come down to the context, what the programs are in the prison, what the outside options are; what's the alternative for the kids who wouldn't, or adults who aren't incarcerated any more. And I think we're just beginning to have kind of enough studies on this that are really good at identifying this causal effect to be able to start thinking about how much the estimates differ across different contexts, and we need to think about what could explain that.

15:49

Russ Roberts: Do you have any thoughts on this issue of year in prison? Obviously prisons have different characteristics. We did an episode with David Skarbek that I found very thought-provoking on the norms that emerge within prisons for how prisoners treat each other. But, along the way in that book he highlighted how street crime culture extends into the prison and then back out again. That we have a certain--I think the average non-criminal, which I think is me: I've never been in prison--has a weird and unrepresentative view of prison life that comes from TV and movies. Mostly--overwhelmingly those two. And some of them may be accurate: I assume most of them are not. But, do you think there's a lot of variety in what a year in prison means, depending on what prison you're in? Forget the judge, and the length of the sentence--a three-year sentence in one place must be different than a three-year sentence somewhere else.

Jennifer Doleac: Yes. Absolutely. I think especially a lot of programming winds up being tied to local community services and people come in and volunteer and do tutoring and hold yoga classes and all that kind of stuff. And that's going to really depend on the local environment. Certainly, so, criminal justice policy in the United States is primarily state and local; and so, that's a huge benefit for researchers like myself because it means there's a lot of variety and a whole bunch of different dimensions. It also means that you're not dependent on the Federal government for a lot of different policy changes. And so, when changes happen it's up to state and local officials. That definitely means that there's a lot of variety across the United States. But I think it's also, to kind of think about the impact of prison is in different contexts. One of the most striking examples is there was a nice--speaking of Scandinavian countries--a nice paper a couple of years ago finding that for people who were sent to prison in Norway, they are much better off when they get out. It's essentially going to, like, job training camp for a year or two, and you get all these incredible services, and you come out a much better person, much more ready to take on the world. And, it's a great paper, but it is--it says more about the potential of prison than anything that we have in the United States right now.

Russ Roberts: Well, we just need to send our prisoners to Norway.

Jennifer Doleac: Right; exactly. They've obviously figured it out.

Russ Roberts: That's the policy lesson.

Jennifer Doleac: Export. Outsource Norway.

Russ Roberts: Yeah.

18:26

Russ Roberts: Let's talk about a particular study of yours on body-worn cameras [BDCs], which is an awkward phrase, but it means police wandering around, I assume, videoing their every move. First, tell us how widespread that is. And, is it every move? When are they allowed to turn them on and off? Some places it's mandatory; some places, I assume, it's maybe voluntary? I don't know. But, tell me about that landscape; tell us about that landscape; and then, what are some of the effects of that technology? Which, I think most of us would assume that's a good thing: 'It adds to transparency, so it must be good.' Talk about the landscape of usage of the technology and then what you found when you studied it.

Jennifer Doleac: Yeah. And just to clarify--so, I haven't--I actually have a couple of ongoing projects on body-worn cameras, but I think the study you are referring to is by the lab at Washington, D.C. I was not an author on that study. I've written about this--

Russ Roberts: Yeah; I think maybe you blogged on it.

Jennifer Doleac: I've blogged on it. Yes. So, I've paid attention to this literature. So, body-worn cameras are at this point increasingly common; and the hope--so, it depends on local rules and guidelines, when officers are required to turn the camera off and on: different places have different standards for that. Some places it's voluntary--it's kind of entirely up to the officer's discretion. Other places, as soon as you get out of your car you'd have it on. Part of the challenge in--I think the reason that it's not the default that it would be on all the time is partly just financial: it's the storage costs for all the video that that racks up get to be pretty high. But, it's also I think a reasonable reaction to the knowledge that a lot of the interactions that police have with citizens over the course of the day are during the worst moments of those people's lives, and they don't necessarily want that recorded. Think about kind of domestic violence situations or someone who is having a mental break of some sort. There might be times when you might want to turn the camera off. But that also--so that's just in terms of whether the camera is on or off. But then there are also policies about when the footage is available--is it public record? Right? Does any citizen have the right to go in and ask for the camera footage from Officer Jones from the past year? And again, there are very strict guidelines and differences across different jurisdictions about what's available and what isn't. And all of this, the goal here is to increase transparency and accountability for police. We've been having a lot of very difficult but important conversations in this country over the last couple of years about unnecessary use of force by officers, and the extent to which that force is disproportionately used against African Americans, and there have been a lot of viral videos going around, usually caught by cell phones--videos, not body-worn cameras--showing officers using force when it seemed very clear that they should not be. And so, this has prompted--certainly a lot of discussion about what should be done about this: How should we reform the rules governing police behavior, training, or, you know, what else, to try to reduce the number of these incidents? And one potential solution is to have these officers wear cameras all the time so that we can see for ourselves in any interaction whether, you know, whether the facts were on their side or the person they were interacting with said. My take on this is, this has become so popular because we don't know what else to do. We don't have any better ideas. And this seemed--there was one, say[?] from Rialto, California, that showed that the use of body-worn cameras in that city reduced the number of, dramatically reduced the number of complaints from citizens against the police department; and people thought, 'Oh, good. That's a good outcome. Fewer complaints means that they must be doing fewer bad things, and so this is the solution to all of our complicated societal problems. We'll just have all the cameras where all these police officers wear cameras.' And, of course, there are firms that are happy to sell cameras; and that data storage that come with it. And they're very, very expensive. But, so in terms of what we know, what's been done: There have now been a number of really good randomized control trials looking at the impacts of body-worn cameras. The first--the one that, I think I wrote about in that blog post that you saw--was, I think in the first major U.S. city. It was in Washington, D.C., where they randomly assigned cameras to some officers and not others. And then they waited, and saw what the impact was on the officers' behavior. And they had a zillion different potential measures that they could look at to get a sense of whether officers' behavior and interactions changed in any way--after when they were wearing cameras or when they didn't. And then the end result was that nothing seemed to change. There was just like, no result, all across the board. This, it turns out, is in line with previous studies, more or less. So, there have been other previous studies in the United States and in Europe. Some find that body-worn cameras increase use of force, actually. Some find that it decreases use of force. On average, there's no effect. In the Washington, D.C. context, my hunch is that most of those officers just assumed they were on camera all of the time anyway, because it's Washington, D.C. and there are cameras everywhere. So, that seems like an easy way to explain why there's no effect in Washington, D.C. But, yeah: The punchline here is that policies that we implement for good reason don't necessarily have the impacts we think they will have. And it then becomes critical that we understand--that we go in with particular goals in mind. So that we can sort of be honest with ourselves about whether it's working.

24:57

Russ Roberts: What do you think we would expect those cameras to do? I mean, I think the average person's first thought is that, 'Well, if you know you are on camera, the joke'--it's not a joke--the saying is morality is about how you behave when no one's watching.

Jennifer Doleac: Mmmhmm.

Russ Roberts: So, the idea of a body-worn camera is that someone is always watching.

Jennifer Doleac: Right.

Russ Roberts: Of course, there was always somebody watching before. There was the person you were interacting with. There was your colleagues, often--your fellow officers. Police officers. So, it's a little more complicated. But you'd think it would add to a feeling of being responsible. And it would reduce the worst kinds of reactions in those kinds of high-stress situations. And I guess my--that's the first thing. But it's not the end. Of course there's the classic economics question: And then what? How does it change the behavior of potential victims? Of criminals? You name it. And then, how does it change where the police go and how they spend their time? I mean, if you know you are on camera, you might avoid certain places that, or settings that you thought might lead to stress and putting yourself in an awkward situation being recorded. So, those are the things I think economists look at. And if I've left any off, I'd love your reaction. But the last one I'd have is that, you know, in many ways, one unnecessary death because an officer overreacted is one too many. So, on the other hand, there aren't very many of these. There are a lot--horrifyingly, for the people whose lives are lost and their families and loved ones. I'm not minimizing it at all or suggesting--it's a horrible, horrible, horrible thing and it seems to be on the rise. But my point is, as an economist is, it's a very infrequent event. And it's not likely that the cameras are going to pick those up. And so, in any one study, we might see no effect. But that's just because it's rare. And what the camera does is reduce the odds of a rare, horrifying event, and so they are really important, even though in the sample period "no effect." So, what do you think of those arguments?

Jennifer Doleac: Yeah. Absolutely. Yeah. I know. Just to start the one thing I would add to your list about potential behavioral effects here: the one reason we might expect that these cameras would actually increase use of force is, you could imagine that most officers try very hard all the time not to make the front page of the newspaper. And, like, when in doubt, whenever they think there might be any question, they back off. And you could imagine that when they are wearing a camera and it would be clear to anyone watching the video that the facts were on their side, then they might use force more often. And so that might be how you get this kind of policy backfire. But, yeah: Your point about the rare events here I think is crucial to interpreting these studies. You know, I think when D.C.'s results came out, the--it was, I think somewhat frustrating to the people who ran the study, because the city then said, 'Well, even though there are no results and we said that that was what we cared about, we're going to keep the cameras anyway.' And, I think, you then have to think, you then have to ask yourself: 'Well, what do you really care about that? Because, you told us these were the outcomes you care about.' And, you know, city officials were certainly not the only ones who went down this road, and basically saying, 'Well, we really care about this--just the possibility that we could find out if, when one of these terrible situations happens and someone dies and we are suspicious about the circumstances, we want to be able to go look at the footage.' And, that is a completely reasonable goal, and a completely reasonable thing to pay a lot of money for. But it will never be picked up in an RCT[Randomized Control Trial]. Right?

Russ Roberts: Randomized Control Trial--

Jennifer Doleac: Randomized Control Trial. Right. These studies would never have been able to measure the impacts on, yeah, the one, you know, very rare death or being able to quantify in any way kind of what the impact is on just sort of community trust of police, if it doesn't show up in day-to-day behavior. And so this is where you really have to, again, be really clear on what the goals are, and make sure that the studies that you are doing are actually informing you about whether those goals are met or not.

Russ Roberts: It's a really interesting example of policy, because--when I think about it, I think--I mean, there's obviously some kind of virtue signaling going on here by the politician, saying, 'Hey, look: we're letting you look.' So, it's good. There's some amount of money that that would cost that would be not worth paying because you could do something better with the money, in theory, at least. But it also strikes me that, given the power of the state to use force and the risk of a very tragic outcome at times, which we know happens, that this just seems like a good idea. I wouldn't say at any cost: obviously there is a cost that would be too high. But I think it's worth quite a bit even if it doesn't show up in the data.

30:25

Russ Roberts: The other thing that strikes me, that I find strange just about the nature of life: You could imagine a school putting cameras in every classroom to make sure there isn't an incident where a student is say, bullied by a teacher, bullied by other students, humiliated, or a teacher loses it--which, of course, happens even in first-rate schools. A teacher after a long day, there's student that gets under their skin and they just explode; and that's a horrible thing. It often is very destructive to the teacher, the relationship with the other kids, etc. But it would be a weird thing--think how weird it would be that you would put in a video camera in every classroom and you would only use it for disasters. So, it would seem to me--tell me if I'm wrong--but, these cameras, they actually, they are capturing lots of data that have nothing to do with these tragic encounters with often-innocent people that end in terrible tragedy. They are capturing every day-to-day moment. And there's an enormous opportunity to improve the teaching--the style, the coaching, the way that police behave on the job. And I doubt that's happening. Or am I wrong?

Jennifer Doleac: I think, I think that people are trying. I think it is definitely a logistical challenge to think about how to do this, though. My sense is that police departments are using footage in training. It's unclear, you know, if that is helping in any way. But there are researchers who are trying hard to think about--you know, how to study the footage that is captured from all these zillions of hours of police interactions with citizens, to try to say something useful about, you know, what police behavior or policies or practices are beneficial or, you know, whether, you could imagine even seeing like if there's a policy change and there's a new training program, does that improve the interactions with people? And to be able to use the footage in any sort of productive way, it's going to require, you know, running it through machine learning algorithms and looking for [?]. I think people have, you know, people have done this and just listened for the words, like, 'Please,' or 'Thank you.' Like, to measure like whether police officers seem polite. Right? But that's basically like, when you think about what's required and what you'd be trying to train the algorithm to look for and it just gets very complicated and it's hard to think about, you know, to some extent we kind of know when we see an interaction that is, you know, maybe could have been de-escalated or something else. But thinking about training an algorithm to detect that seems harder. But, there are people thinking about it. Certainly, it is not possible in these departments for people that just, like, watch all of the video. Right? Like, that is not going to happen. Because it's just--the storage costs for the video will fall dramatically as the years progress. The human, like, labor costs of having someone sit there and watch all the hours of footage to see if anything went wrong--that is only going to get more expensive. And so, that is just not--it's not on the table.

Russ Roberts: I'm just struck by the similarity between being a good police officer and being a good teacher. We often judge teaching based on outcomes, not what happens day to day on the ground. But a good teacher, and a good principal knows who the good teachers are and is often in those classrooms; and knows the techniques that work well. Doug Lemov, previous EconTalk guest--he creates all these videos online to help people see how to do these techniques well. And to really pick an unattractive analogy: A lot of what being a good teacher is, is keeping order in the classroom. And, that's an art. It's not something--first of all, most 24-year-olds straight out of school or 26-year-olds out of grad school don't know how to do it. They learn from experience. They learn from other teachers. I'm sure it's true for police officers, too; and I think about the work of the great Jane Jacobs and how in older times police were walking the streets rather than driving in cars and had an intimacy and familiarity with the neighborhood, and vice versa, that must have been, I think, much healthier than the current world. So, the idea that somebody comes out of police school ready to be a police officer--there's a thousand things they don't know how to do in various encounters. And there's just an art here that it just seems it would be a useful thing to think about. Somebody's got--there's an opportunity there. That's all.

Jennifer Doleac: No, I totally agree. And I'll add to the sort of complicated situation, when someone comes straight out of the training to be a police officer and they are put on the force, and typically the rookies wind up in the worst neighborhoods, because they are sort of like, as you gain seniority you get yourself out of there. And so that's, that is surely not most efficient way to run things; but you understand how it happens.

Russ Roberts: Same analogy with students, right? You give the new teacher the worst classroom in some schools. That's a terrible thing to do.

Jennifer Doleac: Absolutely. Yes. Yeah. So, I think there is currently some active academic research, but I also know that in-house police departments are working very hard to come up with ways to identify the problem officers. Early. So that there can be some sort of intervention. Yet to be determined what that intervention is, and whether it has any effect. But at least it, you know, identifying the problem officers first, before they get to a point where they've killed somebody. And so, different--there is an active work there. But it definitely--this is sort of all related to the black box of policing, which we know much less about than, you know one area that I think we've sort of settled in the economics literature at least is that more police equals less crime. We know from study after study after study that when you have good causal identification, hiring more police officers reduces crime rates. But that obviously--that is on average. Right? And the best evidence suggests that most American cities are under-policed, not over-policed. But, that is, again, on average. And certainly, we know from all the conversations we've been having in the last couple of years, not every officer is doing good things; and not everything that police officers do over the course of the day is beneficial and productive. And so, figuring--I think we're just at the point where people are starting to figure out ways to kind of get inside that black box of how police officers spend their days and the variation across police officers. And, you know, try to figure out--you know, we try to say something useful about how we could potentially reform training or reform our, the incentive structure for police officers to get their outcomes.

Russ Roberts: Yeah. And again, it's the same issue in teaching: you don't really want to incentivize teachers to be rewarded based on the grades of their students or their test scores: you get all kinds of perverse effects there. Similarly, this is a big challenge. But it's one that I don't think government does very well. It struggles to--I would say to customize the way it treats either employees or situations. So, I'm not sure it's going to get better any time soon.

38:28

Russ Roberts: Let's talk about another piece of your work--you're actual work, I think, I hope--which is on DNA [deoxyribonucleic acid] databases. Again, I have no idea how DNA databases work. I assume there's--I assume I can't get at one. I don't know, even. So, tell us what they are, other than every once in a while you hear, 'Oh, somebody checked into a DNA database and they were freed.' But I don't know how--so, tell us how they actually work, in a thumbnail; and then what you've studied and what you found looking at them.

Jennifer Doleac: Sure. So, DNA databases in a criminal justice context are computer databases, as you might expect, where you basically store identifying profiles of offenders, along with the DNA profiles from crime scenes; and then occasionally, or on a very regular[?irregular?] basis those databases occasionally are scanned to see if, to find matches. So, to match known offenders with evidence from unsolved crimes. So, who is required to be added to the database depends on state law, in the United States. National law in other countries, for the most part. But, these databases at this point in the criminal justice context are extremely common: every U.S. state has one. We've had them for a long time. They gradually expand to include additional categories of offenders. So, most of them start out with sort of the worst of the worst--people convicted of homicide, or rape--and then gradually we added--burglars actually tended to be next, because people thought of rape as often being a crime of opportunity, and it was burglars who broke into a house where someone happened to be home that would be most likely to commit a rape offense. And then we added more, you know, more property crimes and less severe crimes. And at this point, the policy frontier in the United States is states adding arrestees. So, if you are arrested for a felony, you'd be added, or convicted of a misdemeanor. So, basically just adding additional groups that, you know, are plausibly at high risk of committing a crime. And then, we've collected the crime scene evidence and look for matches. And so, the hope is that the technology will reduce crime in two ways. One, it's--so, the main mechanism is that this increases the likelihood that you'll get caught for a future offense if you weren't already a suspect. If you're already a suspect, then the police can get a warrant for your DNA and compare it to the crime scene evidence themselves. They don't need the database. Um, this is truly in situations where it was, you know, especially in sort of crimes committed by strangers, or where someone wouldn't have been on the police radar as a suspect. So, if you are in that situation--if you are that offender, now, once you are [?] database, you are much more likely to get caught for a future crime than you were before. And, if you think about the standard Becker model of crime where we have--that you are a rational offender who is deciding whether to commit a crime and they consider the likelihood that they'll get caught, and the potential punishment that they would receive if they are, and they weigh that against the benefits to themselves of committing the crime--if we increase the probability of getting caught, then that should deter crime. And so, so, one way these databases could reduce crime is that they deter people who are added to the database from committing more crime. Um, but then in addition, anyone who isn't deterred, who says, 'Yeah, I'm more likely to get caught, but who cares? This is what I do,' or they are crimes of passion, or whatever else--if it doesn't deter crime, we can at least get those repeat offenders off the streets more quickly. They're still more likely to get caught, but we'll incapacitate them and make places safer. So, those were the hopes, as these policies went into effect. And so I have studied the effects of DNA databases in the United States as well as a more recent paper in Denmark where we have much better data. And in both cases, what we look at is, the--if you think of, you know, you've got an existing database and suddenly there's a database expansion, so on, you know--April 30th, there's, if you are a robbery offender, if you are a robber, you are not going to go in the database, but if you had been charged or convicted on May 1st then you are going to go in the database. And it sets up this nice, natural experiment where we can compare people who would have been eligible for the law for their DNA to be added to the database just before and after the law goes into effect. And what we find, in both the United States and in Denmark, is that people who are added to the database are dramatically less likely to re-offend. So, recidivism evolves[?] a lot. Sort of much bigger effects than I ever would have expected going into these projects. And this tells us a lot, I think, both about the power of increasing the probability of getting caught versus increasing the sentence--which has traditionally been U.S. policy, and the way that we try to deter crime. But also says a lot, I think, especially in this context, how much potential offenders think the technology--how powerful they think the technology is. And I kind of think that there's a bit of a CSI [Crime Scene Investigation, TV show] effect going on for them. Like, they think that they are in the database and they will instantly be picked up if they re-offend--

Russ Roberts: There will be a helicopter hovering--

Jennifer Doleac: and it doesn't quite work--right. It doesn't work quite that quickly in reality, but certainly [?] their chances.

Russ Roberts: Shhhh! Don't tell 'em. Don't tell anybody.

Jennifer Doleac: Exactly. As long as the technology keeps advancing faster than offender learn, I think we will be okay.

44:44

Russ Roberts: So, what are some of the downsides of those databases?

Jennifer Doleac: Hmm. The biggest one is the potential privacy costs, or the perceived privacy cost. So, the information that's actually in the database is--there's no sensitive information. It's essentially a string of numbers. You can think of it as akin to a Social Security number. It doesn't really contain any information in and of itself. And the way that they collect the DNA, a saliva swab, and then they, at this point it's a very mechanical process--they put it in a machine; it prints out the string of numbers that goes in the database. And so, the government is not, is not using people's genetic material to sort of determine people's all kind of health information or your predisposition to schizophrenia, or other things that might even be relevant. But people worry about that. Right. People do worry about sort of a slippery slope--

Russ Roberts: I worry about it--

Jennifer Doleac: Yeah. And so, the way I kind of frame the results of my research in this area is, you know, all of those costs are, you know, are real. And people are going to perceive them to be what they perceive them as. Right? I mean, it's going to differ from person to person, how much you worry about this. But, so at the very least let's think about what we're getting in exchange for that privacy. And, if you know that, you know, recidivism--I think in Denmark we find recidivism falls by about 40%--you know, there are a whole lot of tools that are a lot more invasive than this one. And so, if you're okay with having cameras everywhere but not okay with this, I think, you know, there's sort of a scare factor here because it's genetic material. But, at the very least, it gives you sort of a benchmark to think about. You know, if we're saving $x dollars every year due to reduced crime, and you look at that number and you say, 'I don't know--it's still not worth it,' then that just means that your perceived cost in terms of the privacy is higher than that. And that is useful information.

Russ Roberts: Yeah. No, I--it opens up a bunch of interesting things. One, of course, is to[?] take the DNA swab and throw it away, but you don't tell the person. That's, of course, dishonest and would be a bad policy you would think in a democracy. But it might be a lot better than storing genetic material on people or[?] background issues. If you watch the movie Gattaca, which I recommend--it's a fantastic film--it's about the awareness of how inevitably you leave genetic material around. And I think it's somewhat true--whether it's a strand of hair or a fleck of skin or whatever it is. And, it's great, because, 'Hey, we're going to get him.' And it's horrible, because it means there's a level of potential control by the state that's not healthy. So.

Jennifer Doleac: But, yeah, but I think in some ways I think one of the benefits of this technology is spreading the risk of wrongful conviction if they are, if you want to think about it that way, to a broader group. I think it's healthy that we are all thinking now about, like, 'Well, what if I get caught up in the DNA database?'

Russ Roberts: Good point.

Jennifer Doleac: Like, you know. And right now, again, the comparison is not some sort of ideal system where we know the truth. The comparison should be the status quo, where police officers make, you know, wrongful--arrest the wrong guy all the time. And I think if this moved us--and so, there are extremely high privacy costs for certain communities in this country who are targeted and sort of the usual suspects for police. And this, I think these advances where you've got scientific matches should make most of the arrests more accurate and in cases where you've got, you know, you happen to be in the wrong place at the wrong time and so they found your DNA there, or there's, you know, certainly human error active, human malfeasance, that is occasionally reported in these crime labs--you know, the rest of us could get caught up in that sometimes. But I think it is, that's actually a, that's useful somewhat to making us think about the problems that others have already been dealing with.

Russ Roberts: But it also can, as you say, it can also prove someone's innocence. Which is wonderful and glorious.

Jennifer Doleac: Absolutely.

49:35

Russ Roberts: Does it--besides errors like coding errors; and there's human imperfection, obviously, in entering the database and so on. But are there any strange things? I remember a story about a cousin getting tangled up in something because they shared DNA with somebody. Am I remembering that correctly?

Jennifer Doleac: So, I don't know about the specific story, but I know that one of the more controversial policies that states can implement is allowing familial DNA searches, where they basically look for partial matches that could lead them to the family member of someone who is in the database.

Russ Roberts: Yeah; that's probably what I'm thinking of.

Jennifer Doleac: Yeah. And so, you can imagine--it essentially expands the database to be not just people who have been convicted or arrested in the past, but basically everyone in their family. Which--I mean, still, you have to then demonstrate that the person was actually there and committed the crime. And the DNA, the partial match is certainly not going to solve the case. What's really interesting to me in this space right now is all of the, the recent stories about how they caught that serial rapist in California by uploading--like, citizens uploaded--maybe the police [?] were involved--uploaded DNA samples to a private DNA database used for genealogy. And so--and a lot of these--23andMe, and those types of databases, don't allow you to kind of upload your own string of numbers. You have to actually send them the DNA sample. Which the police would not have. But some databases do. And suddenly, so they caught that one offender that had been a cold case for a long time; and then, other police departments were like, 'Whoa, that's a great idea.' So, other police departments are now doing this. And, it's just been fascinating to think about, like, I suspect most of these databases will start ending, you know, change their privacy practices to not allow this, because they imagine most people are going to stop paying for the service if they know that they could be pulled into investigations. But it is really interesting to just think about, like, if you don't perceive this as having big privacy costs--if we have in[?] all the safeguards in place that you need to make sure that they are not coding other stuff, your genome and sensitive information, or you trust the process more in terms of it not, there's a high burden of proof to show that you did it and all that stuff: What an incredibly cheap way to catch offenders and reduce crime. You know. It's just like one of those advances that is both terrifying but also really exciting. Because this is just so--like, if this type of tool can dramatically reduce criminal offending, it is surely a whole lot less, uh--well, I suppose this is somewhat controversial. But I think it is a lot less socially costly than putting lots of people in prison for decades on end.

52:45

Russ Roberts: Yeah. I'm going to put on my Left-wing hat, which, I have to reach--it's not readily available but I think I can find it here in my office. And I want to just mention, I think another cost of this kind of improvement or solution, which is: It diverts us from thinking about the underlying reasons that people cause, commit crimes and have miserable lives, and do irreparable harm to people around them. And, I think for most wealthy, successful, un-prisoned people like me, it's tempting to say, 'We care about it. It's just, you know, catching criminals.' But we don't, really. What we really care about is--I hope--is creating an opportunity for human beings to flourish. And this kind of solves that kind of problem in a way-too-late[?] way that is seductive. So, that's my worry.

Jennifer Doleac: Yeah. That's totally fair. If we think of crime as--it is a costly outcome itself; but it is a proxy for other things that are going wrong in someone's life. And so if you think of it as more like a symptom than a, an integral[?], then you are totally right. So, a lot of my work right now is, I've been doing all this tech-in-crime[?] stuff, but I wrote a paper a couple of years ago on Ban the Box policies, which we can talk more about if you'd like--

Russ Roberts: Explain what that is.

Jennifer Doleac: Sure. So, Ban the Box policy has become really popular in the United States to try to improve the access to employment for people who have criminal records. And the idea behind it, the motivation for it, is that we know that employers discriminate against people with criminal records. So that the name of the policy comes from this idea that there's a box that you are asked to check if you've ever been convicted of a crime; and that employers would just sort of like throw out the applications that had the box checked. And so, a lot of very well-meaning people said, 'Well, we can solve this problem by just banning the box. We'll just tell employers they can't ask any more.' And then, as you noted earlier, the question that economists love to ask is, 'And then what?' And it turns out that's really important here, because the policy doesn't do anything to change, to address why employers were worried about hiring people with criminal records. And so, it turns out that when they are not allowed to ask any more, they try to guess. And then they reduce their hiring of black men. And so you see a net reduction for young black men who don't have a college degree--the group that is sort of most likely to be helped by this policy if it's helpful but also the most likely to be statistically discriminated against if employers are just trying to guess who has a criminal record--their employment rates fall by 5% in the years after Ban the Box goes into effect. So, a great example of unintended consequences of well-meaning policies. And so, sort of in the process of writing the paper and in the years since, I've become very interested in prison re-entry. And, like, what do we know about--what would be better than this? Right? Because I think I always try to--ultimately, I do research because I care about making policy better, and trying to achieve the outcomes that these policies had in mind. And want to be able to recommend alternatives to Ban the Box for people who act, really just want to help people with criminal records build better lives. And unfortunately, we just know so little about how to do this well. So that is--that is sort of another area where it's just, I think, [?], as you said, we want to help people have opportunities and build better lives so that they don't need to be involved in crime or don't wind up going down that path, but also so they can get out of it if they want to change their lives. And, figuring out ways to kind of help that process along, and make the opportunities available to people who want to seize them is something that we just don't know much about, yet, as researchers.

57:05

Russ Roberts: I want to come back to this parallel, which is just haunting me in our conversation, which, between education and crime. You know, you say we don't know much about it. I'd say we don't know much about what makes a good school teacher, in the sense that it's not easily quantified. Like: Have a Masters' Degree. That's not an important part of being a good teacher.

Jennifer Doleac: Sure.

Russ Roberts: And so, as researchers, we inevitably look where the light is. Like the drunk stumbling around for the keys under the lamppost when they were lost further from the scene. And so, we're always looking for measurable, quantifiable things that can enter into a regression. When in fact I think we ought to be looking at videos and intangibles and subtle things, and enhancing the opportunities for people who we know we're[?] good at even though we can't explicitly measure why. And getting the people who aren't good at it out. So, for example, you know, it seems to me--this is an example where you want 500, or 500,000 non-profits funded by voluntary contributions to help transform people's lives out after prison. Most of them will not be good at it. But there will be a handful that are good at it. And we want to scale those up. And, again, I'd get the government out of this for a lot of reasons. But one reason would be is that they are not going to be able to discriminate in any way. They are just going to have to fund--they are not going to be able to make the judgment call the way the head of a really first-rate non-profit could of who is doing a good job and who isn't. They are going to need measurable things; we're back to square one again. So it just strikes me that, again, just like I think people are doing really fabulous things trying to figure out the subtle things that people do to become better teachers: The subtle things we can train people to do after prison to have better lives is not going to be a scientific enterprise. It's going to be an artful enterprise, and we ought to let a thousand flowers bloom and let the good ones thrive.

Jennifer Doleac: So, I--I'm with you in spirit to a certain extent. I think the challenge is that we know that well-meaning policies and programs backfire all the time. Like, there are unintended consequences all the time. So there's actually--a great example of this in the re-entry space is, a real emphasis recently on wholistic wraparound services. So the idea is that people coming out of prison have just a tremendous number of needs. They have generally had [?] higher rate of substance abuse and to illness, no work history. You know, on and on and on. Nowhere to live--

Russ Roberts: A full-court press--

Jennifer Doleac: Yeah, a full-court press. So, we give people, you know, just intensive case management. They have someone that they can go to 24 hours a day and who will help connect them with all the services they need. And the best. So then, these programs, these types of programs, are typically so highly praised in communities across the country. They are fairly--typically the most common type of intervention. And, then, it turns out that when you actually do a Randomized Control Trial of them, they don't have any benefits. And, in fact, so, thinking of, your example of, just give funding to local communities to kind of do whatever needs to be done there: The funding was coming from the government, which you might not like, but basically the idea was, in recent years the government has poured a ton of money just into finding non-profits that do, especially, that really focus especially on these more holistic[wholistic?] type of treatments: It wasn't just job-training or something like that. But just like gave them money and said, 'Just keep doing what you're doing.' But, again, it was implemented as an RCT, so that we could see what the impacts were in these local communities. And, on average, people who were in the treatment group who got access to the services in these programs that got all the money were more likely to wind up with another conviction down the road. Not less. And so, it--it just highlights how hard this is. Right? So there's--of course, you are trying address needs and trying to do good things, but there's also the potential that by, kind of, that the full court press itself could actually do more harm than good in the sense of, maybe it just takes up a whole other person's time that they could have been spending some other--you know, looking for a job or something. Alternatively, there could be something about, you know, holding someone's hand through all of these different facets of their lives that kind of reduces their own sense of agency.

Russ Roberts: Yup.

Jennifer Doleac: And that more targeted intervention, just cognitive behavioral therapy and then leave it at that, or something like that, just more targeted intervention could give the person the freedom to kind of rack up wins on their own without help. And that--that could be just incredibly beneficial. And so, this is something where I think it just--you know, some things are not going to be measured; some things, it's just going to be really hard to--we'll never have all the answers. But I do, I do believe in the power of good research to be able to point us in the right direction better than sort of our guts are able to do.

1:02:32

Russ Roberts: I agree with that 100%. In particular, you know, what you were saying, while you were saying, I was saying, 'But agency? What about this feeling that if someone's constantly hovering over you?' So, again, there's an artful way to do it. And your other point, which I think is crucial, is: It's really hard to do. And so, I would expect the modal impact to be zero. But that doesn't mean there aren't 5 or 10 or maybe even 20% of the people who are helping others who do it really well. And they get lost in the noise, because it's hard to measure. Or they get a bad--even though it's random, they just happen to draw particularly difficult people so it's not really random: it just really looks--the process is random but the actual outcomes, what they are dealing with, are not. And so, I just think--anyway, well said.

Jennifer Doleac: Yeah. I'm totally with you. And I think, when I go around and talk to policy-makers about this, my last line[?] is always: We should assume everything we try will fail. And it's not coming from a place of pessimism. I think of myself as a very optimistic person, in the sense that I think that there are answers out there. I think we will figure out what works. But, we are slowing ourselves down by not--by, sort of becoming invested in certain programs before we know that they are working. And the best way to figure out quickly what's going to have the biggest impact is to just implement--implement things, try new things, do it in a way that allows us to test what the impact is. And then, as soon as we figure out it's not working, move--try something else. Right? But if we go in with the expectation of failure, then we are much less likely to wind up in a situation that I see all the time in government agencies--who care deeply this; but they have a pet program that has been running for years; and they just--they don't want to know if it doesn't work, frankly, because like they believe in it so strongly. And that's just that it's just so detrimental to sort of this, to figuring out what can be beneficial.

Russ Roberts: Just to make one last--at least for now--analogy to education: I think there's a terribly mistaken belief that we need to figure out what is The Solution to education: What's the right way? What's the correct curriculum? And it comes back to the, in a way, to the Norway example, you said about Norway--training in prison--

Jennifer Doleac: Mmmhmmm--

Russ Roberts: 'Well, that's the solution. That's what we need to do. Let's take everything they do; we'll just do it here.' When in fact the real solution is going to be multivariate, manifoldly different, depending on the individual, depending on the location, depending on the neighborhood, etc.--and that's the artfulness part. And we need 10 different solutions, not one.

Jennifer Doleac: Yes.

Russ Roberts: And we ought to be ready for that.

Jennifer Doleac: Yeah.

1:05:26

Russ Roberts: Let's close with some bigger-picture issues. You are an economist. Economists have been studying crime, I think probably since Gary Becker now, which is about 50 years, which is pretty amazing. And when Becker came into the field, he was ridiculed by sociologists and others who said, 'People, criminals, aren't rational. By definition.' And his response, of course, was, 'It's useful to treat them as if they are, and incentives matter even for criminals.' And, of course, your work is, in many ways--not 'in many ways'--your work is in that tradition. And I'm curious what you think the challenges are facing economists in interacting with a policy space that has lots of non-economists who don't really, who don't always respect what you are doing.

Jennifer Doleac: Yeah. Great question.

Russ Roberts: What's your personal experience on that?

Jennifer Doleac: Yeah. So, I think, it's--you're right that economists don't view the world the same way as non-economists do. For better or worse. So, when people ask me, like, why, you know how an economist would end up studying crime, my first answer is we're interested in incentives, and incentives are relevant here to, not just for, you know, rational offenders but also police officers and judges and all the rest. But we're also really focused on weighing costs and benefits. And, in order to do that we are much more focused than, I think, any other social science discipline on identifying the causal effect of a policy. You have to know you've nailed down what the impact of the program is, in order to calculate what the benefits of the program are. And so, we just have developed as a field, as a discipline, a toolkit that allows us to focus on that, on measuring that causal effect. And so that distinguishes us from researchers in other disciplines. And, I think--you know, we have discussions within economics, ourselves, about the extent to which our focus on causal identification distracts us from the important questions, and those are useful discussions to have. But, for, you know, I do think measuring the causal impact is important. And that is definitely one of our big contributions. I think the crime space, the pushback most often tends to be around how you can possibly place a value on, say, a human life. Right? So, how could you possibly--clearly the--if we are thinking about whether some crime reduction policy could reduce the homicide rate, some people will say, 'Well, then, we obviously have to do it, no matter, if we save even one life, it's worth it.' And, I think economists just come at this, because we are used to thinking about trade-offs, and thinking in terms of, sort of, of you know, quantifiable numbers, can react, or will react I think for the most part to that sort of reasoning and say, like, 'Well, the way we are all living our lives suggests that we don't place infinite value on other people's lives.' Right? I mean, we are all--we drive cars. We do things all things all day long that might potentially have negative impacts. And so obviously we are weighing tradeoffs, and we place some non-infinite value on these things. So, but in order to weigh those tradeoffs, we have to put numbers on them. It's just sort of--it's the nature of the game. I have other work on opioids, and I've kind of dipped my toe into that debate a couple of times now, looking at the unintended consequences of policies that reduced the risk associated with using opioids. And, that debate--so then I wind up interacting with the public health research community. And, they also, I think, have a really hard time with the way economists approach things. And so that--I have not figured out how to have productive discussions with that group, unfortunately. I think there's just a lot of talking past each other about, you know, what--what economists are bringing to the table and the value of causal inference, and the value of--and just the possibility of tradeoffs. I mean, I think economists are much more used to thinking about tradeoffs than most other disciplines are. And comfortable with it. Like, no policy is all benefits, right? It doesn't mean it's not worth doing. We just have to think, if we know what the costs are, maybe we can mitigate those costs. And that's useful. And I think a lot of my interactions with those in the public health space suggest that they view any cost or potential tradeoff as a threat to the possibility that the program will happen. And I think that's unfortunate.

Russ Roberts: That's the--that's the 'something better than nothing impulse' I think we all have. Which is a human impulse. And I think economists are fairly well inoculated against that, because we are trained relentlessly in unintended consequences. Which is a beautiful thing. And sometimes very difficult, as you say, for other disciplines, other people, to listen to. They don't want to hear--either they don't want to hear it, or they think we're just wrong.

Jennifer Doleac: Which maybe we are.

Russ Roberts: Right--well, one of the challenges--

Jennifer Doleac: Ultimately a lot of, all these questions--most of these questions are empirical questions. And then you can take it to the data. I think one thing I love about being an economist is that I think we, as a group, we rarely let each other coast on our priors. Right? Like, almost no hypothesis is off limits in an economics seminar room. And, you can disagree with--you can think that the person asking the question is dead wrong and their hypothesis is crazy; but you need to be able to show in the data. Or, in your model, or something. You need some evidence that they are crazy. You can't just call them a bad person; and that's not going to end the argument. And so, for better or worse, I think economists are used to being called bad people--heh, heh, heh--and it's not persuasive to us. So, I think that makes our research better.

1:11:38

Russ Roberts: I think before we started recording our conversation, you and I talked about the--I would call it the, one of the costs of being an economist, or costs of acquiring this inoculation and our natural tendency to worry about unintended consequences. Which is: We can fall in love with contrarian results--results that show that the public health profession's priors are wrong. I mean, just take a famous example--you know, Sam Peltzman, he's not so popular in some circles because he showed that many safety measures encourage people to drive more recklessly, say, or to behave more recklessly. And I think he's right. I think he's onto something. I think the--

Jennifer Doleac: Mmmhmm.

Russ Roberts: And when people say, 'Well, people don't take those into account,' I just ask them: 'If football players didn't wear helmets, do you think they would play the same way?' And the answer is: 'They would not.' That doesn't mean that every football player, when they make a play, is thinking, 'Well, at least I have a helmet on.' You know, and tragically[?] we've been forced to come to grips with this very unpleasant, unintended consequence of safer helmets. They've led to less-safe play. And it looks like terrible damage to some people. But, if we're not careful we fall in love with that.

Jennifer Doleac: Mmmhmm.

Russ Roberts: And those are the things--those unexpected results, those contrarian results, those unintended consequence results, we're a little bit--I think, not a little bit--I think we're over-enamored of them. It tends to push us toward research findings that more publishable, and cleverer, and maybe not always true.

Jennifer Doleac: Yeahhh. I mean, I think that, um--I do think that there's a reason we are enamored with them. Right? I mean, this is the unique perspective that economists bring to the table. And to the extent that, like, you know, if you just want to focus on good program evaluation, there are plenty of people who can do that. Heh, heh. So, the fact that it's going to be economists who say, 'Oh, well, if you make it safer to do that, people will probably do more of it. I wonder if I can measure that.' Um, I think you are right that that is the sort of natural experiment, the sort of analysis that economists just love. Like, it just puts a smile on our faces to hear about that sort of thing. But, I--but that's partly because it reveals something about human behavior that wouldn't be revealed otherwise. And in many cases it's important. Right? And I think, one thing I sort of learned to do is just to highlight out front is just that recognizing these unintended consequences doesn't necessarily mean we should do the--we shouldn't implement the policy. Right? We just need to recognize that there are tradeoffs, and knowing what the tradeoffs are--having a full understanding of all of the costs and benefits in a situation--allows us to implement other policies that maybe can help mitigate those costs. Or find a better policy that doesn't have the costs. Right? I think that if we just settle for the first thing that seems to work, regardless of all the downsides, we're missing out on opportunities of doing even better things. And so, yeah. I think economists--economists are always bringing the bad news to the conversation. But I think--I hope--I mean, the reason I'm an economist is I think it ultimately leads us to a better place.

1:14:59

Russ Roberts: I'll play public health researcher for a minute.

Jennifer Doleac: Heh, heh, heh--

Russ Roberts: and point out that I think we have flaw, as economists, to assume that, 'Oh, we'll save this money because it will be spent more wisely elsewhere.' And you could argue that the public health people really have the right view. Which is: That money is not going to be spent at all in this area. That, the public reacts to certain research findings with, in a non-rational, irrational ways. And we're better off at least with some money going there. And so, I think that's the downside of our engineering focus: that we'll just reallocate the money more efficiently. Because it often doesn't happen. It's-- So, that's relevant--

Jennifer Doleac: Yeahhh. I mean, so do I hear that from public health groups a lot. And when I--but when I talk with actual policy makers and practitioners, who are on the ground trying to figure out what to do about a certain policy, or, you know, how much to invest: They are very thoughtful about this. Like, my sense is that none of them are saying, 'Okay, end all the safe injection sites,' or something. You know, like they are not just, uh, no one's behaving that rationally. I think people on the ground making these sorts of decisions are pretty, are very aware that there are tradeoffs. Like, they are the ones who have to balance their budgets every year. Right? They are fully aware of this stuff. And I think we should--I think we should just give people more credit for being able to kind of handle complexity. And, you're right that, a lot of this is just kind of, voters and, um, all kinds of people respond with things without full knowledge. But, I do think that we're capable of having more complex conversations than we often give ourselves credit for.

Russ Roberts: That was very well said. Safe injections sites is a reference to?

Jennifer Doleac: So then the often--recently wrote a Brookings blog post, kind of reviewing recent research, mostly economists, not all economists, about what to do about the opioid crisis. And we got a lot of pushback on our review of the evidence of harm-reduction policies. Mostly around these moral hazard concerns that Sam Peltzman made so famous, and safe injection sites, and syringe exchange [?]--are in that category. Where, you can imagine them having huge benefits, on net. But, coming with the unintended consequence of leading to more drug use, because it's safer. And so it's thinking about that kind of trade-off, and what we can do to try to mitigate that, I think it's really important.

1:17:28

Russ Roberts: I want to close with one other thing I worry about, which is that--I mean, I think you are a very careful scholar. You are working in an area that's extremely important. It's an area where people's lives are going to be affected by what you find. But you are a human being; and you obviously are going to be drawn--are going to be drawn to the contrarian result when you see it. And, one of the challenges I know we have in our profession is that, you know, when you go out to "measure,"--measure the impact of policy acts--which is a phrase like it makes it sound like you've got a ruler or a calorimeter or some kind of measuring device. But, in fact, what we do in our field is we run, maybe 500 or 5000 regressions. Econometric analyses. And we figure out--we convince ourselves as to what the right one is. So, reflect a little on the humility that should engender. And I'm curious if you worry about that, as your career advances, and the incentives that you face for dramatic findings as much as accurate ones.

Jennifer Doleac: Yeah. So, it's definitely something that I think about. I spend a lot of time with academic psychologists who have been dealing with the replication crisis they are, and it seems like the--

Russ Roberts: We've talked about it a lot on the program. We've had Brian Nusselbaum[? Nosek?], twice.

Jennifer Doleac: Yeah. And so they--I think the conclusion in that field has been to really go all in on things like pre-analysis plans. And pre-[?]registering experiments, and so on. And I think there is now, I just saw that the AER [American Economic Review] requires that you pre-register any RCT [?] that you are going to--

Russ Roberts: American Economic Review--

Jennifer Doleac: Yeah, they are kind of our main, one of our top 5 journals. If you want to publish your randomized control trial in their journal, then they are going to--you better have registered up front before you ran it. Um, I actually really like--so, I also hear plenty about, it's like people are gaming that system, and just like running a zillion pilots before they figure out the one that works.

Russ Roberts: Huhh.

Jennifer Doleac: And they register it. Right? Yeah. I mean, you could figure out what it is to game it. I actually really like that economists have gotten to the point that we are--the standard empirical paper, now, it's like, you know, 30 pages of text and like 100 page Appendix, with every robust [?sys check] under the sun. And I think that that's--I think the idea that in our, especially observational data analyses where we are not running an experiment--you are dealing with a natural experiment you are trying to evaluate--you are learning so much along the way that the idea that, like, your first regression is somehow going to be the best one or the right one or the most, you know, uncontaminated or something, just strikes me as--it's just not informed by how much learning is involved in the process of this. That I think, you know, the--what we've come to do as a profession is expect, for every judgment call you make, you better show me in the Appendix a table where you do it every other way. So I can see that you didn't just choose the outlier. Right? Or, what happens if you drop one state at a time, or you, like, just choose different thresholds, or all these different things? And so, especially--I mean, ultimately, this is a profession where we have to trust one another. And our reputations are everything. And so, I think that the risk involved for someone like me--or anyone who is an actively, you know, publishing academic papers, there's such a tremendous risk in like, if you get a reputation as being someone who is like cherry-picking results or just is not, is not being sort of transparently more scientific about the process, people aren't going to trust you any more. And that defeats the whole point of doing this. And so, I do think that ultimately there is not going to be an easy policy fix for this. This is something where trust is going to be a big component of it. And the best researchers are really careful. And, also, you know, ultimately every paper contributes to a literature, and then we interpret the literature as a whole instead of just looking at a single paper. So, yeah. It's sort of a not a super-direct, easy answer. But that's how I think about this issue.