Episode 88: Sara Heller & Max Kapustin
Sara heller & max kapustin
Sara Heller is an Assistant Professor of Economics at the University of Michigan.
Max Kapustin is an Assistant Professor of Economics and Public Policy at Cornell University.
Date: February 28, 2023
A transcript of this episode is available here.
Episode Details:
In this episode, we discuss their work on how to reduce gun violence:
“Predicting and Preventing Gun Violence: An Experimental Evaluation of READI Chicago” by Monica P. Bhatt, Sara B. Heller, Max Kapustin, Marianne Bertrand, and Christopher Blattman.
OTHER RESEARCH WE DISCUSS IN THIS EPISODE:
“Cure Violence: A Public Health Model to Reduce Gun Violence” by Jeffrey Butts, Caterina Gouvis Roman, Lindsay Bostwick, and Jeremy R. Porter.
“Machine Learning Can Predict Shooting Victimization Well Enough to Help Prevent It” by Sara B Heller, Benjamin Jakubowski, Zubin Jelveh, and Max Kapustin.
“The Enhanced Transitional Jobs Demonstration: Implementation and Early Impacts of the Next Generation of Subsidized Employment Programs” by Cindy Redcross, Bret Barden, Dan Bloom, Joseph Broads, Jennifer Thompson, Sonya Williams, Sam Elkins, Randall Jurus, Janae Bonus, Ada Tso et al.
“Thinking, Fast and Slow? Some Field Experiments to Reduce Crime and Dropout in Chicago” by Sara B. Heller, Anuj K. Shah, Jonathan Guryan, Jens Ludwig, Sendhil Mullainathan, and Harold A. Pollack.
“Reducing Crime and Violence: Experimental Evidence from Cognitive Behavioral Therapy in Liberia” by Christopher Blattman, Julian C. Jamison, and Margaret Sheridan.
“Reducing Violence Without Police: A Review of Research Evidence” by Charles Branas, Shani Bugs, Jeffrey A. Butts, Anna Harvey, and Erin M. Kerrison.
“Advance Peace Stockton, 2018-20 Evaluation Report” by Jason Corburn and Amanda Fukutome.
“Implementation Evaluation of Roca, Inc.” by Abt Associates.
“Reaching and Connecting: Preliminary Results from Chicago CRED’s Impact on Gun Violence Involvement” by Northwestern Neighborhood & Network Initiative.
TRANSCRIPT OF THIS EPISODE:
Jennifer [00:00:08] Hello and welcome to Probable Causation a show about law, economics and crime. I'm your host, Jennifer Doleac of Texas A&M University, where I'm an economics professor and the director of the Justice Tech Lab. I have two guests this week. Guest number one is Sara Heller. Sara is an assistant professor of economics at the University of Michigan. Sara, welcome.
Sara [00:00:28] Thank you so much.
Jennifer [00:00:30] And guest number two is Max Kapustin. Max is an assistant professor of economics and public policy at Cornell University. Max, welcome to the show.
Max [00:00:40] Thanks so much, Jen.
Jennifer [00:00:41] Well, very happy to have both of you here. We're going to talk today about your research on how to reduce gun violence. Big question. But before we get into that, could you tell us about your research expertise and how you became interested in this topic? Sara, why don't you go first?
Sara [00:00:56] Sure. So broadly, I'm a labor economist interested in how young people make choices and then how policy can help shape those choices to make people better off, and in particular those at the bottom of the income distribution. I actually didn't start out studying gun violence. I started out interested in education policy, but I realized as I started reading about and talking to young people that their decisions about education don't happen in a vacuum, right they're making choices about education and work and crime all at the same time. And so that sort of got me more interested in the economics of crime, along with labor economics. And then, you know, anyone studying crime in the U.S. knows just how much of an outlier we are in terms of gun violence and just the incredibly tragic toll that it can take, especially on communities that are already marginalized in lots of other ways. And so I think I came to this gun violence prevention topic motivated by how much harm is is being done every day and a desire to try to use my own expertise, which is in designing and running experiments to rigorously test the effects of social programs to try to do something to help.
Jennifer [00:01:58] Awesome. And Max, how about you?
Max [00:01:59] Yeah. So my background is also kind of broadly similar to Sara's. I'm also a labor economist and I'm interested in studying ways to improve the life outcomes of young people and adults, particularly in underserved communities. And I previously studied questions in health and housing, but I kind of found my way to crime and to gun violence specifically when I spent a few years at the University of Chicago Crime Lab, where both Sara and I are now affiliates and which also played an important role in getting READI started. I think it was my time in Chicago and at the lab that really kind of focused my attention on how ruinous gun violence is to neighborhoods and how it shapes and also too often tragically cuts short people's lives. So, you know, like Sara, I believe there's a role for social science in developing and identifying social programs or other policies that can that can help reduce gun violence.
Jennifer [00:02:52] Your paper is titled "Predicting and Preventing Gun Violence: An Experimental Evaluation of READI Chicago." It's coauthored with Monica Bhatt, Marianne Bertrand and Chris Blattman. So, Sara, tell us a bit about the context here. What does gun violence look like in Chicago and who's affected by it?
Sara [00:03:08] So if all you did was read the newspaper headlines, you might think that Chicago is basically the murder capital of the U.S. It turns out that's not true. Chicago is actually kind of in the middle of the pack. So it certainly has higher homicide rates than the two bigger cities, New York and L.A., but much lower homicide rates than places like Baltimore and St Louis. So it's sort of more in the middle and more comparable to places like Philly or Detroit. That said, Chicago still has a shocking and devastating amount of gun violence. So if you include non-fatal shootings and you look back to the year this project went into the field, which was 2017, there are almost 3,400 people shot or killed in the city. And, you know, your listeners might have a sense from from watching things like The Wire that most of that violence is related to organized drug markets and gangs, but that actually tends to be less true now than it was back in the nineties, partly because law enforcement made a really purposeful decision to dismantle the hierarchical gang structure that's in Chicago.
Sara [00:04:08] And so some of the violence is certainly still about selling, but it's become now more about personal altercations, small, very local cliques and crews, the beefs that they have with each other, and then the escalating cycles of reciprocal violence that occur when something happens. In terms of who's affected by it so gun violence itself tends to be very concentrated in a small number of neighborhoods that have a long history of marginalization and among a small number of people within those neighborhoods and just like elsewhere in the country, young black and Hispanic men are disproportionately the ones involved in violence. So nationally, homicide kills more young black men than the nine other leading causes of death combined. But I would argue that we really shouldn't think of the victims and the offenders as the only people who are affected by gun violence. Gun violence is pretty much always at the top of the list of of every policymakers agenda in the city because it shapes the lives, not just of the people who are directly behind and in front of the gun, but also, you know, their families, their communities and really the city as a whole.
Jennifer [00:05:13] So this is obviously a research topic that has gotten lots of attention over decades. So, Max, I'm going to ask you the unfair question to briefly summarize that all of that research or at least kind of the research that you all are most focused on as you are going into this study. So what are we previously known about how to reduce gun violence?
Max [00:05:34] It's not an unfair question. We know a lot less than we ought to. That's why we're doing a study and there's a lot of ways to characterize it, but most of the prior research on interventions specifically meant to reduce gun violence have really focused, I think, on two broad areas. One is policing and the other is just sort of a bunch of different community led efforts. So on the policing side, we have pretty strong evidence. You know, for example, hiring more officers or concentrating officers in areas where no crime is likely to happen, so, you know, hotspots, policing, those kinds of things can reduce serious violence, and that includes homicides and gun violence.
Max [00:06:11] There are other strategies like focused deterrence that your listeners might know, right, in which police and community groups try to deter violence among specific people at high risk of it. Those are harder to study in ways that can really reliably estimate their causal impact. So I just think the evidence there is not quite as clear, but one thing about the policing research that we have, I think equally as important as how many police there are or where they focus their attention is also what they're actually doing and which is both well, we know a lot less. And where there's also a lot of concern. Right. Because we think about aggressive strategies that prioritize things like street stops and low level arrests, for example you know, those might have very different benefits and carry very different costs than, say, you know, improving how well detectives can investigate shootings.
Max [00:06:55] Right. So I think at least partly for that reason, there's a lot of interest in finding alternative ways to reduce shootings that don't involve law enforcement and that are community led. And when we look at that side of the picture, I think perhaps the most widely adopted and studied community violence intervention is something known as violence interruption and this is where outreach workers try to kind of mediate active disputes in a neighborhood and establish a norm of nonviolence, but that, too, is is hard to study causally in a reliable way. And what evidence exists has been described not by us, but by others as kind of being mixed at best. So it's precisely this this lack of very reliable evidence, particularly on community led gun violence interventions that made us so excited to study READI.
Jennifer [00:07:39] So, Sara, why don't we know more than we do? What makes this topic so difficult to study?
Sara [00:07:44] So I think there are a couple of reasons. So the first is, is one that kind of alluded to, which is just the challenge of evaluation. Right, so a lot of interventions you might think of try to intervene at either the state or the community level. So, you know, gun policy changes tend to happen at the state level a number of interventions, including the ones Max mentioned, you know, tend to try to change social norms or intervene within a geographic area for the whole community.
Sara [00:08:09] There's lots of reasons to think that might be important to do, but it also makes it really hard to isolate the effects of those interventions from anything else that's happening at the state or the community level. Right. You have to find other states or other communities that are good comparisons for the place you're intervening to see what would have happened otherwise. Right. They have to look like the place you're doing, the intervention in everything but the intervention itself and that's just a really tall order, right? You you probably picks the communities you're intervening with because of how bad violence was getting there. And so it's just not clear how to find other places that didn't get that bad, but look like that place might have that could help us understand what might have happened otherwise and so, you know, I think the first reason is gun violence interventions are just hard to evaluate well.
Sara [00:08:55] The second reason I would say is that, you know, working with a population that's at really high risk of gun violence, involvement is hard. They tend to be very disconnected. And, you know, after what's often a lifetime of institutions repeatedly letting them down and a lot of trauma exposure over many years, they can be pretty skeptical of offers to help. And so, you know, the organizations that do this kind of work are often really deeply rooted in the community and have spent a long time building trust, but they often tend to struggle to get and keep funding, and they face a lot of staff turnover. Right. It turns out it's just really draining to do this work, especially when the people you're trying to serve keep dying. And so evaluating the effects of these programs means finding enough of these organizations that are stable enough to keep operating well, that are large enough that there's enough capacity to study at the scale where we can, you know, sort of get the sample sizes as researchers that we would need to analyze and that are implementing similar enough programs that it makes any kind of sense to compare them and those things are just, you know, it's not an easy task.
Jennifer [00:09:59] So in this paper, you study the READI program, Max, what is READI? What does this program involve?
Max [00:10:06] Sure. Well, the idea behind READI, which it's an acronym that stands for Rapid Employment and Development Initiative. The idea is to provide the men who are at highest risk of being involved in gun violence, either as as victims or as offenders with services that are designed to help keep them and those around them safe. So maybe the easiest way to kind of understand READI and describe it is just thinking about it from the perspective of a participant. So let's say, Jen you know, let's say you've been identified as someone at very high risk of gun violence. Right. An outreach worker from your neighborhood with a similar set of life experiences perhaps to yours, you know, might reach out to you and tell you about READI. And that outreach worker is going to continue to engage you until you're interested in and sort of ready to actually join the program.
Max [00:10:53] And once that happens, they'll connect you with a local READI employment organization and that organization is going to provide you with a job for up to 18 months. Now, you might start off doing things like outdoor cleanup in a park, you might pack meals at a food pantry, but over time, you know, you also might have the opportunity to take on jobs with with higher pay and with more responsibilities, such as, you know, working in a manufacturing plant. And recognizing, by the way, that you might face obstacles to participating. Right. Like you might have unstable housing or involved in the criminal legal system. These are jobs where you can come back to them if you have to drop them for a while.
Max [00:11:33] Now, crucially, in order to actually work the job and this is the other big component of READI is you have to participate in morning sessions of group cognitive behavioral therapy or CBT. So CBT is designed to help you identify and to change thoughts that adversely affect your behavior. So, you know, to think about in some situations the thoughts of yours that might occur automatically to you could lead you to take certain actions that are counterproductive. Right like, so you escalate an argument with someone because you might have thought that they're being disrespectful to you, but it turns out that they have a gun and that situation does not end well for for you or for them. And so CBT is really meant to give you tools to help slow down in those moments, restructure your thoughts and go a different way.
Max [00:12:21] And as you're sort of working the job and attending CBT for for up to 18 months, your outreach worker continues to help you. Right. So they might help de-escalate disputes that you're having with another participant or with the program staff, or they may connect you with other services you know as necessary and subject to availability, things like substance abuse treatment or housing support.
Jennifer [00:12:43] Okay. So, Sara, why might a program like this be effective? What are the main mechanisms that you all had in mind as you were implementing and evaluating this program?
Sara [00:12:55] So READI is a bundled intervention. It's all of the things Max talked about at once. And the intent is that it might address a whole set of different proximal causes of gun violence. So on the work side, a job is going to provide a stable source of income and maybe a place to build and reinforce new skills and norms. The stable income itself might deter people from working in illegal markets that might involve violence. Money might also be an incentive to participate in the CBT or just to show up at all. Right. So if participants are criminally active enough, just keeping them busy for 18 months at a time could make a difference. You might think about that as sort of an incapacitation effect, but with productive work instead of prison doing the incapacitation.
Sara [00:13:36] On the CBT side like Max described, right CBT is designed to help people think about their own thinking to slow down in those key moments of conflict, to make more reflective decisions, to practice behavioral strategies that might help de-escalate some dangerous situations, and to help adapt their behavior to a legal workplace and less violent identity. And in fact, you know, one of the things we did was collect some qualitative data, so did interviews with participants. And we did hear mention of all of these mechanisms. Right. So we heard guys saying that the money mattered, that it either got them in the door or they liked using it to support their families.
Sara [00:14:13] We heard that the CBT mattered, that it was teaching them things that helped them walk away from a situation or at least pause before they engaged in an angry way instead of always blowing up at at every slight. Some of them even said they, you know, it helped in in other things like their relationships with their girlfriends. And then we heard others talk about how both the things they were learning in the program and especially the relationships with the really dedicated staff at READI, helped them envision a new identity or a future for themselves that they didn't think was possible. And so we think probably, you know, all of these mechanisms are going on to some extent.
Jennifer [00:14:48] So it's interesting, I think you all mentioned in the paper, you know, there's this transitional jobs literature before this that is randomized access to a temporary job that tried it's like gives them practice working a regular job and the idea is to transition to the private sector. And those have generally not been very successful in the RCTs that have been done of those. So as as I've talked to you all about this program in the past, I think the way I've come to think of it is that the job is sort of a carrot to get them in the door to do the CBT. And I realize you can't actually you're not going be able to distinguish among these different components of the package as you're implementing it, but is that is that how you think about it now? Or have you come to think of the employment piece as being more than just a carrot?
Sara [00:15:33] I don't know if we have consensus about that. I think that's probably how we started out thinking about it for sure. You know, I think we also hear stories from from the crew chiefs or the the program staff who are running the worksites that talk about how they sort of helped the guys practice the skills they were learning in the CBT. And, you know, we've heard the guys talk about how important the income was for them, that that's what allowed them, you know, space to sort of not do some of the, let's say, you know, drug market work that they were in. And so, you know, whether if it's substituting for illegal work, I don't know if that's a characteristic if it's helping them practice those skills, it might be a little bit more substantive.
Sara [00:16:11] I think what I took away from our qualitative evidence is that it might be hard to operate or succeed in the program without both pieces that, you know, maybe we're not sure which piece is doing what, but that seems like you might need both of them in order to get guys there and keep them engaged and teach them what you need to teach.
Max [00:16:29] Yeah, I would agree with that. And I would just remind us all that carrots can be independently nutritious, not just an incentive the job itself could be it could be doing something. And the interaction between the job and the CBT, as you know, as a place to practice some of the lessons learned in CBT in a somewhat controlled environment where people are all sort of on the same page about what it is, what the mission of the intervention is, I think that can also be quite helpful.
Jennifer [00:16:52] Yeah, and I guess these jobs are it's different from the transitional jobs programs that have been studied in the past and that I guess is this 18 months versus like six months on average. I agree. The interaction with the CBT does seem really interesting here, and I guess there have been some of those that have been tested before, but not a lot, so.
Sara [00:17:11] And it's also just a very different population than has been tested in most of those transitional jobs programs as well. So that might make a difference.
Jennifer [00:17:17] Yeah, great point. Okay. So, Max, a primary challenge here is finding people who would one benefit from this program and who would to actually show up and take part in it. So how are people actually recruited for READI?
Max [00:17:32] Yeah. So that this was really the first big challenge that READI had to solve because as Sara mentioned earlier, and this is something that affects gun violence interventions more generally, the men who are very highest risk of gun violence and who could theoretically benefit the most from something like READI are also often among the most isolated from a lot of social institutions.
Max [00:17:54] So the first thing that READI had to do was decide where they were going to focus geographically within the city of Chicago and so they chose five neighborhoods that have among the highest rates of gun violence in the city. So we're talking something on the order of 10% of the population of Chicago, but accounting for about a third of the homicides. So very, very concentrated neighborhoods have very concentrated rates of gun violence. And then within each of those neighborhoods, READI used sort of three complementary different referral pathways to actually find potential participants. So the first of those was actually something that we, the research team, were involved in, where we referred specific men who were predicted to be at highest risk of becoming either a victim of or being arrested for an act of gun violence in the near future and we did that using a machine learning algorithm trained on administrative data.
Max [00:18:49] And so this approach can actually find people at remarkably high risk of becoming a shooting victim, actually Sara and I along with our coauthor Zubin Jelveh and Ben Jakubowski we have a working paper out about that if your if your listeners are interested and it's also kind of a scalable way of doing this right, but it could also miss someone who is, for example, at very high risk for being involved in gun violence, but for whom their risk factors are not in the administrative data. So maybe something recently happened to them and administrative data hasn't quite caught up. And I should also just note here that although this algorithm was trained on data from the Chicago Police Department, neither the algorithm itself nor the identities of any of the men who were referred by it were ever shared with police.
Max [00:19:32] This was just used to refer men to to READI. So because of those limitations of the algorithm, we have a second pathway that involved having local outreach organizations are with deep knowledge of gun violence in their neighborhoods, tap into their social networks and refer men whom they thought would be at highest risk and the idea was to have these outreach workers make referrals on the basis and part of information that was unobservable to the algorithm and also refer men who weren't just at high risk of being involved in gun violence, but also whom outreach thought might be particularly responsive to READI.
Max [00:20:10] And so that the last pathway I'll just briefly mention here is that we referred men leaving jail or who are on parole and who are at particularly high risk and who might otherwise have been missed by both, you know, the outreach and the algorithm pathways. This pathway took a bit longer to get started, and recruitment was unfortunately cut short due to the pandemic. So most of the men in the study really came through the the algorithm and the outreach pathways.
Jennifer [00:20:34] For the algorithm piece can you say a little bit more about what the main predictors were in that model? Is it like being arrested for a gun related crime in the past, something like that?
Max [00:20:43] Yeah, I mean, we adopted a sort of and this is the technical term, the kitchen sink approach. So we we I think there was something like 1400 different predictors in the model, including the things you mentioned. Right. So were you a victim of gun violence in the past? Were you arrested for an incident of gun violence in the past? But also, we're a bit agnostic as to, you know, we didn't know which things would predict involvement in gun violence.
Max [00:21:06] So we kind of gave the model access to to a lot of information and let it sort of find the relationships that best predicted involvement in gun violence. And, you know, it did, you know, as we'll probably talk about a bit later in the conversation, it did a it did a very good job, but it's also the case that as we'll also see, you know, the outreach workers also did a very good job. So there's more than one way to identify people who are at high risk of gun violence and there are certain trade offs and considerations when considering sort of different approaches.
Jennifer [00:21:33] Okay. So once you have your participant sample, you then have to implement the program in a way that actually enables you to test whether it works, which as long time listener as well know is easier said than done. So, Sara, how did your team do this? What was your research strategy here?
Sara [00:21:49] Yeah, So, you know, the reality of operating these kinds of programs is that you just don't ever have enough money to serve everyone you think could benefit, right? There's always more people eligible than we could possibly serve and so what that does is let us set up our studies, sort of like a clinical trial in medicine where we have an eligible sample in our case, it's about 2500 men. And then we randomly assigned who gets offered a READI slot and who doesn't, and those who doesn't, those who don't get offered one are free to pursue any other available programing, just not READI. And so one benefit of this kind of random allocation process is that it's fair, right? So it avoids nepotism in allocating a limited resource. It's not about who, you know, whether you get in the program.
Sara [00:22:36] It avoids serving only the most motivated people. If you do a sort of first come first serve who might not be the people who you most want in the program. The second big benefit is that it sets us up to really isolate the causal effect of the program itself, right. We have these two groups, the people who the coin flip said offer READI and the people who said the coin flip said don't offer READI and the only difference between them is literally the flip of a coin, right. It's random. And so that means that on average, the outcomes of those two groups should be equal unless the program does anything right and so that's what we test. We measure the outcomes across the two groups the treatment group offered READI and the control group not offered READI. And we know with confidence that any difference between those two groups is really due to READI itself.
Jennifer [00:23:20] So running a randomized controlled trial like this, this fair lottery, especially when you're your capacity constrained, this is sort of a researcher's dream, but often hard to convince the stakeholders on the ground to go along with. So I'm super curious how this research partnership came about. Can you tell us a little bit of the backstory here?
Sara [00:23:41] Yeah. So I mean, the idea started back in in 2016 and that year there was just an unprecedented year over year spike in gun violence in Chicago was something like a 60% increase in homicide and similar rise in nonfatal shootings. And one of the things that that did was really motivated a broad set of people to try to marshal resources to do more. Right. So that included government and nonprofits and importantly, the philanthropic community. And so our research team, along with the executive director of the University of Chicago Crime Lab, Rosanna Ander, and the amazing team at Heartland Alliance, which is a nonprofit that's housed in Chicago, although has a global presence which was led by Evelyn Diaz and Eddie Bocanegra, we all started talking together about how to best deploy these resources in a way that we could actually learn from, right. So we're not just trying to prevent violence with those resources.
Sara [00:24:37] We're trying to learn from what we do so that people can sort of benefit from that in the future. And so as we started this discussion and process of of developing the program, you know, one thing we did was look at the homicide data and realize that almost 90% of victims were over age 18, but almost all of Chicago's prevention programs at the time were for students and juveniles and so we sort of documented the fact that there was this big service gap for adults that we wanted to help fill and then we looked at the literature. Right. And as you said, the the evidence on jobs program alone is mixed, but I think, you know, we've seen that CBT can reduce at least less serious violence in other contexts. So we've seen that in my own work on a CBT program for high school boys. We've seen it in my in my coauthor, Chris Blattman's work on CBT influenced programing for men in Liberia.
Sara [00:25:26] So there was a lot of excitement among our partners, I think about the CBT. And there are some hints in both the CBT and the jobs literature that the two strategies might be more effective together, right. And so I think everyone was sort of on board with the idea of trying something like this, but there are only a few programs across the country that have been doing that, that have been combining these kinds of strategies for a population that faces as much risk of serious gun violence as the one we aim to serve and really, none of them have been rigorously evaluated yet. And so I think it was it was a pretty compelling case to make, given the situation in Chicago, the resources we had and what was in the literature.
Sara [00:26:05] And so we set out to, you know, to evaluate this CBT plus jobs idea. And then I think, you know, it was really Heartland Alliance that made it happen. They started recruiting the local community organizations, selling them on, you know, sort of what the idea was. And I think everyone was very excited about the amount of resources that the program was bringing. And I think that helped sort of convince people to go along with the randomization. And it was those little local community organizations who were the ones to actually implement the program. Right. So in partnership with Heartland Alliance, they just worked out a huge number of logistical and safety challenges. They hired and trained staff. They did really all the hard work of implementation. And, you know, we were there along the way from the beginning working with them to to figure out how a randomized controlled trial could work in this kind of setting so we could learn from all of the incredible work that they did on the ground.
Jennifer [00:26:56] Awesome. And I'm also curious to hear more about I mean, just from the research side, the IRB process does that, which is perhaps not that sexy to people outside of university settings, but you're working with a super high risk population who it's not you know, it's not implausible that some of them might wind up actually getting shot while they're engaging in this program. What was sort of the ethics conversation here and how difficult was that hoop to get through you, Max you want to take that one?
Max [00:27:30] Yeah, sure. I was I was at the University of Chicago, you know, as one of the co-pIs at the time. So it might have been my name on the protocol. I can't remember, but no, that was that was certainly a huge concern for us. And I have to say, you know, all credit to the staff at the at the IRB in Chicago, they were exceedingly professional and supportive and helpful in answering our myriad questions about this and our concerns. I mean, I think they really understood what we were trying to do and they understood the population that we were working with and the risks involved. And I actually still recall now that you mention it, it's sort of buried this memory a little bit, but I forget how far into the study it was, and it probably wasn't very long. I mean, maybe a few months in when, you know, I got a call from from Eddie Bocanegra, who was who was the director of READI at the time at Heartland Alliance, letting me know that a participant had been killed.
Max [00:28:17] That was the first one, but not the last, sadly and I you know, I immediately went to to inform the IRB, and I was sitting on the edge of my seat, worried that they would, you know, tell me to stop the study, but again, to their credit, they looked at the situation. It wasn't you know, I forget the exact circumstances of that particular participant's tragic death, but it was it was not directly due to the program. It was, you know, maybe on off hours, whatever, whatever the circumstances, where, again, I don't I can't recall them here, but they understood that this was a population that where that would happen. And they said, no, we don't see any reason why the study should stop. You know, the intervention may still be helping so proceed. Let us, you know, keep us posted and that's kind of how we we had to approach it.
Sara [00:28:59] And I would say, you know, we also tried to build in a lot of things to make sure that nothing was coercive. Right. I mean, that's that's one of the main concerns you have working with vulnerable populations is that you don't want their research to coerce them to do anything they don't otherwise want to do. And so, you know, I think we tried to to set up recruitment and discussions with the outreach workers and everything to to make sure that was the case. And then we also, you know, tried to do our best, I think, to give everybody a voice in the process. So, you know, once the program started operating, we formed a participant advisory committee where, you know, when we weren't sure sort of how things were working or how people might respond to particular interview questions or or anything else.
Sara [00:29:39] You know, the participants themselves had to say, along with, you know, the staff who had lived in the communities and were working in the community outreach organizations who had a lot of history in the neighborhoods. And so, you know, I think we really didn't want to be the kinds of researchers who just walk in and say, we're here and we're going to solve this problem because we didn't believe that, right? We wanted to learn from them and their experience. And so I think, you know, we as economists are not really trying. And to do this, I think our coauthor, Monica Bhatt, has more experience than Max and I do in doing this, but I think we we really sort of tried to go out of our way to make sure everyone felt included in the research part of it, as well as the implementation.
Jennifer [00:30:18] Okay. Let's talk about data. Max, what data are you using in your analysis both for that fancy machine learning algorithm and then everything else you've got?
Max [00:30:29] So thinking about the kind of what's at the core of the study, what if we think about the impact evaluation, you know, the kind of outcome that we're most interested in is whether READI reduced, how often participants are involved in acts of serious violence, either as victims or as perpetrators. But, you know, we lack data on every act of serious violence in which these men are actually involved, which we can't observe them 24/7 and so instead, we do use police records on reported victimization and arrests as an admittedly very imperfect proxy. Now, police data have some advantages and some pretty large disadvantages. I think the main advantage of police data is that we can use them to measure outcomes for a large sample people. Right which just wouldn't be feasible if we actually had to track down the men in the study one by one and administer them surveys at regular intervals.
Max [00:31:23] One of the big disadvantages of police data is that if you think about an outcome like victimization, right, that's only going to appear when those are actually reported to the police. And so for this reason, we focus on only the most serious kinds of violent victimization shootings and homicides in particular. And the reason we do this is because in Illinois and a number of other states, if you show up at a hospital with a gunshot wound, medical providers are required by state law to report that to the police. And so combined with the fact that we think most homicides also come to the attention of police as well, these types of victimization suffer much less from underreporting than do other types of victimization like aggravated assault.
Max [00:32:06] Another disadvantage of police data is that if you think about the arrest side of things, right, arrests don't capture all offenses in Chicago in particular actually, most violent crimes, including homicides, they don't result in an arrest and there are other arrests there that are wrongful mistaken. So arrests are pretty noisy, a measure of whether or not someone actually committed an offense, but as long as arrests are an equally noisy measure of offending for men in the treatment group, the ones who are offered READI and men in the control group who are not. And we think that that that is likely to be true for serious offenses like shootings and aggravated assaults, then they don't undermine the research design.
Max [00:32:46] So we end up relying on police data for these reasons and to your point also about the about the model, it's trained exclusively on on data from from the police department. And the other thing I would just mention here, in addition to all of this administrative data that we have to wade through, is that to get a bit more of an insight into how participants and staff experience READI two of our colleagues, Monica Bhatt and Chris Blattman, they led a team that actually collected a wealth of qualitative data. So that includes things like field observations, focus groups, interviews, surveys. So we have a lot of qualitative data that we're thinking about how best to sort of process and analyze and report out to show what we've learned there.
Jennifer [00:33:24] Okay. And so you're going to wind up grouping a few different outcomes together for some indices for your main outcome measures in your analysis. So what what are the the outcomes you're most interested in when you actually go to run your regressions?
Max [00:33:39] Yeah, that's right.
Max [00:33:40] It's almost like you've seen the paper gen Well, I should say prior to analyzing any outcomes from the study, we actually wrote and posted a pre-analysis plan that lays out the approach that we, that we plan to take. And in that pre-analysis plan we specified that the study's primary outcome is going to be this measure of overall serious violence involvement. And that, as you point out, is going to be an index of three things. One is shooting and homicide victimization. As I mentioned earlier, two are shooting and homicide arrests and three are arrests for other serious violent crimes like armed robbery and aggravated assault. And all of these are going to be measured over the 20 months after an individual is randomized.
Max [00:34:24] Now, you might actually notice that in that list of outcomes that I mentioned, it's not just focused on shootings and homicides. It also includes arrests for these other serious violent crimes. And the reason for this is that, you know, there's just part of the kind of the research process that occurred a bit is that before READI was launched, we actually weren't sure whether it would find enough men who were at high enough risk for being involved in shootings and homicide specifically. And so we were worried that the study would be underpowered to detect an effect involvement just in those incidents. And so as what we'll talk about in a bit, you know, that worry turned out to be misplaced because READI actually, you know, found men had extraordinarily really high risk of involvement in lethal violence, but that's just sort of an insight into our thinking at the time.
Max [00:35:09] And in addition, in our pre-analysis plan, we also said that we'd estimate READI's impact on each of those components of the main outcome separately, and we would adjust our inference that we draw from this to account for the fact that we're asking, you know, more questions of the data and so that increases our chances of making a type one error and we did this because, you know, here too, we really weren't sure how READI would affect participants behavior and if it would affect these different types of violence involvement differently, READI could have moved them all in the same direction. It could have moved some of them in different directions. So we just didn't know what it would do, you know, at the outset. And so we pre-specified that we would look into the matter and sort of adjust our analysis accordingly for that fact.
Jennifer [00:35:49] Okay, So let's talk about the results. Sara, Let's talk first about READI's ability to identify and engage people who are at high risk of gun violence. What did you find on that front?
Sara [00:36:00] So I would say that READI was sort of shockingly successful, both at finding and engaging people at strikingly high risk of gun violence, or at least I was shocked by it. So to start, if you look just at the control groups, so no treatment effects, just sort of characterizing how much violence this population would have been involved in in the absence of the program. For every 100 people in the control group, there were about 11 shooting and homicide victimization over the next 20 months. That's just a stunningly high rate of being shot or killed. It's 54 times higher than the average Chicagoan, but even if you zoom in on on young men in the same neighborhoods as READI was operating, the study control groups still had almost three times more shooting and homicide victimization.
Sara [00:36:46] Now, interestingly and Max sort of alluded to this earlier, when we look across the three different pathways, all three of them found people with similar risk of being shot, but importantly, they were different kinds of people. Right. So the folks referred by the algorithm look pretty different on all sorts of observable characteristics like age and arrest or victimization history, but they all showed similar risk of gun victimization. And so what this tells us is that you can find people with similar risk of gun victimization using any one of the referral mechanisms, but if you use only one of them, you might miss other people who are also facing high risk. So I think that's one of the lessons that come out of the the success of READI and finding the population that really faces a high risk.
Sara [00:37:31] In terms of engaging this population I think READI was also really successful. So the take up rate, the rate the proportion of people offered the program who started the program was about 55%. And we think that's successful partly because it's it's even a little bit higher than a CBT intervention for a much less disconnected population that we've done with high school boys in Chicago. And that's, you know, initial take up retention was also pretty high, right so it's comparable to some of those transitional jobs programs. You mentioned earlier that operate over, you know, sometimes even weeks or a couple months rather than a full 18 months. And so I think the the success at recruiting and retaining the participants speaks partly to the demand for this kind of programing among this population, that they just don't have any other option like this. And partly really to the success of the outreach workers and their sort of relentless engagement of finding these guys and getting them and keeping them in the program.
Jennifer [00:38:28] Okay. So, Max, turning next to whether READI had meaningful benefits for those who participated. What was the causal effect of READI on involvement in violent crime?
Max [00:38:37] Yeah. So on the study's primary outcome that I mentioned earlier and right this this index that combines victimization, reporting, victimization and arrests for serious violence, we find no statistically significant change between the men who are offered READI and the men in the control group. However, when we look at the components of the index separately, we find that READI participants saw a 64% drop in arrests for shootings and homicides. And that decline is statistically significant on its own, but not after we make those adjustments that I just talked about for inference, you know, for the fact that we're testing, in this case, three different hypotheses.
Max [00:39:18] So in contrast, if you look at the other measures in our index, like shooting and homicide victimization, you know, their READI participants saw a more modest decline and on arrests for these other non shooting or homicide, violent crimes, things like aggravated assault and armed robbery they actually saw a small increase although neither of these changes are statistically significant. So overall, it seems like READI had different impacts, perhaps on different forms of violence with a potentially large but still suggestive reduction in arrests for shootings and homicides specifically.
Jennifer [00:39:53] Sara, when you look at different subgroups here, different subsets of the overall sample, does it look like some people benefit more from this program than others, or is everyone pretty much the same?
Sara [00:40:06] It's definitely different. So I think probably our clearest results is that the group referred by the outreach workers had a bigger decline in violence. So they have a statistically significant decline in our overall violence index that remains there even after we adjust for the three different tests we're doing across the different referral pathways.
Sara [00:40:25] And that drop is driven by huge declines in both arrests and victimization for shooting and homicides. So it's about an 80% drop for shooting and homicide arrests and about a 45% drop for victimization. And again, both of those results are accounting for the three different outcome tests we use to break the index into its components. And so it looks like the outreach workers are doing a good job at anticipating who is going to respond to READI. Interestingly, though, we can break things up even further and we see that not everyone the outreach workers refer is responsive. So the subset of outreach referrals also have a high predicted risk. That is, the algorithm anticipates that they are going to be at future high risk of gun violence. Those are the ones who are really driving the violence decline.
Sara [00:41:13] Now, to be transparent, this is not entirely conclusive these are relatively small groups. And unlike the differences by pathways we didn't prespecified that we were going to look at the sort of combination of the groups, but I think it's still a really interesting result for future study that it seems like the combination of of outreach and algorithmic referral works better than either by itself, right. It's the subset that both the outreach worker and the algorithm would refer that look to be the most responsive to READI.
Jennifer [00:41:41] So I think you all do a really nice job in the paper of stepping readers through these somewhat mixed and nuanced results. Right. It's not a very clear like this program works or it doesn't work. It's sort of a suggestive like seems like maybe it works for some people and that's just kind of complicated to explain, but you do a good job of helping us think through the uncertainty involved here with the various outcomes. And one way you approach this is by calculating the social costs and benefits implied by the various estimates. Some of these outcomes that you're measuring are simply more costly than others and so should probably get more weight as we think about whether this program is cost effective. So, Max, could you walk us through what you do there and what you find?
Max [00:42:25] Yeah, and thanks, Jen. I think we really, really appreciate that. It's quite hard, as we've learned firsthand on this project, to convey nuanced results, you know, in a clear and hopefully compelling way for the cost benefit analysis you mentioned. So just taking as a starting point, I think what you said just a second ago, it's a it's fairly intuitive that different acts of crime and violence carry different costs to society. So, you know, a simple assault is less costly than an aggravated assault, and both are less costly than a homicide. Prior researchers have actually gone further and for each of these acts, they provide ranges of their estimated cost to society, so to victims, to offenders, to taxpayers and so on.
Max [00:43:05] And so we're going to build on that and use those estimated costs to calculate READI's impact on society from changes in participants involvement in crime and violence. So I should be clear here that these calculations that we present in the paper, they don't capture READI's full impact, right? Because they leave out, for example, the value of the work the participants perform and the benefits of investing in underserved communities, but they still give us a sense of READI's benefit in dollar terms, which we can then compare with its costs. And so what we find is that for each READI participant, society saves between, let's say, 174,000 and 860,000 or so dollars, right depending on how conservative or inclusive your approach is to estimating the social cost of crime.
Max [00:43:53] And those reductions are about 50% relative to the control complier group and they're and they're highly statistically significant. And part of the reason why these savings are so large and more precise than our estimates for the for the impact on the index is that these place greater weight on those more serious acts of violence right the shootings and the homicides which carry the highest social costs and that's where READI appears to be having the greatest impact. So when you look at these reductions in social costs that are caused by READI and you compare them to the cost of the intervention, you're getting benefits that are anywhere between 3.8 and 18.8 times as large as already that program costs.
Jennifer [00:44:36] Great. That seems like a much clearer answer, which is which is very helpful. So that's your study? That's the most recent study. Sara, have any other papers related to this topic come out since you'll first started working on this project?
Sara [00:44:49] Well, you know, when you work on something for seven years, it turns out that a lot of time for al new research to come out. There's been a recent review of the Community Violence Intervention literature, which I, I think shows fairly mixed results. And then, you know, I think there's a handful of groups that have been trying programs that have some overlapping features with READI. So, you know, organizations like Advance Peace, Roca, CRED in Chicago, Turn 90 in South Carolina. I think, you know, some of them have released quasi experimental evaluations or implementation evaluations, and others have different kinds of evaluations that that we know are still in progress. At this point, I think, you know, it's still not particularly clear how to interpret the evidence. I think what's clear is that people have independently come to the idea that combining work and either, you know, CBT or some other kind of some social emotional intervention is a promising approach to gun violence. You know, but like I said earlier, evaluating these kinds of programs is hard and so I don't think there's yet a clear and convincing conclusion.
Jennifer [00:45:50] Okay. So just to clarify, so those other studies or any of them randomized in the same way that your says.
Sara [00:45:56] So I think that Roca has a randomized evaluation in progress, but the results, as far as I know, are not out yet. The rest of them are not.
Jennifer [00:46:03] Okay, great. I'll keep an eye out for that Roca study. Okay. So, Max, what are the policy implications of these results and the other work in this area? What should policymakers and practitioners in Chicago, I imagine you've talked with them and and in other places take away from all this?
Max [00:46:21] Yeah. We have talked with folks in Chicago and in a number of other jurisdictions across the U.S. It's actually been really heartening to see policymakers express as much interest as they have in this. So one set of policy implications here concerns the promise of approaches like READI to try to reduce gun violence by targeting their efforts at really specific people. This is not a new idea that you know it's not unique to READI a lot of a lot of cities are pursuing strategies like this and if you look across the range of referral methods READI was able to find men who were at extraordinarily high risk of being involved in shootings. And a large share of these men were were willing to engage in a program that offered them supportive services and kind of met them where they were at.
Max [00:47:04] So that, on its own is really encouraging for at least the possibility of dedicating resources to helping this small, underserved and extremely vulnerable group. And because of how much risk they are they are carrying, it could really make a dent potentially in gun violence citywide, at least it's theoretically possible. Now, on whether an intervention like READI itself can reduce violence. I mean, the results are less definitive, as we talked about, but they're I think they're still encouraging, you know, despite program interruptions and having a smaller sample size than we expected due to the pandemic, you know, we're still finding these really large reductions in shooting homicide arrests among participants. And for those who refer to the outreach pathway, you know, even larger reductions that are quite precisely measured.
Max [00:47:48] Now, overall, if you think about the whole sample, right, those reductions in shooting homicide arrests, the fact that they're not statistically significant means we can't be as confident in whether these reductions were due to READI as we'd like and also as is the norm in social science. Social science sets a really high bar for overturning hypotheses, as it should, but if you think about, you know, a policymaker faced with a decision about whether to invest in READI like programing, you know, that decision probably doesn't and shouldn't, I think, hinge on whether a single estimate is statistically significant or not. Right. It should probably hinge on how this evidence compares to other available evidence. You know, the benefits of costs of READI compared to alternatives and the immense social cost of gun violence itself right, which is borne mostly by underserved communities. So we think the evidence here, you know, again, while not as clear and definitive as we'd like, is still going to be helpful to policymakers.
Jennifer [00:48:46] Sara, anything to add?
Sara [00:48:48] I might just emphasize, I guess, that gun violence is a complicated problem that is probably going to take a whole toolbox of complicated solutions. So, you know, I think it was always incredibly unlikely from the start that we'd end up saying, look at this program we worked on, isn't it the best thing ever and it's now the solution to gun violence. Right. That's not how problems this complicated work. And so, you know, READI might be one tool or one part of the package, a hopeful one, I think, and one that merits further study, but this is a problem that's so incredibly costly that we should be trying and importantly, testing a whole range of ideas. Right. So individual programing in the spirit of READI, but also continuing to work on the more systemic issues like concentrated poverty and gun access, trying to improve policing so the police can do a better job of reducing gun violence in ways that don't generate their own social costs. And then, you know, after we're doing those things and then evaluating them, we should pay attention to the evaluations. Right. We should abandon ideas that seem good in theory, but don't work in practice and we should invest more in things that are making a cost effective difference.
Jennifer [00:49:54] Yeah, and I'll just I'll just piggyback off of that a bit. I mean, I think often when I talk with policymakers about why they should evaluate what they're doing, and often they're worried that, you know, an evaluation is going to give us some sort of like up or down vote as to whether this thing is working or not. And in practice, I think what we should be aiming for is, is to iterate on our approaches. So, you know, it's I read the study and think, you know, the way READI is currently constructed seems really promising. And maybe it had these benefits and maybe it maybe it was due to chance, but more likely than not, it does seem to be doing something. But are there ways we can think of to improve it even further and have a more conclusive benefit the next time around right. And so I think, you know, just the thinking of everything that we're trying is being a process rather than, you know, you solve the problem or you didn't, I think is likely to be our best path.
Sara [00:50:47] I agree science you know, science is iterative and--
Jennifer [00:50:49] Yeah.
Sara [00:50:49] It's supposed to be iterative. And, you know, maybe every once in a while we find something that we really think doesn't work, but I would also say, isn't that great because now we can stop now we can stop wasting our money on--
Jennifer [00:51:02] Right.
Sara [00:51:03] Something that we know doesn't work and keep iterating on the things that that seem more promising.
Jennifer [00:51:06] Absolutely. Amen to that. So, Sara, on that front, what's the research frontier here? What are the next big questions in this area that you and others are going to be thinking about in the years ahead?
Sara [00:51:17] Well, you know, to be fair, I think there's still more work to do on this question. So how to how to structure a job and CBT program that can reduce gun violence who's going to respond the most, whether this is going to scale and replicate. And so, you know, one of the things that our group is doing is continuing to pursue, you know, more data even on this group for this READI group. So, you know, statewide arrest data, which might pick up some of the events outside of Chicago that our current data missed, data from hospital admissions to try to get at other kinds of violent injury along with, you know, substance abuse and mental health crises. I have some other work that's going to look at spillovers on to different kinds of peers. Like Max said, I think there's been a lot of policy interest in other jurisdictions and in trying programs like READI.
Sara [00:51:59] So I think people are pursuing these kinds of questions on the implementation side, like I said, you know, I hope they'll also partner with researchers to answer the evaluation questions, but, you know, maybe another way to say sort of what you said in terms of of research being iterative is something my coauthor, Chris Blattman, likes to say, which is that we shouldn't believe papers we should believe literatures.
Jennifer [00:52:21] Mm hmm.
Sara [00:52:22] And so I think we just need a lot more work to kind of build out this piece of of literature.
Jennifer [00:52:27] Yeah. Max, anything to add there?
Max [00:52:29] I completely agree with Sara on there being a lot more work to do. You know, just on this question of whether and how behavioral interventions like READI can reduce involvement in gun violence if they're having an impact, why can we better target them? Can we isolate whatever the active ingredient is and find a way to deliver it, you know, more cost effectively in a larger scale and so on. I also have some separate work that I'm just trying to get off the ground about improving the capacity of institutions that are tasked with being on the frontlines of providing public safety and reducing gun violence right things like police departments and community anti-violence organizations like the ones that implemented READI. So there's more work to be done, I guess stay tuned.
Jennifer [00:53:09] Awesome.
Sara [00:53:10] If I can just add one more thing, you know, because I know you have a lot of young scholars who listen to your podcast. You know, this this is a hard problem to work on. They're hard studies to do. You know, Max talked about that, that first participant we lost, you know, the staff went through a lot of trauma. It was just it's very difficult, but I hope that all the young scholars who are interested in crime will realize there's depressingly little good evidence about solutions to gun violence and it's so important. And so, you know, I just would encourage people to not be afraid of the hard parts because it's also, you know, a really rewarding topic to work on because you feel like you're you know, you're contributing to something that's important.
Jennifer [00:53:49] Saving lives. Yeah. On that note. Thank you both for doing this.
Jennifer [00:53:53] My guest today have been Sara Heller from the University of Michigan. And Max Kapustin from Cornell University. Sara and Max, thanks so much for talking with me.
Sara [00:54:01] Thanks for having us.
Max [00:54:02] Thanks so much.
Jennifer [00:54:08] You can find links to all the research we discussed today on her website probablecausation.com. You can also subscribe to the show there or wherever you get your podcasts to make sure you don't miss a single episode. Big thanks to Emergent Ventures for supporting the show and thanks also to our Patreon subscribers and other contributors. Probable Causation is produced by Doleac Initiatives a 501(c)3 nonprofit, so all contributions are tax deductible. If you enjoy the podcast, please consider supporting us via Patreon or with a one time donation on our website. Please also consider leaving us a rating and review on Apple Podcasts. This helps others find the show, which we very much appreciate. Our sound engineer is Jon Keur with production assistance from Nefertari Elshiekh. Our music is by Werner and our logo was designed by Carrie Throckmorton. Thanks for listening and I'll talk to you in two weeks.