1 00:00:00,843 --> 00:00:03,158 In 2007, I became the Attorney General 2 00:00:03,158 --> 00:00:05,159 of the State of New Jersey. 3 00:00:05,159 --> 00:00:07,439 Before that, I'd been a criminal prosecutor, 4 00:00:07,439 --> 00:00:10,120 first in the Manhattan District Attorney's office, 5 00:00:10,120 --> 00:00:12,770 and then at the United States Department of Justice. 6 00:00:12,770 --> 00:00:14,980 But when I became the Attorney General, 7 00:00:14,980 --> 00:00:15,925 two things happened 8 00:00:15,925 --> 00:00:18,866 that changed the way I see criminal justice. 9 00:00:18,866 --> 00:00:20,589 The first is that I asked what I thought 10 00:00:20,589 --> 00:00:23,082 were really basic questions. 11 00:00:23,082 --> 00:00:26,246 I wanted to understand who we were arresting, 12 00:00:26,246 --> 00:00:27,433 who we were charging, 13 00:00:27,433 --> 00:00:29,730 and who we were putting in our nation's jails 14 00:00:29,730 --> 00:00:31,146 and prisons. 15 00:00:31,146 --> 00:00:32,794 I also wanted to understand 16 00:00:32,794 --> 00:00:34,123 if we were making decisions 17 00:00:34,123 --> 00:00:36,641 in a way that made us safer. 18 00:00:36,641 --> 00:00:39,893 And I couldn't get this information out. 19 00:00:39,893 --> 00:00:43,250 It turned out that most big criminal justice agencies 20 00:00:43,250 --> 00:00:44,552 like my own 21 00:00:44,552 --> 00:00:46,934 didn't track the things that matter. 22 00:00:46,934 --> 00:00:50,252 So after about a month of being incredibly frustrated, 23 00:00:50,252 --> 00:00:52,223 I walked down into a conference room 24 00:00:52,223 --> 00:00:54,113 that was filled with detectives 25 00:00:54,113 --> 00:00:56,895 and stacks and stacks of case files, 26 00:00:56,895 --> 00:00:58,071 and the detectives were sitting there 27 00:00:58,071 --> 00:01:00,305 with yellow legal pads taking notes. 28 00:01:00,305 --> 00:01:01,845 They were trying to get the information 29 00:01:01,845 --> 00:01:03,109 I was looking for 30 00:01:03,109 --> 00:01:05,154 by going through case by case 31 00:01:05,154 --> 00:01:07,591 for the past five years. 32 00:01:07,591 --> 00:01:08,705 And as you can imagine, 33 00:01:08,705 --> 00:01:11,795 when we finally got the results, they weren't good. 34 00:01:11,795 --> 00:01:13,003 It turned out that we were doing 35 00:01:13,003 --> 00:01:15,023 a lot of low-level drug cases 36 00:01:15,023 --> 00:01:16,498 on the streets just around the corner 37 00:01:16,498 --> 00:01:18,766 from our office in Trenton. 38 00:01:18,766 --> 00:01:20,233 The second thing that happened 39 00:01:20,233 --> 00:01:23,907 is that I spent the day in the Camden, New Jersey, Police Department. 40 00:01:23,907 --> 00:01:25,794 Now at that time, Camden, New Jersey, 41 00:01:25,794 --> 00:01:28,446 was the most dangerous city in America. 42 00:01:28,446 --> 00:01:32,273 I ran the Camden Police Department because of that. 43 00:01:32,273 --> 00:01:34,385 I spent the day in the Police Department, 44 00:01:34,385 --> 00:01:37,111 and I was taken into a room with senior police officials, 45 00:01:37,111 --> 00:01:38,786 all of whom were working hard 46 00:01:38,786 --> 00:01:42,320 and trying very hard to reduce crime in Camden. 47 00:01:42,320 --> 00:01:43,869 And what I saw in that room, 48 00:01:43,869 --> 00:01:46,114 as we talked about how to reduce crime, 49 00:01:46,114 --> 00:01:49,973 were a series of officers with a lot of little yellow sticky notes. 50 00:01:49,973 --> 00:01:51,804 And they would take a yellow sticky 51 00:01:51,804 --> 00:01:52,823 and they would write something on it 52 00:01:52,823 --> 00:01:54,622 and they would put it up on a board. 53 00:01:54,622 --> 00:01:56,717 And one of them said, "We had a robbery two weeks ago. 54 00:01:56,717 --> 00:01:58,504 We have no suspects." 55 00:01:58,504 --> 00:02:03,531 And another said, "We had a shooting in this neighborhood last week. We have no suspects." 56 00:02:03,531 --> 00:02:06,114 We weren't using data-driven policing. 57 00:02:06,114 --> 00:02:08,156 We were essentially trying to fight crime 58 00:02:08,156 --> 00:02:10,683 with yellow post-it notes. 59 00:02:10,683 --> 00:02:12,818 Now both of these things made me realize 60 00:02:12,818 --> 00:02:16,069 fundamentally that we were failing. 61 00:02:16,069 --> 00:02:19,192 We didn't even know who was in our criminal justice system, 62 00:02:19,192 --> 00:02:22,427 we didn't have any data about the things that mattered, 63 00:02:22,427 --> 00:02:24,995 and we didn't share data or use analytics 64 00:02:24,995 --> 00:02:27,146 or tools to help us make better decisions 65 00:02:27,146 --> 00:02:29,149 and to reduce crime. 66 00:02:29,149 --> 00:02:31,127 And for the first time, I started to think 67 00:02:31,127 --> 00:02:33,283 about how we made decisions. 68 00:02:33,283 --> 00:02:34,680 When I was an assistant D.A., 69 00:02:34,680 --> 00:02:36,550 and when I was a federal prosecutor, 70 00:02:36,550 --> 00:02:38,296 I looked at the cases in front of me, 71 00:02:38,296 --> 00:02:40,922 and I generally made decisions based on my instinct 72 00:02:40,922 --> 00:02:42,614 and my experience. 73 00:02:42,614 --> 00:02:44,273 When I became Attorney General, 74 00:02:44,273 --> 00:02:45,912 I could look at this system as a whole, 75 00:02:45,912 --> 00:02:47,377 and what surprised me is that I found 76 00:02:47,377 --> 00:02:49,635 that that was exactly how we were doing it 77 00:02:49,635 --> 00:02:51,938 across the entire system, 78 00:02:51,938 --> 00:02:54,339 in police departments, in prosecutors's offices, 79 00:02:54,339 --> 00:02:57,139 in courts, and in jails. 80 00:02:57,139 --> 00:02:59,336 And what I learned very quickly 81 00:02:59,336 --> 00:03:02,969 is that we weren't doing a good job. 82 00:03:02,969 --> 00:03:04,985 So I wanted to do things differently. 83 00:03:04,985 --> 00:03:07,182 I wanted to introduce data and analytics 84 00:03:07,182 --> 00:03:09,231 and rigorous statistical analysis 85 00:03:09,231 --> 00:03:10,631 into our work. 86 00:03:10,631 --> 00:03:13,601 In short, I wanted to moneyball criminal justice. 87 00:03:13,601 --> 00:03:15,628 Now, moneyball, as many of you know, 88 00:03:15,628 --> 00:03:17,197 is what the Oakland A's did, 89 00:03:17,197 --> 00:03:19,170 where they used smart data and statistics 90 00:03:19,170 --> 00:03:20,469 to figure out how to pick players 91 00:03:20,469 --> 00:03:22,313 that would help them win games, 92 00:03:22,313 --> 00:03:24,298 and they went from a system that was based 93 00:03:24,298 --> 00:03:25,386 on baseball scouts 94 00:03:25,386 --> 00:03:27,153 who used to go out and watch players 95 00:03:27,153 --> 00:03:28,714 and use their instinct and experience, 96 00:03:28,714 --> 00:03:30,533 the scouts' instincts and experience, 97 00:03:30,533 --> 00:03:32,246 to pick players, from one to use 98 00:03:32,246 --> 00:03:35,068 smart data and rigorous statistical analysis 99 00:03:35,068 --> 00:03:38,439 to figure out how to pick players that would help them win games. 100 00:03:38,439 --> 00:03:40,237 It worked for the Oakland A's, 101 00:03:40,237 --> 00:03:42,456 and it worked in the State of New Jersey. 102 00:03:42,456 --> 00:03:44,529 We took Camden off the top of the list 103 00:03:44,529 --> 00:03:46,700 as the most dangerous city in America. 104 00:03:46,700 --> 00:03:49,855 We reduced murders there by 41 percent, 105 00:03:49,855 --> 00:03:52,837 which actually means 37 lives were saved. 106 00:03:52,837 --> 00:03:56,577 And we reduced all crime in the city by 26 percent. 107 00:03:56,577 --> 00:03:59,816 We also changed the way we did criminal prosecutions. 108 00:03:59,816 --> 00:04:01,821 So we went from doing low-level drug crimes 109 00:04:01,821 --> 00:04:03,463 that were outside our building 110 00:04:03,463 --> 00:04:05,805 to doing cases of state-wide importance, 111 00:04:05,805 --> 00:04:08,963 on things like reducing violence with the most violent offenders, 112 00:04:08,963 --> 00:04:10,821 prosecuting street gangs, 113 00:04:10,821 --> 00:04:14,229 gun and drug trafficking, and political corruption. 114 00:04:14,229 --> 00:04:16,731 And all of this matters greatly, 115 00:04:16,731 --> 00:04:18,676 because public safety to me 116 00:04:18,676 --> 00:04:21,212 is the most important function of government. 117 00:04:21,212 --> 00:04:23,510 If we're not safe, we can't be educated, 118 00:04:23,510 --> 00:04:24,858 we can't be healthy, 119 00:04:24,858 --> 00:04:25,738 we can't do any of the other things 120 00:04:25,738 --> 00:04:28,327 we want to do in our lives. 121 00:04:28,327 --> 00:04:29,504 And we live in a country today 122 00:04:29,504 --> 00:04:32,638 where we face serious criminal justice problems. 123 00:04:32,638 --> 00:04:36,299 We have 12 million arrests every single year. 124 00:04:36,299 --> 00:04:38,342 The vast majority of those arrests 125 00:04:38,342 --> 00:04:41,170 are for low-level crimes, like misdemeanors, 126 00:04:41,170 --> 00:04:43,088 70 to 80 percent. 127 00:04:43,088 --> 00:04:45,079 Less than five percent of all arrests 128 00:04:45,079 --> 00:04:46,974 are for violent crime. 129 00:04:46,974 --> 00:04:49,029 Yet we spend 75 billion, 130 00:04:49,029 --> 00:04:50,447 that's b for billion, 131 00:04:50,447 --> 00:04:54,574 dollars a year on state and local corrections costs. 132 00:04:54,574 --> 00:04:57,415 Right now, today, we have 2.3 million people 133 00:04:57,415 --> 00:04:59,315 in our jails and prisons. 134 00:04:59,315 --> 00:05:02,111 And we face unbelievable public safety challenges 135 00:05:02,111 --> 00:05:04,050 because we have a situation 136 00:05:04,050 --> 00:05:06,625 in which two thirds of the people in our jails 137 00:05:06,625 --> 00:05:08,702 are there waiting for trial. 138 00:05:08,702 --> 00:05:10,837 They haven't yet been convicted of a crime. 139 00:05:10,837 --> 00:05:12,956 They're just waiting for their day in court. 140 00:05:12,956 --> 00:05:16,504 And 67 percent of people come back. 141 00:05:16,504 --> 00:05:18,471 Our recidivism rate is amongst the highest 142 00:05:18,471 --> 00:05:19,526 in the world. 143 00:05:19,526 --> 00:05:21,359 Almost seven in 10 people who are released 144 00:05:21,359 --> 00:05:23,286 from prison will be rearrested 145 00:05:23,286 --> 00:05:27,241 in a constant cycle of crime and incarceration. 146 00:05:27,241 --> 00:05:29,823 So when I started my job at the Arnold Foundation, 147 00:05:29,823 --> 00:05:32,559 I came back to looking at a lot of these questions, 148 00:05:32,559 --> 00:05:34,213 and I came back to thinking about how 149 00:05:34,213 --> 00:05:36,304 we had used data and analytics to transform 150 00:05:36,304 --> 00:05:39,180 the way we did criminal justice in New Jersey. 151 00:05:39,180 --> 00:05:41,063 And when I look at the criminal justice system 152 00:05:41,063 --> 00:05:42,780 in the United States today, 153 00:05:42,780 --> 00:05:44,466 I feel the exact same way that I did 154 00:05:44,466 --> 00:05:47,085 about the State of New Jersey when I started there, 155 00:05:47,085 --> 00:05:50,313 which is that we absolutely have to do better, 156 00:05:50,313 --> 00:05:52,560 and I know that we can do better. 157 00:05:52,560 --> 00:05:53,941 So I decided to focus 158 00:05:53,941 --> 00:05:55,835 on using data and analytics 159 00:05:55,835 --> 00:05:58,519 to help make the most critical decision 160 00:05:58,519 --> 00:06:00,125 in public safety, 161 00:06:00,125 --> 00:06:02,146 and that decision is the determination 162 00:06:02,146 --> 00:06:04,681 of whether, when someone has been arrested, 163 00:06:04,681 --> 00:06:06,596 whether they pose a risk to public safety 164 00:06:06,596 --> 00:06:08,122 and should be detained, 165 00:06:08,122 --> 00:06:10,478 or whether they don't pose a risk to public safety 166 00:06:10,478 --> 00:06:12,115 and should be released. 167 00:06:12,115 --> 00:06:13,881 Everything that happens in criminal cases 168 00:06:13,881 --> 00:06:15,806 comes out of this one decision. 169 00:06:15,806 --> 00:06:17,302 It impacts everything. 170 00:06:17,302 --> 00:06:18,652 It impacts sentencing. 171 00:06:18,652 --> 00:06:20,677 It impacts whether someone gets drug treatment. 172 00:06:20,677 --> 00:06:22,876 It impacts crime and violence. 173 00:06:22,876 --> 00:06:24,813 And when I talk to judges around the United States, 174 00:06:24,813 --> 00:06:26,772 which I do all the time now, 175 00:06:26,772 --> 00:06:28,578 they all say the same thing, 176 00:06:28,578 --> 00:06:31,685 which is that we put dangerous people in jail, 177 00:06:31,685 --> 00:06:35,210 and we let non-dangerous, non-violent people out. 178 00:06:35,210 --> 00:06:37,443 They mean it and they believe it. 179 00:06:37,443 --> 00:06:39,176 But when you start to look at the data, 180 00:06:39,176 --> 00:06:41,640 which, by the way, the judges don't have, 181 00:06:41,640 --> 00:06:43,252 when we start to look at the data, 182 00:06:43,252 --> 00:06:45,486 what we find time and time again, 183 00:06:45,486 --> 00:06:47,652 is that this isn't the case. 184 00:06:47,652 --> 00:06:49,333 We find low-risk offenders, 185 00:06:49,333 --> 00:06:53,047 which makes up 50 percent of our entire criminal justice population, 186 00:06:53,047 --> 00:06:55,446 we find that they're in jail. 187 00:06:55,446 --> 00:06:57,932 Take Leslie Chew, who was a Texas man 188 00:06:57,932 --> 00:07:00,816 who stole four blankets on a cold winter night. 189 00:07:00,816 --> 00:07:03,411 He was arrested, and he was kept in jail 190 00:07:03,411 --> 00:07:05,464 on 3,500 dollars bail, 191 00:07:05,464 --> 00:07:08,240 an amount that he could not afford to pay. 192 00:07:08,240 --> 00:07:10,828 And he stayed in jail for eight months 193 00:07:10,828 --> 00:07:12,893 until his case came up for trial, 194 00:07:12,893 --> 00:07:16,798 at a cost to taxpayers of more than 9,000 dollars. 195 00:07:16,798 --> 00:07:18,626 And at the other end of the spectrum, 196 00:07:18,626 --> 00:07:21,077 we're doing an equally terrible job. 197 00:07:21,077 --> 00:07:22,649 The people who we find 198 00:07:22,649 --> 00:07:24,668 are the highest risk offenders, 199 00:07:24,668 --> 00:07:26,689 the people who we think have the highest likelihood 200 00:07:26,689 --> 00:07:29,117 of committing a new crime if they're released, 201 00:07:29,117 --> 00:07:31,929 we see nationally that 50 percent of those people 202 00:07:31,929 --> 00:07:34,041 are being released. 203 00:07:34,041 --> 00:07:37,215 The reason for this is the way we make decisions. 204 00:07:37,215 --> 00:07:39,740 Judges have the best intentions 205 00:07:39,740 --> 00:07:41,970 when they make these decisions about risk, 206 00:07:41,970 --> 00:07:43,360 but they're making them subjectively. 207 00:07:43,360 --> 00:07:45,583 They're like the baseball scouts 20 years ago 208 00:07:45,583 --> 00:07:47,637 who were using their instinct and their experience 209 00:07:47,637 --> 00:07:50,871 to try to decide what risk someone poses. 210 00:07:50,871 --> 00:07:52,125 They're being subjective, 211 00:07:52,125 --> 00:07:54,937 and we know what happens with subjective decision-making, 212 00:07:54,937 --> 00:07:57,266 which is that we are often wrong. 213 00:07:57,266 --> 00:07:59,202 What we need in this space 214 00:07:59,202 --> 00:08:01,800 are strong data and analytics. 215 00:08:01,800 --> 00:08:03,331 What I decided to look for 216 00:08:03,331 --> 00:08:06,321 was a strong data and analytic risk assessment tool, 217 00:08:06,321 --> 00:08:08,870 something that would let judges actually understand 218 00:08:08,870 --> 00:08:11,314 with a scientific and objective way 219 00:08:11,314 --> 00:08:12,930 what the risk was that was posed 220 00:08:12,930 --> 00:08:14,663 by someone in front of them. 221 00:08:14,663 --> 00:08:16,266 I looked all over the country, 222 00:08:16,266 --> 00:08:18,470 and I found that between five and 10 percent 223 00:08:18,470 --> 00:08:19,537 of all U.S. jurisdictions 224 00:08:19,537 --> 00:08:22,393 actually use any type of risk assessment tool, 225 00:08:22,393 --> 00:08:23,940 and when I looked at these tools, 226 00:08:23,940 --> 00:08:25,892 I quickly realized why. 227 00:08:25,892 --> 00:08:28,367 They were unbelievably expensive to administer, 228 00:08:28,367 --> 00:08:30,110 they were time-consuming, 229 00:08:30,110 --> 00:08:32,680 they were limited to the local jurisdiction 230 00:08:32,680 --> 00:08:33,739 in which they'd been created. 231 00:08:33,739 --> 00:08:35,287 So basically, they couldn't be scaled 232 00:08:35,287 --> 00:08:37,726 or transferred to other places. 233 00:08:37,726 --> 00:08:39,901 So I went out and build a phenomenal team 234 00:08:39,901 --> 00:08:42,900 of data scientists and researchers 235 00:08:42,900 --> 00:08:43,587 and statisticians 236 00:08:43,587 --> 00:08:46,339 to build a universal risk assessment tool, 237 00:08:46,339 --> 00:08:48,732 so that every single judge in the United States of America 238 00:08:48,732 --> 00:08:53,380 can have an objective, scientific measure of risk. 239 00:08:53,380 --> 00:08:54,714 In the tool that we've built, 240 00:08:54,714 --> 00:08:57,382 what we did was we collected 1.5 million cases 241 00:08:57,382 --> 00:08:59,465 from all around the United States, 242 00:08:59,465 --> 00:09:01,710 from cities, from counties, 243 00:09:01,710 --> 00:09:02,635 from every single state in the country, 244 00:09:02,635 --> 00:09:04,320 the federal districts. 245 00:09:04,320 --> 00:09:06,370 And with those 1.5 million cases, 246 00:09:06,370 --> 00:09:08,310 which is the largest data set on pretrial 247 00:09:08,310 --> 00:09:10,870 in the United States today, 248 00:09:10,870 --> 00:09:12,273 we were able to basically find that there were 249 00:09:12,273 --> 00:09:15,302 900-plus risk factors that we could look at 250 00:09:15,302 --> 00:09:18,800 to try to figure out what mattered most. 251 00:09:18,800 --> 00:09:20,466 And we found that there were nine specific things 252 00:09:20,466 --> 00:09:22,484 that mattered all across the country 253 00:09:22,484 --> 00:09:25,478 and that were the most highly predictive of risk. 254 00:09:25,478 --> 00:09:28,982 And so we built a universal risk assessment tool. 255 00:09:28,982 --> 00:09:30,827 And it looks like this. 256 00:09:30,827 --> 00:09:33,223 As you'll see, we put some information in, 257 00:09:33,223 --> 00:09:35,390 but most of it is incredibly simple, 258 00:09:35,390 --> 00:09:36,992 it's easy to use, 259 00:09:36,992 --> 00:09:39,761 it focuses on things like the defendant's prior convictions, 260 00:09:39,761 --> 00:09:41,786 whether they've been sentenced to incarceration, 261 00:09:41,786 --> 00:09:43,957 whether they've engaged in violence before, 262 00:09:43,957 --> 00:09:46,397 whether they've even failed to come back to court. 263 00:09:46,397 --> 00:09:48,912 And with this tool, we can predict three things. 264 00:09:48,912 --> 00:09:50,596 First, whether or not someone will commit 265 00:09:50,596 --> 00:09:52,315 a new crime if they're released. 266 00:09:52,315 --> 00:09:53,979 Second, for the first time, 267 00:09:53,979 --> 00:09:55,840 and I think this is incredibly important, 268 00:09:55,840 --> 00:09:57,517 we can predict whether someone will commit 269 00:09:57,517 --> 00:09:59,689 an act of violence if they're released. 270 00:09:59,689 --> 00:10:01,391 And that's the single most important thing 271 00:10:01,391 --> 00:10:03,244 that judges say when you talk to them. 272 00:10:03,244 --> 00:10:04,889 And third, we can predict whether someone 273 00:10:04,889 --> 00:10:06,802 will come back to court. 274 00:10:06,802 --> 00:10:10,680 And every single judge in the United States of America can use it, 275 00:10:10,680 --> 00:10:13,554 because it's been created on a universal data set. 276 00:10:13,554 --> 00:10:15,994 What judges see if they run the risk assessment tool 277 00:10:15,994 --> 00:10:18,559 is this: it's a dashboard. 278 00:10:18,559 --> 00:10:21,500 At the top, you see the new criminal activity score, 279 00:10:21,500 --> 00:10:23,860 six of course being the highest, 280 00:10:23,860 --> 00:10:26,970 and then in the middle you see "elevated risk of violence." 281 00:10:26,970 --> 00:10:27,793 What that says is that this person 282 00:10:27,793 --> 00:10:29,655 is someone who has an elevated risk of violence 283 00:10:29,655 --> 00:10:31,584 that the judge should look twice at. 284 00:10:31,584 --> 00:10:32,813 And then, towards the bottom, 285 00:10:32,813 --> 00:10:34,734 you see the "Failure to Appear" score, 286 00:10:34,734 --> 00:10:36,850 which again is the likelihood 287 00:10:36,850 --> 00:10:38,787 that someone will come back to court. 288 00:10:38,787 --> 00:10:41,537 Now I want to say something really important. 289 00:10:41,537 --> 00:10:43,865 It's not that I think we should be eliminating 290 00:10:43,865 --> 00:10:46,416 the judge's instinct and experience 291 00:10:46,416 --> 00:10:48,112 from this process. 292 00:10:48,112 --> 00:10:49,294 I don't. 293 00:10:49,294 --> 00:10:51,900 I actually believe the problem that we see 294 00:10:51,900 --> 00:10:53,940 and the reason that we have these incredible system errors, 295 00:10:53,940 --> 00:10:56,933 where we're incarcerating low-level, nonviolent people 296 00:10:56,933 --> 00:11:00,229 and we're releasing high-risk, dangerous people, 297 00:11:00,229 --> 00:11:02,998 is that we don't have an objective measure of risk. 298 00:11:02,998 --> 00:11:04,313 But what I believe should happen 299 00:11:04,313 --> 00:11:06,867 is that we should take that data-driven risk assessment 300 00:11:06,867 --> 00:11:09,908 and combine that with the judge's instinct and experience 301 00:11:09,908 --> 00:11:12,897 to lead us to better decision-making. 302 00:11:12,897 --> 00:11:16,415 The tool went state-wide in Kentucky on July 1st, 303 00:11:16,415 --> 00:11:19,720 and we're about to go up in a number of other U.S. jurisdictions. 304 00:11:19,720 --> 00:11:22,220 Our goal, quite simply, is that every single judge 305 00:11:22,220 --> 00:11:24,472 in the United States will use a data-driven risk tool 306 00:11:24,472 --> 00:11:26,548 within the next five years. 307 00:11:26,548 --> 00:11:28,900 We're now working on risk tools 308 00:11:28,900 --> 00:11:31,230 for prosecutors and for police officers, as well, 309 00:11:31,230 --> 00:11:33,762 to try to take a system that runs today 310 00:11:33,762 --> 00:11:36,772 in America the same way it did 50 years ago, 311 00:11:36,772 --> 00:11:38,762 based on instinct and experience, 312 00:11:38,762 --> 00:11:40,586 and make it into one that runs 313 00:11:40,586 --> 00:11:43,286 on data and analytics. 314 00:11:43,286 --> 00:11:45,930 Now, the great news about all this, 315 00:11:45,930 --> 00:11:46,762 and we have a ton of work left to do, 316 00:11:46,762 --> 00:11:48,527 and we have a lot of culture to change, 317 00:11:48,527 --> 00:11:50,366 but the great news about all of it 318 00:11:50,366 --> 00:11:52,710 is that we know it works. 319 00:11:52,710 --> 00:11:54,740 It's why Google is Google, 320 00:11:54,740 --> 00:11:56,802 and it's why all these baseball teams use moneyball 321 00:11:56,802 --> 00:11:58,598 to win games. 322 00:11:58,598 --> 00:12:00,350 The great news for us as well 323 00:12:00,350 --> 00:12:02,123 is that it's the way that we can transform 324 00:12:02,123 --> 00:12:04,492 the American criminal justice system. 325 00:12:04,492 --> 00:12:06,924 It's how we can make our streets safer, 326 00:12:06,924 --> 00:12:09,470 we can reduce our prison costs, 327 00:12:09,470 --> 00:12:11,290 and we can make our system much fairer 328 00:12:11,290 --> 00:12:12,969 and more just. 329 00:12:12,969 --> 00:12:14,870 Some people call it data science. 330 00:12:14,870 --> 00:12:17,356 I call it moneyballing criminal justice. 331 00:12:17,356 --> 00:12:19,175 Thank you. 332 00:12:19,175 --> 00:12:23,175 (Applause)