0:00:00.739,0:00:04.861 So, I started my first job[br]as a computer programmer 0:00:04.885,0:00:06.841 in my very first year of college -- 0:00:06.865,0:00:08.372 basically, as a teenager. 0:00:08.889,0:00:10.621 Soon after I started working, 0:00:10.645,0:00:12.255 writing software in a company, 0:00:12.799,0:00:16.434 a manager who worked at the company[br]came down to where I was, 0:00:16.458,0:00:17.726 and he whispered to me, 0:00:18.229,0:00:21.090 "Can he tell if I'm lying?" 0:00:21.806,0:00:23.883 There was nobody else in the room. 0:00:25.032,0:00:29.421 "Can who tell if you're lying?[br]And why are we whispering?" 0:00:30.266,0:00:33.373 The manager pointed[br]at the computer in the room. 0:00:33.397,0:00:36.493 "Can he tell if I'm lying?" 0:00:37.613,0:00:41.975 Well, that manager was having[br]an affair with the receptionist. 0:00:41.999,0:00:43.111 (Laughter) 0:00:43.135,0:00:44.901 And I was still a teenager. 0:00:45.447,0:00:47.466 So I whisper-shouted back to him, 0:00:47.490,0:00:51.114 "Yes, the computer can tell[br]if you're lying." 0:00:51.138,0:00:52.944 (Laughter) 0:00:52.968,0:00:55.891 Well, I laughed, but actually,[br]the laugh's on me. 0:00:55.915,0:00:59.183 Nowadays, there are computational systems 0:00:59.207,0:01:02.755 that can suss out[br]emotional states and even lying 0:01:02.779,0:01:04.823 from processing human faces. 0:01:05.248,0:01:09.401 Advertisers and even governments[br]are very interested. 0:01:10.319,0:01:12.181 I had become a computer programmer 0:01:12.205,0:01:15.318 because I was one of those kids[br]crazy about math and science. 0:01:15.942,0:01:19.050 But somewhere along the line[br]I'd learned about nuclear weapons, 0:01:19.074,0:01:22.026 and I'd gotten really concerned[br]with the ethics of science. 0:01:22.050,0:01:23.254 I was troubled. 0:01:23.278,0:01:25.919 However, because of family circumstances, 0:01:25.943,0:01:29.241 I also needed to start working[br]as soon as possible. 0:01:29.265,0:01:32.564 So I thought to myself, hey,[br]let me pick a technical field 0:01:32.588,0:01:34.384 where I can get a job easily 0:01:34.408,0:01:38.426 and where I don't have to deal[br]with any troublesome questions of ethics. 0:01:39.022,0:01:40.551 So I picked computers. 0:01:40.575,0:01:41.679 (Laughter) 0:01:41.703,0:01:45.113 Well, ha, ha, ha![br]All the laughs are on me. 0:01:45.137,0:01:47.891 Nowadays, computer scientists[br]are building platforms 0:01:47.915,0:01:52.124 that control what a billion[br]people see every day. 0:01:53.052,0:01:56.874 They're developing cars[br]that could decide who to run over. 0:01:57.707,0:02:00.920 They're even building machines, weapons, 0:02:00.944,0:02:03.229 that might kill human beings in war. 0:02:03.253,0:02:06.024 It's ethics all the way down. 0:02:07.183,0:02:09.241 Machine intelligence is here. 0:02:09.823,0:02:13.297 We're now using computation[br]to make all sort of decisions, 0:02:13.321,0:02:15.207 but also new kinds of decisions. 0:02:15.231,0:02:20.403 We're asking questions to computation[br]that have no single right answers, 0:02:20.427,0:02:21.629 that are subjective 0:02:21.653,0:02:23.978 and open-ended and value-laden. 0:02:24.002,0:02:25.760 We're asking questions like, 0:02:25.784,0:02:27.434 "Who should the company hire?" 0:02:28.096,0:02:30.855 "Which update from which friend[br]should you be shown?" 0:02:30.879,0:02:33.145 "Which convict is more[br]likely to reoffend?" 0:02:33.514,0:02:36.568 "Which news item or movie[br]should be recommended to people?" 0:02:36.592,0:02:39.964 Look, yes, we've been using[br]computers for a while, 0:02:39.988,0:02:41.505 but this is different. 0:02:41.529,0:02:43.596 This is a historical twist, 0:02:43.620,0:02:48.957 because we cannot anchor computation[br]for such subjective decisions 0:02:48.981,0:02:54.401 the way we can anchor computation[br]for flying airplanes, building bridges, 0:02:54.425,0:02:55.684 going to the moon. 0:02:56.449,0:02:59.708 Are airplanes safer?[br]Did the bridge sway and fall? 0:02:59.732,0:03:04.230 There, we have agreed-upon,[br]fairly clear benchmarks, 0:03:04.254,0:03:06.493 and we have laws of nature to guide us. 0:03:06.517,0:03:09.911 We have no such anchors and benchmarks 0:03:09.935,0:03:13.898 for decisions in messy human affairs. 0:03:13.922,0:03:18.159 To make things more complicated,[br]our software is getting more powerful, 0:03:18.183,0:03:21.956 but it's also getting less[br]transparent and more complex. 0:03:22.542,0:03:24.582 Recently, in the past decade, 0:03:24.606,0:03:27.335 complex algorithms[br]have made great strides. 0:03:27.359,0:03:29.349 They can recognize human faces. 0:03:29.985,0:03:32.040 They can decipher handwriting. 0:03:32.436,0:03:34.502 They can detect credit card fraud 0:03:34.526,0:03:35.715 and block spam 0:03:35.739,0:03:37.776 and they can translate between languages. 0:03:37.800,0:03:40.374 They can detect tumors in medical imaging. 0:03:40.398,0:03:42.603 They can beat humans in chess and Go. 0:03:43.264,0:03:47.768 Much of this progress comes[br]from a method called "machine learning." 0:03:48.175,0:03:51.362 Machine learning is different[br]than traditional programming, 0:03:51.386,0:03:54.971 where you give the computer[br]detailed, exact, painstaking instructions. 0:03:55.378,0:03:59.560 It's more like you take the system[br]and you feed it lots of data, 0:03:59.584,0:04:01.240 including unstructured data, 0:04:01.264,0:04:03.542 like the kind we generate[br]in our digital lives. 0:04:03.566,0:04:06.296 And the system learns[br]by churning through this data. 0:04:06.669,0:04:08.195 And also, crucially, 0:04:08.219,0:04:12.599 these systems don't operate[br]under a single-answer logic. 0:04:12.623,0:04:15.582 They don't produce a simple answer;[br]it's more probabilistic: 0:04:15.606,0:04:19.089 "This one is probably more like[br]what you're looking for." 0:04:20.023,0:04:23.093 Now, the upside is:[br]this method is really powerful. 0:04:23.117,0:04:25.193 The head of Google's AI systems called it, 0:04:25.217,0:04:27.414 "the unreasonable effectiveness of data." 0:04:27.791,0:04:29.144 The downside is, 0:04:29.738,0:04:32.809 we don't really understand[br]what the system learned. 0:04:32.833,0:04:34.420 In fact, that's its power. 0:04:34.946,0:04:38.744 This is less like giving[br]instructions to a computer; 0:04:39.200,0:04:43.264 it's more like training[br]a puppy-machine-creature 0:04:43.288,0:04:45.659 we don't really understand or control. 0:04:46.362,0:04:47.913 So this is our problem. 0:04:48.427,0:04:52.689 It's a problem when this artificial[br]intelligence system gets things wrong. 0:04:52.713,0:04:56.253 It's also a problem[br]when it gets things right, 0:04:56.277,0:04:59.905 because we don't even know which is which[br]when it's a subjective problem. 0:04:59.929,0:05:02.268 We don't know what this thing is thinking. 0:05:03.493,0:05:07.176 So, consider a hiring algorithm -- 0:05:08.123,0:05:12.434 a system used to hire people,[br]using machine-learning systems. 0:05:13.052,0:05:16.631 Such a system would have been trained[br]on previous employees' data 0:05:16.655,0:05:19.246 and instructed to find and hire 0:05:19.270,0:05:22.308 people like the existing[br]high performers in the company. 0:05:22.814,0:05:23.967 Sounds good. 0:05:23.991,0:05:25.990 I once attended a conference 0:05:26.014,0:05:29.139 that brought together[br]human resources managers and executives, 0:05:29.163,0:05:30.369 high-level people, 0:05:30.393,0:05:31.952 using such systems in hiring. 0:05:31.976,0:05:33.622 They were super excited. 0:05:33.646,0:05:38.299 They thought that this would make hiring[br]more objective, less biased, 0:05:38.323,0:05:41.323 and give women[br]and minorities a better shot 0:05:41.347,0:05:43.535 against biased human managers. 0:05:43.559,0:05:46.402 And look -- human hiring is biased. 0:05:47.099,0:05:48.284 I know. 0:05:48.308,0:05:51.313 I mean, in one of my early jobs[br]as a programmer, 0:05:51.337,0:05:55.205 my immediate manager would sometimes[br]come down to where I was 0:05:55.229,0:05:58.982 really early in the morning[br]or really late in the afternoon, 0:05:59.006,0:06:02.068 and she'd say, "Zeynep,[br]let's go to lunch!" 0:06:02.724,0:06:04.891 I'd be puzzled by the weird timing. 0:06:04.915,0:06:07.044 It's 4pm. Lunch? 0:06:07.068,0:06:10.162 I was broke, so free lunch. I always went. 0:06:10.618,0:06:12.685 I later realized what was happening. 0:06:12.709,0:06:17.255 My immediate managers[br]had not confessed to their higher-ups 0:06:17.279,0:06:20.392 that the programmer they hired[br]for a serious job was a teen girl 0:06:20.416,0:06:24.346 who wore jeans and sneakers to work. 0:06:25.174,0:06:27.376 I was doing a good job,[br]I just looked wrong 0:06:27.400,0:06:29.099 and was the wrong age and gender. 0:06:29.123,0:06:32.469 So hiring in a gender- and race-blind way 0:06:32.493,0:06:34.358 certainly sounds good to me. 0:06:35.031,0:06:38.372 But with these systems,[br]it is more complicated, and here's why: 0:06:38.968,0:06:44.759 Currently, computational systems[br]can infer all sorts of things about you 0:06:44.783,0:06:46.655 from your digital crumbs, 0:06:46.679,0:06:49.012 even if you have not[br]disclosed those things. 0:06:49.506,0:06:52.433 They can infer your sexual orientation, 0:06:52.994,0:06:54.300 your personality traits, 0:06:54.859,0:06:56.232 your political leanings. 0:06:56.830,0:07:00.515 They have predictive power[br]with high levels of accuracy. 0:07:01.362,0:07:03.940 Remember -- for things[br]you haven't even disclosed. 0:07:03.964,0:07:05.555 This is inference. 0:07:05.579,0:07:08.840 I have a friend who developed[br]such computational systems 0:07:08.864,0:07:12.505 to predict the likelihood[br]of clinical or postpartum depression 0:07:12.529,0:07:13.945 from social media data. 0:07:14.676,0:07:16.103 The results are impressive. 0:07:16.492,0:07:19.849 Her system can predict[br]the likelihood of depression 0:07:19.873,0:07:23.776 months before the onset of any symptoms -- 0:07:23.800,0:07:25.173 months before. 0:07:25.197,0:07:27.443 No symptoms, there's prediction. 0:07:27.467,0:07:32.279 She hopes it will be used[br]for early intervention. Great! 0:07:32.911,0:07:34.951 But now put this in the context of hiring. 0:07:36.027,0:07:39.073 So at this human resources[br]managers conference, 0:07:39.097,0:07:43.806 I approached a high-level manager[br]in a very large company, 0:07:43.830,0:07:48.408 and I said to her, "Look,[br]what if, unbeknownst to you, 0:07:48.432,0:07:54.981 your system is weeding out people[br]with high future likelihood of depression? 0:07:55.761,0:07:59.137 They're not depressed now,[br]just maybe in the future, more likely. 0:07:59.923,0:08:03.329 What if it's weeding out women[br]more likely to be pregnant 0:08:03.353,0:08:05.939 in the next year or two[br]but aren't pregnant now? 0:08:06.844,0:08:12.480 What if it's hiring aggressive people[br]because that's your workplace culture?" 0:08:13.173,0:08:15.864 You can't tell this by looking[br]at gender breakdowns. 0:08:15.888,0:08:17.390 Those may be balanced. 0:08:17.414,0:08:20.971 And since this is machine learning,[br]not traditional coding, 0:08:20.995,0:08:25.902 there is no variable there[br]labeled "higher risk of depression," 0:08:25.926,0:08:27.759 "higher risk of pregnancy," 0:08:27.783,0:08:29.517 "aggressive guy scale." 0:08:29.995,0:08:33.674 Not only do you not know[br]what your system is selecting on, 0:08:33.698,0:08:36.021 you don't even know[br]where to begin to look. 0:08:36.045,0:08:37.291 It's a black box. 0:08:37.315,0:08:40.122 It has predictive power,[br]but you don't understand it. 0:08:40.486,0:08:42.855 "What safeguards," I asked, "do you have 0:08:42.879,0:08:46.552 to make sure that your black box[br]isn't doing something shady?" 0:08:48.863,0:08:52.741 She looked at me as if I had[br]just stepped on 10 puppy tails. 0:08:52.765,0:08:54.013 (Laughter) 0:08:54.037,0:08:56.078 She stared at me and she said, 0:08:56.556,0:09:00.889 "I don't want to hear[br]another word about this." 0:09:01.458,0:09:03.492 And she turned around and walked away. 0:09:04.064,0:09:05.550 Mind you -- she wasn't rude. 0:09:05.574,0:09:11.882 It was clearly: what I don't know[br]isn't my problem, go away, death stare. 0:09:11.906,0:09:13.152 (Laughter) 0:09:13.862,0:09:17.701 Look, such a system[br]may even be less biased 0:09:17.725,0:09:19.828 than human managers in some ways. 0:09:19.852,0:09:21.998 And it could make monetary sense. 0:09:22.573,0:09:24.223 But it could also lead 0:09:24.247,0:09:28.995 to a steady but stealthy[br]shutting out of the job market 0:09:29.019,0:09:31.312 of people with higher risk of depression. 0:09:31.753,0:09:34.349 Is this the kind of society[br]we want to build, 0:09:34.373,0:09:36.658 without even knowing we've done this, 0:09:36.682,0:09:40.646 because we turned decision-making[br]to machines we don't totally understand? 0:09:41.265,0:09:42.723 Another problem is this: 0:09:43.314,0:09:47.766 these systems are often trained[br]on data generated by our actions, 0:09:47.790,0:09:49.606 human imprints. 0:09:50.188,0:09:53.996 Well, they could just be[br]reflecting our biases, 0:09:54.020,0:09:57.613 and these systems[br]could be picking up on our biases 0:09:57.637,0:09:58.950 and amplifying them 0:09:58.974,0:10:00.392 and showing them back to us, 0:10:00.416,0:10:01.878 while we're telling ourselves, 0:10:01.902,0:10:05.019 "We're just doing objective,[br]neutral computation." 0:10:06.314,0:10:08.991 Researchers found that on Google, 0:10:10.134,0:10:15.447 women are less likely than men[br]to be shown job ads for high-paying jobs. 0:10:16.463,0:10:18.993 And searching for African-American names 0:10:19.017,0:10:23.723 is more likely to bring up ads[br]suggesting criminal history, 0:10:23.747,0:10:25.314 even when there is none. 0:10:26.693,0:10:30.242 Such hidden biases[br]and black-box algorithms 0:10:30.266,0:10:34.239 that researchers uncover sometimes[br]but sometimes we don't know, 0:10:34.263,0:10:36.924 can have life-altering consequences. 0:10:37.958,0:10:42.117 In Wisconsin, a defendant[br]was sentenced to six years in prison 0:10:42.141,0:10:43.496 for evading the police. 0:10:44.824,0:10:46.010 You may not know this, 0:10:46.034,0:10:50.032 but algorithms are increasingly used[br]in parole and sentencing decisions. 0:10:50.056,0:10:53.011 He wanted to know:[br]How is this score calculated? 0:10:53.795,0:10:55.460 It's a commercial black box. 0:10:55.484,0:10:59.689 The company refused to have its algorithm[br]be challenged in open court. 0:11:00.396,0:11:05.928 But ProPublica, an investigative[br]nonprofit, audited that very algorithm 0:11:05.952,0:11:07.968 with what public data they could find, 0:11:07.992,0:11:10.308 and found that its outcomes were biased 0:11:10.332,0:11:13.961 and its predictive power[br]was dismal, barely better than chance, 0:11:13.985,0:11:18.401 and it was wrongly labeling[br]black defendants as future criminals 0:11:18.425,0:11:22.320 at twice the rate of white defendants. 0:11:23.891,0:11:25.455 So, consider this case: 0:11:26.103,0:11:29.955 This woman was late[br]picking up her godsister 0:11:29.979,0:11:32.054 from a school in Broward County, Florida, 0:11:32.757,0:11:35.113 running down the street[br]with a friend of hers. 0:11:35.137,0:11:39.236 They spotted an unlocked kid's bike[br]and a scooter on a porch 0:11:39.260,0:11:40.892 and foolishly jumped on it. 0:11:40.916,0:11:43.515 As they were speeding off,[br]a woman came out and said, 0:11:43.539,0:11:45.744 "Hey! That's my kid's bike!" 0:11:45.768,0:11:49.062 They dropped it, they walked away,[br]but they were arrested. 0:11:49.086,0:11:52.723 She was wrong, she was foolish,[br]but she was also just 18. 0:11:52.747,0:11:55.291 She had a couple of juvenile misdemeanors. 0:11:55.808,0:12:00.993 Meanwhile, that man had been arrested[br]for shoplifting in Home Depot -- 0:12:01.017,0:12:03.941 85 dollars' worth of stuff,[br]a similar petty crime. 0:12:04.766,0:12:09.325 But he had two prior[br]armed robbery convictions. 0:12:09.955,0:12:13.437 But the algorithm scored her[br]as high risk, and not him. 0:12:14.746,0:12:18.620 Two years later, ProPublica found[br]that she had not reoffended. 0:12:18.644,0:12:21.194 It was just hard to get a job[br]for her with her record. 0:12:21.218,0:12:23.294 He, on the other hand, did reoffend 0:12:23.318,0:12:27.154 and is now serving an eight-year[br]prison term for a later crime. 0:12:28.088,0:12:31.457 Clearly, we need to audit our black boxes 0:12:31.481,0:12:34.096 and not have them have[br]this kind of unchecked power. 0:12:34.120,0:12:36.999 (Applause) 0:12:38.087,0:12:42.329 Audits are great and important,[br]but they don't solve all our problems. 0:12:42.353,0:12:45.101 Take Facebook's powerful[br]news feed algorithm -- 0:12:45.125,0:12:49.968 you know, the one that ranks everything[br]and decides what to show you 0:12:49.992,0:12:52.276 from all the friends and pages you follow. 0:12:52.898,0:12:55.173 Should you be shown another baby picture? 0:12:55.197,0:12:56.393 (Laughter) 0:12:56.417,0:12:59.013 A sullen note from an acquaintance? 0:12:59.449,0:13:01.305 An important but difficult news item? 0:13:01.329,0:13:02.811 There's no right answer. 0:13:02.835,0:13:05.494 Facebook optimizes[br]for engagement on the site: 0:13:05.518,0:13:06.933 likes, shares, comments. 0:13:08.168,0:13:10.864 In August of 2014, 0:13:10.888,0:13:13.550 protests broke out in Ferguson, Missouri, 0:13:13.574,0:13:17.991 after the killing of an African-American[br]teenager by a white police officer, 0:13:18.015,0:13:19.585 under murky circumstances. 0:13:19.974,0:13:21.981 The news of the protests was all over 0:13:22.005,0:13:24.690 my algorithmically[br]unfiltered Twitter feed, 0:13:24.714,0:13:26.664 but nowhere on my Facebook. 0:13:27.182,0:13:28.916 Was it my Facebook friends? 0:13:28.940,0:13:30.972 I disabled Facebook's algorithm, 0:13:31.472,0:13:34.320 which is hard because Facebook[br]keeps wanting to make you 0:13:34.344,0:13:36.380 come under the algorithm's control, 0:13:36.404,0:13:38.642 and saw that my friends[br]were talking about it. 0:13:38.666,0:13:41.175 It's just that the algorithm[br]wasn't showing it to me. 0:13:41.199,0:13:44.241 I researched this and found[br]this was a widespread problem. 0:13:44.265,0:13:48.078 The story of Ferguson[br]wasn't algorithm-friendly. 0:13:48.102,0:13:49.273 It's not "likable." 0:13:49.297,0:13:50.849 Who's going to click on "like?" 0:13:51.500,0:13:53.706 It's not even easy to comment on. 0:13:53.730,0:13:55.101 Without likes and comments, 0:13:55.125,0:13:58.417 the algorithm was likely showing it[br]to even fewer people, 0:13:58.441,0:13:59.983 so we didn't get to see this. 0:14:00.946,0:14:02.174 Instead, that week, 0:14:02.198,0:14:04.496 Facebook's algorithm highlighted this, 0:14:04.520,0:14:06.746 which is the ALS Ice Bucket Challenge. 0:14:06.770,0:14:10.512 Worthy cause; dump ice water,[br]donate to charity, fine. 0:14:10.536,0:14:12.440 But it was super algorithm-friendly. 0:14:13.219,0:14:15.832 The machine made this decision for us. 0:14:15.856,0:14:19.353 A very important[br]but difficult conversation 0:14:19.377,0:14:20.932 might have been smothered, 0:14:20.956,0:14:23.652 had Facebook been the only channel. 0:14:24.117,0:14:27.914 Now, finally, these systems[br]can also be wrong 0:14:27.938,0:14:30.674 in ways that don't resemble human systems. 0:14:30.698,0:14:33.620 Do you guys remember Watson,[br]IBM's machine-intelligence system 0:14:33.644,0:14:36.772 that wiped the floor[br]with human contestants on Jeopardy? 0:14:37.131,0:14:38.559 It was a great player. 0:14:38.583,0:14:42.152 But then, for Final Jeopardy,[br]Watson was asked this question: 0:14:42.659,0:14:45.591 "Its largest airport is named[br]for a World War II hero, 0:14:45.615,0:14:47.867 its second-largest[br]for a World War II battle." 0:14:47.891,0:14:49.269 (Hums Final Jeopardy music) 0:14:49.582,0:14:50.764 Chicago. 0:14:50.788,0:14:52.158 The two humans got it right. 0:14:52.697,0:14:57.045 Watson, on the other hand,[br]answered "Toronto" -- 0:14:57.069,0:14:58.887 for a US city category! 0:14:59.596,0:15:02.497 The impressive system also made an error 0:15:02.521,0:15:06.172 that a human would never make,[br]a second-grader wouldn't make. 0:15:06.823,0:15:09.932 Our machine intelligence can fail 0:15:09.956,0:15:13.056 in ways that don't fit[br]error patterns of humans, 0:15:13.080,0:15:16.030 in ways we won't expect[br]and be prepared for. 0:15:16.054,0:15:19.692 It'd be lousy not to get a job[br]one is qualified for, 0:15:19.716,0:15:23.443 but it would triple suck[br]if it was because of stack overflow 0:15:23.467,0:15:24.899 in some subroutine. 0:15:24.923,0:15:26.502 (Laughter) 0:15:26.526,0:15:29.312 In May of 2010, 0:15:29.336,0:15:33.380 a flash crash on Wall Street[br]fueled by a feedback loop 0:15:33.404,0:15:36.432 in Wall Street's "sell" algorithm 0:15:36.456,0:15:40.640 wiped a trillion dollars[br]of value in 36 minutes. 0:15:41.722,0:15:43.909 I don't even want to think[br]what "error" means 0:15:43.933,0:15:47.522 in the context of lethal[br]autonomous weapons. 0:15:49.894,0:15:53.684 So yes, humans have always made biases. 0:15:53.708,0:15:55.884 Decision makers and gatekeepers, 0:15:55.908,0:15:59.401 in courts, in news, in war ... 0:15:59.425,0:16:02.463 they make mistakes;[br]but that's exactly my point. 0:16:02.487,0:16:06.008 We cannot escape[br]these difficult questions. 0:16:06.596,0:16:10.112 We cannot outsource[br]our responsibilities to machines. 0:16:10.676,0:16:14.884 (Applause) 0:16:17.089,0:16:21.536 Artificial intelligence does not give us[br]a "Get out of ethics free" card. 0:16:22.742,0:16:26.123 Data scientist Fred Benenson[br]calls this math-washing. 0:16:26.147,0:16:27.536 We need the opposite. 0:16:27.560,0:16:32.948 We need to cultivate algorithm suspicion,[br]scrutiny and investigation. 0:16:33.380,0:16:36.578 We need to make sure we have[br]algorithmic accountability, 0:16:36.602,0:16:39.047 auditing and meaningful transparency. 0:16:39.380,0:16:42.614 We need to accept[br]that bringing math and computation 0:16:42.638,0:16:45.608 to messy, value-laden human affairs 0:16:45.632,0:16:48.016 does not bring objectivity; 0:16:48.040,0:16:51.673 rather, the complexity of human affairs[br]invades the algorithms. 0:16:52.148,0:16:55.635 Yes, we can and we should use computation 0:16:55.659,0:16:57.673 to help us make better decisions. 0:16:57.697,0:17:03.029 But we have to own up[br]to our moral responsibility to judgment, 0:17:03.053,0:17:05.871 and use algorithms within that framework, 0:17:05.895,0:17:10.830 not as a means to abdicate[br]and outsource our responsibilities 0:17:10.854,0:17:13.308 to one another as human to human. 0:17:13.807,0:17:16.416 Machine intelligence is here. 0:17:16.440,0:17:19.861 That means we must hold on ever tighter 0:17:19.885,0:17:22.032 to human values and human ethics. 0:17:22.056,0:17:23.210 Thank you. 0:17:23.234,0:17:28.254 (Applause)