0:00:01.041,0:00:04.175 Hello, I'm Joy, a poet of code, 0:00:04.199,0:00:09.192 on a mission to stop[br]an unseen force that's rising, 0:00:09.216,0:00:12.072 a force that I called "the coded gaze," 0:00:12.096,0:00:15.405 my term for algorithmic bias. 0:00:15.429,0:00:19.729 Algorithmic bias, like human bias,[br]results in unfairness. 0:00:19.753,0:00:25.775 However, algorithms, like viruses,[br]can spread bias on a massive scale 0:00:25.799,0:00:27.381 at a rapid pace. 0:00:27.943,0:00:32.330 Algorithmic bias can also lead[br]to exclusionary experiences 0:00:32.354,0:00:34.482 and discriminatory practices. 0:00:34.506,0:00:36.567 Let me show you what I mean. 0:00:38.250,0:00:40.490 (Video) Joy Boulamwini: Camera.[br]I've got a face. 0:00:40.514,0:00:42.387 Can you see my face? 0:00:42.411,0:00:45.524 No-glasses face. 0:00:45.548,0:00:48.241 You can see her face. 0:00:48.265,0:00:50.495 What about my face? 0:00:53.255,0:00:54.406 (Laughter) 0:00:54.430,0:00:58.180 I've got a mask. Can you see my mask? 0:00:59.014,0:01:01.379 Joy Boulamwini: So how did this happen? 0:01:01.403,0:01:04.544 Why am I sitting in front of a computer 0:01:04.568,0:01:05.992 in a white mask, 0:01:06.016,0:01:09.666 trying to be detected by a cheap webcam? 0:01:09.690,0:01:11.981 Well, when I'm not fighting the coded gaze 0:01:12.005,0:01:13.525 as a poet of code, 0:01:13.549,0:01:16.821 I'm a graduate student[br]at the MIT Media Lab, 0:01:16.845,0:01:21.762 and there I have the opportunity to work[br]on all sorts of whimsical projects, 0:01:21.786,0:01:23.813 including the Aspire Mirror, 0:01:23.837,0:01:28.971 a project I did so I could project[br]digital masks onto my reflection. 0:01:28.995,0:01:31.345 So in the morning, if I wanted[br]to feel powerful, 0:01:31.369,0:01:32.803 I could put on a lion. 0:01:32.827,0:01:36.323 If I wanted to be uplifted,[br]I might have a quote. 0:01:36.347,0:01:39.336 So I used generic[br]facial recognition software 0:01:39.360,0:01:40.711 to build the system, 0:01:40.735,0:01:45.838 but found it was really hard to test it[br]unless I wore a white mask. 0:01:46.822,0:01:51.168 Unfortunately, I've run[br]into this issue before. 0:01:51.192,0:01:55.495 When I was an undergraduate[br]at Georgia Tech studying computer science, 0:01:55.519,0:01:57.574 I used to work on social robots, 0:01:57.598,0:02:01.375 and one of my tasks was to get a robot[br]to play peek-a-boo, 0:02:01.399,0:02:03.082 a simple turn-taking game 0:02:03.106,0:02:07.427 where partners cover their face[br]and then uncover it saying, "Peek-a-boo!" 0:02:07.451,0:02:11.880 The problem is, peek-a-boo[br]doesn't really work if I can't see you, 0:02:11.904,0:02:14.403 and my robot couldn't see me. 0:02:14.427,0:02:18.377 But I borrowed my roommate's face[br]to get the project done, 0:02:18.401,0:02:19.781 submitted the assignment, 0:02:19.805,0:02:23.558 and figured, you know what,[br]somebody else will solve this problem. 0:02:24.209,0:02:26.212 Not too longer after, 0:02:26.236,0:02:30.395 I was in Hong Kong[br]for an entrepreneurship competition. 0:02:30.879,0:02:33.573 The organizers decided[br]to take participants 0:02:33.597,0:02:35.969 on a tour of local start-ups. 0:02:35.993,0:02:38.708 One of the start-ups had a social robot, 0:02:38.732,0:02:40.644 and they decided to do a demo. 0:02:40.668,0:02:43.648 The demo worked on everybody[br]until it got to me, 0:02:43.672,0:02:45.595 and you can probably guess it. 0:02:45.619,0:02:48.584 It couldn't detect my face. 0:02:48.608,0:02:51.119 I asked the developers what was going on, 0:02:51.143,0:02:56.676 and it turned out we had used the same[br]generic facial recognition software. 0:02:56.700,0:02:58.350 Halfway around the world, 0:02:58.374,0:03:02.226 I learned that algorithmic bias[br]can travel as quickly 0:03:02.250,0:03:05.420 as it takes to download[br]some files off of the internet. 0:03:06.285,0:03:09.361 So what's going on?[br]Why isn't my face being detected? 0:03:09.385,0:03:12.741 Well, we have to look at[br]how we give machines sight. 0:03:12.765,0:03:16.174 Computer vision uses[br]machine learning techniques 0:03:16.198,0:03:18.078 to do facial recognition. 0:03:18.102,0:03:21.999 So how this works is, you create[br]a training set with examples of faces. 0:03:22.023,0:03:24.841 This is a face. This is a face.[br]This is not a face. 0:03:24.865,0:03:29.384 And over time, you can teach a computer[br]how to recognize other faces. 0:03:29.408,0:03:33.397 However, if the training sets[br]aren't really that diverse, 0:03:33.421,0:03:36.770 any face that deviates too much[br]from the established norm 0:03:36.794,0:03:38.443 will be harder to detect, 0:03:38.467,0:03:40.430 which is what was happening to me. 0:03:40.454,0:03:42.836 But don't worry -- there's some good news. 0:03:42.860,0:03:45.631 Training sets don't just[br]materialize out of nowhere. 0:03:45.655,0:03:47.443 We actually can create them. 0:03:47.467,0:03:51.643 So there's an opportunity to create[br]full-spectrum training sets 0:03:51.667,0:03:55.491 that reflect a richer[br]portrait of humanity. 0:03:55.515,0:03:57.736 Now you've seen in my examples 0:03:57.760,0:03:59.528 how social robots 0:03:59.552,0:04:04.163 was how I found out about exclusion[br]with algorithmic bias. 0:04:04.187,0:04:09.002 But algorithmic bias can also lead[br]to discriminatory practices. 0:04:09.977,0:04:11.430 Across the US, 0:04:11.454,0:04:15.652 police departments are starting to use[br]facial recognition software 0:04:15.676,0:04:18.135 in their crime-fighting arsenal. 0:04:18.159,0:04:20.172 Georgetown Law published a report 0:04:20.196,0:04:26.959 showing that one in two adults[br]in the US -- that's 117 million people -- 0:04:26.983,0:04:30.517 have their faces[br]in facial recognition networks. 0:04:30.541,0:04:35.093 Police departments can currently look[br]at these networks unregulated, 0:04:35.117,0:04:39.403 using algorithms that have not[br]been audited for accuracy. 0:04:39.427,0:04:43.291 Yet we know facial recognition[br]is not fail proof, 0:04:43.315,0:04:47.494 and labeling faces consistently[br]remains a challenge. 0:04:47.518,0:04:49.280 You might have seen this on Facebook. 0:04:49.304,0:04:52.292 My friends and I laugh all the time[br]when we see other people 0:04:52.316,0:04:54.774 mislabeled in our photos. 0:04:54.798,0:05:00.389 But misidentifying a suspected criminal[br]is no laughing matter, 0:05:00.413,0:05:03.240 nor is breaching civil liberties. 0:05:03.264,0:05:06.469 Machine learning is being used[br]for facial recognition, 0:05:06.493,0:05:10.998 but it's also extending beyond the realm[br]of computer vision. 0:05:11.806,0:05:15.822 In her book, "Weapons[br]of Math Destruction," 0:05:15.846,0:05:22.527 data scientist Cathy O'Neil[br]talks about the rising new WMDs -- 0:05:22.551,0:05:26.904 widespread, mysterious[br]and destructive algorithms 0:05:26.928,0:05:29.892 that are increasingly being used[br]to make decisions 0:05:29.916,0:05:33.093 that impact more aspects of our lives. 0:05:33.117,0:05:34.987 So who gets hired or fired? 0:05:35.011,0:05:37.123 Do you get that loan?[br]Do you get insurance? 0:05:37.147,0:05:40.650 Are you admitted into the college[br]you wanted to get into? 0:05:40.674,0:05:44.183 Do you and I pay the same price[br]for the same product 0:05:44.207,0:05:46.649 purchased on the same platform? 0:05:46.673,0:05:50.432 Law enforcement is also starting[br]to use machine learning 0:05:50.456,0:05:52.745 for predictive policing. 0:05:52.769,0:05:56.263 Some judges use machine-generated[br]risk scores to determine 0:05:56.287,0:06:00.689 how long an individual[br]is going to spend in prison. 0:06:00.713,0:06:03.167 So we really have to think[br]about these decisions. 0:06:03.191,0:06:04.373 Are they fair? 0:06:04.397,0:06:07.287 And we've seen that algorithmic bias 0:06:07.311,0:06:10.685 doesn't necessarily always[br]lead to fair outcomes. 0:06:10.709,0:06:12.673 So what can we do about it? 0:06:12.697,0:06:16.377 Well, we can start thinking about[br]how we create more inclusive code 0:06:16.401,0:06:19.391 and employ inclusive coding practices. 0:06:19.415,0:06:21.724 It really starts with people. 0:06:22.248,0:06:24.209 So who codes matters. 0:06:24.233,0:06:28.352 Are we creating full-spectrum teams[br]with diverse individuals 0:06:28.376,0:06:30.787 who can check each other's blind spots? 0:06:30.811,0:06:34.356 On the technical side,[br]how we code matters. 0:06:34.380,0:06:38.031 Are we factoring in fairness[br]as we're developing systems? 0:06:38.055,0:06:40.968 And finally, why we code matters. 0:06:41.325,0:06:46.408 We've used tools of computational creation[br]to unlock immense wealth. 0:06:46.432,0:06:50.879 We now have the opportunity[br]to unlock even greater equality 0:06:50.903,0:06:53.833 if we make social change a priority 0:06:53.857,0:06:56.027 and not an afterthought. 0:06:56.548,0:07:01.070 And so these are the three tenets[br]that will make up the "incoding" movement. 0:07:01.094,0:07:02.746 Who codes matters, 0:07:02.770,0:07:04.313 how we code matters 0:07:04.337,0:07:06.360 and why we code matters. 0:07:06.384,0:07:09.483 So to go towards encoding,[br]we can start thinking about 0:07:09.507,0:07:12.671 building platforms that can identify bias 0:07:12.695,0:07:15.773 by collecting people's experiences[br]like the ones I shared, 0:07:15.797,0:07:18.867 but also auditing existing software. 0:07:18.891,0:07:22.656 We can also start to create[br]more inclusive training sets. 0:07:22.680,0:07:25.483 Imagine a "Selfies for Inclusion" campaign 0:07:25.507,0:07:29.162 where you and I can help[br]developers test and create 0:07:29.186,0:07:31.279 more inclusive training sets. 0:07:31.842,0:07:34.670 And we can also start thinking[br]more conscientiously 0:07:34.694,0:07:40.085 about the social impact[br]of the technology that we're developing. 0:07:40.109,0:07:42.502 To get the incoding movement started, 0:07:42.526,0:07:45.373 I've launched the Algorithmic[br]Justice League, 0:07:45.397,0:07:51.269 where anyone who cares about fairness[br]can help fight the coded gaze. 0:07:51.293,0:07:54.589 On codedgaze.com, you can report bias, 0:07:54.613,0:07:57.058 request audits, become a tester 0:07:57.082,0:07:59.853 and join the ongoing conversation, 0:07:59.877,0:08:02.164 #codedgaze. 0:08:03.282,0:08:05.769 So I invite you to join me 0:08:05.793,0:08:09.512 in creating a world where technology[br]works for all of us, 0:08:09.536,0:08:11.433 not just some of us, 0:08:11.457,0:08:16.045 a world where we value inclusion[br]and center social change. 0:08:16.069,0:08:17.244 Thank you. 0:08:17.268,0:08:21.539 (Applause) 0:08:23.413,0:08:26.267 But I have one question: 0:08:26.291,0:08:28.350 Will you join me in the fight? 0:08:28.374,0:08:29.659 (Laughter) 0:08:29.683,0:08:33.370 (Applause)