1 00:00:01,041 --> 00:00:04,175 Hello, I'm Joy, a poet of code, 2 00:00:04,199 --> 00:00:09,192 on a mission to stop an unseen force that's rising, 3 00:00:09,216 --> 00:00:12,072 a force that I called "the coded gaze," 4 00:00:12,096 --> 00:00:15,405 my term for algorithmic bias. 5 00:00:15,429 --> 00:00:19,729 Algorithmic bias, like human bias, results in unfairness. 6 00:00:19,753 --> 00:00:25,775 However, algorithms, like viruses, can spread bias on a massive scale 7 00:00:25,799 --> 00:00:27,381 at a rapid pace. 8 00:00:27,943 --> 00:00:32,330 Algorithmic bias can also lead to exclusionary experiences 9 00:00:32,354 --> 00:00:34,482 and discriminatory practices. 10 00:00:34,506 --> 00:00:36,567 Let me show you what I mean. 11 00:00:38,250 --> 00:00:40,490 (Video) Joy Boulamwini: Camera. I've got a face. 12 00:00:40,514 --> 00:00:42,387 Can you see my face? 13 00:00:42,411 --> 00:00:45,524 No-glasses face. 14 00:00:45,548 --> 00:00:48,241 You can see her face. 15 00:00:48,265 --> 00:00:50,495 What about my face? 16 00:00:53,255 --> 00:00:54,406 (Laughter) 17 00:00:54,430 --> 00:00:58,180 I've got a mask. Can you see my mask? 18 00:00:59,014 --> 00:01:01,379 Joy Boulamwini: So how did this happen? 19 00:01:01,403 --> 00:01:04,544 Why am I sitting in front of a computer 20 00:01:04,568 --> 00:01:05,992 in a white mask, 21 00:01:06,016 --> 00:01:09,666 trying to be detected by a cheap webcam? 22 00:01:09,690 --> 00:01:11,981 Well, when I'm not fighting the coded gaze 23 00:01:12,005 --> 00:01:13,525 as a poet of code, 24 00:01:13,549 --> 00:01:16,821 I'm a graduate student at the MIT Media Lab, 25 00:01:16,845 --> 00:01:21,762 and there I have the opportunity to work on all sorts of whimsical projects, 26 00:01:21,786 --> 00:01:23,813 including the Aspire Mirror, 27 00:01:23,837 --> 00:01:28,971 a project I did so I could project digital masks onto my reflection. 28 00:01:28,995 --> 00:01:31,345 So in the morning, if I wanted to feel powerful, 29 00:01:31,369 --> 00:01:32,803 I could put on a lion. 30 00:01:32,827 --> 00:01:36,323 If I wanted to be uplifted, I might have a quote. 31 00:01:36,347 --> 00:01:39,336 So I used generic facial recognition software 32 00:01:39,360 --> 00:01:40,711 to build the system, 33 00:01:40,735 --> 00:01:45,838 but found it was really hard to test it unless I wore a white mask. 34 00:01:46,822 --> 00:01:51,168 Unfortunately, I've run into this issue before. 35 00:01:51,192 --> 00:01:55,495 When I was an undergraduate at Georgia Tech studying computer science, 36 00:01:55,519 --> 00:01:57,574 I used to work on social robots, 37 00:01:57,598 --> 00:02:01,375 and one of my tasks was to get a robot to play peek-a-boo, 38 00:02:01,399 --> 00:02:03,082 a simple turn-taking game 39 00:02:03,106 --> 00:02:07,427 where partners cover their face and then uncover it saying, "Peek-a-boo!" 40 00:02:07,451 --> 00:02:11,880 The problem is, peek-a-boo doesn't really work if I can't see you, 41 00:02:11,904 --> 00:02:14,403 and my robot couldn't see me. 42 00:02:14,427 --> 00:02:18,377 But I borrowed my roommate's face to get the project done, 43 00:02:18,401 --> 00:02:19,781 submitted the assignment, 44 00:02:19,805 --> 00:02:23,558 and figured, you know what, somebody else will solve this problem. 45 00:02:24,209 --> 00:02:26,212 Not too longer after, 46 00:02:26,236 --> 00:02:30,395 I was in Hong Kong for an entrepreneurship competition. 47 00:02:30,879 --> 00:02:33,573 The organizers decided to take participants 48 00:02:33,597 --> 00:02:35,969 on a tour of local start-ups. 49 00:02:35,993 --> 00:02:38,708 One of the start-ups had a social robot, 50 00:02:38,732 --> 00:02:40,644 and they decided to do a demo. 51 00:02:40,668 --> 00:02:43,648 The demo worked on everybody until it got to me, 52 00:02:43,672 --> 00:02:45,595 and you can probably guess it. 53 00:02:45,619 --> 00:02:48,584 It couldn't detect my face. 54 00:02:48,608 --> 00:02:51,119 I asked the developers what was going on, 55 00:02:51,143 --> 00:02:56,676 and it turned out we had used the same generic facial recognition software. 56 00:02:56,700 --> 00:02:58,350 Halfway around the world, 57 00:02:58,374 --> 00:03:02,226 I learned that algorithmic bias can travel as quickly 58 00:03:02,250 --> 00:03:05,420 as it takes to download some files off of the internet. 59 00:03:06,285 --> 00:03:09,361 So what's going on? Why isn't my face being detected? 60 00:03:09,385 --> 00:03:12,741 Well, we have to look at how we give machines sight. 61 00:03:12,765 --> 00:03:16,174 Computer vision uses machine learning techniques 62 00:03:16,198 --> 00:03:18,078 to do facial recognition. 63 00:03:18,102 --> 00:03:21,999 So how this works is, you create a training set with examples of faces. 64 00:03:22,023 --> 00:03:24,841 This is a face. This is a face. This is not a face. 65 00:03:24,865 --> 00:03:29,384 And over time, you can teach a computer how to recognize other faces. 66 00:03:29,408 --> 00:03:33,397 However, if the training sets aren't really that diverse, 67 00:03:33,421 --> 00:03:36,770 any face that deviates too much from the established norm 68 00:03:36,794 --> 00:03:38,443 will be harder to detect, 69 00:03:38,467 --> 00:03:40,430 which is what was happening to me. 70 00:03:40,454 --> 00:03:42,836 But don't worry -- there's some good news. 71 00:03:42,860 --> 00:03:45,631 Training sets don't just materialize out of nowhere. 72 00:03:45,655 --> 00:03:47,443 We actually can create them. 73 00:03:47,467 --> 00:03:51,643 So there's an opportunity to create full-spectrum training sets 74 00:03:51,667 --> 00:03:55,491 that reflect a richer portrait of humanity. 75 00:03:55,515 --> 00:03:57,736 Now you've seen in my examples 76 00:03:57,760 --> 00:03:59,528 how social robots 77 00:03:59,552 --> 00:04:04,163 was how I found out about exclusion with algorithmic bias. 78 00:04:04,187 --> 00:04:09,002 But algorithmic bias can also lead to discriminatory practices. 79 00:04:09,977 --> 00:04:11,430 Across the US, 80 00:04:11,454 --> 00:04:15,652 police departments are starting to use facial recognition software 81 00:04:15,676 --> 00:04:18,135 in their crime-fighting arsenal. 82 00:04:18,159 --> 00:04:20,172 Georgetown Law published a report 83 00:04:20,196 --> 00:04:26,959 showing that one in two adults in the US -- that's 117 million people -- 84 00:04:26,983 --> 00:04:30,517 have their faces in facial recognition networks. 85 00:04:30,541 --> 00:04:35,093 Police departments can currently look at these networks unregulated, 86 00:04:35,117 --> 00:04:39,403 using algorithms that have not been audited for accuracy. 87 00:04:39,427 --> 00:04:43,291 Yet we know facial recognition is not fail proof, 88 00:04:43,315 --> 00:04:47,494 and labeling faces consistently remains a challenge. 89 00:04:47,518 --> 00:04:49,280 You might have seen this on Facebook. 90 00:04:49,304 --> 00:04:52,292 My friends and I laugh all the time when we see other people 91 00:04:52,316 --> 00:04:54,774 mislabeled in our photos. 92 00:04:54,798 --> 00:05:00,389 But misidentifying a suspected criminal is no laughing matter, 93 00:05:00,413 --> 00:05:03,240 nor is breaching civil liberties. 94 00:05:03,264 --> 00:05:06,469 Machine learning is being used for facial recognition, 95 00:05:06,493 --> 00:05:10,998 but it's also extending beyond the realm of computer vision. 96 00:05:11,806 --> 00:05:15,822 In her book, "Weapons of Math Destruction," 97 00:05:15,846 --> 00:05:22,527 data scientist Cathy O'Neil talks about the rising new WMDs -- 98 00:05:22,551 --> 00:05:26,904 widespread, mysterious and destructive algorithms 99 00:05:26,928 --> 00:05:29,892 that are increasingly being used to make decisions 100 00:05:29,916 --> 00:05:33,093 that impact more aspects of our lives. 101 00:05:33,117 --> 00:05:34,987 So who gets hired or fired? 102 00:05:35,011 --> 00:05:37,123 Do you get that loan? Do you get insurance? 103 00:05:37,147 --> 00:05:40,650 Are you admitted into the college you wanted to get into? 104 00:05:40,674 --> 00:05:44,183 Do you and I pay the same price for the same product 105 00:05:44,207 --> 00:05:46,649 purchased on the same platform? 106 00:05:46,673 --> 00:05:50,432 Law enforcement is also starting to use machine learning 107 00:05:50,456 --> 00:05:52,745 for predictive policing. 108 00:05:52,769 --> 00:05:56,263 Some judges use machine-generated risk scores to determine 109 00:05:56,287 --> 00:06:00,689 how long an individual is going to spend in prison. 110 00:06:00,713 --> 00:06:03,167 So we really have to think about these decisions. 111 00:06:03,191 --> 00:06:04,373 Are they fair? 112 00:06:04,397 --> 00:06:07,287 And we've seen that algorithmic bias 113 00:06:07,311 --> 00:06:10,685 doesn't necessarily always lead to fair outcomes. 114 00:06:10,709 --> 00:06:12,673 So what can we do about it? 115 00:06:12,697 --> 00:06:16,377 Well, we can start thinking about how we create more inclusive code 116 00:06:16,401 --> 00:06:19,391 and employ inclusive coding practices. 117 00:06:19,415 --> 00:06:21,724 It really starts with people. 118 00:06:22,248 --> 00:06:24,209 So who codes matters. 119 00:06:24,233 --> 00:06:28,352 Are we creating full-spectrum teams with diverse individuals 120 00:06:28,376 --> 00:06:30,787 who can check each other's blind spots? 121 00:06:30,811 --> 00:06:34,356 On the technical side, how we code matters. 122 00:06:34,380 --> 00:06:38,031 Are we factoring in fairness as we're developing systems? 123 00:06:38,055 --> 00:06:40,968 And finally, why we code matters. 124 00:06:41,325 --> 00:06:46,408 We've used tools of computational creation to unlock immense wealth. 125 00:06:46,432 --> 00:06:50,879 We now have the opportunity to unlock even greater equality 126 00:06:50,903 --> 00:06:53,833 if we make social change a priority 127 00:06:53,857 --> 00:06:56,027 and not an afterthought. 128 00:06:56,548 --> 00:07:01,070 And so these are the three tenets that will make up the "incoding" movement. 129 00:07:01,094 --> 00:07:02,746 Who codes matters, 130 00:07:02,770 --> 00:07:04,313 how we code matters 131 00:07:04,337 --> 00:07:06,360 and why we code matters. 132 00:07:06,384 --> 00:07:09,483 So to go towards encoding, we can start thinking about 133 00:07:09,507 --> 00:07:12,671 building platforms that can identify bias 134 00:07:12,695 --> 00:07:15,773 by collecting people's experiences like the ones I shared, 135 00:07:15,797 --> 00:07:18,867 but also auditing existing software. 136 00:07:18,891 --> 00:07:22,656 We can also start to create more inclusive training sets. 137 00:07:22,680 --> 00:07:25,483 Imagine a "Selfies for Inclusion" campaign 138 00:07:25,507 --> 00:07:29,162 where you and I can help developers test and create 139 00:07:29,186 --> 00:07:31,279 more inclusive training sets. 140 00:07:31,842 --> 00:07:34,670 And we can also start thinking more conscientiously 141 00:07:34,694 --> 00:07:40,085 about the social impact of the technology that we're developing. 142 00:07:40,109 --> 00:07:42,502 To get the incoding movement started, 143 00:07:42,526 --> 00:07:45,373 I've launched the Algorithmic Justice League, 144 00:07:45,397 --> 00:07:51,269 where anyone who cares about fairness can help fight the coded gaze. 145 00:07:51,293 --> 00:07:54,589 On codedgaze.com, you can report bias, 146 00:07:54,613 --> 00:07:57,058 request audits, become a tester 147 00:07:57,082 --> 00:07:59,853 and join the ongoing conversation, 148 00:07:59,877 --> 00:08:02,164 #codedgaze. 149 00:08:03,282 --> 00:08:05,769 So I invite you to join me 150 00:08:05,793 --> 00:08:09,512 in creating a world where technology works for all of us, 151 00:08:09,536 --> 00:08:11,433 not just some of us, 152 00:08:11,457 --> 00:08:16,045 a world where we value inclusion and center social change. 153 00:08:16,069 --> 00:08:17,244 Thank you. 154 00:08:17,268 --> 00:08:21,539 (Applause) 155 00:08:23,413 --> 00:08:26,267 But I have one question: 156 00:08:26,291 --> 00:08:28,350 Will you join me in the fight? 157 00:08:28,374 --> 00:08:29,659 (Laughter) 158 00:08:29,683 --> 00:08:33,370 (Applause)