1 00:00:01,041 --> 00:00:02,682 Hello, I'm Joy, 2 00:00:02,682 --> 00:00:04,199 a poet of code 3 00:00:04,199 --> 00:00:06,581 on a mission to stop 4 00:00:06,581 --> 00:00:09,216 an unseen force that's rising, 5 00:00:09,216 --> 00:00:12,096 a force that I called the coded gaze, 6 00:00:12,096 --> 00:00:15,429 my term for algorithmic bias. 7 00:00:15,429 --> 00:00:17,844 Algorithmic bias, like human bias, 8 00:00:17,844 --> 00:00:20,232 results in unfairness. 9 00:00:20,232 --> 00:00:22,601 However, algorithms, like viruses, 10 00:00:22,601 --> 00:00:25,799 can spread bias on a massive scale 11 00:00:25,799 --> 00:00:28,118 at a rapid pace. 12 00:00:28,118 --> 00:00:32,528 Algorithmic bias can also lead to exclusionary experiences, 13 00:00:32,528 --> 00:00:34,596 and discriminatory practices. 14 00:00:34,596 --> 00:00:38,377 Let me show you what I mean. 15 00:00:38,377 --> 00:00:40,506 (Video) Joy Boulamwini: Camera. I've got a face. 16 00:00:40,506 --> 00:00:42,704 Can you see my face? 17 00:00:42,704 --> 00:00:45,627 No glasses face. 18 00:00:45,627 --> 00:00:48,343 You can see her face. 19 00:00:48,343 --> 00:00:51,016 What about my face? 20 00:00:53,819 --> 00:00:54,699 (Laughter) 21 00:00:54,699 --> 00:00:58,886 I've got a mask. Can you see my mask? 22 00:00:58,886 --> 00:01:02,195 Joy Boulamwini: So how did this happen? 23 00:01:02,195 --> 00:01:04,711 Why am I sitting in front of a computer 24 00:01:04,711 --> 00:01:06,158 in a white mask 25 00:01:06,158 --> 00:01:09,831 trying to be detected by a cheap webcam? 26 00:01:09,831 --> 00:01:12,145 Well, when I'm not fighting the coded gaze 27 00:01:12,145 --> 00:01:13,549 as a poet of code, 28 00:01:13,549 --> 00:01:16,845 I'm a graduate student at the MIT Media Lab, 29 00:01:16,845 --> 00:01:19,058 and there I have the opportunity 30 00:01:19,058 --> 00:01:21,786 to work on all sorts of whimsical projects, 31 00:01:21,786 --> 00:01:24,095 including the Aspire Mirror, 32 00:01:24,095 --> 00:01:28,995 a project I did so I could project digital masks onto my reflection. 33 00:01:28,995 --> 00:01:31,369 So in the morning, if I wanted to feel powerful, 34 00:01:31,369 --> 00:01:32,827 I could put on a lion. 35 00:01:32,827 --> 00:01:36,347 If I wanted to be uplifted, I might have a quote. 36 00:01:36,347 --> 00:01:39,430 So I used generic facial recognition software 37 00:01:39,430 --> 00:01:40,735 to build the system, 38 00:01:40,735 --> 00:01:43,624 but found that it was really hard to test it 39 00:01:43,624 --> 00:01:46,659 unless I wore a white mask. 40 00:01:46,659 --> 00:01:51,091 Unfortunately, I've run into this issue before. 41 00:01:51,091 --> 00:01:55,519 When I was an undergraduate at Georgia Tech studying computer science, 42 00:01:55,519 --> 00:01:57,598 I used to work on social robots, 43 00:01:57,598 --> 00:01:59,974 and one of my tasks was to get a robot 44 00:01:59,974 --> 00:02:01,399 to play peek-a-boo, 45 00:02:01,399 --> 00:02:03,268 a simple turn-taking game 46 00:02:03,268 --> 00:02:07,312 where partners cover their face and then uncover it saying "Peek-a-boo." 47 00:02:07,312 --> 00:02:12,093 The problem is, peek-a-boo doesn't really work if I can't see you, 48 00:02:12,093 --> 00:02:14,427 and my robot couldn't see me. 49 00:02:14,427 --> 00:02:18,401 But I borrowed my roommate's face to get the project done, 50 00:02:18,401 --> 00:02:20,372 submitted the assignment, and figured, 51 00:02:20,372 --> 00:02:24,411 you know what? Somebody else will solve this problem. 52 00:02:24,411 --> 00:02:26,299 Not too longer after, 53 00:02:26,299 --> 00:02:30,956 I was in Hong Kong for an entrepreneurship competition. 54 00:02:30,956 --> 00:02:33,507 The organizers decided to take participants 55 00:02:33,507 --> 00:02:35,993 on a tour of local start-ups. 56 00:02:35,993 --> 00:02:38,732 One of the start-ups had a social robot, 57 00:02:38,732 --> 00:02:40,668 and they decided to do a demo. 58 00:02:40,668 --> 00:02:43,672 The demo worked on everybody until it got to me, 59 00:02:43,672 --> 00:02:45,741 and you can probably guess it. 60 00:02:45,741 --> 00:02:48,608 It couldn't detect my face. 61 00:02:48,608 --> 00:02:51,143 I asked the developers what was going on, 62 00:02:51,143 --> 00:02:52,354 and it turned out 63 00:02:52,354 --> 00:02:56,729 we had used the same generic facial recognition software. 64 00:02:56,729 --> 00:02:58,402 Halfway around the world, 65 00:02:58,402 --> 00:03:01,997 I learned that algorithmic bias can travel as quickly 66 00:03:01,997 --> 00:03:05,262 as it takes to download some files off of the Internet. 67 00:03:05,262 --> 00:03:07,406 So what's going on? Why isn't my face being detected? 68 00:03:07,406 --> 00:03:12,765 Well, we have to look at how we give machines sight. 69 00:03:12,765 --> 00:03:16,198 Computer vision uses machine learning techniques 70 00:03:16,198 --> 00:03:18,382 to do facial recognition. 71 00:03:18,382 --> 00:03:20,658 So how this works is you create a training set 72 00:03:20,658 --> 00:03:22,023 with examples of faces. 73 00:03:22,023 --> 00:03:24,931 This is face. This is face. This is not a face. 74 00:03:24,931 --> 00:03:29,455 And over time, you can teach a computer how to recognize other faces. 75 00:03:29,455 --> 00:03:33,263 However, if the training sets aren't really that diverse, 76 00:03:33,263 --> 00:03:36,459 any face that deviates too much from the established norm 77 00:03:36,459 --> 00:03:38,467 will be harder to detect, 78 00:03:38,467 --> 00:03:40,454 which is what was happening to me. 79 00:03:40,454 --> 00:03:42,702 But don't worry, there's some good news. 80 00:03:42,702 --> 00:03:45,730 Training sets don't just materialize out of nowhere. 81 00:03:45,730 --> 00:03:47,467 We actually can create them. 82 00:03:47,467 --> 00:03:51,553 So there's an opportunity to create full spectrum training sets 83 00:03:51,553 --> 00:03:55,515 that reflect a richer portrait of humanity. 84 00:03:55,515 --> 00:03:57,797 Now you've seen in my examples 85 00:03:57,797 --> 00:03:59,552 how social robots 86 00:03:59,552 --> 00:04:04,246 was how I found out about exclusion with algorithmic bias, 87 00:04:04,246 --> 00:04:10,160 but algorithmic bias can also lead to discriminatory practices. 88 00:04:10,160 --> 00:04:13,955 Across the U.S., police departments are starting to use 89 00:04:13,955 --> 00:04:15,676 facial recognition software 90 00:04:15,676 --> 00:04:18,289 in their crime-fighting arsenal. 91 00:04:18,289 --> 00:04:18,802 Georgetown Law published a report showing that one in two 92 00:04:18,802 --> 00:04:19,052 adults in the U.S. -- that's 117 million people -- 93 00:04:19,052 --> 00:04:30,180 have their faces in facial recognition networks. 94 00:04:30,180 --> 00:04:35,117 Police departments can currently look at these networks unregulated 95 00:04:35,117 --> 00:04:39,484 using algorithms that have not been audited for accuracy. 96 00:04:39,484 --> 00:04:41,691 Yet we know facial recognition 97 00:04:41,691 --> 00:04:43,232 is not failproof, 98 00:04:43,232 --> 00:04:47,518 and labeling faces consistently remains a challenge. 99 00:04:47,518 --> 00:04:49,277 You might have seen this on Facebook. 100 00:04:49,277 --> 00:04:52,316 My friends and I laugh all the time when we see other people 101 00:04:52,316 --> 00:04:54,918 mislabeled in our photos. 102 00:04:54,918 --> 00:04:58,406 But misidentifying a suspected criminal 103 00:04:58,406 --> 00:05:00,547 is no laughing matter, 104 00:05:00,547 --> 00:05:03,364 nor is breaching civil liberties. 105 00:05:03,364 --> 00:05:06,411 Machine learning is being used for facial recognition, 106 00:05:06,411 --> 00:05:09,083 but it's also extending beyond the realm 107 00:05:09,083 --> 00:05:11,934 of computer vision. 108 00:05:11,934 --> 00:05:15,865 In her book "Weapons of Math Destruction," 109 00:05:15,865 --> 00:05:18,178 data scientist Cathy O'Neil 110 00:05:18,178 --> 00:05:22,718 talks about the rising new WMDs -- 111 00:05:22,718 --> 00:05:26,806 widespread, mysterious, and destructive algorithms 112 00:05:26,806 --> 00:05:29,673 that are increasingly being used to make decisions 113 00:05:29,673 --> 00:05:32,883 that impact more aspects of our lives. 114 00:05:32,883 --> 00:05:35,318 So who gets hired or fired? 115 00:05:35,318 --> 00:05:37,135 Do you get that loan? Do you get insurance? 116 00:05:37,135 --> 00:05:40,730 Are you admitted into the college that you wanted to get into? 117 00:05:40,730 --> 00:05:44,207 Do you and I pay the same price for the same product 118 00:05:44,207 --> 00:05:46,749 purchased on the same platform? 119 00:05:46,749 --> 00:05:50,502 Law enforcement is also starting to use machine learning 120 00:05:50,502 --> 00:05:52,769 for predictive policing. 121 00:05:52,769 --> 00:05:56,287 Some judges use machine-generated risk scores to determine 122 00:05:56,287 --> 00:06:00,728 how long an individual is going to spend in prison. 123 00:06:00,728 --> 00:06:03,221 So we really have to think about these decisions. 124 00:06:03,221 --> 00:06:04,397 Are they fair? 125 00:06:04,397 --> 00:06:07,225 And we've seen that algorithmic bias 126 00:06:07,225 --> 00:06:10,709 doesn't necessarily always lead to fair outcomes. 127 00:06:10,709 --> 00:06:12,697 So what can we do about it? 128 00:06:12,697 --> 00:06:16,401 Well, we can start thinking about how we create more inclusive code 129 00:06:16,401 --> 00:06:19,415 and employ inclusive coding practices. 130 00:06:19,415 --> 00:06:22,087 It really starts with people. 131 00:06:22,087 --> 00:06:24,383 So who codes matters. 132 00:06:24,383 --> 00:06:28,288 Are we creating full spectrum teams with diverse individuals 133 00:06:28,288 --> 00:06:30,811 who can check each other's blind spots? 134 00:06:30,811 --> 00:06:32,508 On the technical side, 135 00:06:32,508 --> 00:06:34,380 how we code matters. 136 00:06:34,380 --> 00:06:38,055 Are we factoring in fairness as we're developing systems? 137 00:06:38,055 --> 00:06:41,452 And finally, why we code matters. 138 00:06:41,452 --> 00:06:44,157 We've used tools of computational creation 139 00:06:44,157 --> 00:06:46,335 to unlock immense wealth. 140 00:06:46,335 --> 00:06:48,391 We now have the opportunity 141 00:06:48,391 --> 00:06:51,088 to unlock even greater equality 142 00:06:51,088 --> 00:06:53,565 if we make social change a priority 143 00:06:53,565 --> 00:06:56,732 and not an afterthought. 144 00:06:56,732 --> 00:07:00,989 And so these are the three tenets that will make up the incoding movement. 145 00:07:00,989 --> 00:07:02,677 Who codes matters, 146 00:07:02,677 --> 00:07:04,337 how we code matters, 147 00:07:04,337 --> 00:07:06,505 and why we code matters. 148 00:07:06,505 --> 00:07:09,507 So to go towards incoding, we can start thinking about 149 00:07:09,507 --> 00:07:12,589 building platforms that can identify bias 150 00:07:12,589 --> 00:07:15,913 by collecting people's experiences like the ones I shared, 151 00:07:15,913 --> 00:07:18,891 but also auditing existing software. 152 00:07:18,891 --> 00:07:22,680 We can also start to create more inclusive training sets. 153 00:07:22,680 --> 00:07:25,575 Imagine a selfies for inclusion campaign 154 00:07:25,575 --> 00:07:27,915 where you and I can help developers 155 00:07:27,915 --> 00:07:31,383 test and create more inclusive training sets. 156 00:07:31,383 --> 00:07:34,738 And we can also start thinking more conscientiously 157 00:07:34,738 --> 00:07:40,189 about the social impact of the technology that we're developing. 158 00:07:40,189 --> 00:07:42,526 To get the incoding movement started, 159 00:07:42,526 --> 00:07:45,470 I've launched the Algorithmic Justice League, 160 00:07:45,470 --> 00:07:48,488 where anyone who cares about fairness 161 00:07:48,488 --> 00:07:51,372 can help fight the coded gaze. 162 00:07:51,372 --> 00:07:54,165 On codedgaze.com, you can report bias, 163 00:07:54,165 --> 00:07:56,070 request audits, become a tester, 164 00:07:56,070 --> 00:07:59,877 and join the ongoing conversation, 165 00:07:59,877 --> 00:08:03,326 #codedgaze. 166 00:08:03,326 --> 00:08:05,919 So I invite you to join me 167 00:08:05,919 --> 00:08:09,615 in creating a world where technology works for all of us, 168 00:08:09,615 --> 00:08:11,678 not just some of us, 169 00:08:11,678 --> 00:08:16,069 a world where we value inclusion and center social change. 170 00:08:16,069 --> 00:08:18,258 Thank you. 171 00:08:18,258 --> 00:08:23,003 (Applause) 172 00:08:23,533 --> 00:08:26,928 But I have one question. 173 00:08:26,928 --> 00:08:28,780 Will you join me in the fight? 174 00:08:28,780 --> 00:08:30,988 (Laughter) 175 00:08:30,988 --> 00:08:33,348 (Applause)