1 00:00:00,691 --> 00:00:03,849 I write fiction sci-fi thrillers, 2 00:00:03,849 --> 00:00:06,042 so if I say "killer robots," 3 00:00:06,042 --> 00:00:08,403 you'd probably think something like this. 4 00:00:08,403 --> 00:00:10,958 But I'm actually not here to talk about fiction. 5 00:00:10,958 --> 00:00:13,906 I'm here to talk about very real killer robots, 6 00:00:13,906 --> 00:00:17,028 autonomous combat drones. 7 00:00:17,028 --> 00:00:20,655 Now, I'm not referring to Predator and Reaper drones, 8 00:00:20,655 --> 00:00:23,915 which have a human making targeting decisions. 9 00:00:23,915 --> 00:00:27,044 I'm talking about fully autonomous robotic weapons 10 00:00:27,044 --> 00:00:29,698 that make lethal decisions about human beings 11 00:00:29,698 --> 00:00:32,115 all on their own. 12 00:00:32,115 --> 00:00:36,115 There's actually a technical term for this: lethal autonomy. 13 00:00:36,115 --> 00:00:38,971 Now, lethally autonomous killer robots 14 00:00:38,971 --> 00:00:42,040 would take many forms -- flying, driving, 15 00:00:42,040 --> 00:00:44,786 or just lying in wait. 16 00:00:44,786 --> 00:00:47,895 And actually, they're very quickly becoming a reality. 17 00:00:47,895 --> 00:00:50,379 These are two automatic sniper stations 18 00:00:50,379 --> 00:00:54,516 currently deployed in the DMZ between North and South Korea. 19 00:00:54,516 --> 00:00:56,687 Both of these machines are capable of automatically 20 00:00:56,687 --> 00:01:00,211 identifying a human target and firing on it, 21 00:01:00,211 --> 00:01:04,535 the one on the left at a distance of over a kilometer. 22 00:01:04,535 --> 00:01:08,124 Now, in both cases, there's still a human in the loop 23 00:01:08,124 --> 00:01:10,496 to make that lethal firing decision, 24 00:01:10,496 --> 00:01:15,909 but it's not a technological requirement. It's a choice. 25 00:01:15,909 --> 00:01:19,002 And it's that choice that I want to focus on, 26 00:01:19,002 --> 00:01:21,643 because as we migrate lethal decision-making 27 00:01:21,643 --> 00:01:24,752 from humans to software, 28 00:01:24,752 --> 00:01:28,228 we risk not only taking the humanity out of war, 29 00:01:28,228 --> 00:01:31,754 but also changing our social landscape entirely, 30 00:01:31,754 --> 00:01:33,978 far from the battlefield. 31 00:01:33,978 --> 00:01:38,487 That's because the way humans resolve conflict 32 00:01:38,487 --> 00:01:40,220 shapes our social landscape. 33 00:01:40,220 --> 00:01:42,853 And this has always been the case, throughout history. 34 00:01:42,853 --> 00:01:45,514 For example, these were state-of-the-art weapons systems 35 00:01:45,514 --> 00:01:47,593 in 1400 A.D. 36 00:01:47,593 --> 00:01:50,737 Now they were both very expensive to build and maintain, 37 00:01:50,737 --> 00:01:53,977 but with these you could dominate the populace, 38 00:01:53,977 --> 00:01:57,866 and the distribution of political power in feudal society reflected that. 39 00:01:57,866 --> 00:02:00,553 Power was focused at the very top. 40 00:02:00,553 --> 00:02:04,081 And what changed? Technological innovation. 41 00:02:04,081 --> 00:02:05,952 Gunpowder, cannon. 42 00:02:05,952 --> 00:02:09,769 And pretty soon, armor and castles were obsolete, 43 00:02:09,769 --> 00:02:12,302 and it mattered less who you brought to the battlefield 44 00:02:12,302 --> 00:02:16,081 versus how many people you brought to the battlefield. 45 00:02:16,081 --> 00:02:19,719 And as armies grew in size, the nation-state arose 46 00:02:19,719 --> 00:02:23,399 as a political and logistical requirement of defense. 47 00:02:23,399 --> 00:02:25,775 And as leaders had to rely on more of their populace, 48 00:02:25,775 --> 00:02:27,608 they began to share power. 49 00:02:27,608 --> 00:02:30,207 Representative government began to form. 50 00:02:30,207 --> 00:02:33,495 So again, the tools we use to resolve conflict 51 00:02:33,495 --> 00:02:36,799 shape our social landscape. 52 00:02:36,799 --> 00:02:40,863 Autonomous robotic weapons are such a tool, 53 00:02:40,863 --> 00:02:46,031 except that, by requiring very few people to go to war, 54 00:02:46,031 --> 00:02:50,871 they risk re-centralizing power into very few hands, 55 00:02:50,871 --> 00:02:57,386 possibly reversing a five-century trend toward democracy. 56 00:02:57,386 --> 00:02:59,143 Now, I think, knowing this, 57 00:02:59,143 --> 00:03:03,495 we can take decisive steps to preserve our democratic institutions, 58 00:03:03,495 --> 00:03:07,474 to do what humans do best, which is adapt. 59 00:03:07,474 --> 00:03:09,479 But time is a factor. 60 00:03:09,479 --> 00:03:12,330 Seventy nations are developing remotely-piloted 61 00:03:12,330 --> 00:03:14,487 combat drones of their own, 62 00:03:14,487 --> 00:03:17,080 and as you'll see, remotely-piloted combat drones 63 00:03:17,080 --> 00:03:21,552 are the precursors to autonomous robotic weapons. 64 00:03:21,552 --> 00:03:24,319 That's because once you've deployed remotely-piloted drones, 65 00:03:24,319 --> 00:03:27,703 there are three powerful factors pushing decision-making 66 00:03:27,703 --> 00:03:32,303 away from humans and on to the weapon platform itself. 67 00:03:32,303 --> 00:03:37,562 The first of these is the deluge of video that drones produce. 68 00:03:37,562 --> 00:03:41,415 For example, in 2004, the U.S. drone fleet produced 69 00:03:41,415 --> 00:03:46,727 a grand total of 71 hours of video surveillance for analysis. 70 00:03:46,727 --> 00:03:51,226 By 2011, this had gone up to 300,000 hours, 71 00:03:51,226 --> 00:03:54,375 outstripping human ability to review it all, 72 00:03:54,375 --> 00:03:58,039 but even that number is about to go up drastically. 73 00:03:58,039 --> 00:04:00,614 The Pentagon's Gorgon Stare and Argus programs 74 00:04:00,614 --> 00:04:03,778 will put up to 65 independently operated camera eyes 75 00:04:03,778 --> 00:04:05,816 on each drone platform, 76 00:04:05,816 --> 00:04:09,119 and this would vastly outstrip human ability to review it. 77 00:04:09,119 --> 00:04:11,279 And that means visual intelligence software will need 78 00:04:11,279 --> 00:04:15,327 to scan it for items of interest. 79 00:04:15,327 --> 00:04:16,675 And that means very soon 80 00:04:16,675 --> 00:04:19,422 drones will tell humans what to look at, 81 00:04:19,422 --> 00:04:21,919 not the other way around. 82 00:04:21,919 --> 00:04:24,392 But there's a second powerful incentive pushing 83 00:04:24,392 --> 00:04:27,775 decision-making away from humans and onto machines, 84 00:04:27,775 --> 00:04:30,647 and that's electromagnetic jamming, 85 00:04:30,647 --> 00:04:32,883 severing the connection between the drone 86 00:04:32,883 --> 00:04:35,697 and its operator. 87 00:04:35,697 --> 00:04:38,315 Now we saw an example of this in 2011 88 00:04:38,315 --> 00:04:41,271 when an American RQ-170 Sentinel drone 89 00:04:41,271 --> 00:04:45,578 got a bit confused over Iran due to a GPS spoofing attack, 90 00:04:45,578 --> 00:04:50,692 but any remotely-piloted drone is susceptible to this type of attack, 91 00:04:50,692 --> 00:04:52,744 and that means drones 92 00:04:52,744 --> 00:04:56,364 will have to shoulder more decision-making. 93 00:04:56,364 --> 00:04:59,407 They'll know their mission objective, 94 00:04:59,407 --> 00:05:04,252 and they'll react to new circumstances without human guidance. 95 00:05:04,252 --> 00:05:06,833 They'll ignore external radio signals 96 00:05:06,833 --> 00:05:09,163 and send very few of their own. 97 00:05:09,163 --> 00:05:11,169 Which brings us to, really, the third 98 00:05:11,169 --> 00:05:15,031 and most powerful incentive pushing decision-making 99 00:05:15,031 --> 00:05:18,373 away from humans and onto weapons: 100 00:05:18,373 --> 00:05:21,666 plausible deniability. 101 00:05:21,666 --> 00:05:24,553 Now we live in a global economy. 102 00:05:24,553 --> 00:05:28,887 High-tech manufacturing is occurring on most continents. 103 00:05:28,887 --> 00:05:31,801 Cyber espionage is spiriting away advanced designs 104 00:05:31,801 --> 00:05:33,687 to parts unknown, 105 00:05:33,687 --> 00:05:35,701 and in that environment, it is very likely 106 00:05:35,701 --> 00:05:40,435 that a successful drone design will be knocked off in contract factories, 107 00:05:40,435 --> 00:05:42,605 proliferate in the gray market. 108 00:05:42,605 --> 00:05:45,065 And in that situation, sifting through the wreckage 109 00:05:45,065 --> 00:05:48,025 of a suicide drone attack, it will be very difficult to say 110 00:05:48,025 --> 00:05:52,425 who sent that weapon. 111 00:05:52,425 --> 00:05:55,225 This raises the very real possibility 112 00:05:55,225 --> 00:05:58,160 of anonymous war. 113 00:05:58,160 --> 00:06:00,774 This could tilt the geopolitical balance on its head, 114 00:06:00,774 --> 00:06:04,265 make it very difficult for a nation to turn its firepower 115 00:06:04,265 --> 00:06:07,113 against an attacker, and that could shift the balance 116 00:06:07,113 --> 00:06:10,877 in the 21st century away from defense and toward offense. 117 00:06:10,877 --> 00:06:14,001 It could make military action a viable option 118 00:06:14,001 --> 00:06:16,289 not just for small nations, 119 00:06:16,289 --> 00:06:18,834 but criminal organizations, private enterprise, 120 00:06:18,834 --> 00:06:21,313 even powerful individuals. 121 00:06:21,313 --> 00:06:24,641 It could create a landscape of rival warlords 122 00:06:24,641 --> 00:06:28,321 undermining rule of law and civil society. 123 00:06:28,321 --> 00:06:31,937 Now if responsibility and transparency 124 00:06:31,937 --> 00:06:34,321 are two of the cornerstones of representative government, 125 00:06:34,321 --> 00:06:38,641 autonomous robotic weapons could undermine both. 126 00:06:38,641 --> 00:06:40,187 Now you might be thinking that 127 00:06:40,187 --> 00:06:42,433 citizens of high-tech nations 128 00:06:42,433 --> 00:06:45,136 would have the advantage in any robotic war, 129 00:06:45,136 --> 00:06:48,769 that citizens of those nations would be less vulnerable, 130 00:06:48,769 --> 00:06:53,057 particularly against developing nations. 131 00:06:53,057 --> 00:06:56,581 But I think the truth is the exact opposite. 132 00:06:56,581 --> 00:06:58,832 I think citizens of high-tech societies 133 00:06:58,832 --> 00:07:02,561 are more vulnerable to robotic weapons, 134 00:07:02,561 --> 00:07:07,026 and the reason can be summed up in one word: data. 135 00:07:07,026 --> 00:07:10,507 Data powers high-tech societies. 136 00:07:10,507 --> 00:07:13,697 Cell phone geolocation, telecom metadata, 137 00:07:13,697 --> 00:07:17,169 social media, email, text, financial transaction data, 138 00:07:17,169 --> 00:07:20,701 transportation data, it's a wealth of real-time data 139 00:07:20,701 --> 00:07:24,074 on the movements and social interactions of people. 140 00:07:24,074 --> 00:07:27,849 In short, we are more visible to machines 141 00:07:27,849 --> 00:07:30,091 than any people in history, 142 00:07:30,091 --> 00:07:35,707 and this perfectly suits the targeting needs of autonomous weapons. 143 00:07:35,707 --> 00:07:37,445 What you're looking at here 144 00:07:37,445 --> 00:07:40,691 is a link analysis map of a social group. 145 00:07:40,691 --> 00:07:44,325 Lines indicate social connectedness between individuals. 146 00:07:44,325 --> 00:07:47,205 And these types of maps can be automatically generated 147 00:07:47,205 --> 00:07:51,920 based on the data trail modern people leave behind. 148 00:07:51,920 --> 00:07:54,397 Now it's typically used to market goods and services 149 00:07:54,397 --> 00:07:58,813 to targeted demographics, but it's a dual-use technology, 150 00:07:58,813 --> 00:08:02,173 because targeting is used in another context. 151 00:08:02,173 --> 00:08:04,733 Notice that certain individuals are highlighted. 152 00:08:04,733 --> 00:08:08,013 These are the hubs of social networks. 153 00:08:08,013 --> 00:08:11,603 These are organizers, opinion-makers, leaders, 154 00:08:11,603 --> 00:08:14,285 and these people also can be automatically identified 155 00:08:14,285 --> 00:08:16,667 from their communication patterns. 156 00:08:16,667 --> 00:08:18,813 Now, if you're a marketer, you might then target them 157 00:08:18,813 --> 00:08:21,356 with product samples, try to spread your brand 158 00:08:21,356 --> 00:08:24,185 through their social group. 159 00:08:24,185 --> 00:08:26,138 But if you're a repressive government 160 00:08:26,138 --> 00:08:30,948 searching for political enemies, you might instead remove them, 161 00:08:30,948 --> 00:08:33,708 eliminate them, disrupt their social group, 162 00:08:33,708 --> 00:08:36,877 and those who remain behind lose social cohesion 163 00:08:36,877 --> 00:08:39,498 and organization. 164 00:08:39,498 --> 00:08:42,822 Now in a world of cheap, proliferating robotic weapons, 165 00:08:42,822 --> 00:08:45,457 borders would offer very little protection 166 00:08:45,457 --> 00:08:47,403 to critics of distant governments 167 00:08:47,403 --> 00:08:51,049 or trans-national criminal organizations. 168 00:08:51,049 --> 00:08:54,542 Popular movements agitating for change 169 00:08:54,542 --> 00:08:58,151 could be detected early and their leaders eliminated 170 00:08:58,151 --> 00:09:01,062 before their ideas achieve critical mass. 171 00:09:01,062 --> 00:09:03,653 And ideas achieving critical mass 172 00:09:03,653 --> 00:09:07,589 is what political activism in popular government is all about. 173 00:09:07,589 --> 00:09:11,586 Anonymous lethal weapons could make lethal action 174 00:09:11,586 --> 00:09:15,368 an easy choice for all sorts of competing interests. 175 00:09:15,368 --> 00:09:19,102 And this would put a chill on free speech 176 00:09:19,102 --> 00:09:24,410 and popular political action, the very heart of democracy. 177 00:09:24,410 --> 00:09:27,324 And this is why we need an international treaty 178 00:09:27,324 --> 00:09:30,864 on robotic weapons, and in particular a global ban 179 00:09:30,864 --> 00:09:34,772 on the development and deployment of killer robots. 180 00:09:34,772 --> 00:09:38,026 Now we already have international treaties 181 00:09:38,026 --> 00:09:41,412 on nuclear and biological weapons, and, while imperfect, 182 00:09:41,412 --> 00:09:43,700 these have largely worked. 183 00:09:43,700 --> 00:09:47,468 But robotic weapons might be every bit as dangerous, 184 00:09:47,468 --> 00:09:50,756 because they will almost certainly be used, 185 00:09:50,756 --> 00:09:55,783 and they would also be corrosive to our democratic institutions. 186 00:09:55,783 --> 00:09:59,251 Now in November 2012 the U.S. Department of Defense 187 00:09:59,251 --> 00:10:01,709 issued a directive requiring 188 00:10:01,709 --> 00:10:06,228 a human being be present in all lethal decisions. 189 00:10:06,228 --> 00:10:11,004 This temporarily effectively banned autonomous weapons in the U.S. military, 190 00:10:11,004 --> 00:10:14,757 but that directive needs to be made permanent. 191 00:10:14,757 --> 00:10:19,133 And it could set the stage for global action. 192 00:10:19,133 --> 00:10:22,978 Because we need an international legal framework 193 00:10:22,978 --> 00:10:25,116 for robotic weapons. 194 00:10:25,116 --> 00:10:28,044 And we need it now, before there's a devastating attack 195 00:10:28,044 --> 00:10:31,196 or a terrorist incident that causes nations of the world 196 00:10:31,196 --> 00:10:33,120 to rush to adopt these weapons 197 00:10:33,120 --> 00:10:36,891 before thinking through the consequences. 198 00:10:36,891 --> 00:10:39,872 Autonomous robotic weapons concentrate too much power 199 00:10:39,872 --> 00:10:46,155 in too few hands, and they would imperil democracy itself. 200 00:10:46,155 --> 00:10:48,841 Now, don't get me wrong, I think there are tons 201 00:10:48,841 --> 00:10:51,459 of great uses for unarmed civilian drones: 202 00:10:51,459 --> 00:10:55,398 environmental monitoring, search and rescue, logistics. 203 00:10:55,398 --> 00:10:58,224 If we have an international treaty on robotic weapons, 204 00:10:58,224 --> 00:11:01,811 how do we gain the benefits of autonomous drones 205 00:11:01,811 --> 00:11:04,459 and vehicles while still protecting ourselves 206 00:11:04,459 --> 00:11:08,439 against illegal robotic weapons? 207 00:11:08,439 --> 00:11:13,180 I think the secret will be transparency. 208 00:11:13,180 --> 00:11:16,193 No robot should have an expectation of privacy 209 00:11:16,193 --> 00:11:19,644 in a public place. 210 00:11:19,644 --> 00:11:24,692 (Applause) 211 00:11:24,692 --> 00:11:26,737 Each robot and drone should have 212 00:11:26,737 --> 00:11:29,620 a cryptographically signed I.D. burned in at the factory 213 00:11:29,620 --> 00:11:32,543 that can be used to track its movement through public spaces. 214 00:11:32,543 --> 00:11:35,924 We have license plates on cars, tail numbers on aircraft. 215 00:11:35,924 --> 00:11:37,765 This is no different. 216 00:11:37,765 --> 00:11:39,777 And every citizen should be able to download an app 217 00:11:39,777 --> 00:11:42,902 that shows the population of drones and autonomous vehicles 218 00:11:42,902 --> 00:11:45,331 moving through public spaces around them, 219 00:11:45,331 --> 00:11:48,064 both right now and historically. 220 00:11:48,064 --> 00:11:51,612 And civic leaders should deploy sensors and civic drones 221 00:11:51,612 --> 00:11:53,956 to detect rogue drones, 222 00:11:53,956 --> 00:11:57,132 and instead of sending killer drones of their own up to shoot them down, 223 00:11:57,132 --> 00:12:00,124 they should notify humans to their presence. 224 00:12:00,124 --> 00:12:02,730 And in certain very high-security areas, 225 00:12:02,730 --> 00:12:04,639 perhaps civic drones would snare them 226 00:12:04,639 --> 00:12:07,480 and drag them off to a bomb disposal facility. 227 00:12:07,480 --> 00:12:10,507 But notice, this is more an immune system 228 00:12:10,507 --> 00:12:11,828 than a weapons system. 229 00:12:11,828 --> 00:12:14,420 It would allow us to avail ourselves of the use 230 00:12:14,420 --> 00:12:16,452 of autonomous vehicles and drones 231 00:12:16,452 --> 00:12:20,747 while still preserving our open, civil society. 232 00:12:20,747 --> 00:12:23,746 We must ban the deployment and development 233 00:12:23,746 --> 00:12:25,608 of killer robots. 234 00:12:25,608 --> 00:12:30,458 Let's not succumb to the temptation to automate war. 235 00:12:30,458 --> 00:12:33,176 Autocratic governments and criminal organizations 236 00:12:33,176 --> 00:12:36,132 undoubtedly will, but let's not join them. 237 00:12:36,132 --> 00:12:38,023 Autonomous robotic weapons 238 00:12:38,023 --> 00:12:40,074 would concentrate too much power 239 00:12:40,074 --> 00:12:42,556 in too few unseen hands, 240 00:12:42,556 --> 00:12:45,811 and that would be corrosive to representative government. 241 00:12:45,811 --> 00:12:48,772 Let's make sure, for democracies at least, 242 00:12:48,772 --> 00:12:51,376 killer robots remain fiction. 243 00:12:51,376 --> 00:12:52,486 Thank you. 244 00:12:52,486 --> 00:12:57,051 (Applause) 245 00:12:57,051 --> 00:13:01,667 Thank you. (Applause)