Return to Video

The kill decision shouldn't belong to a robot

  • 0:01 - 0:04
    I write fiction sci-fi thrillers,
  • 0:04 - 0:06
    so if I say "killer robots,"
  • 0:06 - 0:08
    you'd probably think something like this.
  • 0:08 - 0:11
    But I'm actually not here to talk about fiction.
  • 0:11 - 0:14
    I'm here to talk about very real killer robots,
  • 0:14 - 0:17
    autonomous combat drones.
  • 0:17 - 0:21
    Now, I'm not referring to Predator and Reaper drones,
  • 0:21 - 0:24
    which have a human making targeting decisions.
  • 0:24 - 0:27
    I'm talking about fully autonomous robotic weapons
  • 0:27 - 0:30
    that make lethal decisions about human beings
  • 0:30 - 0:32
    all on their own.
  • 0:32 - 0:36
    There's actually a technical term for this: lethal autonomy.
  • 0:36 - 0:39
    Now, lethally autonomous killer robots
  • 0:39 - 0:42
    would take many forms -- flying, driving,
  • 0:42 - 0:45
    or just lying in wait.
  • 0:45 - 0:48
    And actually, they're very quickly becoming a reality.
  • 0:48 - 0:50
    These are two automatic sniper stations
  • 0:50 - 0:55
    currently deployed in the DMZ between North and South Korea.
  • 0:55 - 0:57
    Both of these machines are capable of automatically
  • 0:57 - 1:00
    identifying a human target and firing on it,
  • 1:00 - 1:05
    the one on the left at a distance of over a kilometer.
  • 1:05 - 1:08
    Now, in both cases, there's still a human in the loop
  • 1:08 - 1:10
    to make that lethal firing decision,
  • 1:10 - 1:16
    but it's not a technological requirement. It's a choice.
  • 1:16 - 1:19
    And it's that choice that I want to focus on,
  • 1:19 - 1:22
    because as we migrate lethal decision-making
  • 1:22 - 1:25
    from humans to software,
  • 1:25 - 1:28
    we risk not only taking the humanity out of war,
  • 1:28 - 1:32
    but also changing our social landscape entirely,
  • 1:32 - 1:34
    far from the battlefield.
  • 1:34 - 1:38
    That's because the way humans resolve conflict
  • 1:38 - 1:40
    shapes our social landscape.
  • 1:40 - 1:43
    And this has always been the case, throughout history.
  • 1:43 - 1:46
    For example, these were state-of-the-art weapons systems
  • 1:46 - 1:48
    in 1400 A.D.
  • 1:48 - 1:51
    Now they were both very expensive to build and maintain,
  • 1:51 - 1:54
    but with these you could dominate the populace,
  • 1:54 - 1:58
    and the distribution of political power in feudal society reflected that.
  • 1:58 - 2:01
    Power was focused at the very top.
  • 2:01 - 2:04
    And what changed? Technological innovation.
  • 2:04 - 2:06
    Gunpowder, cannon.
  • 2:06 - 2:10
    And pretty soon, armor and castles were obsolete,
  • 2:10 - 2:12
    and it mattered less who you brought to the battlefield
  • 2:12 - 2:16
    versus how many people you brought to the battlefield.
  • 2:16 - 2:20
    And as armies grew in size, the nation-state arose
  • 2:20 - 2:23
    as a political and logistical requirement of defense.
  • 2:23 - 2:26
    And as leaders had to rely on more of their populace,
  • 2:26 - 2:28
    they began to share power.
  • 2:28 - 2:30
    Representative government began to form.
  • 2:30 - 2:33
    So again, the tools we use to resolve conflict
  • 2:33 - 2:37
    shape our social landscape.
  • 2:37 - 2:41
    Autonomous robotic weapons are such a tool,
  • 2:41 - 2:46
    except that, by requiring very few people to go to war,
  • 2:46 - 2:51
    they risk re-centralizing power into very few hands,
  • 2:51 - 2:57
    possibly reversing a five-century trend toward democracy.
  • 2:57 - 2:59
    Now, I think, knowing this,
  • 2:59 - 3:03
    we can take decisive steps to preserve our democratic institutions,
  • 3:03 - 3:07
    to do what humans do best, which is adapt.
  • 3:07 - 3:09
    But time is a factor.
  • 3:09 - 3:12
    Seventy nations are developing remotely-piloted
  • 3:12 - 3:14
    combat drones of their own,
  • 3:14 - 3:17
    and as you'll see, remotely-piloted combat drones
  • 3:17 - 3:22
    are the precursors to autonomous robotic weapons.
  • 3:22 - 3:24
    That's because once you've deployed remotely-piloted drones,
  • 3:24 - 3:28
    there are three powerful factors pushing decision-making
  • 3:28 - 3:32
    away from humans and on to the weapon platform itself.
  • 3:32 - 3:38
    The first of these is the deluge of video that drones produce.
  • 3:38 - 3:41
    For example, in 2004, the U.S. drone fleet produced
  • 3:41 - 3:47
    a grand total of 71 hours of video surveillance for analysis.
  • 3:47 - 3:51
    By 2011, this had gone up to 300,000 hours,
  • 3:51 - 3:54
    outstripping human ability to review it all,
  • 3:54 - 3:58
    but even that number is about to go up drastically.
  • 3:58 - 4:01
    The Pentagon's Gorgon Stare and Argus programs
  • 4:01 - 4:04
    will put up to 65 independently operated camera eyes
  • 4:04 - 4:06
    on each drone platform,
  • 4:06 - 4:09
    and this would vastly outstrip human ability to review it.
  • 4:09 - 4:11
    And that means visual intelligence software will need
  • 4:11 - 4:15
    to scan it for items of interest.
  • 4:15 - 4:17
    And that means very soon
  • 4:17 - 4:19
    drones will tell humans what to look at,
  • 4:19 - 4:22
    not the other way around.
  • 4:22 - 4:24
    But there's a second powerful incentive pushing
  • 4:24 - 4:28
    decision-making away from humans and onto machines,
  • 4:28 - 4:31
    and that's electromagnetic jamming,
  • 4:31 - 4:33
    severing the connection between the drone
  • 4:33 - 4:36
    and its operator.
  • 4:36 - 4:38
    Now we saw an example of this in 2011
  • 4:38 - 4:41
    when an American RQ-170 Sentinel drone
  • 4:41 - 4:46
    got a bit confused over Iran due to a GPS spoofing attack,
  • 4:46 - 4:51
    but any remotely-piloted drone is susceptible to this type of attack,
  • 4:51 - 4:53
    and that means drones
  • 4:53 - 4:56
    will have to shoulder more decision-making.
  • 4:56 - 4:59
    They'll know their mission objective,
  • 4:59 - 5:04
    and they'll react to new circumstances without human guidance.
  • 5:04 - 5:07
    They'll ignore external radio signals
  • 5:07 - 5:09
    and send very few of their own.
  • 5:09 - 5:11
    Which brings us to, really, the third
  • 5:11 - 5:15
    and most powerful incentive pushing decision-making
  • 5:15 - 5:18
    away from humans and onto weapons:
  • 5:18 - 5:22
    plausible deniability.
  • 5:22 - 5:25
    Now we live in a global economy.
  • 5:25 - 5:29
    High-tech manufacturing is occurring on most continents.
  • 5:29 - 5:32
    Cyber espionage is spiriting away advanced designs
  • 5:32 - 5:34
    to parts unknown,
  • 5:34 - 5:36
    and in that environment, it is very likely
  • 5:36 - 5:40
    that a successful drone design will be knocked off in contract factories,
  • 5:40 - 5:43
    proliferate in the gray market.
  • 5:43 - 5:45
    And in that situation, sifting through the wreckage
  • 5:45 - 5:48
    of a suicide drone attack, it will be very difficult to say
  • 5:48 - 5:52
    who sent that weapon.
  • 5:52 - 5:55
    This raises the very real possibility
  • 5:55 - 5:58
    of anonymous war.
  • 5:58 - 6:01
    This could tilt the geopolitical balance on its head,
  • 6:01 - 6:04
    make it very difficult for a nation to turn its firepower
  • 6:04 - 6:07
    against an attacker, and that could shift the balance
  • 6:07 - 6:11
    in the 21st century away from defense and toward offense.
  • 6:11 - 6:14
    It could make military action a viable option
  • 6:14 - 6:16
    not just for small nations,
  • 6:16 - 6:19
    but criminal organizations, private enterprise,
  • 6:19 - 6:21
    even powerful individuals.
  • 6:21 - 6:25
    It could create a landscape of rival warlords
  • 6:25 - 6:28
    undermining rule of law and civil society.
  • 6:28 - 6:32
    Now if responsibility and transparency
  • 6:32 - 6:34
    are two of the cornerstones of representative government,
  • 6:34 - 6:39
    autonomous robotic weapons could undermine both.
  • 6:39 - 6:40
    Now you might be thinking that
  • 6:40 - 6:42
    citizens of high-tech nations
  • 6:42 - 6:45
    would have the advantage in any robotic war,
  • 6:45 - 6:49
    that citizens of those nations would be less vulnerable,
  • 6:49 - 6:53
    particularly against developing nations.
  • 6:53 - 6:57
    But I think the truth is the exact opposite.
  • 6:57 - 6:59
    I think citizens of high-tech societies
  • 6:59 - 7:03
    are more vulnerable to robotic weapons,
  • 7:03 - 7:07
    and the reason can be summed up in one word: data.
  • 7:07 - 7:11
    Data powers high-tech societies.
  • 7:11 - 7:14
    Cell phone geolocation, telecom metadata,
  • 7:14 - 7:17
    social media, email, text, financial transaction data,
  • 7:17 - 7:21
    transportation data, it's a wealth of real-time data
  • 7:21 - 7:24
    on the movements and social interactions of people.
  • 7:24 - 7:28
    In short, we are more visible to machines
  • 7:28 - 7:30
    than any people in history,
  • 7:30 - 7:36
    and this perfectly suits the targeting needs of autonomous weapons.
  • 7:36 - 7:37
    What you're looking at here
  • 7:37 - 7:41
    is a link analysis map of a social group.
  • 7:41 - 7:44
    Lines indicate social connectedness between individuals.
  • 7:44 - 7:47
    And these types of maps can be automatically generated
  • 7:47 - 7:52
    based on the data trail modern people leave behind.
  • 7:52 - 7:54
    Now it's typically used to market goods and services
  • 7:54 - 7:59
    to targeted demographics, but it's a dual-use technology,
  • 7:59 - 8:02
    because targeting is used in another context.
  • 8:02 - 8:05
    Notice that certain individuals are highlighted.
  • 8:05 - 8:08
    These are the hubs of social networks.
  • 8:08 - 8:12
    These are organizers, opinion-makers, leaders,
  • 8:12 - 8:14
    and these people also can be automatically identified
  • 8:14 - 8:17
    from their communication patterns.
  • 8:17 - 8:19
    Now, if you're a marketer, you might then target them
  • 8:19 - 8:21
    with product samples, try to spread your brand
  • 8:21 - 8:24
    through their social group.
  • 8:24 - 8:26
    But if you're a repressive government
  • 8:26 - 8:31
    searching for political enemies, you might instead remove them,
  • 8:31 - 8:34
    eliminate them, disrupt their social group,
  • 8:34 - 8:37
    and those who remain behind lose social cohesion
  • 8:37 - 8:39
    and organization.
  • 8:39 - 8:43
    Now in a world of cheap, proliferating robotic weapons,
  • 8:43 - 8:45
    borders would offer very little protection
  • 8:45 - 8:47
    to critics of distant governments
  • 8:47 - 8:51
    or trans-national criminal organizations.
  • 8:51 - 8:55
    Popular movements agitating for change
  • 8:55 - 8:58
    could be detected early and their leaders eliminated
  • 8:58 - 9:01
    before their ideas achieve critical mass.
  • 9:01 - 9:04
    And ideas achieving critical mass
  • 9:04 - 9:08
    is what political activism in popular government is all about.
  • 9:08 - 9:12
    Anonymous lethal weapons could make lethal action
  • 9:12 - 9:15
    an easy choice for all sorts of competing interests.
  • 9:15 - 9:19
    And this would put a chill on free speech
  • 9:19 - 9:24
    and popular political action, the very heart of democracy.
  • 9:24 - 9:27
    And this is why we need an international treaty
  • 9:27 - 9:31
    on robotic weapons, and in particular a global ban
  • 9:31 - 9:35
    on the development and deployment of killer robots.
  • 9:35 - 9:38
    Now we already have international treaties
  • 9:38 - 9:41
    on nuclear and biological weapons, and, while imperfect,
  • 9:41 - 9:44
    these have largely worked.
  • 9:44 - 9:47
    But robotic weapons might be every bit as dangerous,
  • 9:47 - 9:51
    because they will almost certainly be used,
  • 9:51 - 9:56
    and they would also be corrosive to our democratic institutions.
  • 9:56 - 9:59
    Now in November 2012 the U.S. Department of Defense
  • 9:59 - 10:02
    issued a directive requiring
  • 10:02 - 10:06
    a human being be present in all lethal decisions.
  • 10:06 - 10:11
    This temporarily effectively banned autonomous weapons in the U.S. military,
  • 10:11 - 10:15
    but that directive needs to be made permanent.
  • 10:15 - 10:19
    And it could set the stage for global action.
  • 10:19 - 10:23
    Because we need an international legal framework
  • 10:23 - 10:25
    for robotic weapons.
  • 10:25 - 10:28
    And we need it now, before there's a devastating attack
  • 10:28 - 10:31
    or a terrorist incident that causes nations of the world
  • 10:31 - 10:33
    to rush to adopt these weapons
  • 10:33 - 10:37
    before thinking through the consequences.
  • 10:37 - 10:40
    Autonomous robotic weapons concentrate too much power
  • 10:40 - 10:46
    in too few hands, and they would imperil democracy itself.
  • 10:46 - 10:49
    Now, don't get me wrong, I think there are tons
  • 10:49 - 10:51
    of great uses for unarmed civilian drones:
  • 10:51 - 10:55
    environmental monitoring, search and rescue, logistics.
  • 10:55 - 10:58
    If we have an international treaty on robotic weapons,
  • 10:58 - 11:02
    how do we gain the benefits of autonomous drones
  • 11:02 - 11:04
    and vehicles while still protecting ourselves
  • 11:04 - 11:08
    against illegal robotic weapons?
  • 11:08 - 11:13
    I think the secret will be transparency.
  • 11:13 - 11:16
    No robot should have an expectation of privacy
  • 11:16 - 11:20
    in a public place.
  • 11:20 - 11:25
    (Applause)
  • 11:25 - 11:27
    Each robot and drone should have
  • 11:27 - 11:30
    a cryptographically signed I.D. burned in at the factory
  • 11:30 - 11:33
    that can be used to track its movement through public spaces.
  • 11:33 - 11:36
    We have license plates on cars, tail numbers on aircraft.
  • 11:36 - 11:38
    This is no different.
  • 11:38 - 11:40
    And every citizen should be able to download an app
  • 11:40 - 11:43
    that shows the population of drones and autonomous vehicles
  • 11:43 - 11:45
    moving through public spaces around them,
  • 11:45 - 11:48
    both right now and historically.
  • 11:48 - 11:52
    And civic leaders should deploy sensors and civic drones
  • 11:52 - 11:54
    to detect rogue drones,
  • 11:54 - 11:57
    and instead of sending killer drones of their own up to shoot them down,
  • 11:57 - 12:00
    they should notify humans to their presence.
  • 12:00 - 12:03
    And in certain very high-security areas,
  • 12:03 - 12:05
    perhaps civic drones would snare them
  • 12:05 - 12:07
    and drag them off to a bomb disposal facility.
  • 12:07 - 12:11
    But notice, this is more an immune system
  • 12:11 - 12:12
    than a weapons system.
  • 12:12 - 12:14
    It would allow us to avail ourselves of the use
  • 12:14 - 12:16
    of autonomous vehicles and drones
  • 12:16 - 12:21
    while still preserving our open, civil society.
  • 12:21 - 12:24
    We must ban the deployment and development
  • 12:24 - 12:26
    of killer robots.
  • 12:26 - 12:30
    Let's not succumb to the temptation to automate war.
  • 12:30 - 12:33
    Autocratic governments and criminal organizations
  • 12:33 - 12:36
    undoubtedly will, but let's not join them.
  • 12:36 - 12:38
    Autonomous robotic weapons
  • 12:38 - 12:40
    would concentrate too much power
  • 12:40 - 12:43
    in too few unseen hands,
  • 12:43 - 12:46
    and that would be corrosive to representative government.
  • 12:46 - 12:49
    Let's make sure, for democracies at least,
  • 12:49 - 12:51
    killer robots remain fiction.
  • 12:51 - 12:52
    Thank you.
  • 12:52 - 12:57
    (Applause)
  • 12:57 - 13:02
    Thank you. (Applause)
Title:
The kill decision shouldn't belong to a robot
Speaker:
Daniel Suarez
Description:

As a novelist, Daniel Suarez spins dystopian tales of the future. But on the TEDGlobal stage, he talks us through a real-life scenario we all need to know more about: the rise of autonomous robotic weapons of war. Advanced drones, automated weapons and AI-powered intelligence-gathering tools, he suggests, could take the decision to make war out of the hands of humans.

more » « less
Video Language:
English
Team:
closed TED
Project:
TEDTalks
Duration:
13:20

English subtitles

Revisions Compare revisions