Return to Video

Can we prevent the end of the world?

  • 0:00 - 0:03
    Ten years ago, I wrote a book which I entitled
  • 0:03 - 0:06
    "Our Final Century?" Question mark.
  • 0:06 - 0:10
    My publishers cut out the question mark. (Laughter)
  • 0:10 - 0:12
    The American publishers changed our title
  • 0:12 - 0:15
    to "Our Final Hour."
  • 0:15 - 0:18
    Americans like instant gratification and the reverse.
  • 0:18 - 0:21
    (Laughter)
  • 0:21 - 0:22
    And my theme was this:
  • 0:22 - 0:27
    our Earth has existed for 45 million centuries,
  • 0:27 - 0:28
    but this one is special:
  • 0:28 - 0:31
    it's the first where one species, ours,
  • 0:31 - 0:34
    has the planet's future in its hands.
  • 0:34 - 0:36
    Over nearly all of Earth's history,
  • 0:36 - 0:38
    threats have come from nature
  • 0:38 - 0:42
    — disease, earthquakes, asteroids, and so forth —
  • 0:42 - 0:47
    but from now, the worst dangers come from us.
  • 0:47 - 0:51
    And it's now not just the nuclear threat:
  • 0:51 - 0:52
    in our interconnected world,
  • 0:52 - 0:56
    network breakdowns can cascade globally;
  • 0:56 - 1:00
    air travel can spread pandemics
    worldwide within days;
  • 1:00 - 1:03
    and social media can spread panic and rumor
  • 1:03 - 1:06
    literally at the speed of light.
  • 1:06 - 1:09
    We fret too much about minor hazards
  • 1:09 - 1:13
    — improbable air crashes, carcinogens in food,
  • 1:13 - 1:16
    low radiation doses, and so forth —
  • 1:16 - 1:18
    but we and our political masters
  • 1:18 - 1:23
    are in denial about catastrophic scenarios.
  • 1:23 - 1:26
    The worst have thankfully not yet happened.
  • 1:26 - 1:28
    Indeed, they probably won't.
  • 1:28 - 1:31
    But if an event is potentially devastating,
  • 1:31 - 1:34
    it's worth paying a substantial premium
  • 1:34 - 1:38
    to safeguard against it, even if it's unlikely,
  • 1:38 - 1:42
    just as we take our fire insurance on our house.
  • 1:42 - 1:48
    And as science offers greater power and promise,
  • 1:48 - 1:51
    the downside gets scarier too.
  • 1:51 - 1:53
    We get ever more vulnerable.
  • 1:53 - 1:55
    Within a few decades,
  • 1:55 - 1:57
    millions will have the capability
  • 1:57 - 2:01
    to misuse rapidly advancing biotech,
  • 2:01 - 2:04
    just as they misuse cybertech today.
  • 2:04 - 2:07
    Freeman Dyson, in a TEDTalk,
  • 2:07 - 2:11
    foresaw that children will design
    and create new organisms
  • 2:11 - 2:15
    just as routinely as his generation
    played with chemistry sets.
  • 2:15 - 2:18
    Well, this may be on the science fiction fringe,
  • 2:18 - 2:21
    but were even part of his scenario to come about,
  • 2:21 - 2:24
    our ecology and even our species
  • 2:24 - 2:28
    would surely not survive long unscathed.
  • 2:28 - 2:31
    For instance, there are some eco-extremists
  • 2:31 - 2:34
    who think that it would be better for the planet,
  • 2:34 - 2:37
    for Gaia, if there were far fewer humans.
  • 2:37 - 2:40
    What happens when such people have mastered
  • 2:40 - 2:42
    synthetic biology techniques
  • 2:42 - 2:45
    that will be widespread by 2050?
  • 2:45 - 2:48
    And by then, other science fiction nightmares
  • 2:48 - 2:50
    may transition to reality:
  • 2:50 - 2:52
    dumb robots going rogue,
  • 2:52 - 2:55
    or a network that develops a mind of its own
  • 2:55 - 2:57
    threatens us all.
  • 2:57 - 3:00
    Well, can we guard against such risks by regulation?
  • 3:00 - 3:03
    We must surely try, but these enterprises
  • 3:03 - 3:06
    are so competitive, so globalized,
  • 3:06 - 3:08
    and so driven by commercial pressure,
  • 3:08 - 3:11
    that anything that can be done
    will be done somewhere,
  • 3:11 - 3:13
    whatever the regulations say.
  • 3:13 - 3:17
    It's like the drug laws: we try to regulate, but can't.
  • 3:17 - 3:20
    And the global village will have its village idiots,
  • 3:20 - 3:24
    and they'll have a global range.
  • 3:24 - 3:26
    So as I said in my book,
  • 3:26 - 3:29
    we have a bumpy ride through this century.
  • 3:29 - 3:32
    There may be setbacks to our society,
  • 3:32 - 3:37
    indeed, a 50 percent chance of a severe setback.
  • 3:37 - 3:39
    But are there conceivable events
  • 3:39 - 3:41
    that could be even worse,
  • 3:41 - 3:45
    events that could snuff out all life?
  • 3:45 - 3:48
    When a new particle accelerator came online,
  • 3:48 - 3:49
    some people anxiously asked,
  • 3:49 - 3:52
    could it destroy the Earth or, even worse,
  • 3:52 - 3:55
    rip apart the fabric of space?
  • 3:55 - 3:58
    Well luckily, reassurance could be offered.
  • 3:58 - 4:00
    I and others pointed out that nature
  • 4:00 - 4:02
    has done the same experiments
  • 4:02 - 4:04
    zillions of times already,
  • 4:04 - 4:06
    via cosmic ray collisions.
  • 4:06 - 4:09
    But scientists should surely be precautionary
  • 4:09 - 4:11
    about experiments that generate conditions
  • 4:11 - 4:14
    without precedent in the natural world.
  • 4:14 - 4:17
    Biologists should avoid release
    of potentially devastating
  • 4:17 - 4:20
    genetically modified pathogens.
  • 4:20 - 4:24
    And by the way, our special aversion
  • 4:24 - 4:28
    to the risk of truly existential disasters
  • 4:28 - 4:31
    depends on a philosophical and ethical question,
  • 4:31 - 4:32
    and it's this:
  • 4:32 - 4:35
    consider two scenarios.
  • 4:35 - 4:40
    Scenario A wipes out 90 percent of humanity.
  • 4:40 - 4:44
    Scenario B wipes out a hundred percent.
  • 4:44 - 4:47
    How much worse is B than A?
  • 4:47 - 4:50
    Some would say 10 percent worse.
  • 4:50 - 4:53
    The body count is 10 percent higher.
  • 4:53 - 4:56
    But I claim that B is incomparably worse.
  • 4:56 - 4:58
    As an astronomer, I can't believe
  • 4:58 - 5:01
    that humans are the end of the story.
  • 5:01 - 5:04
    It is five billion years before the sun flares up,
  • 5:04 - 5:07
    and the universe may go on forever,
  • 5:07 - 5:09
    so post-human evolution,
  • 5:09 - 5:11
    here on Earth and far beyond,
  • 5:11 - 5:14
    could be as prolonged as the Darwinian process
  • 5:14 - 5:17
    that's led to us, and even more wonderful.
  • 5:17 - 5:20
    And indeed, future evolution
    will happen much faster,
  • 5:20 - 5:22
    on a technological timescale,
  • 5:22 - 5:25
    not a natural selection timescale.
  • 5:25 - 5:28
    So we surely, in view of those immense stakes,
  • 5:28 - 5:32
    shouldn't accept even a one in a billion risk
  • 5:32 - 5:34
    that human extinction would foreclose
  • 5:34 - 5:37
    this immense potential.
  • 5:37 - 5:38
    Some scenarios that have been envisaged
  • 5:38 - 5:40
    may indeed by science fiction,
  • 5:40 - 5:44
    but others may be disquietingly real.
  • 5:44 - 5:46
    It's an important maxim that the unfamiliar
  • 5:46 - 5:49
    is not the same as the improbable,
  • 5:49 - 5:52
    and in fact, that's why we at Cambridge University
  • 5:52 - 5:55
    are setting up a center to study how to mitigate
  • 5:55 - 5:57
    these existential risks.
  • 5:57 - 6:00
    It seems it's worthwhile just for a few people
  • 6:00 - 6:02
    to think about these potential disasters,
  • 6:02 - 6:05
    and we need all the help we can get from others,
  • 6:05 - 6:08
    because we are stewards of a precious
  • 6:08 - 6:11
    pale blue dot in a vast cosmos,
  • 6:11 - 6:14
    a planet with 50 million centuries ahead of it.
  • 6:14 - 6:17
    And so let's not jeopardize that future.
  • 6:17 - 6:19
    And I'd like to finish with a quote
  • 6:19 - 6:22
    from a great science called Peter Medawar.
  • 6:22 - 6:26
    I quote, "The bells that toll for mankind
  • 6:26 - 6:28
    are like the bells of our fine cattle:
  • 6:28 - 6:31
    they are attached to our own necks,
  • 6:31 - 6:33
    and it must be our fault if they do not make
  • 6:33 - 6:36
    a tuneful and melodious sound."
  • 6:36 - 6:38
    Thank you very much.
  • 6:38 - 6:40
    (Applause)
Title:
Can we prevent the end of the world?
Speaker:
Sir Martin Rees
Description:

more » « less
Video Language:
English
Team:
closed TED
Project:
TEDTalks
Duration:
06:52

English subtitles

Revisions Compare revisions