WEBVTT 00:00:00.485 --> 00:00:02.707 Ten years ago, I wrote a book which I entitled 00:00:02.707 --> 00:00:05.800 "Our Final Century?" Question mark. 00:00:05.800 --> 00:00:09.377 My publishers cut out the question mark. (Laughter) 00:00:09.377 --> 00:00:11.259 The American publishers changed our title 00:00:11.259 --> 00:00:15.168 to "Our Final Hour." 00:00:15.168 --> 00:00:18.660 Americans like instant gratification and the reverse. 00:00:18.660 --> 00:00:20.368 (Laughter) NOTE Paragraph 00:00:20.368 --> 00:00:22.118 And my theme was this: 00:00:22.118 --> 00:00:26.284 Our Earth has existed for 45 million centuries, 00:00:26.284 --> 00:00:28.297 but this one is special — 00:00:28.297 --> 00:00:31.313 it's the first where one species, ours, 00:00:31.313 --> 00:00:34.115 has the planet's future in its hands. 00:00:34.115 --> 00:00:36.105 Over nearly all of Earth's history, 00:00:36.105 --> 00:00:38.041 threats have come from nature — 00:00:38.041 --> 00:00:41.537 disease, earthquakes, asteroids and so forth — 00:00:41.537 --> 00:00:47.209 but from now on, the worst dangers come from us. 00:00:47.209 --> 00:00:50.480 And it's now not just the nuclear threat; 00:00:50.480 --> 00:00:52.231 in our interconnected world, 00:00:52.231 --> 00:00:55.394 network breakdowns can cascade globally; 00:00:55.394 --> 00:00:59.350 air travel can spread pandemics worldwide within days; 00:00:59.350 --> 00:01:02.677 and social media can spread panic and rumor 00:01:02.677 --> 00:01:05.894 literally at the speed of light. 00:01:05.894 --> 00:01:09.119 We fret too much about minor hazards — 00:01:09.119 --> 00:01:13.150 improbable air crashes, carcinogens in food, 00:01:13.150 --> 00:01:15.376 low radiation doses, and so forth — 00:01:15.376 --> 00:01:18.201 but we and our political masters 00:01:18.201 --> 00:01:22.404 are in denial about catastrophic scenarios. 00:01:22.404 --> 00:01:25.442 The worst have thankfully not yet happened. 00:01:25.442 --> 00:01:27.638 Indeed, they probably won't. 00:01:27.638 --> 00:01:30.823 But if an event is potentially devastating, 00:01:30.823 --> 00:01:33.691 it's worth paying a substantial premium 00:01:33.691 --> 00:01:37.527 to safeguard against it, even if it's unlikely, 00:01:37.527 --> 00:01:42.040 just as we take out fire insurance on our house. NOTE Paragraph 00:01:42.040 --> 00:01:47.037 And as science offers greater power and promise, 00:01:47.037 --> 00:01:50.903 the downside gets scarier too. 00:01:50.903 --> 00:01:53.142 We get ever more vulnerable. 00:01:53.142 --> 00:01:54.980 Within a few decades, 00:01:54.980 --> 00:01:57.210 millions will have the capability 00:01:57.210 --> 00:02:00.331 to misuse rapidly advancing biotech, 00:02:00.331 --> 00:02:03.884 just as they misuse cybertech today. 00:02:03.884 --> 00:02:07.083 Freeman Dyson, in a TED Talk, 00:02:07.083 --> 00:02:10.679 foresaw that children will design and create new organisms 00:02:10.679 --> 00:02:15.190 just as routinely as his generation played with chemistry sets. 00:02:15.190 --> 00:02:17.718 Well, this may be on the science fiction fringe, 00:02:17.718 --> 00:02:20.901 but were even part of his scenario to come about, 00:02:20.901 --> 00:02:23.638 our ecology and even our species 00:02:23.638 --> 00:02:27.627 would surely not survive long unscathed. 00:02:27.627 --> 00:02:31.490 For instance, there are some eco-extremists 00:02:31.490 --> 00:02:33.999 who think that it would be better for the planet, 00:02:33.999 --> 00:02:37.402 for Gaia, if there were far fewer humans. 00:02:37.402 --> 00:02:40.119 What happens when such people have mastered 00:02:40.119 --> 00:02:42.256 synthetic biology techniques 00:02:42.256 --> 00:02:45.108 that will be widespread by 2050? 00:02:45.108 --> 00:02:48.150 And by then, other science fiction nightmares 00:02:48.150 --> 00:02:49.860 may transition to reality: 00:02:49.860 --> 00:02:51.930 dumb robots going rogue, 00:02:51.930 --> 00:02:54.347 or a network that develops a mind of its own 00:02:54.347 --> 00:02:56.936 threatens us all. NOTE Paragraph 00:02:56.936 --> 00:03:00.206 Well, can we guard against such risks by regulation? 00:03:00.206 --> 00:03:02.613 We must surely try, but these enterprises 00:03:02.613 --> 00:03:06.142 are so competitive, so globalized, 00:03:06.142 --> 00:03:08.122 and so driven by commercial pressure, 00:03:08.122 --> 00:03:11.407 that anything that can be done will be done somewhere, 00:03:11.407 --> 00:03:13.443 whatever the regulations say. 00:03:13.443 --> 00:03:16.930 It's like the drug laws — we try to regulate, but can't. 00:03:16.930 --> 00:03:19.974 And the global village will have its village idiots, 00:03:19.974 --> 00:03:23.470 and they'll have a global range. NOTE Paragraph 00:03:23.470 --> 00:03:25.761 So as I said in my book, 00:03:25.761 --> 00:03:28.650 we'll have a bumpy ride through this century. 00:03:28.650 --> 00:03:32.140 There may be setbacks to our society — 00:03:32.140 --> 00:03:36.255 indeed, a 50 percent chance of a severe setback. 00:03:36.255 --> 00:03:39.169 But are there conceivable events 00:03:39.169 --> 00:03:41.330 that could be even worse, 00:03:41.330 --> 00:03:44.760 events that could snuff out all life? 00:03:44.760 --> 00:03:47.686 When a new particle accelerator came online, 00:03:47.686 --> 00:03:49.475 some people anxiously asked, 00:03:49.475 --> 00:03:51.725 could it destroy the Earth or, even worse, 00:03:51.725 --> 00:03:54.384 rip apart the fabric of space? 00:03:54.384 --> 00:03:57.927 Well luckily, reassurance could be offered. 00:03:57.927 --> 00:03:59.971 I and others pointed out that nature 00:03:59.971 --> 00:04:01.904 has done the same experiments 00:04:01.904 --> 00:04:04.090 zillions of times already, 00:04:04.090 --> 00:04:05.855 via cosmic ray collisions. 00:04:05.855 --> 00:04:08.909 But scientists should surely be precautionary 00:04:08.909 --> 00:04:11.489 about experiments that generate conditions 00:04:11.489 --> 00:04:13.972 without precedent in the natural world. 00:04:13.972 --> 00:04:17.395 Biologists should avoid release of potentially devastating 00:04:17.395 --> 00:04:20.110 genetically modified pathogens. NOTE Paragraph 00:04:20.110 --> 00:04:23.627 And by the way, our special aversion 00:04:23.627 --> 00:04:27.088 to the risk of truly existential disasters 00:04:27.088 --> 00:04:30.363 depends on a philosophical and ethical question, 00:04:30.363 --> 00:04:32.033 and it's this: 00:04:32.033 --> 00:04:34.341 Consider two scenarios. 00:04:34.341 --> 00:04:39.577 Scenario A wipes out 90 percent of humanity. 00:04:39.577 --> 00:04:43.473 Scenario B wipes out 100 percent. 00:04:43.473 --> 00:04:46.391 How much worse is B than A? 00:04:46.391 --> 00:04:49.414 Some would say 10 percent worse. 00:04:49.414 --> 00:04:52.564 The body count is 10 percent higher. 00:04:52.564 --> 00:04:55.470 But I claim that B is incomparably worse. 00:04:55.470 --> 00:04:58.099 As an astronomer, I can't believe 00:04:58.099 --> 00:05:00.566 that humans are the end of the story. 00:05:00.566 --> 00:05:03.889 It is five billion years before the sun flares up, 00:05:03.889 --> 00:05:06.600 and the universe may go on forever, 00:05:06.600 --> 00:05:08.892 so post-human evolution, 00:05:08.892 --> 00:05:11.082 here on Earth and far beyond, 00:05:11.082 --> 00:05:13.796 could be as prolonged as the Darwinian process 00:05:13.796 --> 00:05:17.077 that's led to us, and even more wonderful. 00:05:17.077 --> 00:05:19.741 And indeed, future evolution will happen much faster, 00:05:19.741 --> 00:05:21.940 on a technological timescale, 00:05:21.940 --> 00:05:24.239 not a natural selection timescale. NOTE Paragraph 00:05:24.239 --> 00:05:28.434 So we surely, in view of those immense stakes, 00:05:28.434 --> 00:05:31.820 shouldn't accept even a one in a billion risk 00:05:31.820 --> 00:05:34.049 that human extinction would foreclose 00:05:34.049 --> 00:05:36.359 this immense potential. 00:05:36.359 --> 00:05:38.131 Some scenarios that have been envisaged 00:05:38.131 --> 00:05:39.950 may indeed be science fiction, 00:05:39.950 --> 00:05:43.336 but others may be disquietingly real. 00:05:43.336 --> 00:05:46.210 It's an important maxim that the unfamiliar 00:05:46.210 --> 00:05:48.907 is not the same as the improbable, 00:05:48.907 --> 00:05:51.305 and in fact, that's why we at Cambridge University 00:05:51.305 --> 00:05:54.680 are setting up a center to study how to mitigate 00:05:54.680 --> 00:05:56.712 these existential risks. 00:05:56.712 --> 00:05:59.775 It seems it's worthwhile just for a few people 00:05:59.775 --> 00:06:02.091 to think about these potential disasters. 00:06:02.091 --> 00:06:05.104 And we need all the help we can get from others, 00:06:05.104 --> 00:06:07.583 because we are stewards of a precious 00:06:07.583 --> 00:06:11.066 pale blue dot in a vast cosmos, 00:06:11.066 --> 00:06:14.444 a planet with 50 million centuries ahead of it. 00:06:14.444 --> 00:06:17.000 And so let's not jeopardize that future. NOTE Paragraph 00:06:17.000 --> 00:06:18.795 And I'd like to finish with a quote 00:06:18.795 --> 00:06:22.296 from a great scientist called Peter Medawar. 00:06:22.296 --> 00:06:25.569 I quote, "The bells that toll for mankind 00:06:25.569 --> 00:06:28.213 are like the bells of Alpine cattle. 00:06:28.213 --> 00:06:30.499 They are attached to our own necks, 00:06:30.499 --> 00:06:33.174 and it must be our fault if they do not make 00:06:33.174 --> 00:06:35.305 a tuneful and melodious sound." NOTE Paragraph 00:06:35.305 --> 00:06:37.572 Thank you very much. NOTE Paragraph 00:06:37.572 --> 00:06:39.685 (Applause)