0:00:00.485,0:00:02.707 Ten years ago, I wrote a book which I entitled 0:00:02.707,0:00:05.800 "Our Final Century?" Question mark. 0:00:05.800,0:00:09.377 My publishers cut out the question mark. (Laughter) 0:00:09.377,0:00:11.259 The American publishers changed our title 0:00:11.259,0:00:15.168 to "Our Final Hour." 0:00:15.168,0:00:18.660 Americans like instant gratification and the reverse. 0:00:18.660,0:00:20.368 (Laughter) 0:00:20.368,0:00:22.118 And my theme was this: 0:00:22.118,0:00:26.284 Our Earth has existed for 45 million centuries, 0:00:26.284,0:00:28.297 but this one is special — 0:00:28.297,0:00:31.313 it's the first where one species, ours, 0:00:31.313,0:00:34.115 has the planet's future in its hands. 0:00:34.115,0:00:36.105 Over nearly all of Earth's history, 0:00:36.105,0:00:38.041 threats have come from nature — 0:00:38.041,0:00:41.537 disease, earthquakes, asteroids and so forth — 0:00:41.537,0:00:47.209 but from now on, the worst dangers come from us. 0:00:47.209,0:00:50.480 And it's now not just the nuclear threat; 0:00:50.480,0:00:52.231 in our interconnected world, 0:00:52.231,0:00:55.394 network breakdowns can cascade globally; 0:00:55.394,0:00:59.350 air travel can spread pandemics[br]worldwide within days; 0:00:59.350,0:01:02.677 and social media can spread panic and rumor 0:01:02.677,0:01:05.894 literally at the speed of light. 0:01:05.894,0:01:09.119 We fret too much about minor hazards — 0:01:09.119,0:01:13.150 improbable air crashes, carcinogens in food, 0:01:13.150,0:01:15.376 low radiation doses, and so forth — 0:01:15.376,0:01:18.201 but we and our political masters 0:01:18.201,0:01:22.404 are in denial about catastrophic scenarios. 0:01:22.404,0:01:25.442 The worst have thankfully not yet happened. 0:01:25.442,0:01:27.638 Indeed, they probably won't. 0:01:27.638,0:01:30.823 But if an event is potentially devastating, 0:01:30.823,0:01:33.691 it's worth paying a substantial premium 0:01:33.691,0:01:37.527 to safeguard against it, even if it's unlikely, 0:01:37.527,0:01:42.040 just as we take out fire insurance on our house. 0:01:42.040,0:01:47.037 And as science offers greater power and promise, 0:01:47.037,0:01:50.903 the downside gets scarier too. 0:01:50.903,0:01:53.142 We get ever more vulnerable. 0:01:53.142,0:01:54.980 Within a few decades, 0:01:54.980,0:01:57.210 millions will have the capability 0:01:57.210,0:02:00.331 to misuse rapidly advancing biotech, 0:02:00.331,0:02:03.884 just as they misuse cybertech today. 0:02:03.884,0:02:07.083 Freeman Dyson, in a TED Talk, 0:02:07.083,0:02:10.679 foresaw that children will design[br]and create new organisms 0:02:10.679,0:02:15.190 just as routinely as his generation[br]played with chemistry sets. 0:02:15.190,0:02:17.718 Well, this may be on the science fiction fringe, 0:02:17.718,0:02:20.901 but were even part of his scenario to come about, 0:02:20.901,0:02:23.638 our ecology and even our species 0:02:23.638,0:02:27.627 would surely not survive long unscathed. 0:02:27.627,0:02:31.490 For instance, there are some eco-extremists 0:02:31.490,0:02:33.999 who think that it would be better for the planet, 0:02:33.999,0:02:37.402 for Gaia, if there were far fewer humans. 0:02:37.402,0:02:40.119 What happens when such people have mastered 0:02:40.119,0:02:42.256 synthetic biology techniques 0:02:42.256,0:02:45.108 that will be widespread by 2050? 0:02:45.108,0:02:48.150 And by then, other science fiction nightmares 0:02:48.150,0:02:49.860 may transition to reality: 0:02:49.860,0:02:51.930 dumb robots going rogue, 0:02:51.930,0:02:54.347 or a network that develops a mind of its own 0:02:54.347,0:02:56.936 threatens us all. 0:02:56.936,0:03:00.206 Well, can we guard against such risks by regulation? 0:03:00.206,0:03:02.613 We must surely try, but these enterprises 0:03:02.613,0:03:06.142 are so competitive, so globalized, 0:03:06.142,0:03:08.122 and so driven by commercial pressure, 0:03:08.122,0:03:11.407 that anything that can be done[br]will be done somewhere, 0:03:11.407,0:03:13.443 whatever the regulations say. 0:03:13.443,0:03:16.930 It's like the drug laws — we try to regulate, but can't. 0:03:16.930,0:03:19.974 And the global village will have its village idiots, 0:03:19.974,0:03:23.470 and they'll have a global range. 0:03:23.470,0:03:25.761 So as I said in my book, 0:03:25.761,0:03:28.650 we'll have a bumpy ride through this century. 0:03:28.650,0:03:32.140 There may be setbacks to our society — 0:03:32.140,0:03:36.255 indeed, a 50 percent chance of a severe setback. 0:03:36.255,0:03:39.169 But are there conceivable events 0:03:39.169,0:03:41.330 that could be even worse, 0:03:41.330,0:03:44.760 events that could snuff out all life? 0:03:44.760,0:03:47.686 When a new particle accelerator came online, 0:03:47.686,0:03:49.475 some people anxiously asked, 0:03:49.475,0:03:51.725 could it destroy the Earth or, even worse, 0:03:51.725,0:03:54.384 rip apart the fabric of space? 0:03:54.384,0:03:57.927 Well luckily, reassurance could be offered. 0:03:57.927,0:03:59.971 I and others pointed out that nature 0:03:59.971,0:04:01.904 has done the same experiments 0:04:01.904,0:04:04.090 zillions of times already, 0:04:04.090,0:04:05.855 via cosmic ray collisions. 0:04:05.855,0:04:08.909 But scientists should surely be precautionary 0:04:08.909,0:04:11.489 about experiments that generate conditions 0:04:11.489,0:04:13.972 without precedent in the natural world. 0:04:13.972,0:04:17.395 Biologists should avoid release[br]of potentially devastating 0:04:17.395,0:04:20.110 genetically modified pathogens. 0:04:20.110,0:04:23.627 And by the way, our special aversion 0:04:23.627,0:04:27.088 to the risk of truly existential disasters 0:04:27.088,0:04:30.363 depends on a philosophical and ethical question, 0:04:30.363,0:04:32.033 and it's this: 0:04:32.033,0:04:34.341 Consider two scenarios. 0:04:34.341,0:04:39.577 Scenario A wipes out 90 percent of humanity. 0:04:39.577,0:04:43.473 Scenario B wipes out 100 percent. 0:04:43.473,0:04:46.391 How much worse is B than A? 0:04:46.391,0:04:49.414 Some would say 10 percent worse. 0:04:49.414,0:04:52.564 The body count is 10 percent higher. 0:04:52.564,0:04:55.470 But I claim that B is incomparably worse. 0:04:55.470,0:04:58.099 As an astronomer, I can't believe 0:04:58.099,0:05:00.566 that humans are the end of the story. 0:05:00.566,0:05:03.889 It is five billion years before the sun flares up, 0:05:03.889,0:05:06.600 and the universe may go on forever, 0:05:06.600,0:05:08.892 so post-human evolution, 0:05:08.892,0:05:11.082 here on Earth and far beyond, 0:05:11.082,0:05:13.796 could be as prolonged as the Darwinian process 0:05:13.796,0:05:17.077 that's led to us, and even more wonderful. 0:05:17.077,0:05:19.741 And indeed, future evolution[br]will happen much faster, 0:05:19.741,0:05:21.940 on a technological timescale, 0:05:21.940,0:05:24.239 not a natural selection timescale. 0:05:24.239,0:05:28.434 So we surely, in view of those immense stakes, 0:05:28.434,0:05:31.820 shouldn't accept even a one in a billion risk 0:05:31.820,0:05:34.049 that human extinction would foreclose 0:05:34.049,0:05:36.359 this immense potential. 0:05:36.359,0:05:38.131 Some scenarios that have been envisaged 0:05:38.131,0:05:39.950 may indeed be science fiction, 0:05:39.950,0:05:43.336 but others may be disquietingly real. 0:05:43.336,0:05:46.210 It's an important maxim that the unfamiliar 0:05:46.210,0:05:48.907 is not the same as the improbable, 0:05:48.907,0:05:51.305 and in fact, that's why we at Cambridge University 0:05:51.305,0:05:54.680 are setting up a center to study how to mitigate 0:05:54.680,0:05:56.712 these existential risks. 0:05:56.712,0:05:59.775 It seems it's worthwhile just for a few people 0:05:59.775,0:06:02.091 to think about these potential disasters. 0:06:02.091,0:06:05.104 And we need all the help we can get from others, 0:06:05.104,0:06:07.583 because we are stewards of a precious 0:06:07.583,0:06:11.066 pale blue dot in a vast cosmos, 0:06:11.066,0:06:14.444 a planet with 50 million centuries ahead of it. 0:06:14.444,0:06:17.000 And so let's not jeopardize that future. 0:06:17.000,0:06:18.795 And I'd like to finish with a quote 0:06:18.795,0:06:22.296 from a great scientist called Peter Medawar. 0:06:22.296,0:06:25.569 I quote, "The bells that toll for mankind 0:06:25.569,0:06:28.213 are like the bells of Alpine cattle. 0:06:28.213,0:06:30.499 They are attached to our own necks, 0:06:30.499,0:06:33.174 and it must be our fault if they do not make 0:06:33.174,0:06:35.305 a tuneful and melodious sound." 0:06:35.305,0:06:37.572 Thank you very much. 0:06:37.572,0:06:39.685 (Applause)