1 00:00:00,485 --> 00:00:02,707 Ten years ago, I wrote a book which I entitled 2 00:00:02,707 --> 00:00:05,800 "Our Final Century?" Question mark. 3 00:00:05,800 --> 00:00:09,377 My publishers cut out the question mark. (Laughter) 4 00:00:09,377 --> 00:00:11,259 The American publishers changed our title 5 00:00:11,259 --> 00:00:15,168 to "Our Final Hour." 6 00:00:15,168 --> 00:00:18,660 Americans like instant gratification and the reverse. 7 00:00:18,660 --> 00:00:20,368 (Laughter) 8 00:00:20,368 --> 00:00:22,118 And my theme was this: 9 00:00:22,118 --> 00:00:26,284 Our Earth has existed for 45 million centuries, 10 00:00:26,284 --> 00:00:28,297 but this one is special — 11 00:00:28,297 --> 00:00:31,313 it's the first where one species, ours, 12 00:00:31,313 --> 00:00:34,115 has the planet's future in its hands. 13 00:00:34,115 --> 00:00:36,105 Over nearly all of Earth's history, 14 00:00:36,105 --> 00:00:38,041 threats have come from nature — 15 00:00:38,041 --> 00:00:41,537 disease, earthquakes, asteroids and so forth — 16 00:00:41,537 --> 00:00:47,209 but from now on, the worst dangers come from us. 17 00:00:47,209 --> 00:00:50,480 And it's now not just the nuclear threat; 18 00:00:50,480 --> 00:00:52,231 in our interconnected world, 19 00:00:52,231 --> 00:00:55,394 network breakdowns can cascade globally; 20 00:00:55,394 --> 00:00:59,350 air travel can spread pandemics worldwide within days; 21 00:00:59,350 --> 00:01:02,677 and social media can spread panic and rumor 22 00:01:02,677 --> 00:01:05,894 literally at the speed of light. 23 00:01:05,894 --> 00:01:09,119 We fret too much about minor hazards — 24 00:01:09,119 --> 00:01:13,150 improbable air crashes, carcinogens in food, 25 00:01:13,150 --> 00:01:15,376 low radiation doses, and so forth — 26 00:01:15,376 --> 00:01:18,201 but we and our political masters 27 00:01:18,201 --> 00:01:22,404 are in denial about catastrophic scenarios. 28 00:01:22,404 --> 00:01:25,442 The worst have thankfully not yet happened. 29 00:01:25,442 --> 00:01:27,638 Indeed, they probably won't. 30 00:01:27,638 --> 00:01:30,823 But if an event is potentially devastating, 31 00:01:30,823 --> 00:01:33,691 it's worth paying a substantial premium 32 00:01:33,691 --> 00:01:37,527 to safeguard against it, even if it's unlikely, 33 00:01:37,527 --> 00:01:42,040 just as we take out fire insurance on our house. 34 00:01:42,040 --> 00:01:47,037 And as science offers greater power and promise, 35 00:01:47,037 --> 00:01:50,903 the downside gets scarier too. 36 00:01:50,903 --> 00:01:53,142 We get ever more vulnerable. 37 00:01:53,142 --> 00:01:54,980 Within a few decades, 38 00:01:54,980 --> 00:01:57,210 millions will have the capability 39 00:01:57,210 --> 00:02:00,331 to misuse rapidly advancing biotech, 40 00:02:00,331 --> 00:02:03,884 just as they misuse cybertech today. 41 00:02:03,884 --> 00:02:07,083 Freeman Dyson, in a TED Talk, 42 00:02:07,083 --> 00:02:10,679 foresaw that children will design and create new organisms 43 00:02:10,679 --> 00:02:15,190 just as routinely as his generation played with chemistry sets. 44 00:02:15,190 --> 00:02:17,718 Well, this may be on the science fiction fringe, 45 00:02:17,718 --> 00:02:20,901 but were even part of his scenario to come about, 46 00:02:20,901 --> 00:02:23,638 our ecology and even our species 47 00:02:23,638 --> 00:02:27,627 would surely not survive long unscathed. 48 00:02:27,627 --> 00:02:31,490 For instance, there are some eco-extremists 49 00:02:31,490 --> 00:02:33,999 who think that it would be better for the planet, 50 00:02:33,999 --> 00:02:37,402 for Gaia, if there were far fewer humans. 51 00:02:37,402 --> 00:02:40,119 What happens when such people have mastered 52 00:02:40,119 --> 00:02:42,256 synthetic biology techniques 53 00:02:42,256 --> 00:02:45,108 that will be widespread by 2050? 54 00:02:45,108 --> 00:02:48,150 And by then, other science fiction nightmares 55 00:02:48,150 --> 00:02:49,860 may transition to reality: 56 00:02:49,860 --> 00:02:51,930 dumb robots going rogue, 57 00:02:51,930 --> 00:02:54,347 or a network that develops a mind of its own 58 00:02:54,347 --> 00:02:56,936 threatens us all. 59 00:02:56,936 --> 00:03:00,206 Well, can we guard against such risks by regulation? 60 00:03:00,206 --> 00:03:02,613 We must surely try, but these enterprises 61 00:03:02,613 --> 00:03:06,142 are so competitive, so globalized, 62 00:03:06,142 --> 00:03:08,122 and so driven by commercial pressure, 63 00:03:08,122 --> 00:03:11,407 that anything that can be done will be done somewhere, 64 00:03:11,407 --> 00:03:13,443 whatever the regulations say. 65 00:03:13,443 --> 00:03:16,930 It's like the drug laws — we try to regulate, but can't. 66 00:03:16,930 --> 00:03:19,974 And the global village will have its village idiots, 67 00:03:19,974 --> 00:03:23,470 and they'll have a global range. 68 00:03:23,470 --> 00:03:25,761 So as I said in my book, 69 00:03:25,761 --> 00:03:28,650 we'll have a bumpy ride through this century. 70 00:03:28,650 --> 00:03:32,140 There may be setbacks to our society — 71 00:03:32,140 --> 00:03:36,255 indeed, a 50 percent chance of a severe setback. 72 00:03:36,255 --> 00:03:39,169 But are there conceivable events 73 00:03:39,169 --> 00:03:41,330 that could be even worse, 74 00:03:41,330 --> 00:03:44,760 events that could snuff out all life? 75 00:03:44,760 --> 00:03:47,686 When a new particle accelerator came online, 76 00:03:47,686 --> 00:03:49,475 some people anxiously asked, 77 00:03:49,475 --> 00:03:51,725 could it destroy the Earth or, even worse, 78 00:03:51,725 --> 00:03:54,384 rip apart the fabric of space? 79 00:03:54,384 --> 00:03:57,927 Well luckily, reassurance could be offered. 80 00:03:57,927 --> 00:03:59,971 I and others pointed out that nature 81 00:03:59,971 --> 00:04:01,904 has done the same experiments 82 00:04:01,904 --> 00:04:04,090 zillions of times already, 83 00:04:04,090 --> 00:04:05,855 via cosmic ray collisions. 84 00:04:05,855 --> 00:04:08,909 But scientists should surely be precautionary 85 00:04:08,909 --> 00:04:11,489 about experiments that generate conditions 86 00:04:11,489 --> 00:04:13,972 without precedent in the natural world. 87 00:04:13,972 --> 00:04:17,395 Biologists should avoid release of potentially devastating 88 00:04:17,395 --> 00:04:20,110 genetically modified pathogens. 89 00:04:20,110 --> 00:04:23,627 And by the way, our special aversion 90 00:04:23,627 --> 00:04:27,088 to the risk of truly existential disasters 91 00:04:27,088 --> 00:04:30,363 depends on a philosophical and ethical question, 92 00:04:30,363 --> 00:04:32,033 and it's this: 93 00:04:32,033 --> 00:04:34,341 Consider two scenarios. 94 00:04:34,341 --> 00:04:39,577 Scenario A wipes out 90 percent of humanity. 95 00:04:39,577 --> 00:04:43,473 Scenario B wipes out 100 percent. 96 00:04:43,473 --> 00:04:46,391 How much worse is B than A? 97 00:04:46,391 --> 00:04:49,414 Some would say 10 percent worse. 98 00:04:49,414 --> 00:04:52,564 The body count is 10 percent higher. 99 00:04:52,564 --> 00:04:55,470 But I claim that B is incomparably worse. 100 00:04:55,470 --> 00:04:58,099 As an astronomer, I can't believe 101 00:04:58,099 --> 00:05:00,566 that humans are the end of the story. 102 00:05:00,566 --> 00:05:03,889 It is five billion years before the sun flares up, 103 00:05:03,889 --> 00:05:06,600 and the universe may go on forever, 104 00:05:06,600 --> 00:05:08,892 so post-human evolution, 105 00:05:08,892 --> 00:05:11,082 here on Earth and far beyond, 106 00:05:11,082 --> 00:05:13,796 could be as prolonged as the Darwinian process 107 00:05:13,796 --> 00:05:17,077 that's led to us, and even more wonderful. 108 00:05:17,077 --> 00:05:19,741 And indeed, future evolution will happen much faster, 109 00:05:19,741 --> 00:05:21,940 on a technological timescale, 110 00:05:21,940 --> 00:05:24,239 not a natural selection timescale. 111 00:05:24,239 --> 00:05:28,434 So we surely, in view of those immense stakes, 112 00:05:28,434 --> 00:05:31,820 shouldn't accept even a one in a billion risk 113 00:05:31,820 --> 00:05:34,049 that human extinction would foreclose 114 00:05:34,049 --> 00:05:36,359 this immense potential. 115 00:05:36,359 --> 00:05:38,131 Some scenarios that have been envisaged 116 00:05:38,131 --> 00:05:39,950 may indeed be science fiction, 117 00:05:39,950 --> 00:05:43,336 but others may be disquietingly real. 118 00:05:43,336 --> 00:05:46,210 It's an important maxim that the unfamiliar 119 00:05:46,210 --> 00:05:48,907 is not the same as the improbable, 120 00:05:48,907 --> 00:05:51,305 and in fact, that's why we at Cambridge University 121 00:05:51,305 --> 00:05:54,680 are setting up a center to study how to mitigate 122 00:05:54,680 --> 00:05:56,712 these existential risks. 123 00:05:56,712 --> 00:05:59,775 It seems it's worthwhile just for a few people 124 00:05:59,775 --> 00:06:02,091 to think about these potential disasters. 125 00:06:02,091 --> 00:06:05,104 And we need all the help we can get from others, 126 00:06:05,104 --> 00:06:07,583 because we are stewards of a precious 127 00:06:07,583 --> 00:06:11,066 pale blue dot in a vast cosmos, 128 00:06:11,066 --> 00:06:14,444 a planet with 50 million centuries ahead of it. 129 00:06:14,444 --> 00:06:17,000 And so let's not jeopardize that future. 130 00:06:17,000 --> 00:06:18,795 And I'd like to finish with a quote 131 00:06:18,795 --> 00:06:22,296 from a great scientist called Peter Medawar. 132 00:06:22,296 --> 00:06:25,569 I quote, "The bells that toll for mankind 133 00:06:25,569 --> 00:06:28,213 are like the bells of Alpine cattle. 134 00:06:28,213 --> 00:06:30,499 They are attached to our own necks, 135 00:06:30,499 --> 00:06:33,174 and it must be our fault if they do not make 136 00:06:33,174 --> 00:06:35,305 a tuneful and melodious sound." 137 00:06:35,305 --> 00:06:37,572 Thank you very much. 138 00:06:37,572 --> 00:06:39,685 (Applause)