1 00:00:00,365 --> 00:00:01,747 十年前,我寫咗一本書 2 00:00:01,747 --> 00:00:05,240 叫做 「 我哋最後世紀? 」問號 3 00:00:05,840 --> 00:00:08,550 我嘅出版商將問號刪咗 4 00:00:08,550 --> 00:00:09,727 (笑聲) 5 00:00:09,727 --> 00:00:11,919 美國出版商將題目改咗做 6 00:00:11,919 --> 00:00:13,509 「 我哋最後一個鐘 」 7 00:00:13,749 --> 00:00:15,278 (笑聲) 8 00:00:15,278 --> 00:00:18,200 美國人鍾意即時行樂,將事情倒轉嚟睇 9 00:00:18,200 --> 00:00:20,368 (笑聲) 10 00:00:20,368 --> 00:00:22,118 我嘅主題係噉樣嘅: 11 00:00:22,118 --> 00:00:26,284 我哋嘅地球已經有四十五億年歷史 12 00:00:26,284 --> 00:00:28,323 但過去一百年嚟係特別嘅 13 00:00:28,323 --> 00:00:31,313 呢個係有史以嚟第一次 有一個物種,即係我哋人類 14 00:00:31,313 --> 00:00:34,115 將呢個星球嘅未來掌握喺自己手裏面 15 00:00:34,115 --> 00:00:36,105 睇返地球歷史 16 00:00:36,105 --> 00:00:38,041 威脅絕大部分都係嚟自大自然 17 00:00:38,041 --> 00:00:41,537 疾病、地震、隕石撞擊等等 18 00:00:41,537 --> 00:00:47,129 但係由宜家開始 最大嘅威脅嚟自於我哋自己 19 00:00:47,129 --> 00:00:50,480 並且宜家,唔單止核威脅 20 00:00:50,480 --> 00:00:52,231 喺我哋全球化嘅世界裏面 21 00:00:52,231 --> 00:00:55,394 網路故障可以癱瘓全球 22 00:00:55,394 --> 00:00:59,350 航班令流行病 可以喺幾日之內喺全世界傳播 23 00:00:59,350 --> 00:01:02,857 並且社會媒體可以散佈恐慌同謠言 24 00:01:02,857 --> 00:01:04,904 傳播速度同光速有得輝 25 00:01:05,894 --> 00:01:08,489 我哋會為到細小嘅威脅而焦慮 26 00:01:09,359 --> 00:01:13,000 冇乜可能發生嘅空難、食物嘅致癌物質 27 00:01:13,000 --> 00:01:15,686 低水平輻射等等 28 00:01:15,686 --> 00:01:21,631 但係我哋同我哋嘅政治領袖 都唔肯面對災難情景 29 00:01:22,404 --> 00:01:25,442 好彩最壞嘅仲未發生 30 00:01:25,442 --> 00:01:27,638 確實,佢哋應該唔會發生 31 00:01:27,638 --> 00:01:30,823 但係如果有件潛在毀滅性嘅事發生 32 00:01:30,823 --> 00:01:33,691 噉就值得我哋持續開支 33 00:01:33,691 --> 00:01:37,357 防止嗰件事發生 即使件事唔太可能發生都好 34 00:01:37,357 --> 00:01:40,140 就好似我哋會幫間屋買火險一樣 35 00:01:42,040 --> 00:01:44,037 道高一尺魔高一丈 36 00:01:44,037 --> 00:01:47,467 科學俾我哋更加多嘅力量同願望嘅同時 37 00:01:47,467 --> 00:01:50,903 科學可以係好危險 38 00:01:50,903 --> 00:01:53,142 我哋於是更加經唔起考驗 39 00:01:53,142 --> 00:01:57,000 幾十年後,幾百萬嘅人會有能力 40 00:01:57,000 --> 00:02:00,331 濫用發展一日千里嘅生物科技 41 00:02:00,331 --> 00:02:03,094 就好似宜家嘅人濫用網絡科技一樣 42 00:02:04,064 --> 00:02:07,083 Freeman Dyson 喺一場 TED 演講入面 43 00:02:07,083 --> 00:02:10,679 預見小朋友設計同創造新嘅生物 44 00:02:10,679 --> 00:02:14,570 就好似佢嗰代人當化學實驗遊戲噉玩 45 00:02:15,190 --> 00:02:17,718 當然,呢樣可能只會喺科幻小說出現 46 00:02:17,718 --> 00:02:20,901 但亦有可能係佢預言嘅情景︰ 47 00:02:20,901 --> 00:02:26,248 我哋嘅生態、以至物種 一定唔會平安無事咁長期生存 48 00:02:27,627 --> 00:02:31,400 例如,有一啲生態學嘅極端人士 49 00:02:31,400 --> 00:02:33,389 好似 Gaia 咁 50 00:02:33,389 --> 00:02:37,322 佢認為人類幾量越少 會對我哋嘅星球越好 51 00:02:37,322 --> 00:02:42,259 如果人類掌握咗 2050 年 就會全世界廣泛使用嘅合成生物技術 52 00:02:42,259 --> 00:02:45,108 會點樣? 53 00:02:45,108 --> 00:02:46,108 到嗰個時候 54 00:02:46,108 --> 00:02:48,150 其他嘅科幻小說裏面所提到嘅噩夢 55 00:02:48,150 --> 00:02:49,860 或者會成為現實 56 00:02:49,860 --> 00:02:51,930 沉默嘅機器人變得瘋狂 57 00:02:51,930 --> 00:02:55,547 網路有自己嘅思想,威脅我哋全人類 58 00:02:56,936 --> 00:02:59,726 噉,我哋可以通過規管 去避免呢啲危機嗎? 59 00:03:00,206 --> 00:03:01,703 我哋確實要去嘗試 60 00:03:01,703 --> 00:03:06,142 但係呢啲企業都好有競爭性 喺全世界都有生意利益 61 00:03:06,142 --> 00:03:08,122 太受市場影響 62 00:03:08,122 --> 00:03:09,453 因為法律 63 00:03:09,453 --> 00:03:13,443 令可以做嘅嘢 都要搬到去其他地方先做到 64 00:03:13,443 --> 00:03:16,860 就好似禁毒法律咁 我哋去監管,但始終失敗 65 00:03:16,860 --> 00:03:22,414 並且地球村將會有 一班蠢嘅人遍佈全世界 66 00:03:23,850 --> 00:03:25,761 所以,我喺書裏面話 67 00:03:25,761 --> 00:03:28,650 我哋喺呢個世紀嘅路會比較難行 68 00:03:28,650 --> 00:03:31,240 我哋社會可能會有一啲難題 69 00:03:31,930 --> 00:03:35,875 其實係有一半機會面對重大問題 70 00:03:36,495 --> 00:03:39,169 但係會唔會有可以預見到 71 00:03:39,169 --> 00:03:43,920 但更加弊、滅絕所有物種嘅事發生? 72 00:03:45,000 --> 00:03:47,566 當新嘅粒子加速器嘅消息喺上網出現 73 00:03:47,566 --> 00:03:49,295 一啲人會緊張咁問 74 00:03:49,295 --> 00:03:50,555 佢會唔會破壞地球 75 00:03:50,555 --> 00:03:54,394 或者話更弊嘅,將太空空間撕開? 76 00:03:54,394 --> 00:03:57,377 值得慶倖嘅係,我哋仍然係安全嘅 77 00:03:58,197 --> 00:03:59,611 我同其他人指出過 78 00:03:59,611 --> 00:04:05,814 大自然其實已經透過宇宙射線碰撞 做過無數次同樣嘅實驗 79 00:04:06,244 --> 00:04:08,159 但係科學家喺做研究嘅時候 80 00:04:08,159 --> 00:04:13,599 必須要小心整咗啲 宇宙從來冇發生過嘅嘢 81 00:04:14,202 --> 00:04:16,635 生物學家應該避免泄漏 82 00:04:16,635 --> 00:04:20,380 經過基因改造 具有潛在毀滅後果嘅病原體 83 00:04:20,380 --> 00:04:27,427 順便講下,我哋 對真正嘅災難之所以特別厭惡 84 00:04:27,427 --> 00:04:30,363 係源於一個哲學倫理問題 85 00:04:30,363 --> 00:04:31,923 個問題係: 86 00:04:31,923 --> 00:04:34,341 想像有兩個場景 87 00:04:34,341 --> 00:04:39,577 場景一,消滅 90% 人類 88 00:04:39,577 --> 00:04:43,473 場景二,消滅所有人類 89 00:04:43,473 --> 00:04:46,391 場景二比場景一壞幾多呢? 90 00:04:46,391 --> 00:04:49,414 有人會話差十個百分點 91 00:04:49,414 --> 00:04:51,874 死亡統計係高出十個百分點 92 00:04:52,564 --> 00:04:55,470 但係我唔會話場景二比較差 93 00:04:55,470 --> 00:04:57,299 作為一個天文學家 94 00:04:57,299 --> 00:05:00,496 我唔相信人類只係 成個地球歷史嘅最後一部分 95 00:05:00,976 --> 00:05:02,326 仲有五十億年 96 00:05:02,326 --> 00:05:04,329 太陽至會加劇燃燒,步入衰退期 97 00:05:04,329 --> 00:05:06,600 但宇宙仍然會繼續走落去 98 00:05:06,600 --> 00:05:09,052 噉樣,後人類嘅進化過程 99 00:05:09,052 --> 00:05:11,372 喺地球同埋外太空 100 00:05:11,372 --> 00:05:16,606 可以同達爾文進化論一樣咁長 甚至更加精彩 101 00:05:17,117 --> 00:05:20,181 其實,未來進化嘅速度會更加快 102 00:05:20,181 --> 00:05:21,940 呢樣係基於技術上嘅時間嘅考慮 103 00:05:21,940 --> 00:05:24,239 而並非自物競天擇嘅時間考慮 104 00:05:24,239 --> 00:05:28,434 所以面對咁大嘅利害關係 105 00:05:28,434 --> 00:05:31,820 面對十億個危險 我哋都唔應該妥栛一個危險 106 00:05:31,820 --> 00:05:35,469 因為要記住人類滅絕嘅可能 107 00:05:36,729 --> 00:05:39,941 一啲俾人提過嘅場景 都可能出自於科幻小說裏邊 108 00:05:39,950 --> 00:05:42,686 但係其他嘅場景 可能真係會成為現實 109 00:05:43,336 --> 00:05:46,180 有句重要嘅教誨係講︰ 110 00:05:46,180 --> 00:05:48,397 未知唔等於唔可能 111 00:05:48,937 --> 00:05:50,025 實際上因為噉樣 112 00:05:50,025 --> 00:05:53,030 我哋喺劍橋大學成立咗一個中心 113 00:05:53,030 --> 00:05:57,212 研究點樣降低以上存在嘅風險 114 00:05:57,212 --> 00:06:02,415 幾個人去思考呢啲潛在災難睇落幾唔錯 115 00:06:02,415 --> 00:06:04,604 但係我哋需要更加多嘅人幫手 116 00:06:05,104 --> 00:06:11,083 因為我哋係茫茫宇宙中 一個藍色星球嘅打理人 117 00:06:11,083 --> 00:06:14,444 呢個星球喺人類出現之前 已經有五十億年嘅歷史 118 00:06:14,444 --> 00:06:17,000 我哋唔要糟塌呢個星球嘅未來 119 00:06:17,000 --> 00:06:22,335 我諗引用著名科學屋 Peter Medawar 嘅一句話收尾 120 00:06:22,335 --> 00:06:25,569 「 為人類而設嘅鈴鈴 121 00:06:25,569 --> 00:06:28,213 就好似阿爾卑斯山嘅小牛嘅鈴鈴一樣, 122 00:06:28,213 --> 00:06:30,679 呢啲鈴鈴就好似掛喺我哋頸上面 123 00:06:30,679 --> 00:06:33,174 如果佢哋冇發出和諧而有旋律嘅聲音, 124 00:06:33,174 --> 00:06:35,305 噉就必定係我哋嘅錯。 」 125 00:06:35,305 --> 00:06:36,332 多謝 126 00:06:36,332 --> 00:06:38,745 (掌聲)