1 00:00:00,742 --> 00:00:07,963 So we humans have an extraordinary potential for goodness, 2 00:00:07,963 --> 00:00:12,363 but also an immense power to do harm. 3 00:00:12,363 --> 00:00:18,029 Any tool can be used to build or to destroy. 4 00:00:18,036 --> 00:00:21,217 That all depends on our motivation. 5 00:00:21,217 --> 00:00:24,664 Therefore, it is all the more important 6 00:00:24,664 --> 00:00:28,938 to foster an altruistic motivation rather than a selfish one. 7 00:00:30,508 --> 00:00:37,008 So now we indeed are facing many challenges in our times. 8 00:00:37,008 --> 00:00:40,325 Those could be personal challenges. 9 00:00:40,325 --> 00:00:44,911 Our own mind can be our best friend or our worst enemy. 10 00:00:46,341 --> 00:00:49,225 There's also societal challenges: 11 00:00:49,225 --> 00:00:54,924 poverty in the midst of plenty, inequalities, conflict, injustice. 12 00:00:54,924 --> 00:00:59,120 And then there are the new challenges, which we don't expect. 13 00:00:59,120 --> 00:01:03,763 Ten thousand years ago, there were about five million human beings on Earth. 14 00:01:03,763 --> 00:01:05,373 Whatever they could do, 15 00:01:05,373 --> 00:01:10,559 the Earth's resilience would soon heal human activities. 16 00:01:10,559 --> 00:01:13,775 After the Industrial and Technological Revolutions, 17 00:01:13,775 --> 00:01:16,008 that's not the same anymore. 18 00:01:16,008 --> 00:01:20,073 We are now the major agent of impact on our Earth. 19 00:01:20,073 --> 00:01:24,977 We enter the Anthropocene, the era of human beings. 20 00:01:24,977 --> 00:01:31,970 So in a way, if we were to say we need to continue this endless growth, 21 00:01:31,970 --> 00:01:35,616 endless use of material resources, 22 00:01:35,616 --> 00:01:38,514 it's like if this man was saying -- 23 00:01:38,514 --> 00:01:43,386 and I heard a former head of state, I won't mention who, saying -- 24 00:01:43,386 --> 00:01:47,395 "Five years ago, we were at the edge of the precipice. 25 00:01:47,395 --> 00:01:49,972 Today we made a big step forward." 26 00:01:50,587 --> 00:01:56,590 So this edge is the same that has been defined by scientists 27 00:01:56,590 --> 00:01:59,244 as the planetary boundaries. 28 00:01:59,244 --> 00:02:03,705 And within those boundaries, they can carry a number of factors. 29 00:02:03,705 --> 00:02:09,231 We can still prosper, humanity can still prosper for 150,000 years 30 00:02:09,231 --> 00:02:12,563 if we keep the same stability of climate 31 00:02:12,563 --> 00:02:15,721 as in the Holocene for the last 10,000 years. 32 00:02:15,721 --> 00:02:21,479 But this depends on choosing a voluntary simplicity, 33 00:02:21,479 --> 00:02:24,160 growing qualitatively, not quantitatively. 34 00:02:24,160 --> 00:02:30,394 So in 1900, as you can see, we were well within the limits of safety. 35 00:02:30,394 --> 00:02:35,793 Now, in 1950 came the great acceleration. 36 00:02:35,793 --> 00:02:40,762 Now hold your breath, not too long, to imagine what comes next. 37 00:02:40,762 --> 00:02:46,907 Now we have vastly overrun some of the planetary boundaries. 38 00:02:46,907 --> 00:02:50,866 Just to take biodiversity, at the current rate, 39 00:02:50,866 --> 00:02:57,288 by 2050, 30 percent of all species on Earth will have disappeared. 40 00:02:57,288 --> 00:03:03,085 Even if we keep their DNA in some fridge, that's not going to be reversible. 41 00:03:03,085 --> 00:03:04,936 So here I am sitting 42 00:03:04,936 --> 00:03:10,836 in front of a 7,000-meter-high, 21,000-foot glacier in Bhutan. 43 00:03:10,836 --> 00:03:17,987 At the Third Pole, 2,000 glaciers are melting fast, faster than the Arctic. 44 00:03:17,987 --> 00:03:20,994 So what can we do in that situation? 45 00:03:22,144 --> 00:03:29,453 Well, however complex politically, economically, scientifically 46 00:03:29,453 --> 00:03:31,851 the question of the environment is, 47 00:03:31,851 --> 00:03:38,678 it simply boils down to a question of altruism versus selfishness. 48 00:03:38,678 --> 00:03:42,287 I'm a Marxist of the Groucho tendency. 49 00:03:42,287 --> 00:03:43,712 (Laughter) 50 00:03:43,712 --> 00:03:47,121 Groucho Marx said, "Why should I care about future generations? 51 00:03:47,121 --> 00:03:49,151 What have they ever done for me?" 52 00:03:49,151 --> 00:03:50,647 (Laughter) 53 00:03:50,647 --> 00:03:55,430 Unfortunately, I heard the billionaire Steve Forbes, 54 00:03:55,430 --> 00:03:58,958 on Fox News, saying exactly the same thing, but seriously. 55 00:03:58,958 --> 00:04:01,293 He was told about the rise of the ocean, 56 00:04:01,293 --> 00:04:04,640 and he said, "I find it absurd to change my behavior today 57 00:04:04,640 --> 00:04:07,695 for something that will happen in a hundred years." 58 00:04:07,695 --> 00:04:10,593 So if you don't care for future generations, 59 00:04:10,593 --> 00:04:13,494 just go for it. 60 00:04:13,494 --> 00:04:16,420 So one of the main challenges of our times 61 00:04:16,420 --> 00:04:19,538 is to reconcile three time scales: 62 00:04:19,538 --> 00:04:21,683 the short term of the economy, 63 00:04:21,683 --> 00:04:25,915 the ups and downs of the stock market, the end-of-the-year accounts; 64 00:04:25,915 --> 00:04:28,621 the midterm of the quality of life -- 65 00:04:28,621 --> 00:04:33,945 what is the quality every moment of our life, over 10 years and 20 years? -- 66 00:04:33,945 --> 00:04:37,554 and the long term of the environment. 67 00:04:37,554 --> 00:04:39,903 When the environmentalists speak with economists, 68 00:04:39,903 --> 00:04:42,982 it's like a schizophrenic dialogue, completely incoherent. 69 00:04:42,982 --> 00:04:45,814 They don't speak the same language. 70 00:04:45,814 --> 00:04:49,354 Now, for the last 10 years, I went around the world 71 00:04:49,354 --> 00:04:53,452 meeting economists, scientists, neuroscientists, environmentalists, 72 00:04:53,452 --> 00:04:57,912 philosophers, thinkers in the Himalayas, all over the place. 73 00:04:57,912 --> 00:05:01,834 It seems to me, there's only one concept 74 00:05:01,834 --> 00:05:04,794 that can reconcile those three time scales. 75 00:05:04,794 --> 00:05:09,198 It is simply having more consideration for others. 76 00:05:09,198 --> 00:05:14,099 If you have more consideration for others, you will have a caring economics, 77 00:05:14,099 --> 00:05:17,051 where finance is at the service of society 78 00:05:17,051 --> 00:05:20,319 and not society at the service of finance. 79 00:05:20,319 --> 00:05:22,187 You will not play at the casino 80 00:05:22,187 --> 00:05:25,073 with the resources that people have entrusted you with. 81 00:05:25,073 --> 00:05:27,919 If you have more consideration for others, 82 00:05:27,919 --> 00:05:31,396 you will make sure that you remedy inequality, 83 00:05:31,396 --> 00:05:35,136 that you bring some kind of well-being within society, 84 00:05:35,136 --> 00:05:37,176 in education, at the workplace. 85 00:05:37,176 --> 00:05:40,599 Otherwise, a nation that is the most powerful and the richest 86 00:05:40,599 --> 00:05:43,686 but everyone is miserable, what's the point? 87 00:05:43,686 --> 00:05:45,973 And if you have more consideration for others, 88 00:05:45,973 --> 00:05:49,369 you are not going to ransack that planet that we have 89 00:05:49,369 --> 00:05:53,703 and at the current rate, we don't have three planets to continue that way. 90 00:05:53,703 --> 00:05:56,056 So the question is, 91 00:05:56,056 --> 00:06:00,387 okay, altruism is the answer, it's not just a novel ideal, 92 00:06:00,387 --> 00:06:03,611 but can it be a real, pragmatic solution? 93 00:06:03,611 --> 00:06:06,469 And first of all, does it exist, 94 00:06:06,469 --> 00:06:10,213 true altruism, or are we so selfish? 95 00:06:10,213 --> 00:06:15,553 So some philosophers thought we were irredeemably selfish. 96 00:06:15,553 --> 00:06:20,765 But are we really all just like rascals? 97 00:06:20,765 --> 00:06:23,579 That's good news, isn't it? 98 00:06:23,579 --> 00:06:26,108 Many philosophers, like Hobbes, have said so. 99 00:06:26,108 --> 00:06:29,429 But not everyone looks like a rascal. 100 00:06:29,429 --> 00:06:32,262 Or is man a wolf for man? 101 00:06:32,262 --> 00:06:35,150 But this guy doesn't seem too bad. 102 00:06:35,150 --> 00:06:38,073 He's one of my friends in Tibet. 103 00:06:38,073 --> 00:06:40,312 He's very kind. 104 00:06:40,312 --> 00:06:43,937 So now, we love cooperation. 105 00:06:43,937 --> 00:06:48,275 There's no better joy than working together, is there? 106 00:06:48,275 --> 00:06:52,350 And then not only humans. 107 00:06:52,350 --> 00:06:54,837 Then, of course, there's the struggle for life, 108 00:06:54,837 --> 00:06:59,178 the survival of the fittest, social Darwinism. 109 00:06:59,178 --> 00:07:05,015 But in evolution, cooperation -- though competition exists, of course -- 110 00:07:05,015 --> 00:07:10,732 cooperation has to be much more creative to go to increased levels of complexity. 111 00:07:10,732 --> 00:07:15,414 We are super-cooperators and we should even go further. 112 00:07:15,414 --> 00:07:21,443 So now, on top of that, the quality of human relationships. 113 00:07:21,443 --> 00:07:25,925 The OECD did a survey among 10 factors, including income, everything. 114 00:07:25,925 --> 00:07:29,268 The first one that people said, that's the main thing for my happiness, 115 00:07:29,268 --> 00:07:32,621 is quality of social relationships. 116 00:07:32,621 --> 00:07:35,498 Not only in humans. 117 00:07:35,498 --> 00:07:38,844 And look at those great-grandmothers. 118 00:07:38,846 --> 00:07:44,000 So now, this idea that if we go deep within, 119 00:07:44,000 --> 00:07:46,570 we are irredeemably selfish, 120 00:07:46,570 --> 00:07:49,224 this is armchair science. 121 00:07:49,224 --> 00:07:51,491 There is not a single sociological study, 122 00:07:51,491 --> 00:07:54,737 psychological study, that's ever shown that. 123 00:07:54,737 --> 00:07:56,697 Rather, the opposite. 124 00:07:56,697 --> 00:08:00,355 My friend, Daniel Batson, spent a whole life 125 00:08:00,355 --> 00:08:03,118 putting people in the lab in very complex situations. 126 00:08:03,118 --> 00:08:07,487 And of course we are sometimes selfish, and some people more than others. 127 00:08:07,487 --> 00:08:10,149 But he found that systematically, no matter what, 128 00:08:10,149 --> 00:08:13,149 there's a significant number of people 129 00:08:13,149 --> 00:08:16,504 who do behave altruistically, no matter what. 130 00:08:16,504 --> 00:08:19,696 If you see someone deeply wounded, great suffering, 131 00:08:19,696 --> 00:08:22,318 you might just help out of empathic distress -- 132 00:08:22,318 --> 00:08:26,468 you can't stand it, so it's better to help than to keep on looking at that person. 133 00:08:26,468 --> 00:08:32,344 So we tested all that, and in the end, he said, clearly people can be altruistic. 134 00:08:32,344 --> 00:08:34,284 So that's good news. 135 00:08:34,284 --> 00:08:39,896 And even further, we should look at the banality of goodness. 136 00:08:39,896 --> 00:08:41,600 Now look at here. 137 00:08:41,600 --> 00:08:44,370 When we come out, we aren't going to say, "That's so nice. 138 00:08:44,370 --> 00:08:48,937 There was no fistfight while this mob was thinking about altruism." 139 00:08:48,937 --> 00:08:51,099 No, that's expected, isn't it? 140 00:08:51,099 --> 00:08:54,278 If there was a fistfight, we would speak of that for months. 141 00:08:54,278 --> 00:08:57,949 So the banality of goodness is something that doesn't attract your attention, 142 00:08:57,949 --> 00:08:59,437 but it exists. 143 00:08:59,437 --> 00:09:04,913 Now, look at this. 144 00:09:09,253 --> 00:09:12,054 So some psychologists said, 145 00:09:12,054 --> 00:09:15,291 when I tell them I run 140 humanitarian projects in the Himalayas 146 00:09:15,291 --> 00:09:17,545 that give me so much joy, 147 00:09:17,545 --> 00:09:20,799 they said, "Oh, I see, you work for the warm glow. 148 00:09:20,799 --> 00:09:23,703 That is not altruistic. You just feel good." 149 00:09:23,703 --> 00:09:26,991 You think this guy, when he jumped in front of the train, 150 00:09:26,991 --> 00:09:29,277 he thought, "I'm going to feel so good when this is over?" 151 00:09:29,277 --> 00:09:31,563 (Laughter) 152 00:09:31,563 --> 00:09:33,849 But that's not the end of it. 153 00:09:33,849 --> 00:09:36,391 They say, well, but when you interviewed him, he said, 154 00:09:36,391 --> 00:09:39,526 "I had no choice. I had to jump, of course." 155 00:09:39,526 --> 00:09:43,407 He has no choice. Automatic behavior. It's neither selfish nor altruistic. 156 00:09:43,407 --> 00:09:44,882 No choice? 157 00:09:44,882 --> 00:09:47,844 Well of course, this guy's not going to think for half an hour, 158 00:09:47,844 --> 00:09:49,881 "Should I give my hand? Not give my hand?" 159 00:09:49,881 --> 00:09:53,676 He does it. There is a choice, but it's obvious, it's immediate. 160 00:09:53,676 --> 00:09:56,037 And then, also, there he had a choice. 161 00:09:56,037 --> 00:09:58,738 (Laughter) 162 00:09:58,738 --> 00:10:02,500 There are people who had choice, like Pastor André Trocmé and his wife, 163 00:10:02,500 --> 00:10:05,187 and the whole village of Le Chambon-sur-Lignon in France. 164 00:10:05,187 --> 00:10:09,135 For the whole Second World War, they saved 3,500 Jews, 165 00:10:09,135 --> 00:10:11,792 gave them shelter, brought them to Switzerland, 166 00:10:11,792 --> 00:10:15,237 against all odds, at the risk of their lives and those of their family. 167 00:10:15,237 --> 00:10:17,394 So altruism does exist. 168 00:10:17,394 --> 00:10:19,119 So what is altruism? 169 00:10:19,119 --> 00:10:23,021 It is the wish: May others be happy and find the cause of happiness. 170 00:10:23,021 --> 00:10:28,266 Now, empathy is the affective resonance or cognitive resonance that tells you, 171 00:10:28,266 --> 00:10:30,977 this person is joyful, this person suffers. 172 00:10:30,977 --> 00:10:34,463 But empathy alone is not sufficient. 173 00:10:34,463 --> 00:10:36,686 If you keep on being confronted with suffering, 174 00:10:36,686 --> 00:10:39,447 you might have empathic distress, burnout, 175 00:10:39,447 --> 00:10:43,507 so you need the greater sphere of loving-kindness. 176 00:10:43,507 --> 00:10:46,234 With Tania Singer at the Max Planck Institute of Leipzig, 177 00:10:46,234 --> 00:10:52,335 we showed that the brain networks for empathy and loving-kindness are different. 178 00:10:52,335 --> 00:10:54,416 Now, that's all well done, 179 00:10:54,416 --> 00:10:59,790 so we got that from evolution, from maternal care, parental love, 180 00:10:59,790 --> 00:11:01,625 but we need to extend that. 181 00:11:01,625 --> 00:11:04,968 It can be extended even to other species. 182 00:11:04,968 --> 00:11:09,375 Now, if we want a more altruistic society, we need two things: 183 00:11:09,375 --> 00:11:12,592 individual change and societal change. 184 00:11:12,592 --> 00:11:15,150 So is individual change possible? 185 00:11:15,150 --> 00:11:18,349 Two thousand years of contemplative study said yes, it is. 186 00:11:18,349 --> 00:11:21,951 Now, 15 years of collaboration with neuroscience and epigenetics 187 00:11:21,951 --> 00:11:26,435 said yes, our brains change when you train in altruism. 188 00:11:26,435 --> 00:11:30,707 So I spent 120 hours in an MRI machine. 189 00:11:30,707 --> 00:11:33,493 This is the first time I went after two and a half hours. 190 00:11:33,493 --> 00:11:37,167 And then the result has been published in many scientific papers. 191 00:11:37,167 --> 00:11:40,749 It shows without ambiguity that there is structural change 192 00:11:40,749 --> 00:11:44,502 and functional change in the brain when you train the altruistic love. 193 00:11:44,502 --> 00:11:46,252 Just to give you an idea: 194 00:11:46,252 --> 00:11:49,073 this is the meditator at rest on the left, 195 00:11:49,073 --> 00:11:52,776 meditator in compassion meditation, you see all the activity, 196 00:11:52,776 --> 00:11:55,330 and then the control group at rest, nothing happened, 197 00:11:55,330 --> 00:11:57,272 in meditation, nothing happened. 198 00:11:57,272 --> 00:11:59,250 They have not been trained. 199 00:11:59,250 --> 00:12:03,671 So do you need 50,000 hours of meditation? No, you don't. 200 00:12:03,671 --> 00:12:07,837 Four weeks, 20 minutes a day, of caring, mindfulness meditation 201 00:12:07,838 --> 00:12:14,141 already brings a structural change in the brain compared to a control group. 202 00:12:14,141 --> 00:12:17,879 That's only 20 minutes a day for four weeks. 203 00:12:17,879 --> 00:12:21,217 Even with preschoolers -- Richard Davidson did that in Madison. 204 00:12:21,217 --> 00:12:27,627 An eight-week program: gratitude, loving- kindness, cooperation, mindful breathing. 205 00:12:27,627 --> 00:12:29,882 You would say, "Oh, they're just preschoolers." 206 00:12:29,882 --> 00:12:31,508 Look after eight weeks, 207 00:12:31,508 --> 00:12:33,958 the pro-social behavior, that's the blue line. 208 00:12:33,958 --> 00:12:39,402 And then comes the ultimate scientific test, the stickers test. 209 00:12:39,402 --> 00:12:43,350 Before, you determine for each child who is their best friend in the class, 210 00:12:43,350 --> 00:12:47,425 their least favorite child, an unknown child, and the sick child, 211 00:12:47,425 --> 00:12:50,129 and they have to give stickers away. 212 00:12:50,129 --> 00:12:54,181 So before the intervention, they give most of it to their best friend. 213 00:12:54,181 --> 00:12:57,640 Four, five years old, 20 minutes three times a week. 214 00:12:57,640 --> 00:13:01,123 After the intervention, no more discrimination: 215 00:13:01,123 --> 00:13:05,048 the same amount of stickers to their best friend and the least favorite child. 216 00:13:05,048 --> 00:13:08,436 That's something we should do in all the schools in the world. 217 00:13:08,436 --> 00:13:10,430 Now where do we go from there? 218 00:13:10,430 --> 00:13:14,678 (Applause) 219 00:13:14,678 --> 00:13:17,359 When the Dalai Lama heard that, he told Richard Davidson, 220 00:13:17,359 --> 00:13:20,815 "You go to 10 schools, 100 schools, the U.N., the whole world." 221 00:13:20,815 --> 00:13:22,499 So now where do we go from there? 222 00:13:22,499 --> 00:13:24,762 Individual change is possible. 223 00:13:24,762 --> 00:13:29,378 Now do we have to wait for an altruistic gene to be in the human race? 224 00:13:29,378 --> 00:13:33,132 That will take 50,000 years, too much for the environment. 225 00:13:33,132 --> 00:13:37,567 Fortunately, there is the evolution of culture. 226 00:13:37,567 --> 00:13:43,301 Cultures, as specialists have shown, change faster than genes. 227 00:13:43,301 --> 00:13:44,829 That's the good news. 228 00:13:44,829 --> 00:13:48,189 Look, attitude towards war has dramatically changed over the years. 229 00:13:48,189 --> 00:13:53,370 So now individual change and cultural change mutually fashion each other, 230 00:13:53,370 --> 00:13:56,146 and yes, we can achieve a more altruistic society. 231 00:13:56,146 --> 00:13:57,888 So where do we go from there? 232 00:13:57,888 --> 00:14:00,137 Myself, I will go back to the East. 233 00:14:00,137 --> 00:14:03,597 Now we treat 100,000 patients a year in our projects. 234 00:14:03,597 --> 00:14:07,346 We have 25,000 kids in school, four percent overhead. 235 00:14:07,346 --> 00:14:09,921 Some people say, "Well, your stuff works in practice, 236 00:14:09,921 --> 00:14:11,945 but does it work in theory?" 237 00:14:11,945 --> 00:14:15,287 There's always positive deviance. 238 00:14:15,287 --> 00:14:17,706 So I will also go back to my hermitage 239 00:14:17,706 --> 00:14:20,995 to find the inner resources to better serve others. 240 00:14:20,995 --> 00:14:24,182 But on the more global level, what can we do? 241 00:14:24,182 --> 00:14:25,973 We need three things. 242 00:14:25,973 --> 00:14:28,325 Enhancing cooperation: 243 00:14:28,325 --> 00:14:32,041 Cooperative learning in the school instead of competitive learning, 244 00:14:32,041 --> 00:14:35,612 Unconditional cooperation within corporations -- 245 00:14:35,612 --> 00:14:40,019 there can be some competition between corporations, but not within. 246 00:14:40,019 --> 00:14:43,971 We need sustainable harmony. I love this term. 247 00:14:43,971 --> 00:14:45,914 Not sustainable growth anymore. 248 00:14:45,914 --> 00:14:49,500 Sustainable harmony means now we will reduce inequality. 249 00:14:49,500 --> 00:14:53,826 In the future, we do more with less, 250 00:14:53,826 --> 00:14:58,310 and we continue to grow qualitatively, not quantitatively. 251 00:14:58,310 --> 00:15:00,615 We need caring economics. 252 00:15:00,615 --> 00:15:06,446 The Homo economicus cannot deal with poverty in the midst of plenty, 253 00:15:06,446 --> 00:15:08,798 cannot deal with the problem of the common goods 254 00:15:08,798 --> 00:15:11,096 of the atmosphere, of the oceans. 255 00:15:11,096 --> 00:15:12,679 We need a caring economics. 256 00:15:12,679 --> 00:15:14,788 If you say economics should be compassionate, 257 00:15:14,788 --> 00:15:16,273 they say, "That's not our job." 258 00:15:16,273 --> 00:15:19,588 But if you say they don't care, that looks bad. 259 00:15:19,588 --> 00:15:22,967 We need local commitment, global responsibility. 260 00:15:22,967 --> 00:15:28,307 We need to extend altruism to the other 1.6 million species. 261 00:15:28,307 --> 00:15:31,732 Sentient beings are co-citizens in this world. 262 00:15:31,732 --> 00:15:34,724 and we need to dare altruism. 263 00:15:34,724 --> 00:15:38,605 So, long live the altruistic revolution. 264 00:15:38,605 --> 00:15:43,155 Viva la revolución de altruismo. 265 00:15:43,155 --> 00:15:48,515 (Applause) 266 00:15:48,515 --> 00:15:50,496 Thank you. 267 00:15:50,496 --> 00:15:52,448 (Applause)