1 00:00:00,760 --> 00:00:02,376 So earlier this year, 2 00:00:02,400 --> 00:00:06,216 I was informed that I would be doing a TED Talk. 3 00:00:06,240 --> 00:00:08,216 So I was excited, then I panicked, 4 00:00:08,240 --> 00:00:10,256 then I was excited, then I panicked, 5 00:00:10,280 --> 00:00:12,816 and in between the excitement and the panicking, 6 00:00:12,840 --> 00:00:15,136 I started to do my research, 7 00:00:15,160 --> 00:00:19,616 and my research primarily consisted of Googling how to give a great TED Talk. 8 00:00:19,640 --> 00:00:20,856 (Laughter) 9 00:00:20,880 --> 00:00:22,536 And interspersed with that, 10 00:00:22,560 --> 00:00:25,256 I was Googling Chimamanda Ngozi Adichie. 11 00:00:25,280 --> 00:00:26,896 How many of you know who that is? 12 00:00:26,920 --> 00:00:29,696 (Cheers) 13 00:00:29,720 --> 00:00:32,055 So I was Googling her because I always Google her 14 00:00:32,080 --> 00:00:33,336 because I'm just a fan, 15 00:00:33,360 --> 00:00:36,976 but also because she always has important and interesting things to say. 16 00:00:37,000 --> 00:00:40,456 And the combination of those searches 17 00:00:40,480 --> 00:00:43,096 kept leading me to her talk 18 00:00:43,120 --> 00:00:46,296 on the dangers of a single story, 19 00:00:46,320 --> 00:00:49,696 on what happens when we have a solitary lens 20 00:00:49,720 --> 00:00:52,216 through which to understand certain groups of people, 21 00:00:52,240 --> 00:00:54,200 and it is the perfect talk. 22 00:00:55,720 --> 00:01:00,056 It's the talk that I would have given if I had been famous first. 23 00:01:00,080 --> 00:01:02,256 (Laughter) 24 00:01:02,280 --> 00:01:05,656 You know, and you know, like, she's African and I'm African, 25 00:01:05,680 --> 00:01:07,616 and she's a feminist and I'm a feminist, 26 00:01:07,640 --> 00:01:09,816 and she's a storyteller and I'm a storyteller, 27 00:01:09,840 --> 00:01:11,656 so I really felt like it's my talk. 28 00:01:11,680 --> 00:01:14,416 (Laughter) 29 00:01:14,440 --> 00:01:17,736 So I decided that I was going to learn how to code, 30 00:01:17,760 --> 00:01:19,776 and then I was going to hack the internet 31 00:01:19,800 --> 00:01:23,536 and I would take down all the copies of that talk that existed, 32 00:01:23,560 --> 00:01:24,976 and then I would memorize it, 33 00:01:25,000 --> 00:01:28,256 and then I would come here and deliver it as if it was my own speech. 34 00:01:28,280 --> 00:01:31,456 So that plan was going really well, except the coding part, 35 00:01:31,480 --> 00:01:35,376 and then one morning a few months ago, 36 00:01:35,400 --> 00:01:36,976 I woke up 37 00:01:37,000 --> 00:01:42,456 to the news that the wife of a certain presidential candidate 38 00:01:42,480 --> 00:01:45,456 had given a speech that -- 39 00:01:45,480 --> 00:01:47,416 (Laughter) 40 00:01:47,440 --> 00:01:50,000 (Applause) 41 00:01:52,960 --> 00:01:57,576 that sounded eerily like a speech given by one of my other faves, 42 00:01:57,600 --> 00:01:58,816 Michelle Obama. 43 00:01:58,840 --> 00:02:00,936 (Cheers) 44 00:02:00,960 --> 00:02:04,976 And so I decided that I should probably write my own TED Talk, 45 00:02:05,000 --> 00:02:07,496 and so that is what I am here to do. 46 00:02:07,520 --> 00:02:11,840 I'm here to talk about my own observations about storytelling. 47 00:02:12,640 --> 00:02:16,616 I want to talk to you about the power of stories, of course, 48 00:02:16,640 --> 00:02:19,576 but I also want to talk about their limitations, 49 00:02:19,600 --> 00:02:23,640 particularly for those of us who are interested in social justice. 50 00:02:24,280 --> 00:02:27,176 So since Adichie gave that talk seven years ago, 51 00:02:27,200 --> 00:02:29,456 there has been a boom in storytelling. 52 00:02:29,480 --> 00:02:32,216 Stories are everywhere, 53 00:02:32,240 --> 00:02:36,176 and if there was a danger in the telling of one tired old tale, 54 00:02:36,200 --> 00:02:40,536 then I think there has got to be lots to celebrate about the flourishing 55 00:02:40,560 --> 00:02:43,336 of so many stories and so many voices. 56 00:02:43,360 --> 00:02:46,160 Stories are the antidote to bias. 57 00:02:46,960 --> 00:02:52,016 In fact, today, if you are middle class and connected via the internet, 58 00:02:52,040 --> 00:02:55,176 you can download stories at the touch of a button 59 00:02:55,200 --> 00:02:56,576 or the swipe of a screen. 60 00:02:56,600 --> 00:02:58,416 You can listen to a podcast 61 00:02:58,440 --> 00:03:02,336 about what it's like to grow up Dalit in Kolkata. 62 00:03:02,360 --> 00:03:04,856 You can hear an indigenous man in Australia 63 00:03:04,880 --> 00:03:08,976 talk about the trials and triumphs of raising his children in dignity 64 00:03:09,000 --> 00:03:10,336 and in pride. 65 00:03:10,360 --> 00:03:12,336 Stories make us fall in love. 66 00:03:12,360 --> 00:03:15,536 They heal rifts and they bridge divides. 67 00:03:15,560 --> 00:03:17,416 Stories can even make it easier for us 68 00:03:17,440 --> 00:03:20,096 to talk about the deaths of people in our societies 69 00:03:20,120 --> 00:03:22,576 who don't matter, because they make us care. 70 00:03:22,600 --> 00:03:23,800 Right? 71 00:03:24,800 --> 00:03:26,056 I'm not so sure, 72 00:03:26,080 --> 00:03:29,160 and I actually work for a place called the Centre for Stories. 73 00:03:29,840 --> 00:03:34,256 And my job is to help to tell stories 74 00:03:34,280 --> 00:03:37,816 that challenge mainstream narratives about what it means to be black 75 00:03:37,840 --> 00:03:40,896 or a Muslim or a refugee or any of those other categories 76 00:03:40,920 --> 00:03:43,936 that we talk about all the time. 77 00:03:43,960 --> 00:03:45,176 But I come to this work 78 00:03:45,200 --> 00:03:48,696 after a long history as a social justice activist, 79 00:03:48,720 --> 00:03:50,856 and so I'm really interested in the ways 80 00:03:50,880 --> 00:03:53,576 that people talk about nonfiction storytelling 81 00:03:53,600 --> 00:03:55,936 as though it's about more than entertainment, 82 00:03:55,960 --> 00:03:58,920 as though it's about being a catalyst for social action. 83 00:03:59,560 --> 00:04:02,216 It's not uncommon to hear people say 84 00:04:02,240 --> 00:04:05,240 that stories make the world a better place. 85 00:04:06,960 --> 00:04:10,176 Increasingly, though, I worry that even the most poignant stories, 86 00:04:10,200 --> 00:04:14,136 particularly the stories about people who no one seems to care about, 87 00:04:14,160 --> 00:04:17,576 can often get in the way of action towards social justice. 88 00:04:17,600 --> 00:04:21,416 Now, this is not because storytellers mean any harm. 89 00:04:21,440 --> 00:04:22,696 Quite the contrary. 90 00:04:22,720 --> 00:04:26,960 Storytellers are often do-gooders like me and, I suspect, yourselves. 91 00:04:27,600 --> 00:04:30,656 And the audiences of storytellers 92 00:04:30,680 --> 00:04:33,920 are often deeply compassionate and empathetic people. 93 00:04:34,360 --> 00:04:39,176 Still, good intentions can have unintended consequences, 94 00:04:39,200 --> 00:04:43,240 and so I want to propose that stories are not as magical as they seem. 95 00:04:43,680 --> 00:04:46,576 So three -- because it's always got to be three -- 96 00:04:46,600 --> 00:04:48,816 three reasons why I think 97 00:04:48,840 --> 00:04:53,520 that stories don't necessarily make the world a better place. 98 00:04:54,320 --> 00:04:58,376 Firstly, stories can create an illusion of solidarity. 99 00:04:58,400 --> 00:05:00,936 There is nothing like that feel-good factor you get 100 00:05:00,960 --> 00:05:03,096 from listening to a fantastic story 101 00:05:03,120 --> 00:05:06,496 where you feel like you climbed that mountain, right, 102 00:05:06,520 --> 00:05:09,360 or that you befriended that death row inmate. 103 00:05:09,840 --> 00:05:11,256 But you didn't. 104 00:05:11,280 --> 00:05:13,096 You haven't done anything. 105 00:05:13,120 --> 00:05:14,896 Listening is an important 106 00:05:14,920 --> 00:05:17,880 but insufficient step towards social action. 107 00:05:19,120 --> 00:05:21,976 Secondly, I think often we are drawn 108 00:05:22,000 --> 00:05:24,936 towards characters and protagonists 109 00:05:24,960 --> 00:05:28,416 who are likable and human. 110 00:05:28,440 --> 00:05:30,336 And this makes sense, of course, right? 111 00:05:30,360 --> 00:05:33,416 Because if you like someone, then you care about them. 112 00:05:33,440 --> 00:05:34,840 But the inverse is also true. 113 00:05:35,400 --> 00:05:37,176 If you don't like someone, 114 00:05:37,200 --> 00:05:39,136 then you don't care about them. 115 00:05:39,160 --> 00:05:41,016 And if you don't care about them, 116 00:05:41,040 --> 00:05:44,936 you don't have to see yourself as having a moral obligation 117 00:05:44,960 --> 00:05:48,200 to think about the circumstances that shaped their lives. 118 00:05:49,000 --> 00:05:52,296 I learned this lesson when I was 14 years old. 119 00:05:52,320 --> 00:05:55,096 I learned that actually, you don't have to like someone 120 00:05:55,120 --> 00:05:56,496 to recognize their wisdom, 121 00:05:56,520 --> 00:05:58,616 and you certainly don't have to like someone 122 00:05:58,640 --> 00:06:00,080 to take a stand by their side. 123 00:06:00,800 --> 00:06:02,600 So my bike was stolen 124 00:06:03,520 --> 00:06:04,976 while I was riding it -- 125 00:06:05,000 --> 00:06:06,136 (Laughter) 126 00:06:06,160 --> 00:06:09,736 which is possible if you're riding slowly enough, which I was. 127 00:06:09,760 --> 00:06:11,256 (Laughter) 128 00:06:11,280 --> 00:06:14,256 So one minute I'm cutting across this field 129 00:06:14,280 --> 00:06:16,576 in the Nairobi neighborhood where I grew up, 130 00:06:16,600 --> 00:06:19,056 and it's like a very bumpy path, 131 00:06:19,080 --> 00:06:20,896 and so when you're riding a bike, 132 00:06:20,920 --> 00:06:23,176 you don't want to be like, you know -- 133 00:06:23,200 --> 00:06:24,600 (Laughter) 134 00:06:26,160 --> 00:06:30,776 And so I'm going like this, slowly pedaling, 135 00:06:30,800 --> 00:06:33,376 and all of a sudden, I'm on the floor. 136 00:06:33,400 --> 00:06:35,576 I'm on the ground, and I look up, 137 00:06:35,600 --> 00:06:38,376 and there's this kid peddling away in the getaway vehicle, 138 00:06:38,400 --> 00:06:39,896 which is my bike, 139 00:06:39,920 --> 00:06:43,176 and he's about 11 or 12 years old, and I'm on the floor, 140 00:06:43,200 --> 00:06:46,056 and I'm crying because I saved a lot of money for that bike, 141 00:06:46,080 --> 00:06:48,656 and I'm crying and I stand up and I start screaming. 142 00:06:48,680 --> 00:06:52,936 Instinct steps in, and I start screaming, "Mwizi, mwizi!" 143 00:06:52,960 --> 00:06:54,600 which means "thief" in Swahili. 144 00:06:55,560 --> 00:07:00,576 And out of the woodworks, all of these people come out 145 00:07:00,600 --> 00:07:02,016 and they start to give chase. 146 00:07:02,040 --> 00:07:04,296 This is Africa, so mob justice in action. 147 00:07:04,320 --> 00:07:05,776 Right? 148 00:07:05,800 --> 00:07:08,576 And I round the corner, and they've captured him, 149 00:07:08,600 --> 00:07:10,056 they've caught him. 150 00:07:10,080 --> 00:07:12,136 The suspect has been apprehended, 151 00:07:12,160 --> 00:07:15,736 and they make him give me my bike back, 152 00:07:15,760 --> 00:07:17,376 and they also make him apologize. 153 00:07:17,400 --> 00:07:20,976 Again, you know, typical African justice, right? 154 00:07:21,000 --> 00:07:22,496 And so they make him say sorry. 155 00:07:22,520 --> 00:07:24,856 And so we stand there facing each other, 156 00:07:24,880 --> 00:07:27,816 and he looks at me, and he says sorry, 157 00:07:27,840 --> 00:07:31,336 but he looks at me with this unbridled fury. 158 00:07:31,360 --> 00:07:34,400 He is very, very angry. 159 00:07:35,440 --> 00:07:38,496 And it is the first time that I have been confronted with someone 160 00:07:38,520 --> 00:07:41,136 who doesn't like me simply because of what I represent. 161 00:07:41,160 --> 00:07:43,216 He looks at me with this look as if to say, 162 00:07:43,240 --> 00:07:47,120 "You, with your shiny skin and your bike, you're angry at me?" 163 00:07:49,240 --> 00:07:52,496 So it was a hard lesson that he didn't like me, 164 00:07:52,520 --> 00:07:54,576 but you know what, he was right. 165 00:07:54,600 --> 00:07:58,096 I was a middle-class kid living in a poor country. 166 00:07:58,120 --> 00:08:01,360 I had a bike, and he barely had food. 167 00:08:01,760 --> 00:08:04,696 Sometimes, it's the messages that we don't want to hear, 168 00:08:04,720 --> 00:08:07,216 the ones that make us want to crawl out of ourselves, 169 00:08:07,240 --> 00:08:09,816 that we need to hear the most. 170 00:08:09,840 --> 00:08:13,016 For every lovable storyteller who steals your heart, 171 00:08:13,040 --> 00:08:17,416 there are hundreds more whose voices are slurred and ragged, 172 00:08:17,440 --> 00:08:22,120 who don't get to stand up on a stage dressed in fine clothes like this. 173 00:08:22,640 --> 00:08:26,936 There are a million angry-boy-on-a-bike stories 174 00:08:26,960 --> 00:08:28,616 and we can't afford to ignore them 175 00:08:28,640 --> 00:08:31,776 simply because we don't like their protagonists 176 00:08:31,800 --> 00:08:34,736 or because that's not the kid that we would bring home with us 177 00:08:34,760 --> 00:08:35,960 from the orphanage. 178 00:08:36,600 --> 00:08:38,456 The third reason that I think 179 00:08:38,480 --> 00:08:42,096 that stories don't necessarily make the world a better place 180 00:08:42,120 --> 00:08:45,576 is that too often we are so invested in the personal narrative 181 00:08:45,600 --> 00:08:48,440 that we forget to look at the bigger picture. 182 00:08:48,880 --> 00:08:50,776 And so we applaud someone 183 00:08:50,800 --> 00:08:53,456 when they tell us about their feelings of shame, 184 00:08:53,480 --> 00:08:56,640 but we don't necessarily link that to oppression. 185 00:08:57,080 --> 00:09:00,736 We nod understandingly when someone says they felt small, 186 00:09:00,760 --> 00:09:02,800 but we don't link that to discrimination. 187 00:09:03,600 --> 00:09:06,416 The most important stories, especially for social justice, 188 00:09:06,440 --> 00:09:08,256 are those that do both, 189 00:09:08,280 --> 00:09:13,040 that are both personal and allow us to explore and understand the political. 190 00:09:13,920 --> 00:09:15,936 But it's not just about the stories we like 191 00:09:15,960 --> 00:09:17,856 versus the stories we choose to ignore. 192 00:09:17,880 --> 00:09:21,816 Increasingly, we are living in a society where there are larger forces at play, 193 00:09:21,840 --> 00:09:26,200 where stories are actually for many people beginning to replace the news. 194 00:09:26,640 --> 00:09:27,856 Yeah? 195 00:09:27,880 --> 00:09:31,256 We live in a time where we are witnessing the decline of facts, 196 00:09:31,280 --> 00:09:33,496 when emotions rule 197 00:09:33,520 --> 00:09:36,616 and analysis, it's kind of boring, right? 198 00:09:36,640 --> 00:09:40,840 Where we value what we feel more than what we actually know. 199 00:09:42,040 --> 00:09:46,336 A recent report by the Pew Center on trends in America 200 00:09:46,360 --> 00:09:52,136 indicates that only 10 percent of young adults under the age of 30 201 00:09:52,160 --> 00:09:55,536 "place a lot of trust in the media." 202 00:09:55,560 --> 00:09:57,360 Now, this is significant. 203 00:09:57,840 --> 00:10:00,456 It means that storytellers are gaining trust 204 00:10:00,480 --> 00:10:01,856 at precisely the same moment 205 00:10:01,880 --> 00:10:05,200 that many in the media are losing the confidence in the public. 206 00:10:06,040 --> 00:10:08,616 This is not a good thing, 207 00:10:08,640 --> 00:10:10,416 because while stories are important 208 00:10:10,440 --> 00:10:12,656 and they help us to have insights in many ways, 209 00:10:12,680 --> 00:10:14,536 we need the media. 210 00:10:14,560 --> 00:10:17,056 From my years as a social justice activist, 211 00:10:17,080 --> 00:10:23,176 I know very well that we need credible facts from media institutions 212 00:10:23,200 --> 00:10:27,216 combined with the powerful voices of storytellers. 213 00:10:27,240 --> 00:10:30,760 That's what pushes the needle forward in terms of social justice. 214 00:10:31,840 --> 00:10:34,560 In the final analysis, of course, 215 00:10:36,480 --> 00:10:38,296 it is justice 216 00:10:38,320 --> 00:10:40,056 that makes the world a better place, 217 00:10:40,080 --> 00:10:42,040 not stories. Right? 218 00:10:43,080 --> 00:10:46,136 And so if it is justice that we are after, 219 00:10:46,160 --> 00:10:49,576 then I think we mustn't focus on the media or on storytellers. 220 00:10:49,600 --> 00:10:52,296 We must focus on audiences, 221 00:10:52,320 --> 00:10:55,416 on anyone who has ever turned on a radio 222 00:10:55,440 --> 00:10:57,256 or listened to a podcast, 223 00:10:57,280 --> 00:10:59,376 and that means all of us. 224 00:10:59,400 --> 00:11:01,536 So a few concluding thoughts 225 00:11:01,560 --> 00:11:05,440 on what audiences can do to make the world a better place. 226 00:11:06,000 --> 00:11:09,936 So firstly, the world would be a better place, I think, 227 00:11:09,960 --> 00:11:13,536 if audiences were more curious and more skeptical 228 00:11:13,560 --> 00:11:16,176 and asked more questions about the social context 229 00:11:16,200 --> 00:11:19,280 that created those stories that they love so much. 230 00:11:20,200 --> 00:11:22,456 Secondly, the world would be a better place 231 00:11:22,480 --> 00:11:26,160 if audiences recognized that storytelling is intellectual work. 232 00:11:27,640 --> 00:11:30,576 And I think it would be important for audiences 233 00:11:30,600 --> 00:11:35,936 to demand more buttons on their favorite websites, 234 00:11:35,960 --> 00:11:38,656 buttons for example that say, 235 00:11:38,680 --> 00:11:40,296 "If you liked this story, 236 00:11:40,320 --> 00:11:44,376 click here to support a cause your storyteller believes in." 237 00:11:44,400 --> 00:11:49,560 Or "click here to contribute to your storyteller's next big idea." 238 00:11:50,480 --> 00:11:53,056 Often, we are committed to the platforms, 239 00:11:53,080 --> 00:11:55,536 but not necessarily to the storytellers themselves. 240 00:11:55,560 --> 00:12:00,656 And then lastly, I think that audiences can make the world a better place 241 00:12:00,680 --> 00:12:02,760 by switching off their phones, 242 00:12:03,560 --> 00:12:05,576 by stepping away from their screens 243 00:12:05,600 --> 00:12:10,080 and stepping out into the real world beyond what feels safe. 244 00:12:10,840 --> 00:12:12,856 Alice Walker has said, 245 00:12:12,880 --> 00:12:16,656 "Look closely at the present you are constructing. 246 00:12:16,680 --> 00:12:19,840 It should look like the future you are dreaming." 247 00:12:20,640 --> 00:12:22,856 Storytellers can help us to dream, 248 00:12:22,880 --> 00:12:26,680 but it's up to all of us to have a plan for justice. 249 00:12:27,480 --> 00:12:28,696 Thank you. 250 00:12:28,720 --> 00:12:33,070 (Applause)