0:00:00.556,0:00:04.573 Our emotions influence[br]every aspect of our lives, 0:00:04.573,0:00:08.149 from our health and how we learn,[br]to how we do business and make decisions, 0:00:08.149,0:00:09.922 big ones and small. 0:00:10.672,0:00:14.162 Our emotions also influence[br]how we connect with one another. 0:00:15.132,0:00:19.108 We've evolved to live[br]in a world like this, 0:00:19.108,0:00:23.427 but instead, we're living[br]more and more of our lives like this -- 0:00:23.427,0:00:26.561 this is the text message[br]from my daughter last night -- 0:00:26.561,0:00:29.301 in a world that's devoid of emotion. 0:00:29.301,0:00:31.252 So I'm on a mission to change that. 0:00:31.252,0:00:35.343 I want to bring emotions[br]back into our digital experiences. 0:00:36.223,0:00:39.300 I started on this path 15 years ago. 0:00:39.300,0:00:41.366 I was a computer scientist in Egypt, 0:00:41.366,0:00:45.871 and I had just gotten accepted to[br]a Ph.D. program at Cambridge University. 0:00:45.871,0:00:47.984 So I did something quite unusual 0:00:47.984,0:00:52.209 for a young newlywed Muslim Egyptian wife: 0:00:53.599,0:00:56.598 With the support of my husband,[br]who had to stay in Egypt, 0:00:56.598,0:00:59.616 I packed my bags and I moved to England. 0:00:59.616,0:01:02.844 At Cambridge, thousands of miles[br]away from home, 0:01:02.844,0:01:06.257 I realized I was spending[br]more hours with my laptop 0:01:06.257,0:01:08.486 than I did with any other human. 0:01:08.486,0:01:13.339 Yet despite this intimacy, my laptop[br]had absolutely no idea how I was feeling. 0:01:13.339,0:01:16.550 It had no idea if I was happy, 0:01:16.550,0:01:19.538 having a bad day, or stressed, confused, 0:01:19.538,0:01:22.460 and so that got frustrating. 0:01:23.600,0:01:28.831 Even worse, as I communicated[br]online with my family back home, 0:01:29.421,0:01:32.703 I felt that all my emotions[br]disappeared in cyberspace. 0:01:32.703,0:01:37.858 I was homesick, I was lonely,[br]and on some days I was actually crying, 0:01:37.858,0:01:42.786 but all I had to communicate[br]these emotions was this. 0:01:42.786,0:01:44.806 (Laughter) 0:01:44.806,0:01:49.780 Today's technology[br]has lots of I.Q., but no E.Q.; 0:01:49.780,0:01:52.956 lots of cognitive intelligence,[br]but no emotional intelligence. 0:01:52.956,0:01:55.153 So that got me thinking, 0:01:55.153,0:01:58.777 what if our technology[br]could sense our emotions? 0:01:58.777,0:02:02.853 What if our devices could sense[br]how we felt and reacted accordingly, 0:02:02.853,0:02:05.866 just the way an emotionally[br]intelligent friend would? 0:02:06.666,0:02:10.230 Those questions led me and my team 0:02:10.230,0:02:14.607 to create technologies that can read[br]and respond to our emotions, 0:02:14.607,0:02:17.697 and our starting point was the human face. 0:02:18.577,0:02:21.750 So our human face happens to be[br]one of the most powerful channels 0:02:21.750,0:02:25.766 that we all use to communicate[br]social and emotional states, 0:02:25.766,0:02:28.776 everything from enjoyment, surprise, 0:02:28.776,0:02:32.979 empathy and curiosity. 0:02:32.979,0:02:37.907 In emotion science, we call each[br]facial muscle movement an action unit. 0:02:37.907,0:02:40.832 So for example, action unit 12, 0:02:40.832,0:02:42.870 it's not a Hollywood blockbuster, 0:02:42.870,0:02:46.312 it is actually a lip corner pull,[br]which is the main component of a smile. 0:02:46.312,0:02:49.300 Try it everybody. Let's get[br]some smiles going on. 0:02:49.300,0:02:51.954 Another example is action unit 4.[br]It's the brow furrow. 0:02:51.954,0:02:54.192 It's when you draw your eyebrows together 0:02:54.192,0:02:56.459 and you create all[br]these textures and wrinkles. 0:02:56.459,0:03:00.754 We don't like them, but it's[br]a strong indicator of a negative emotion. 0:03:00.754,0:03:02.960 So we have about 45 of these action units, 0:03:02.960,0:03:06.350 and they combine to express[br]hundreds of emotions. 0:03:06.350,0:03:10.251 Teaching a computer to read[br]these facial emotions is hard, 0:03:10.251,0:03:13.223 because these action units,[br]they can be fast, they're subtle, 0:03:13.223,0:03:15.777 and they combine in many different ways. 0:03:15.777,0:03:19.515 So take, for example,[br]the smile and the smirk. 0:03:19.515,0:03:23.268 They look somewhat similar,[br]but they mean very different things. 0:03:23.268,0:03:24.986 (Laughter) 0:03:24.986,0:03:27.990 So the smile is positive, 0:03:27.990,0:03:29.260 a smirk is often negative. 0:03:29.260,0:03:33.136 Sometimes a smirk[br]can make you become famous. 0:03:33.136,0:03:35.960 But seriously, it's important[br]for a computer to be able 0:03:35.960,0:03:38.815 to tell the difference[br]between the two expressions. 0:03:38.815,0:03:40.627 So how do we do that? 0:03:40.627,0:03:42.414 We give our algorithms 0:03:42.414,0:03:46.524 tens of thousands of examples[br]of people we know to be smiling, 0:03:46.524,0:03:49.589 from different ethnicities, ages, genders, 0:03:49.589,0:03:52.400 and we do the same for smirks. 0:03:52.400,0:03:53.954 And then, using deep learning, 0:03:53.954,0:03:56.810 the algorithm looks for all these[br]textures and wrinkles 0:03:56.810,0:03:59.390 and shape changes on our face, 0:03:59.390,0:04:02.592 and basically learns that all smiles[br]have common characteristics, 0:04:02.592,0:04:05.773 all smirks have subtly[br]different characteristics. 0:04:05.773,0:04:08.141 And the next time it sees a new face, 0:04:08.141,0:04:10.440 it essentially learns that 0:04:10.440,0:04:13.473 this face has the same[br]characteristics of a smile, 0:04:13.473,0:04:17.751 and it says, "Aha, I recognize this.[br]This is a smile expression." 0:04:18.381,0:04:21.181 So the best way to demonstrate[br]how this technology works 0:04:21.181,0:04:23.317 is to try a live demo, 0:04:23.317,0:04:27.230 so I need a volunteer,[br]preferably somebody with a face. 0:04:27.230,0:04:29.564 (Laughter) 0:04:29.564,0:04:32.335 Cloe's going to be our volunteer today. 0:04:33.325,0:04:37.783 So over the past five years, we've moved[br]from being a research project at MIT 0:04:37.783,0:04:38.939 to a company, 0:04:38.939,0:04:42.131 where my team has worked really hard[br]to make this technology work, 0:04:42.131,0:04:44.540 as we like to say, in the wild. 0:04:44.540,0:04:47.210 And we've also shrunk it so that[br]the core emotion engine 0:04:47.210,0:04:50.530 works on any mobile device[br]with a camera, like this iPad. 0:04:50.530,0:04:53.316 So let's give this a try. 0:04:54.756,0:04:58.680 As you can see, the algorithm[br]has essentially found Cloe's face, 0:04:58.680,0:05:00.372 so it's this white bounding box, 0:05:00.372,0:05:02.943 and it's tracking the main[br]feature points on her face, 0:05:02.943,0:05:05.799 so her eyebrows, her eyes,[br]her mouth and her nose. 0:05:05.799,0:05:08.786 The question is,[br]can it recognize her expression? 0:05:08.786,0:05:10.457 So we're going to test the machine. 0:05:10.457,0:05:14.643 So first of all, give me your poker face.[br]Yep, awesome. (Laughter) 0:05:14.643,0:05:17.456 And then as she smiles,[br]this is a genuine smile, it's great. 0:05:17.456,0:05:19.756 So you can see the green bar[br]go up as she smiles. 0:05:19.756,0:05:20.978 Now that was a big smile. 0:05:20.978,0:05:24.021 Can you try a subtle smile[br]to see if the computer can recognize? 0:05:24.021,0:05:26.352 It does recognize subtle smiles as well. 0:05:26.352,0:05:28.477 We've worked really hard[br]to make that happen. 0:05:28.477,0:05:31.439 And then eyebrow raised,[br]indicator of surprise. 0:05:31.439,0:05:35.688 Brow furrow, which is[br]an indicator of confusion. 0:05:35.688,0:05:39.695 Frown. Yes, perfect. 0:05:39.695,0:05:43.188 So these are all the different[br]action units. There's many more of them. 0:05:43.188,0:05:45.220 This is just a slimmed-down demo. 0:05:45.220,0:05:48.368 But we call each reading[br]an emotion data point, 0:05:48.368,0:05:51.337 and then they can fire together[br]to portray different emotions. 0:05:51.337,0:05:55.990 So on the right side of the demo --[br]look like you're happy. 0:05:55.990,0:05:57.444 So that's joy. Joy fires up. 0:05:57.444,0:05:59.371 And then give me a disgust face. 0:05:59.371,0:06:03.643 Try to remember what it was like[br]when Zayn left One Direction. 0:06:03.643,0:06:05.153 (Laughter) 0:06:05.153,0:06:09.495 Yeah, wrinkle your nose. Awesome. 0:06:09.495,0:06:13.226 And the valence is actually quite[br]negative, so you must have been a big fan. 0:06:13.226,0:06:15.926 So valence is how positive[br]or negative an experience is, 0:06:15.926,0:06:18.712 and engagement is how[br]expressive she is as well. 0:06:18.712,0:06:22.126 So imagine if Cloe had access[br]to this real-time emotion stream, 0:06:22.126,0:06:24.935 and she could share it[br]with anybody she wanted to. 0:06:24.935,0:06:27.858 Thank you. 0:06:27.858,0:06:32.479 (Applause) 0:06:33.749,0:06:39.019 So, so far, we have amassed[br]12 billion of these emotion data points. 0:06:39.019,0:06:41.630 It's the largest emotion[br]database in the world. 0:06:41.630,0:06:44.593 We've collected it[br]from 2.9 million face videos, 0:06:44.593,0:06:47.193 people who have agreed[br]to share their emotions with us, 0:06:47.193,0:06:50.398 and from 75 countries around the world. 0:06:50.398,0:06:52.113 It's growing every day. 0:06:52.603,0:06:54.670 It blows my mind away 0:06:54.670,0:06:57.865 that we can now quantify something[br]as personal as our emotions, 0:06:57.865,0:07:00.100 and we can do it at this scale. 0:07:00.100,0:07:02.277 So what have we learned to date? 0:07:03.057,0:07:05.388 Gender. 0:07:05.388,0:07:09.034 Our data confirms something[br]that you might suspect. 0:07:09.034,0:07:10.891 Women are more expressive than men. 0:07:10.891,0:07:13.574 Not only do they smile more,[br]their smiles last longer, 0:07:13.574,0:07:16.478 and we can now really quantify[br]what it is that men and women 0:07:16.478,0:07:18.614 respond to differently. 0:07:18.614,0:07:20.904 Let's do culture: So in the United States, 0:07:20.904,0:07:24.108 women are 40 percent[br]more expressive than men, 0:07:24.108,0:07:27.753 but curiously, we don't see any difference[br]in the U.K. between men and women. 0:07:27.753,0:07:30.259 (Laughter) 0:07:31.296,0:07:35.323 Age: People who are 50 years and older 0:07:35.323,0:07:38.759 are 25 percent more emotive[br]than younger people. 0:07:39.899,0:07:43.751 Women in their 20s smile a lot more[br]than men the same age, 0:07:43.751,0:07:47.590 perhaps a necessity for dating. 0:07:47.590,0:07:50.207 But perhaps what surprised us[br]the most about this data 0:07:50.207,0:07:53.410 is that we happen[br]to be expressive all the time, 0:07:53.410,0:07:56.243 even when we are sitting[br]in front of our devices alone, 0:07:56.243,0:07:59.517 and it's not just when we're watching[br]cat videos on Facebook. 0:08:00.217,0:08:03.227 We are expressive when we're emailing,[br]texting, shopping online, 0:08:03.227,0:08:05.527 or even doing our taxes. 0:08:05.527,0:08:07.919 Where is this data used today? 0:08:07.919,0:08:10.682 In understanding how we engage with media, 0:08:10.682,0:08:13.166 so understanding virality[br]and voting behavior; 0:08:13.166,0:08:15.906 and also empowering[br]or emotion-enabling technology, 0:08:15.906,0:08:20.527 and I want to share some examples[br]that are especially close to my heart. 0:08:21.197,0:08:24.265 Emotion-enabled wearable glasses[br]can help individuals 0:08:24.265,0:08:27.493 who are visually impaired[br]read the faces of others, 0:08:27.493,0:08:31.680 and it can help individuals[br]on the autism spectrum interpret emotion, 0:08:31.680,0:08:34.458 something that they really struggle with. 0:08:35.918,0:08:38.777 In education, imagine[br]if your learning apps 0:08:38.777,0:08:41.587 sense that you're confused and slow down, 0:08:41.587,0:08:43.444 or that you're bored, so it's sped up, 0:08:43.444,0:08:46.413 just like a great teacher[br]would in a classroom. 0:08:47.043,0:08:49.644 What if your wristwatch tracked your mood, 0:08:49.644,0:08:52.337 or your car sensed that you're tired, 0:08:52.337,0:08:54.885 or perhaps your fridge[br]knows that you're stressed, 0:08:54.885,0:09:00.951 so it auto-locks to prevent you[br]from binge eating. (Laughter) 0:09:00.951,0:09:03.668 I would like that, yeah. 0:09:03.668,0:09:05.595 What if, when I was in Cambridge, 0:09:05.595,0:09:07.908 I had access to my real-time[br]emotion stream, 0:09:07.908,0:09:11.437 and I could share that with my family[br]back home in a very natural way, 0:09:11.437,0:09:15.408 just like I would've if we were all[br]in the same room together? 0:09:15.408,0:09:18.550 I think five years down the line, 0:09:18.550,0:09:20.887 all our devices are going[br]to have an emotion chip, 0:09:20.887,0:09:24.951 and we won't remember what it was like[br]when we couldn't just frown at our device 0:09:24.951,0:09:29.200 and our device would say, "Hmm,[br]you didn't like that, did you?" 0:09:29.200,0:09:32.961 Our biggest challenge is that there are[br]so many applications of this technology, 0:09:32.961,0:09:35.864 my team and I realize that we can't[br]build them all ourselves, 0:09:35.864,0:09:39.360 so we've made this technology available[br]so that other developers 0:09:39.360,0:09:41.474 can get building and get creative. 0:09:41.474,0:09:45.560 We recognize that[br]there are potential risks 0:09:45.560,0:09:47.627 and potential for abuse, 0:09:47.627,0:09:50.576 but personally, having spent[br]many years doing this, 0:09:50.576,0:09:53.548 I believe that the benefits to humanity 0:09:53.548,0:09:55.823 from having emotionally[br]intelligent technology 0:09:55.823,0:09:59.399 far outweigh the potential for misuse. 0:09:59.399,0:10:01.930 And I invite you all to be[br]part of the conversation. 0:10:01.930,0:10:04.484 The more people who know[br]about this technology, 0:10:04.484,0:10:07.661 the more we can all have a voice[br]in how it's being used. 0:10:09.081,0:10:13.655 So as more and more[br]of our lives become digital, 0:10:13.655,0:10:17.153 we are fighting a losing battle[br]trying to curb our usage of devices 0:10:17.153,0:10:19.382 in order to reclaim our emotions. 0:10:20.622,0:10:24.536 So what I'm trying to do instead[br]is to bring emotions into our technology 0:10:24.536,0:10:26.765 and make our technologies more responsive. 0:10:26.765,0:10:29.435 So I want those devices[br]that have separated us 0:10:29.435,0:10:31.897 to bring us back together. 0:10:31.897,0:10:36.485 And by humanizing technology,[br]we have this golden opportunity 0:10:36.485,0:10:39.782 to reimagine how we[br]connect with machines, 0:10:39.782,0:10:44.263 and therefore, how we, as human beings, 0:10:44.263,0:10:46.167 connect with one another. 0:10:46.167,0:10:48.327 Thank you. 0:10:48.327,0:10:51.640 (Applause)