This app knows how you feel — from the look on your face
-
0:01 - 0:05Our emotions influence
every aspect of our lives, -
0:05 - 0:08from our health and how we learn,
to how we do business and make decisions, -
0:08 - 0:10big ones and small.
-
0:11 - 0:14Our emotions also influence
how we connect with one another. -
0:15 - 0:19We've evolved to live
in a world like this, -
0:19 - 0:23but instead, we're living
more and more of our lives like this -- -
0:23 - 0:27this is the text message
from my daughter last night -- -
0:27 - 0:29in a world that's devoid of emotion.
-
0:29 - 0:31So I'm on a mission to change that.
-
0:31 - 0:35I want to bring emotions
back into our digital experiences. -
0:36 - 0:39I started on this path 15 years ago.
-
0:39 - 0:41I was a computer scientist in Egypt,
-
0:41 - 0:46and I had just gotten accepted to
a Ph.D. program at Cambridge University. -
0:46 - 0:48So I did something quite unusual
-
0:48 - 0:52for a young newlywed Muslim Egyptian wife:
-
0:54 - 0:57With the support of my husband,
who had to stay in Egypt, -
0:57 - 1:00I packed my bags and I moved to England.
-
1:00 - 1:03At Cambridge, thousands of miles
away from home, -
1:03 - 1:06I realized I was spending
more hours with my laptop -
1:06 - 1:08than I did with any other human.
-
1:08 - 1:13Yet despite this intimacy, my laptop
had absolutely no idea how I was feeling. -
1:13 - 1:17It had no idea if I was happy,
-
1:17 - 1:20having a bad day, or stressed, confused,
-
1:20 - 1:22and so that got frustrating.
-
1:24 - 1:29Even worse, as I communicated
online with my family back home, -
1:29 - 1:33I felt that all my emotions
disappeared in cyberspace. -
1:33 - 1:38I was homesick, I was lonely,
and on some days I was actually crying, -
1:38 - 1:43but all I had to communicate
these emotions was this. -
1:43 - 1:45(Laughter)
-
1:45 - 1:50Today's technology
has lots of I.Q., but no E.Q.; -
1:50 - 1:53lots of cognitive intelligence,
but no emotional intelligence. -
1:53 - 1:55So that got me thinking,
-
1:55 - 1:59what if our technology
could sense our emotions? -
1:59 - 2:03What if our devices could sense
how we felt and reacted accordingly, -
2:03 - 2:06just the way an emotionally
intelligent friend would? -
2:07 - 2:10Those questions led me and my team
-
2:10 - 2:15to create technologies that can read
and respond to our emotions, -
2:15 - 2:18and our starting point was the human face.
-
2:19 - 2:22So our human face happens to be
one of the most powerful channels -
2:22 - 2:26that we all use to communicate
social and emotional states, -
2:26 - 2:29everything from enjoyment, surprise,
-
2:29 - 2:33empathy and curiosity.
-
2:33 - 2:38In emotion science, we call each
facial muscle movement an action unit. -
2:38 - 2:41So for example, action unit 12,
-
2:41 - 2:43it's not a Hollywood blockbuster,
-
2:43 - 2:46it is actually a lip corner pull,
which is the main component of a smile. -
2:46 - 2:49Try it everybody. Let's get
some smiles going on. -
2:49 - 2:52Another example is action unit 4.
It's the brow furrow. -
2:52 - 2:54It's when you draw your eyebrows together
-
2:54 - 2:56and you create all
these textures and wrinkles. -
2:56 - 3:01We don't like them, but it's
a strong indicator of a negative emotion. -
3:01 - 3:03So we have about 45 of these action units,
-
3:03 - 3:06and they combine to express
hundreds of emotions. -
3:06 - 3:10Teaching a computer to read
these facial emotions is hard, -
3:10 - 3:13because these action units,
they can be fast, they're subtle, -
3:13 - 3:16and they combine in many different ways.
-
3:16 - 3:20So take, for example,
the smile and the smirk. -
3:20 - 3:23They look somewhat similar,
but they mean very different things. -
3:23 - 3:25(Laughter)
-
3:25 - 3:28So the smile is positive,
-
3:28 - 3:29a smirk is often negative.
-
3:29 - 3:33Sometimes a smirk
can make you become famous. -
3:33 - 3:36But seriously, it's important
for a computer to be able -
3:36 - 3:39to tell the difference
between the two expressions. -
3:39 - 3:41So how do we do that?
-
3:41 - 3:42We give our algorithms
-
3:42 - 3:47tens of thousands of examples
of people we know to be smiling, -
3:47 - 3:50from different ethnicities, ages, genders,
-
3:50 - 3:52and we do the same for smirks.
-
3:52 - 3:54And then, using deep learning,
-
3:54 - 3:57the algorithm looks for all these
textures and wrinkles -
3:57 - 3:59and shape changes on our face,
-
3:59 - 4:03and basically learns that all smiles
have common characteristics, -
4:03 - 4:06all smirks have subtly
different characteristics. -
4:06 - 4:08And the next time it sees a new face,
-
4:08 - 4:10it essentially learns that
-
4:10 - 4:13this face has the same
characteristics of a smile, -
4:13 - 4:18and it says, "Aha, I recognize this.
This is a smile expression." -
4:18 - 4:21So the best way to demonstrate
how this technology works -
4:21 - 4:23is to try a live demo,
-
4:23 - 4:27so I need a volunteer,
preferably somebody with a face. -
4:27 - 4:30(Laughter)
-
4:30 - 4:32Cloe's going to be our volunteer today.
-
4:33 - 4:38So over the past five years, we've moved
from being a research project at MIT -
4:38 - 4:39to a company,
-
4:39 - 4:42where my team has worked really hard
to make this technology work, -
4:42 - 4:45as we like to say, in the wild.
-
4:45 - 4:47And we've also shrunk it so that
the core emotion engine -
4:47 - 4:51works on any mobile device
with a camera, like this iPad. -
4:51 - 4:53So let's give this a try.
-
4:55 - 4:59As you can see, the algorithm
has essentially found Cloe's face, -
4:59 - 5:00so it's this white bounding box,
-
5:00 - 5:03and it's tracking the main
feature points on her face, -
5:03 - 5:06so her eyebrows, her eyes,
her mouth and her nose. -
5:06 - 5:09The question is,
can it recognize her expression? -
5:09 - 5:10So we're going to test the machine.
-
5:10 - 5:15So first of all, give me your poker face.
Yep, awesome. (Laughter) -
5:15 - 5:17And then as she smiles,
this is a genuine smile, it's great. -
5:17 - 5:20So you can see the green bar
go up as she smiles. -
5:20 - 5:21Now that was a big smile.
-
5:21 - 5:24Can you try a subtle smile
to see if the computer can recognize? -
5:24 - 5:26It does recognize subtle smiles as well.
-
5:26 - 5:28We've worked really hard
to make that happen. -
5:28 - 5:31And then eyebrow raised,
indicator of surprise. -
5:31 - 5:36Brow furrow, which is
an indicator of confusion. -
5:36 - 5:40Frown. Yes, perfect.
-
5:40 - 5:43So these are all the different
action units. There's many more of them. -
5:43 - 5:45This is just a slimmed-down demo.
-
5:45 - 5:48But we call each reading
an emotion data point, -
5:48 - 5:51and then they can fire together
to portray different emotions. -
5:51 - 5:56So on the right side of the demo --
look like you're happy. -
5:56 - 5:57So that's joy. Joy fires up.
-
5:57 - 5:59And then give me a disgust face.
-
5:59 - 6:04Try to remember what it was like
when Zayn left One Direction. -
6:04 - 6:05(Laughter)
-
6:05 - 6:09Yeah, wrinkle your nose. Awesome.
-
6:09 - 6:13And the valence is actually quite
negative, so you must have been a big fan. -
6:13 - 6:16So valence is how positive
or negative an experience is, -
6:16 - 6:19and engagement is how
expressive she is as well. -
6:19 - 6:22So imagine if Cloe had access
to this real-time emotion stream, -
6:22 - 6:25and she could share it
with anybody she wanted to. -
6:25 - 6:28Thank you.
-
6:28 - 6:32(Applause)
-
6:34 - 6:39So, so far, we have amassed
12 billion of these emotion data points. -
6:39 - 6:42It's the largest emotion
database in the world. -
6:42 - 6:45We've collected it
from 2.9 million face videos, -
6:45 - 6:47people who have agreed
to share their emotions with us, -
6:47 - 6:50and from 75 countries around the world.
-
6:50 - 6:52It's growing every day.
-
6:53 - 6:55It blows my mind away
-
6:55 - 6:58that we can now quantify something
as personal as our emotions, -
6:58 - 7:00and we can do it at this scale.
-
7:00 - 7:02So what have we learned to date?
-
7:03 - 7:05Gender.
-
7:05 - 7:09Our data confirms something
that you might suspect. -
7:09 - 7:11Women are more expressive than men.
-
7:11 - 7:14Not only do they smile more,
their smiles last longer, -
7:14 - 7:16and we can now really quantify
what it is that men and women -
7:16 - 7:19respond to differently.
-
7:19 - 7:21Let's do culture: So in the United States,
-
7:21 - 7:24women are 40 percent
more expressive than men, -
7:24 - 7:28but curiously, we don't see any difference
in the U.K. between men and women. -
7:28 - 7:30(Laughter)
-
7:31 - 7:35Age: People who are 50 years and older
-
7:35 - 7:39are 25 percent more emotive
than younger people. -
7:40 - 7:44Women in their 20s smile a lot more
than men the same age, -
7:44 - 7:48perhaps a necessity for dating.
-
7:48 - 7:50But perhaps what surprised us
the most about this data -
7:50 - 7:53is that we happen
to be expressive all the time, -
7:53 - 7:56even when we are sitting
in front of our devices alone, -
7:56 - 8:00and it's not just when we're watching
cat videos on Facebook. -
8:00 - 8:03We are expressive when we're emailing,
texting, shopping online, -
8:03 - 8:06or even doing our taxes.
-
8:06 - 8:08Where is this data used today?
-
8:08 - 8:11In understanding how we engage with media,
-
8:11 - 8:13so understanding virality
and voting behavior; -
8:13 - 8:16and also empowering
or emotion-enabling technology, -
8:16 - 8:21and I want to share some examples
that are especially close to my heart. -
8:21 - 8:24Emotion-enabled wearable glasses
can help individuals -
8:24 - 8:27who are visually impaired
read the faces of others, -
8:27 - 8:32and it can help individuals
on the autism spectrum interpret emotion, -
8:32 - 8:34something that they really struggle with.
-
8:36 - 8:39In education, imagine
if your learning apps -
8:39 - 8:42sense that you're confused and slow down,
-
8:42 - 8:43or that you're bored, so it's sped up,
-
8:43 - 8:46just like a great teacher
would in a classroom. -
8:47 - 8:50What if your wristwatch tracked your mood,
-
8:50 - 8:52or your car sensed that you're tired,
-
8:52 - 8:55or perhaps your fridge
knows that you're stressed, -
8:55 - 9:01so it auto-locks to prevent you
from binge eating. (Laughter) -
9:01 - 9:04I would like that, yeah.
-
9:04 - 9:06What if, when I was in Cambridge,
-
9:06 - 9:08I had access to my real-time
emotion stream, -
9:08 - 9:11and I could share that with my family
back home in a very natural way, -
9:11 - 9:15just like I would've if we were all
in the same room together? -
9:15 - 9:19I think five years down the line,
-
9:19 - 9:21all our devices are going
to have an emotion chip, -
9:21 - 9:25and we won't remember what it was like
when we couldn't just frown at our device -
9:25 - 9:29and our device would say, "Hmm,
you didn't like that, did you?" -
9:29 - 9:33Our biggest challenge is that there are
so many applications of this technology, -
9:33 - 9:36my team and I realize that we can't
build them all ourselves, -
9:36 - 9:39so we've made this technology available
so that other developers -
9:39 - 9:41can get building and get creative.
-
9:41 - 9:46We recognize that
there are potential risks -
9:46 - 9:48and potential for abuse,
-
9:48 - 9:51but personally, having spent
many years doing this, -
9:51 - 9:54I believe that the benefits to humanity
-
9:54 - 9:56from having emotionally
intelligent technology -
9:56 - 9:59far outweigh the potential for misuse.
-
9:59 - 10:02And I invite you all to be
part of the conversation. -
10:02 - 10:04The more people who know
about this technology, -
10:04 - 10:08the more we can all have a voice
in how it's being used. -
10:09 - 10:14So as more and more
of our lives become digital, -
10:14 - 10:17we are fighting a losing battle
trying to curb our usage of devices -
10:17 - 10:19in order to reclaim our emotions.
-
10:21 - 10:25So what I'm trying to do instead
is to bring emotions into our technology -
10:25 - 10:27and make our technologies more responsive.
-
10:27 - 10:29So I want those devices
that have separated us -
10:29 - 10:32to bring us back together.
-
10:32 - 10:36And by humanizing technology,
we have this golden opportunity -
10:36 - 10:40to reimagine how we
connect with machines, -
10:40 - 10:44and therefore, how we, as human beings,
-
10:44 - 10:46connect with one another.
-
10:46 - 10:48Thank you.
-
10:48 - 10:52(Applause)
- Title:
- This app knows how you feel — from the look on your face
- Speaker:
- Rana el Kaliouby
- Description:
-
Our emotions influence every aspect of our lives – how we learn, how we communicate, how we make decisions. Yet they’re absent from our digital lives; the devices and apps we interact with have no way of knowing how we feel. Scientist Rana el Kaliouby aims to change that. She demos a powerful new technology that reads your facial expressions and matches them to corresponding emotions. This “emotion engine” has big implications, she says, and could change not just how we interact with machines — but with each other.
- Video Language:
- English
- Team:
- closed TED
- Project:
- TEDTalks
- Duration:
- 11:04
Kacper Borowiecki commented on English subtitles for This app knows how you feel -- from the look on your face | ||
Krystian Aparta edited English subtitles for This app knows how you feel -- from the look on your face | ||
Morton Bast edited English subtitles for This app knows how you feel -- from the look on your face | ||
Morton Bast approved English subtitles for This app knows how you feel -- from the look on your face | ||
Morton Bast edited English subtitles for This app knows how you feel -- from the look on your face | ||
Morton Bast edited English subtitles for This app knows how you feel -- from the look on your face | ||
Morton Bast edited English subtitles for This app knows how you feel -- from the look on your face | ||
Morton Bast edited English subtitles for This app knows how you feel -- from the look on your face |
Kacper Borowiecki
Typo in 8:42:
(...) you're confused and slow down, or that you're bored, so it's SPEED up, just like a great teacher (...)