The moral bias behind your search results | Andreas Ekstrøm | TEDxOslo
-
0:04 - 0:07So whenever I visit a school
and talk to students, -
0:07 - 0:09I always ask them the same thing:
-
0:10 - 0:12Why do you Google?
-
0:12 - 0:15Why is Google the search engine
of choice for you? -
0:16 - 0:19Strangely enough, I always get
the same three answers. -
0:19 - 0:21One, "Because it works,"
-
0:21 - 0:23which is a great answer;
that's why I Google, too. -
0:23 - 0:26Two, somebody will say,
-
0:26 - 0:29"I really don't know of any alternatives."
-
0:29 - 0:32It's not an equally great answer
and my reply to that is usually, -
0:32 - 0:34"Try to Google the word 'search engine,'
-
0:34 - 0:37you may find a couple
of interesting alternatives." -
0:37 - 0:39And last but not least, thirdly,
-
0:39 - 0:43inevitably, one student will raise
her or his hand and say, -
0:43 - 0:48"With Google, I'm certain to always get
the best, unbiased search result." -
0:49 - 0:55Certain to always get the best,
unbiased search result. -
0:57 - 0:59Now, as a man of the humanities,
-
0:59 - 1:01albeit a digital humanities man,
-
1:01 - 1:03that just makes my skin curl,
-
1:03 - 1:08even if I, too, realize that that trust,
that idea of the unbiased search result -
1:08 - 1:12is a cornerstone in our collective love
for and appreciation of Google. -
1:12 - 1:16I will show you why that, philosophically,
is almost an impossibility. -
1:16 - 1:19But let me first elaborate,
just a little bit, on a basic principle -
1:19 - 1:23behind each search query
that we sometimes seem to forget. -
1:23 - 1:25So whenever you set out
to Google something, -
1:25 - 1:29start by asking yourself this:
"Am I looking for an isolated fact?" -
1:30 - 1:33What is the capital of France?
-
1:33 - 1:35What are the building blocks
of a water molecule? -
1:35 - 1:38Great -- Google away.
-
1:38 - 1:41There's not a group of scientists
who are this close to proving -
1:41 - 1:43that it's actually London and H30.
-
1:43 - 1:45You don't see a big conspiracy
among those things. -
1:45 - 1:47We agree, on a global scale,
-
1:47 - 1:50what the answers are
to these isolated facts. -
1:50 - 1:55But if you complicate your question
just a little bit and ask something like, -
1:55 - 1:58"Why is there
an Israeli-Palestine conflict?" -
1:58 - 2:01You're not exactly looking
for a singular fact anymore, -
2:01 - 2:03you're looking for knowledge,
-
2:03 - 2:06which is something way more
complicated and delicate. -
2:06 - 2:08And to get to knowledge,
-
2:08 - 2:11you have to bring 10 or 20
or 100 facts to the table -
2:11 - 2:13and acknowledge them and say,
"Yes, these are all true." -
2:13 - 2:15But because of who I am,
-
2:15 - 2:18young or old, black or white,
gay or straight, -
2:18 - 2:19I will value them differently.
-
2:19 - 2:21And I will say, "Yes, this is true,
-
2:21 - 2:23but this is more important
to me than that." -
2:23 - 2:25And this is where it becomes interesting,
-
2:25 - 2:27because this is where we become human.
-
2:27 - 2:30This is when we start
to argue, to form society. -
2:30 - 2:33And to really get somewhere,
we need to filter all our facts here, -
2:33 - 2:36through friends and neighbors
and parents and children -
2:36 - 2:38and coworkers and newspapers
and magazines, -
2:38 - 2:41to finally be grounded in real knowledge,
-
2:41 - 2:46which is something that a search engine
is a poor help to achieve. -
2:47 - 2:52So, I promised you an example
just to show you why it's so hard -
2:53 - 2:57to get to the point of true, clean,
objective knowledge -- -
2:57 - 2:58as food for thought.
-
2:58 - 3:02I will conduct a couple of simple
queries, search queries. -
3:02 - 3:05We'll start with "Michelle Obama,"
-
3:06 - 3:08the First Lady of the United States.
-
3:08 - 3:10And we'll click for pictures.
-
3:11 - 3:13It works really well, as you can see.
-
3:13 - 3:16It's a perfect search
result, more or less. -
3:16 - 3:19It's just her in the picture,
not even the President. -
3:19 - 3:21How does this work?
-
3:21 - 3:23Quite simple.
-
3:23 - 3:26Google uses a lot of smartness
to achieve this, but quite simply, -
3:26 - 3:29they look at two things
more than anything. -
3:29 - 3:34First, what does it say in the caption
under the picture on each website? -
3:34 - 3:36Does it say "Michelle Obama"
under the picture? -
3:36 - 3:38Pretty good indication
it's actually her on there. -
3:38 - 3:41Second, Google looks at the picture file,
-
3:41 - 3:44the name of the file as such
uploaded to the website. -
3:44 - 3:46Again, is it called "MichelleObama.jpeg"?
-
3:46 - 3:49Pretty good indication it's not
Clint Eastwood in the picture. -
3:49 - 3:53So, you've got those two and you get
a search result like this -- almost. -
3:53 - 3:59Now, in 2009, Michelle Obama
was the victim of a racist campaign, -
4:00 - 4:04where people set out to insult her
through her search results. -
4:05 - 4:08There was a picture distributed
widely over the Internet -
4:08 - 4:10where her face was distorted
to look like a monkey. -
4:10 - 4:13And that picture was published all over.
-
4:14 - 4:17And people published it
very, very purposefully, -
4:17 - 4:19to get it up there in the search results.
-
4:19 - 4:22They made sure to write
"Michelle Obama" in the caption -
4:22 - 4:26and they made sure to upload the picture
as "MichelleObama.jpeg," or the like. -
4:26 - 4:29You get why -- to manipulate
the search result. -
4:29 - 4:30And it worked, too.
-
4:30 - 4:33So when you picture-Googled
for "Michelle Obama" in 2009, -
4:33 - 4:36that distorted monkey picture
showed up among the first results. -
4:36 - 4:40Now, the results are self-cleansing,
-
4:40 - 4:42and that's sort of the beauty of it,
-
4:42 - 4:45because Google measures relevance
every hour, every day. -
4:45 - 4:48However, Google didn't settle
for that this time, -
4:48 - 4:51they just thought, "That's racist
and it's a bad search result -
4:51 - 4:54and we're going to go back
and clean that up manually. -
4:54 - 4:57We are going to write
some code and fix it," -
4:57 - 4:58which they did.
-
4:59 - 5:03And I don't think anyone in this room
thinks that was a bad idea. -
5:03 - 5:05Me neither.
-
5:06 - 5:09But then, a couple of years go by,
-
5:09 - 5:12and the world's most-Googled Anders,
-
5:12 - 5:15Anders Behring Breivik,
-
5:15 - 5:16did what he did.
-
5:16 - 5:19This is July 22 in 2011,
-
5:19 - 5:21and a terrible day in Norwegian history.
-
5:21 - 5:25This man, a terrorist, blew up
a couple of government buildings -
5:25 - 5:28walking distance from where we are
right now in Oslo, Norway -
5:28 - 5:30and then he traveled
to the island of Utøya -
5:30 - 5:32and shot and killed a group of kids.
-
5:32 - 5:34Almost 80 people died that day.
-
5:36 - 5:41And a lot of people would describe
this act of terror as two steps, -
5:41 - 5:44that he did two things: he blew up
the buildings and he shot those kids. -
5:44 - 5:46It's not true.
-
5:46 - 5:48It was three steps.
-
5:48 - 5:50He blew up those buildings,
he shot those kids, -
5:50 - 5:54and he sat down and waited
for the world to Google him. -
5:55 - 5:57And he prepared
all three steps equally well. -
5:58 - 6:01And if there was somebody
who immediately understood this, -
6:01 - 6:03it was a Swedish web developer,
-
6:03 - 6:06a search engine optimization expert
in Stockholm, named Nikke Lindqvist. -
6:06 - 6:08He's also a very political guy
-
6:08 - 6:11and he was right out there
in social media, on his blog and Facebook. -
6:11 - 6:13And he told everybody,
-
6:13 - 6:15"If there's something that
this guy wants right now, -
6:15 - 6:18it's to control the image of himself.
-
6:18 - 6:20Let's see if we can distort that.
-
6:21 - 6:25Let's see if we, in the civilized world,
can protest against what he did -
6:25 - 6:29through insulting him
in his search results." -
6:29 - 6:30And how?
-
6:30 - 6:33He told all of his readers the following,
-
6:33 - 6:34"Go out there on the Internet,
-
6:34 - 6:37find pictures of dog poop on sidewalks --
-
6:38 - 6:41find pictures of dog poop on sidewalks --
-
6:41 - 6:44publish them in your feeds,
on your websites, on your blogs. -
6:44 - 6:47Make sure to write the terrorist's
name in the caption, -
6:47 - 6:51make sure to name
the picture file "Breivik.jpeg." -
6:51 - 6:55Let's teach Google that that's
the face of the terrorist." -
6:57 - 6:59And it worked.
-
7:00 - 7:02Two years after that campaign
against Michelle Obama, -
7:02 - 7:06this manipulation campaign
against Anders Behring Breivik worked. -
7:06 - 7:10If you picture-Googled for him weeks after
the July 22 events from Sweden, -
7:10 - 7:14you'd see that picture of dog poop
high up in the search results, -
7:14 - 7:16as a little protest.
-
7:17 - 7:21Strangely enough, Google
didn't intervene this time. -
7:22 - 7:26They did not step in and manually
clean those search results up. -
7:27 - 7:29So the million-dollar question,
-
7:29 - 7:33is there anything different
between these two happenings here? -
7:33 - 7:36Is there anything different between
what happened to Michelle Obama -
7:36 - 7:38and what happened
to Anders Behring Breivik? -
7:38 - 7:40Of course not.
-
7:40 - 7:42It's the exact same thing,
-
7:42 - 7:45yet Google intervened in one case
and not in the other. -
7:45 - 7:46Why?
-
7:47 - 7:50Because Michelle Obama
is an honorable person, that's why, -
7:50 - 7:53and Anders Behring Breivik
is a despicable person. -
7:54 - 7:55See what happens there?
-
7:55 - 7:58An evaluation of a person takes place
-
7:58 - 8:02and there's only one
power-player in the world -
8:02 - 8:05with the authority to say who's who.
-
8:06 - 8:08"We like you, we dislike you.
-
8:08 - 8:10We believe in you,
we don't believe in you. -
8:10 - 8:12You're right, you're wrong.
You're true, you're false. -
8:12 - 8:14You're Obama, and you're Breivik."
-
8:14 - 8:17That's power if I ever saw it.
-
8:19 - 8:23So I'm asking you to remember
that behind every algorithm -
8:23 - 8:24is always a person,
-
8:24 - 8:27a person with a set of personal beliefs
-
8:27 - 8:29that no code can ever
completely eradicate. -
8:29 - 8:32And my message goes
out not only to Google, -
8:32 - 8:35but to all believers in the faith
of code around the world. -
8:35 - 8:38You need to identify
your own personal bias. -
8:38 - 8:40You need to understand that you are human
-
8:40 - 8:43and take responsibility accordingly.
-
8:44 - 8:46And I say this because I believe
we've reached a point in time -
8:46 - 8:48when it's absolutely imperative
-
8:48 - 8:51that we tie those bonds
together again, tighter: -
8:51 - 8:54the humanities and the technology.
-
8:54 - 8:56Tighter than ever.
-
8:56 - 8:59And, if nothing else, to remind us
that that wonderfully seductive idea -
8:59 - 9:02of the unbiased, clean search result
-
9:02 - 9:05is, and is likely to remain, a myth.
-
9:05 - 9:07Thank you for your time.
-
9:07 - 9:09(Applause)
- Title:
- The moral bias behind your search results | Andreas Ekstrøm | TEDxOslo
- Description:
-
Search engines have become our most trusted sources of information and arbiters of truth. But can we ever get an unbiased search result? Swedish author and journalist Andreas Ekström argues that such a thing is a philosophical impossibility. In this thoughtful talk, he calls on us to strengthen the bonds between technology and the humanities, and he reminds us that behind every algorithm is a set of personal beliefs that no code can ever completely eradicate.
This talk was given at a TEDx event using the TED conference format but independently organized by a local community. Learn more at http://ted.com/tedx
- Video Language:
- English
- Team:
- closed TED
- Project:
- TEDxTalks
- Duration:
- 09:16
TED Translators admin edited English subtitles for The myth of the unbiased search result | Andreas Ekstrøm | TEDxOslo | ||
TED Translators admin edited English subtitles for The myth of the unbiased search result | Andreas Ekstrøm | TEDxOslo | ||
TED Translators admin edited English subtitles for The myth of the unbiased search result | Andreas Ekstrøm | TEDxOslo | ||
TED Translators admin edited English subtitles for The myth of the unbiased search result | Andreas Ekstrøm | TEDxOslo | ||
TED Translators admin edited English subtitles for The myth of the unbiased search result | Andreas Ekstrøm | TEDxOslo |