How to separate fact and fiction online
-
0:01 - 0:03I've been a journalist now since I was about 17,
-
0:03 - 0:07and it's an interesting industry to be in at the moment,
-
0:07 - 0:09because as you all know, there's a huge amount of upheaval
-
0:09 - 0:12going on in media, and most of you probably know this
-
0:12 - 0:15from the business angle, which is that the business model
-
0:15 - 0:18is pretty screwed, and as my grandfather would say,
-
0:18 - 0:21the profits have all been gobbled up by Google.
-
0:21 - 0:23So it's a really interesting time to be a journalist,
-
0:23 - 0:26but the upheaval that I'm interested in is not on the output side.
-
0:26 - 0:29It's on the input side. It's concern with
-
0:29 - 0:32how we get information and how we gather the news.
-
0:32 - 0:35And that's changed, because we've had a huge shift
-
0:35 - 0:37in the balance of power from
-
0:37 - 0:39the news organizations to the audience.
-
0:39 - 0:41And the audience for such a long time was in a position
-
0:41 - 0:44where they didn't have any way of affecting news
-
0:44 - 0:46or making any change. They couldn't really connect.
-
0:46 - 0:48And that's changed irrevocably.
-
0:48 - 0:50My first connection with the news media was
-
0:50 - 0:54in 1984, the BBC had a one-day strike.
-
0:54 - 0:57I wasn't happy. I was angry. I couldn't see my cartoons.
-
0:57 - 1:00So I wrote a letter.
-
1:00 - 1:03And it's a very effective way of ending your hate mail:
-
1:03 - 1:06"Love Markham, Aged 4." Still works.
-
1:06 - 1:09I'm not sure if I had any impact on the one-day strike,
-
1:09 - 1:12but what I do know is that it took them three weeks to get back to me.
-
1:12 - 1:14And that was the round journey. It took that long for anyone
-
1:14 - 1:16to have any impact and get some feedback.
-
1:16 - 1:19And that's changed now because, as journalists,
-
1:19 - 1:22we interact in real time. We're not in a position
-
1:22 - 1:24where the audience is reacting to news.
-
1:24 - 1:28We're reacting to the audience, and we're actually relying on them.
-
1:28 - 1:30They're helping us find the news. They're helping us
-
1:30 - 1:35figure out what is the best angle to take and what is the stuff that they want to hear.
-
1:35 - 1:39So it's a real-time thing. It's much quicker. It's happening
-
1:39 - 1:45on a constant basis, and the journalist is always playing catch up.
-
1:45 - 1:47To give an example of how we rely on the audience,
-
1:47 - 1:52on the 5th of September in Costa Rica, an earthquake hit.
-
1:52 - 1:54It was a 7.6 magnitude. It was fairly big.
-
1:54 - 1:57And 60 seconds is the amount of time it took
-
1:57 - 2:00for it to travel 250 kilometers to Managua.
-
2:00 - 2:04So the ground shook in Managua 60 seconds after it hit the epicenter.
-
2:04 - 2:06Thirty seconds later, the first message went onto Twitter,
-
2:06 - 2:09and this was someone saying "temblor," which means earthquake.
-
2:09 - 2:12So 60 seconds was how long it took
-
2:12 - 2:14for the physical earthquake to travel.
-
2:14 - 2:16Thirty seconds later news of that earthquake had traveled
-
2:16 - 2:19all around the world, instantly. Everyone in the world,
-
2:19 - 2:22hypothetically, had the potential to know that an earthquake
-
2:22 - 2:25was happening in Managua.
-
2:25 - 2:27And that happened because this one person had
-
2:27 - 2:31a documentary instinct, which was to post a status update,
-
2:31 - 2:34which is what we all do now, so if something happens,
-
2:34 - 2:36we put our status update, or we post a photo,
-
2:36 - 2:39we post a video, and it all goes up into the cloud in a constant stream.
-
2:39 - 2:42And what that means is just constant,
-
2:42 - 2:45huge volumes of data going up.
-
2:45 - 2:47It's actually staggering. When you look at the numbers,
-
2:47 - 2:50every minute there are 72 more hours
-
2:50 - 2:51of video on YouTube.
-
2:51 - 2:55So that's, every second, more than an hour of video gets uploaded.
-
2:55 - 2:59And in photos, Instagram, 58 photos are uploaded to Instagram a second.
-
2:59 - 3:03More than three and a half thousand photos go up onto Facebook.
-
3:03 - 3:06So by the time I'm finished talking here, there'll be 864
-
3:06 - 3:10more hours of video on Youtube than there were when I started,
-
3:10 - 3:14and two and a half million more photos on Facebook and Instagram than when I started.
-
3:14 - 3:18So it's an interesting position to be in as a journalist,
-
3:18 - 3:20because we should have access to everything.
-
3:20 - 3:23Any event that happens anywhere in the world, I should be able to know about it
-
3:23 - 3:27pretty much instantaneously, as it happens, for free.
-
3:27 - 3:30And that goes for every single person in this room.
-
3:30 - 3:33The only problem is, when you have that much information,
-
3:33 - 3:35you have to find the good stuff, and that can be
-
3:35 - 3:37incredibly difficult when you're dealing with those volumes.
-
3:37 - 3:39And nowhere was this brought home more than during
-
3:39 - 3:42Hurricane Sandy. So what you had in Hurricane Sandy was
-
3:42 - 3:45a superstorm, the likes of which we hadn't seen for a long time,
-
3:45 - 3:48hitting the iPhone capital of the universe -- (Laughter) --
-
3:48 - 3:53and you got volumes of media like we'd never seen before.
-
3:53 - 3:55And that meant that journalists had to deal with fakes,
-
3:55 - 3:58so we had to deal with old photos that were being reposted.
-
3:58 - 4:00We had to deal with composite images
-
4:00 - 4:04that were merging photos from previous storms.
-
4:04 - 4:09We had to deal with images from films like "The Day After Tomorrow." (Laughter)
-
4:09 - 4:12And we had to deal with images that were so realistic
-
4:12 - 4:14it was nearly difficult to tell if they were real at all.
-
4:14 - 4:18(Laughter)
-
4:18 - 4:22But joking aside, there were images like this one from Instagram
-
4:22 - 4:24which was subjected to a grilling by journalists.
-
4:24 - 4:27They weren't really sure. It was filtered in Instagram.
-
4:27 - 4:29The lighting was questioned. Everything was questioned about it.
-
4:29 - 4:31And it turned out to be true. It was from Avenue C
-
4:31 - 4:34in downtown Manhattan, which was flooded.
-
4:34 - 4:36And the reason that they could tell that it was real
-
4:36 - 4:38was because they could get to the source, and in this case,
-
4:38 - 4:40these guys were New York food bloggers.
-
4:40 - 4:42They were well respected. They were known.
-
4:42 - 4:45So this one wasn't a debunk, it was actually something that they could prove.
-
4:45 - 4:48And that was the job of the journalist. It was filtering all this stuff.
-
4:48 - 4:51And you were, instead of going and finding the information
-
4:51 - 4:53and bringing it back to the reader, you were holding back
-
4:53 - 4:55the stuff that was potentially damaging.
-
4:55 - 4:58And finding the source becomes more and more important --
-
4:58 - 5:02finding the good source -- and Twitter is where most journalists now go.
-
5:02 - 5:05It's like the de facto real-time newswire,
-
5:05 - 5:08if you know how to use it, because there is so much on Twitter.
-
5:08 - 5:10And a good example of how useful it can be
-
5:10 - 5:14but also how difficult was the Egyptian revolution in 2011.
-
5:14 - 5:17As a non-Arabic speaker, as someone who was looking
-
5:17 - 5:19from the outside, from Dublin,
-
5:19 - 5:21Twitter lists, and lists of good sources,
-
5:21 - 5:24people we could establish were credible, were really important.
-
5:24 - 5:27And how do you build a list like that from scratch?
-
5:27 - 5:29Well, it can be quite difficult, but you have to know what to look for.
-
5:29 - 5:32This visualization was done by an Italian academic.
-
5:32 - 5:36He's called André Pannison, and he basically
-
5:36 - 5:38took the Twitter conversation in Tahrir Square
-
5:38 - 5:41on the day that Hosni Mubarak would eventually resign,
-
5:41 - 5:44and the dots you can see are retweets, so when someone
-
5:44 - 5:47retweets a message, a connection is made between two dots,
-
5:47 - 5:49and the more times that message is retweeted by other people,
-
5:49 - 5:52the more you get to see these nodes, these connections being made.
-
5:52 - 5:54And it's an amazing way of visualizing the conversation,
-
5:54 - 5:57but what you get is hints at who is more interesting
-
5:57 - 6:00and who is worth investigating.
-
6:00 - 6:03And as the conversation grew and grew, it became
-
6:03 - 6:05more and more lively, and eventually you were left
-
6:05 - 6:10with this huge, big, rhythmic pointer of this conversation.
-
6:10 - 6:11You could find the nodes, though, and then you went,
-
6:11 - 6:14and you go, "Right, I've got to investigate these people.
-
6:14 - 6:16These are the ones that are obviously making sense.
-
6:16 - 6:18Let's see who they are."
-
6:18 - 6:20Now in the deluge of information, this is where
-
6:20 - 6:24the real-time web gets really interesting for a journalist like myself,
-
6:24 - 6:26because we have more tools than ever
-
6:26 - 6:28to do that kind of investigation.
-
6:28 - 6:31And when you start digging into the sources, you can go
-
6:31 - 6:34further and further than you ever could before.
-
6:34 - 6:37Sometimes you come across a piece of content that
-
6:37 - 6:41is so compelling, you want to use it, you're dying to use it,
-
6:41 - 6:43but you're not 100 percent sure if you can because
-
6:43 - 6:44you don't know if the source is credible.
-
6:44 - 6:47You don't know if it's a scrape. You don't know if it's a re-upload.
-
6:47 - 6:48And you have to do that investigative work.
-
6:48 - 6:51And this video, which I'm going to let run through,
-
6:51 - 6:54was one we discovered a couple of weeks ago.
-
6:54 - 6:56Video: Getting real windy in just a second.
-
6:56 - 7:01(Rain and wind sounds)
-
7:01 - 7:04(Explosion) Oh, shit!
-
7:04 - 7:07Markham Nolan: Okay, so now if you're a news producer, this is something
-
7:07 - 7:09you'd love to run with, because obviously, this is gold.
-
7:09 - 7:12You know? This is a fantastic reaction from someone,
-
7:12 - 7:14very genuine video that they've shot in their back garden.
-
7:14 - 7:18But how do you find if this person, if it's true, if it's faked,
-
7:18 - 7:20or if it's something that's old and that's been reposted?
-
7:20 - 7:23So we set about going to work on this video, and
-
7:23 - 7:25the only thing that we had to go on was the username on the YouTube account.
-
7:25 - 7:28There was only one video posted to that account,
-
7:28 - 7:29and the username was Rita Krill.
-
7:29 - 7:33And we didn't know if Rita existed or if it was a fake name.
-
7:33 - 7:36But we started looking, and we used free Internet tools to do so.
-
7:36 - 7:39The first one was called Spokeo, which allowed us to look for Rita Krills.
-
7:39 - 7:41So we looked all over the U.S. We found them in New York,
-
7:41 - 7:44we found them in Pennsylvania, Nevada and Florida.
-
7:44 - 7:47So we went and we looked for a second free Internet tool
-
7:47 - 7:49called Wolfram Alpha, and we checked the weather reports
-
7:49 - 7:52for the day in which this video had been uploaded,
-
7:52 - 7:53and when we went through all those various cities,
-
7:53 - 7:57we found that in Florida, there were thunderstorms and rain on the day.
-
7:57 - 8:00So we went to the white pages, and we found,
-
8:00 - 8:03we looked through the Rita Krills in the phonebook,
-
8:03 - 8:04and we looked through a couple of different addresses,
-
8:04 - 8:07and that took us to Google Maps, where we found a house.
-
8:07 - 8:09And we found a house with a swimming pool that looked
-
8:09 - 8:12remarkably like Rita's. So we went back to the video,
-
8:12 - 8:15and we had to look for clues that we could cross-reference.
-
8:15 - 8:18So if you look in the video, there's the big umbrella,
-
8:18 - 8:20there's a white lilo in the pool,
-
8:20 - 8:23there are some unusually rounded edges in the swimming pool,
-
8:23 - 8:25and there's two trees in the background.
-
8:25 - 8:27And we went back to Google Maps, and we looked a little bit closer,
-
8:27 - 8:30and sure enough, there's the white lilo,
-
8:30 - 8:33there are the two trees,
-
8:33 - 8:35there's the umbrella. It's actually folded in this photo.
-
8:35 - 8:39Little bit of trickery. And there are the rounded edges on the swimming pool.
-
8:39 - 8:42So we were able to call Rita, clear the video,
-
8:42 - 8:44make sure that it had been shot, and then our clients
-
8:44 - 8:47were delighted because they were able to run it without being worried.
-
8:47 - 8:49Sometimes the search for truth, though,
-
8:49 - 8:53is a little bit less flippant, and it has much greater consequences.
-
8:53 - 8:56Syria has been really interesting for us, because obviously
-
8:56 - 8:59a lot of the time you're trying to debunk stuff that can be
-
8:59 - 9:03potentially war crime evidence, so this is where YouTube
-
9:03 - 9:05actually becomes the most important repository
-
9:05 - 9:09of information about what's going on in the world.
-
9:09 - 9:12So this video, I'm not going to show you the whole thing,
-
9:12 - 9:15because it's quite gruesome, but you'll hear some of the sounds.
-
9:15 - 9:17This is from Hama.
-
9:17 - 9:20Video: (Shouting)
-
9:20 - 9:24And what this video shows, when you watch the whole thing through,
-
9:24 - 9:27is bloody bodies being taken out of a pickup truck
-
9:27 - 9:29and thrown off a bridge.
-
9:29 - 9:32The allegations were that these guys were Muslim Brotherhood
-
9:32 - 9:35and they were throwing Syrian Army officers' bodies
-
9:35 - 9:38off the bridge, and they were cursing and using blasphemous language,
-
9:38 - 9:40and there were lots of counterclaims about who they were,
-
9:40 - 9:42and whether or not they were what the video said it was.
-
9:42 - 9:46So we talked to some sources in Hama who we had been
-
9:46 - 9:48back and forth with on Twitter, and we asked them about this,
-
9:48 - 9:52and the bridge was interesting to us because it was something we could identify.
-
9:52 - 9:55Three different sources said three different things about the bridge.
-
9:55 - 9:57They said, one, the bridge doesn't exist.
-
9:57 - 10:01Another one said the bridge does exist, but it's not in Hama. It's somewhere else.
-
10:01 - 10:03And the third one said, "I think the bridge does exist,
-
10:03 - 10:07but the dam upstream of the bridge was closed,
-
10:07 - 10:10so the river should actually have been dry, so this doesn't make sense."
-
10:10 - 10:13So that was the only one that gave us a clue.
-
10:13 - 10:14We looked through the video for other clues.
-
10:14 - 10:17We saw the distinctive railings, which we could use.
-
10:17 - 10:21We looked at the curbs. The curbs were throwing shadows south,
-
10:21 - 10:23so we could tell the bridge was running east-west across the river.
-
10:23 - 10:25It had black-and-white curbs.
-
10:25 - 10:27As we looked at the river itself, you could see there's
-
10:27 - 10:30a concrete stone on the west side. There's a cloud of blood.
-
10:30 - 10:32That's blood in the river. So the river is flowing
-
10:32 - 10:33south to north. That's what that tells me.
-
10:33 - 10:36And also, as you look away from the bridge,
-
10:36 - 10:37there's a divot on the left-hand side of the bank,
-
10:37 - 10:40and the river narrows.
-
10:40 - 10:42So onto Google Maps we go, and we start
-
10:42 - 10:44looking through literally every single bridge.
-
10:44 - 10:48We go to the dam that we talked about, we start just
-
10:48 - 10:51literally going through every time that road crosses the river,
-
10:51 - 10:53crossing off the bridges that don't match.
-
10:53 - 10:55We're looking for one that crosses east-west.
-
10:55 - 10:57And we get to Hama. We get all the way from the dam
-
10:57 - 10:59to Hama and there's no bridge.
-
10:59 - 11:01So we go a bit further. We switch to the satellite view,
-
11:01 - 11:04and we find another bridge, and everything starts to line up.
-
11:04 - 11:07The bridge looks like it's crossing the river east to west.
-
11:07 - 11:10So this could be our bridge. And we zoom right in.
-
11:10 - 11:13We start to see that it's got a median, so it's a two-lane bridge.
-
11:13 - 11:17And it's got the black-and-white curbs that we saw in the video,
-
11:17 - 11:19and as we click through it, you can see someone's
-
11:19 - 11:22uploaded photos to go with the map, which is very handy,
-
11:22 - 11:25so we click into the photos. And the photos start showing us
-
11:25 - 11:28more detail that we can cross-reference with the video.
-
11:28 - 11:31The first thing that we see is we see black-and-white curbing,
-
11:31 - 11:33which is handy because we've seen that before.
-
11:33 - 11:37We see the distinctive railing that we saw the guys
-
11:37 - 11:39throwing the bodies over.
-
11:39 - 11:42And we keep going through it until we're certain that this is our bridge.
-
11:42 - 11:43So what does that tell me? I've got to go back now
-
11:43 - 11:46to my three sources and look at what they told me:
-
11:46 - 11:47the one who said the bridge didn't exist,
-
11:47 - 11:49the one who said the bridge wasn't in Hama,
-
11:49 - 11:53and the one guy who said, "Yes, the bridge does exist, but I'm not sure about the water levels."
-
11:53 - 11:57Number three is looking like the most truthful all of a sudden,
-
11:57 - 12:00and we've been able to find that out using some free Internet tools
-
12:00 - 12:02sitting in a cubicle in an office in Dublin
-
12:02 - 12:04in the space of 20 minutes.
-
12:04 - 12:06And that's part of the joy of this. Although the web
-
12:06 - 12:09is running like a torrent, there's so much information there
-
12:09 - 12:12that it's incredibly hard to sift and getting harder every day,
-
12:12 - 12:16if you use them intelligently, you can find out incredible information.
-
12:16 - 12:18Given a couple of clues, I could probably find out
-
12:18 - 12:22a lot of things about most of you in the audience that you might not like me finding out.
-
12:22 - 12:25But what it tells me is that, at a time when
-
12:25 - 12:29there's more -- there's a greater abundance of information than there ever has been,
-
12:29 - 12:31it's harder to filter, we have greater tools.
-
12:31 - 12:33We have free Internet tools that allow us,
-
12:33 - 12:35help us do this kind of investigation.
-
12:35 - 12:37We have algorithms that are smarter than ever before,
-
12:37 - 12:40and computers that are quicker than ever before.
-
12:40 - 12:43But here's the thing. Algorithms are rules. They're binary.
-
12:43 - 12:45They're yes or no, they're black or white.
-
12:45 - 12:49Truth is never binary. Truth is a value.
-
12:49 - 12:53Truth is emotional, it's fluid, and above all, it's human.
-
12:53 - 12:55No matter how quick we get with computers, no matter
-
12:55 - 12:58how much information we have, you'll never be able
-
12:58 - 13:01to remove the human from the truth-seeking exercise,
-
13:01 - 13:04because in the end, it is a uniquely human trait.
-
13:04 - 13:08Thanks very much. (Applause)
- Title:
- How to separate fact and fiction online
- Speaker:
- Markham Nolan
- Description:
-
By the end of this talk, there will be 864 more hours of video on YouTube and 2.5 million more photos on Facebook and Instagram. So how do we sort through the deluge? At the TEDSalon in London, Markham Nolan shares the investigative techniques he and his team use to verify information in real-time, to let you know if that Statue of Liberty image has been doctored or if that video leaked from Syria is legitimate.
- Video Language:
- English
- Team:
- closed TED
- Project:
- TEDTalks
- Duration:
- 13:29
Thu-Huong Ha edited English subtitles for How to separate fact and fiction online | ||
Thu-Huong Ha approved English subtitles for How to separate fact and fiction online | ||
Thu-Huong Ha edited English subtitles for How to separate fact and fiction online | ||
Morton Bast accepted English subtitles for How to separate fact and fiction online | ||
Morton Bast edited English subtitles for How to separate fact and fiction online | ||
Morton Bast edited English subtitles for How to separate fact and fiction online | ||
Joseph Geni edited English subtitles for How to separate fact and fiction online | ||
Joseph Geni added a translation |