Reconceptualizing Security | Bruce Schneier | TEDxPSU
-
0:23 - 0:25So security is two different things:
-
0:25 - 0:28it's a feeling, and it's a reality.
-
0:28 - 0:29And they're different.
-
0:29 - 0:31You could feel secure
-
0:31 - 0:33even if you're not.
-
0:33 - 0:35And you can be secure
-
0:35 - 0:37even if you don't feel it.
-
0:37 - 0:39Really, we have two separate concepts
-
0:39 - 0:41mapped onto the same word.
-
0:41 - 0:43And what I want to do in this talk
-
0:43 - 0:45is to split them apart --
-
0:45 - 0:47figuring out when they diverge
-
0:47 - 0:49and how they converge.
-
0:49 - 0:51And language is actually a problem here.
-
0:51 - 0:53There aren't a lot of good words
-
0:53 - 0:56for the concepts
we're going to talk about. -
0:56 - 0:58So if you look at security
-
0:58 - 1:00from economic terms,
-
1:00 - 1:02it's a trade-off.
-
1:02 - 1:04Every time you get some security,
-
1:04 - 1:06you're always trading off something.
-
1:06 - 1:08Whether this is a personal decision --
-
1:08 - 1:11whether you're going to install
a burglar alarm in your home -- -
1:11 - 1:15or a national decision -- where you're
going to invade some foreign country -- -
1:15 - 1:16you're going to trade off something,
-
1:16 - 1:19either money or time,
convenience, capabilities, -
1:19 - 1:21maybe fundamental liberties.
-
1:21 - 1:24And the question to ask
when you look at a security anything -
1:24 - 1:27is not whether this makes us safer,
-
1:27 - 1:30but whether it's worth the trade-off.
-
1:30 - 1:32You've heard in the past several years,
-
1:32 - 1:35the world is safer because
Saddam Hussein is not in power. -
1:35 - 1:37That might be true,
but it's not terribly relevant. -
1:37 - 1:40The question is, was it worth it?
-
1:40 - 1:43And you can make your own decision,
-
1:43 - 1:46and then you'll decide
whether the invasion was worth it. -
1:46 - 1:48That's how you think about security --
-
1:48 - 1:49in terms of the trade-off.
-
1:49 - 1:52Now there's often no right or wrong here.
-
1:52 - 1:54Some of us have
a burglar alarm system at home, -
1:54 - 1:56and some of us don't.
-
1:56 - 1:58And it'll depend on where we live,
-
1:58 - 2:00whether we live alone or have a family,
-
2:00 - 2:02how much cool stuff we have,
-
2:02 - 2:04how much we're willing to accept
-
2:04 - 2:06the risk of theft.
-
2:06 - 2:08In politics also,
-
2:08 - 2:10there are different opinions.
-
2:10 - 2:12A lot of times, these trade-offs
-
2:12 - 2:14are about more than just security,
-
2:14 - 2:16and I think that's really important.
-
2:16 - 2:18Now people have a natural intuition
-
2:18 - 2:20about these trade-offs.
-
2:20 - 2:22We make them every day --
-
2:22 - 2:24last night in my hotel room,
-
2:24 - 2:26when I decided to double-lock the door,
-
2:26 - 2:28or you in your car when you drove here,
-
2:28 - 2:30when we go eat lunch
-
2:30 - 2:33and decide the food's not poison
and we'll eat it. -
2:33 - 2:35We make these trade-offs
again and again, -
2:35 - 2:37multiple times a day.
-
2:37 - 2:39We often don't even notice them.
-
2:39 - 2:41They're just part of being alive;
we all do it. -
2:41 - 2:44Every species does it.
-
2:44 - 2:46Imagine a rabbit in a field, eating grass,
-
2:46 - 2:49and the rabbit's going to see a fox.
-
2:49 - 2:51That rabbit will make
a security trade-off: -
2:51 - 2:53"Should I stay, or should I flee?"
-
2:53 - 2:55And if you think about it,
-
2:55 - 2:58the rabbits that are good
at making that trade-off -
2:58 - 3:00will tend to live and reproduce,
-
3:00 - 3:02and the rabbits that are bad at it
-
3:02 - 3:04will get eaten or starve.
-
3:04 - 3:06So you'd think
-
3:06 - 3:09that us, as a successful species
on the planet -- -
3:09 - 3:11you, me, everybody --
-
3:11 - 3:14would be really good
at making these trade-offs. -
3:14 - 3:16Yet it seems, again and again,
-
3:16 - 3:19that we're hopelessly bad at it.
-
3:19 - 3:22And I think that's a fundamentally
interesting question. -
3:22 - 3:24I'll give you the short answer.
-
3:24 - 3:26The answer is, we respond
to the feeling of security -
3:26 - 3:29and not the reality.
-
3:29 - 3:32Now most of the time, that works.
-
3:33 - 3:35Most of the time,
-
3:35 - 3:37feeling and reality are the same.
-
3:38 - 3:40Certainly that's true
-
3:40 - 3:43for most of human prehistory.
-
3:43 - 3:46We've developed this ability
-
3:46 - 3:48because it makes evolutionary sense.
-
3:49 - 3:51One way to think of it
-
3:51 - 3:52is that we're highly optimized
-
3:52 - 3:54for risk decisions
-
3:54 - 3:57that are endemic to living
in small family groups -
3:57 - 4:00in the East African highlands
in 100,000 B.C. -
4:00 - 4:032010 New York, not so much.
-
4:06 - 4:10Now there are several biases
in risk perception. -
4:10 - 4:11A lot of good experiments in this.
-
4:11 - 4:15And you can see certain biases
that come up again and again. -
4:15 - 4:17So I'll give you four.
-
4:17 - 4:20We tend to exaggerate
spectacular and rare risks -
4:20 - 4:22and downplay common risks --
-
4:22 - 4:24so flying versus driving.
-
4:24 - 4:26The unknown is perceived
-
4:26 - 4:28to be riskier than the familiar.
-
4:31 - 4:32One example would be,
-
4:32 - 4:35people fear kidnapping by strangers
-
4:35 - 4:39when the data supports kidnapping
by relatives is much more common. -
4:39 - 4:41This is for children.
-
4:41 - 4:43Third, personified risks
-
4:43 - 4:46are perceived to be greater
than anonymous risks -- -
4:46 - 4:48so Bin Laden is scarier
because he has a name. -
4:50 - 4:51And the fourth
-
4:51 - 4:54is people underestimate risks
-
4:54 - 4:56in situations they do control
-
4:56 - 4:59and overestimate them
in situations they don't control. -
4:59 - 5:02So once you take up skydiving or smoking,
-
5:02 - 5:04you downplay the risks.
-
5:04 - 5:08If a risk is thrust upon you
-- terrorism was a good example -- -
5:08 - 5:11you'll overplay it because you don't feel
like it's in your control. -
5:11 - 5:15There are a bunch of other
of these cognitive biases, -
5:15 - 5:17that affect our risk decisions.
-
5:18 - 5:20There's the availability heuristic,
-
5:20 - 5:22which basically means
-
5:22 - 5:25we estimate the probability of something
-
5:25 - 5:28by how easy it is
to bring instances of it to mind. -
5:30 - 5:31So you can imagine how that works.
-
5:31 - 5:35If you hear a lot about tiger attacks,
there must be a lot of tigers around. -
5:35 - 5:38You don't hear about lion attacks,
there aren't a lot of lions around. -
5:38 - 5:41This works until you invent newspapers.
-
5:41 - 5:42Because what newspapers do
-
5:42 - 5:45is they repeat again and again
-
5:45 - 5:46rare risks.
-
5:46 - 5:49I tell people, if it's in the news,
don't worry about it. -
5:49 - 5:50Because by definition,
-
5:50 - 5:53news is something
that almost never happens. -
5:53 - 5:55(Laughter)
-
5:55 - 5:59When something is so common,
it's no longer news -- -
5:59 - 6:01car crashes, domestic violence --
-
6:01 - 6:03those are the risks you worry about.
-
6:03 - 6:05We're also a species of storytellers.
-
6:05 - 6:08We respond to stories more than data.
-
6:08 - 6:11And there's some basic
innumeracy going on. -
6:11 - 6:14I mean, the joke
"One, Two, Three, Many" is kind of right. -
6:14 - 6:16We're really good at small numbers.
-
6:16 - 6:18One mango, two mangoes, three mangoes,
-
6:18 - 6:2010,000 mangoes, 100,000 mangoes --
-
6:20 - 6:23it's still more mangoes
you can eat before they rot. -
6:23 - 6:27So one half, one quarter, one fifth
-- we're good at that. -
6:27 - 6:29One in a million, one in a billion --
-
6:29 - 6:31they're both almost never.
-
6:31 - 6:33So we have trouble with the risks
-
6:33 - 6:35that aren't very common.
-
6:35 - 6:37And what these cognitive biases do
-
6:37 - 6:40is they act as filters
between us and reality. -
6:41 - 6:43And the result
-
6:43 - 6:45is that feeling and reality
get out of whack, -
6:45 - 6:47they get different.
-
6:47 - 6:51Now you either have a feeling
-- you feel more secure than you are. -
6:51 - 6:53There's a false sense of security.
-
6:53 - 6:54Or the other way,
-
6:54 - 6:56and that's a false sense of insecurity.
-
6:56 - 6:59I write a lot about "security theater,"
-
6:59 - 7:02which are products
that make people feel secure, -
7:02 - 7:04but don't actually do anything.
-
7:04 - 7:07There's no real word
for stuff that makes us secure, -
7:07 - 7:09but doesn't make us feel secure.
-
7:09 - 7:12Maybe it's what the CIA's supposed to do
for us. -
7:13 - 7:15So back to economics.
-
7:15 - 7:18If economics, if the market,
drives security, -
7:18 - 7:21and if people make trade-offs
-
7:21 - 7:23based on the feeling of security,
-
7:23 - 7:27then the smart thing for companies to do
-
7:27 - 7:29for the economic incentives
-
7:29 - 7:31are to make people feel secure.
-
7:32 - 7:34And there are two ways to do this.
-
7:34 - 7:37One, you can make people actually secure
-
7:37 - 7:38and hope they notice.
-
7:38 - 7:41Or two, you can make people
just feel secure -
7:41 - 7:43and hope they don't notice.
-
7:46 - 7:48So what makes people notice?
-
7:49 - 7:51Well a couple of things:
-
7:51 - 7:53understanding of the security,
-
7:53 - 7:54of the risks, the threats,
-
7:54 - 7:56the countermeasures, how they work.
-
7:56 - 7:58But if you know stuff,
-
7:58 - 8:02you're more likely to have
your feelings match reality. -
8:03 - 8:06Enough real world examples helps.
-
8:06 - 8:08Now we all know the crime rate
in our neighborhood, -
8:08 - 8:11because we live there,
and we get a feeling about it -
8:11 - 8:13that basically matches reality.
-
8:15 - 8:17Security theater's exposed
-
8:17 - 8:19when it's obvious
that it's not working properly. -
8:21 - 8:23Okay, so what makes people not notice?
-
8:23 - 8:26Well, a poor understanding.
-
8:26 - 8:29If you don't understand the risks,
you don't understand the costs, -
8:29 - 8:31you're likely to get the trade-off wrong,
-
8:31 - 8:34and your feeling doesn't match reality.
-
8:34 - 8:37Not enough examples.
-
8:37 - 8:38There's an inherent problem
-
8:38 - 8:40with low probability events.
-
8:40 - 8:42If, for example,
-
8:42 - 8:44terrorism almost never happens,
-
8:44 - 8:46it's really hard to judge
-
8:46 - 8:49the efficacy of counter-terrorist
measures. -
8:50 - 8:53This is why you keep sacrificing virgins,
-
8:53 - 8:56and why your unicorn defenses
are working just great. -
8:56 - 8:59There aren't enough examples of failures.
-
9:00 - 9:03Also, feelings that are clouding
the issues -- -
9:03 - 9:06the cognitive biases
I talked about earlier, -
9:06 - 9:07fears, folk beliefs,
-
9:09 - 9:11basically an inadequate model of reality.
-
9:13 - 9:15So let me complicate things.
-
9:15 - 9:17I have feeling and reality.
-
9:17 - 9:20I want to add a third element.
I want to add model. -
9:21 - 9:23Feeling and model in our head,
-
9:23 - 9:25reality is the outside world.
-
9:25 - 9:26It doesn't change; it's real.
-
9:28 - 9:30So feeling is based on our intuition.
-
9:30 - 9:32Model is based on reason.
-
9:32 - 9:34That's basically the difference.
-
9:34 - 9:36In a primitive and simple world,
-
9:36 - 9:38there's really no reason for a model
-
9:40 - 9:42because feeling is close to reality.
-
9:42 - 9:44You don't need a model.
-
9:44 - 9:47But in a modern and complex world,
-
9:47 - 9:48you need models
-
9:48 - 9:51to understand a lot of the risks we face.
-
9:52 - 9:55There's no feeling about germs.
-
9:55 - 9:57You need a model to understand them.
-
9:57 - 9:59So this model
-
9:59 - 10:02is an intelligent representation
of reality. -
10:02 - 10:05It's, of course, limited by science,
-
10:05 - 10:07by technology.
-
10:08 - 10:10We couldn't have a germ theory of disease
-
10:10 - 10:13before we invented
the microscope to see them. -
10:14 - 10:16It's limited by our cognitive biases.
-
10:18 - 10:19But it has the ability
-
10:19 - 10:21to override our feelings.
-
10:21 - 10:24Where do we get these models?
We get them from others. -
10:24 - 10:27We get them from religion, from culture,
-
10:27 - 10:29teachers, elders.
-
10:29 - 10:31A couple years ago,
-
10:31 - 10:33I was in South Africa on safari.
-
10:33 - 10:36The tracker I was with
grew up in Kruger National Park. -
10:36 - 10:39He had some very complex models
of how to survive. -
10:39 - 10:41And it depended on if you were attacked
-
10:41 - 10:43by a lion or a leopard
or a rhino or an elephant -- -
10:43 - 10:46and when you had to run away,
and when you couldn't run away, -
10:46 - 10:50and when you had to climb a tree --
when you could never climb a tree. -
10:50 - 10:52I would have died in a day,
-
10:52 - 10:54but he was born there,
-
10:54 - 10:56and he understood how to survive.
-
10:56 - 10:58I was born in New York City.
-
10:58 - 11:01I could have taken him to New York,
and he would have died in a day. -
11:01 - 11:03(Laughter)
-
11:03 - 11:04Because we had different models
-
11:04 - 11:07based on our different experiences.
-
11:08 - 11:10Models can come from the media,
-
11:10 - 11:12from our elected officials.
-
11:13 - 11:16Think of models of terrorism,
-
11:16 - 11:18child kidnapping,
-
11:18 - 11:21airline safety, car safety.
-
11:21 - 11:23Models can come from industry.
-
11:25 - 11:27The two I'm following
are surveillance cameras, -
11:27 - 11:29ID cards,
-
11:29 - 11:32quite a lot of our computer security
models come from there. -
11:32 - 11:34A lot of models come from science.
-
11:34 - 11:36Health models are a great example.
-
11:36 - 11:39Think of cancer, of bird flu,
swine flu, SARS. -
11:40 - 11:42All of our feelings of security
-
11:42 - 11:44about those diseases
-
11:44 - 11:46come from models
-
11:46 - 11:49given to us, really, by science filtered
through the media. -
11:51 - 11:53So models can change.
-
11:53 - 11:55Models are not static.
-
11:55 - 11:58As we become more comfortable
in our environments, -
11:58 - 12:02our model can move closer to our feelings.
-
12:04 - 12:06So an example might be,
-
12:06 - 12:08if you go back 100 years ago
-
12:08 - 12:11when electricity was first
becoming common, -
12:11 - 12:13there were a lot of fears about it.
-
12:13 - 12:16I mean, there were people
who were afraid to push doorbells, -
12:16 - 12:19because there was electricity in there,
and that was dangerous. -
12:19 - 12:21For us, we're very facile
around electricity. -
12:21 - 12:23We change light bulbs
-
12:23 - 12:25without even thinking about it.
-
12:25 - 12:28Our model of security around electricity
-
12:28 - 12:31is something we were born into.
-
12:32 - 12:34It hasn't changed as we were growing up.
-
12:34 - 12:36And we're good at it.
-
12:37 - 12:39Or think of the risks
-
12:39 - 12:41on the Internet across generations --
-
12:41 - 12:44how your parents approach
Internet security, -
12:44 - 12:46versus how you do,
-
12:46 - 12:48versus how our kids will.
-
12:48 - 12:50Models eventually fade
into the background. -
12:52 - 12:55Intuitive is just another word
for familiar. -
12:56 - 12:58So as your model is close to reality,
-
12:58 - 13:00and it converges with feelings,
-
13:00 - 13:02you often don't know it's there.
-
13:03 - 13:05So a nice example of this
-
13:05 - 13:07came from last year and swine flu.
-
13:08 - 13:10When swine flu first appeared,
-
13:10 - 13:13the initial news caused
a lot of overreaction. -
13:13 - 13:15Now it had a name,
-
13:15 - 13:18which made it scarier
than the regular flu, -
13:18 - 13:20even though it was more deadly.
-
13:20 - 13:23And people thought doctors
should be able to deal with it. -
13:23 - 13:26So there was that feeling
of lack of control. -
13:26 - 13:27And those two things
-
13:27 - 13:29made the risk more than it was.
-
13:29 - 13:32As the novelty wore off,
the months went by, -
13:32 - 13:34there was some amount of tolerance,
-
13:34 - 13:36people got used to it.
-
13:36 - 13:39There was no new data,
but there was less fear. -
13:39 - 13:41By autumn,
-
13:41 - 13:43people thought
-
13:43 - 13:45the doctors should have solved this
already. -
13:45 - 13:47And there's kind of a bifurcation --
-
13:47 - 13:49people had to choose
-
13:49 - 13:51between fear and acceptance --
-
13:54 - 13:56actually fear and indifference --
-
13:56 - 13:58they kind of chose suspicion.
-
13:59 - 14:02And when the vaccine appeared last winter,
-
14:02 - 14:04there were a lot of people
-- a surprising number -- -
14:04 - 14:07who refused to get it --
-
14:08 - 14:10as a nice example
-
14:10 - 14:14of how people's feelings of security
change, how their model changes, -
14:14 - 14:16sort of wildly
-
14:16 - 14:17with no new information,
-
14:17 - 14:19with no new input.
-
14:20 - 14:23This kind of thing happens a lot.
-
14:23 - 14:25I'm going to give one more complication.
-
14:25 - 14:27We have feeling, model, reality.
-
14:28 - 14:31I have a very relativistic view
of security. -
14:31 - 14:33I think it depends on the observer.
-
14:33 - 14:35And most security decisions
-
14:35 - 14:38have a variety of people involved.
-
14:39 - 14:41And stakeholders
-
14:41 - 14:44with specific trade-offs
-
14:44 - 14:46will try to influence the decision.
-
14:46 - 14:48And I call that their agenda.
-
14:49 - 14:51And you see agenda --
-
14:51 - 14:53this is marketing, this is politics --
-
14:53 - 14:56trying to convince you to have
one model versus another, -
14:56 - 14:58trying to convince you to ignore a model
-
14:58 - 15:01and trust your feelings,
-
15:01 - 15:04marginalizing people
with models you don't like. -
15:05 - 15:07This is not uncommon.
-
15:07 - 15:11An example, a great example,
is the risk of smoking. -
15:12 - 15:14In the history of the past 50 years,
the smoking risk -
15:14 - 15:17shows how a model changes,
-
15:17 - 15:19and it also shows
how an industry fights against -
15:19 - 15:21a model it doesn't like.
-
15:22 - 15:25Compare that
to the secondhand smoke debate -- -
15:25 - 15:27probably about 20 years behind.
-
15:27 - 15:30Think about seat belts.
-
15:30 - 15:32When I was a kid, no one wore a seat belt.
-
15:32 - 15:34Nowadays, no kid will let you drive
-
15:34 - 15:36if you're not wearing a seat belt.
-
15:37 - 15:39Compare that to the airbag debate --
-
15:39 - 15:42probably about 30 years behind.
-
15:42 - 15:45All examples of models changing.
-
15:47 - 15:50What we learn is that
changing models is hard. -
15:50 - 15:53Models are hard to dislodge.
-
15:53 - 15:55If they equal your feelings,
-
15:55 - 15:57you don't even know you have a model.
-
15:57 - 15:59And there's another cognitive bias
-
15:59 - 16:01I'll call confirmation bias,
-
16:01 - 16:03where we tend to accept data
-
16:03 - 16:06that confirms our beliefs
-
16:06 - 16:08and reject data
that contradicts our beliefs. -
16:14 - 16:16So evidence against our model,
-
16:16 - 16:19we're likely to ignore,
even if it's compelling. -
16:19 - 16:22It has to get very compelling
before we'll pay attention. -
16:23 - 16:26New models that extend
long periods of time are hard. -
16:26 - 16:28Global warming is a great example.
-
16:28 - 16:29We're terrible
-
16:29 - 16:31at models that span 80 years.
-
16:31 - 16:33We can do to the next harvest.
-
16:33 - 16:36We can often do until our kids grow up.
-
16:36 - 16:39But 80 years, we're just not good at.
-
16:39 - 16:41So it's a very hard model to accept.
-
16:42 - 16:46We can have both models
in our head simultaneously, -
16:46 - 16:49right, that kind of problem
-
16:50 - 16:53where we're holding both beliefs together,
-
16:53 - 16:55right, the cognitive dissonance.
-
16:55 - 16:56Eventually,
-
16:56 - 16:58the new model will replace the old model.
-
16:58 - 17:01Strong feelings can create a model.
-
17:02 - 17:05September 11th created a security model
-
17:05 - 17:07in a lot of people's heads.
-
17:07 - 17:10Also, personal experiences
with crime can do it, -
17:10 - 17:12personal health scare,
-
17:12 - 17:14a health scare in the news.
-
17:14 - 17:16You'll see these called flashbulb events
-
17:16 - 17:18by psychiatrists.
-
17:19 - 17:21They can create a model instantaneously,
-
17:21 - 17:23because they're very emotive.
-
17:24 - 17:26So in the technological world,
-
17:26 - 17:28we don't have experience
-
17:28 - 17:30to judge models.
-
17:30 - 17:32And we rely on others. We rely on proxies.
-
17:32 - 17:36I mean, this works
as long as it's to correct others. -
17:36 - 17:38We rely on government agencies
-
17:38 - 17:42to tell us what pharmaceuticals are safe.
-
17:43 - 17:44I flew here yesterday.
-
17:44 - 17:47I didn't check the airplane.
-
17:47 - 17:49I relied on some other group
-
17:49 - 17:52to determine whether
my plane was safe to fly. -
17:52 - 17:55We're here, none of us fear
the roof is going to collapse on us, -
17:55 - 17:57not because we checked,
-
17:57 - 17:59but because we're pretty sure
-
17:59 - 18:02the building codes here are good.
-
18:03 - 18:05It's a model we just accept
-
18:05 - 18:07pretty much by faith.
-
18:08 - 18:10And that's okay.
-
18:12 - 18:14Now, what we want
-
18:14 - 18:16is people to get familiar enough
-
18:16 - 18:18with better models --
-
18:18 - 18:20have it reflected in their feelings --
-
18:20 - 18:23to allow them to make security trade-offs.
-
18:24 - 18:26Now when these go out of whack,
-
18:26 - 18:28you have two options.
-
18:28 - 18:30One, you can fix people's feelings,
-
18:30 - 18:32directly appeal to feelings.
-
18:32 - 18:35It's manipulation, but it can work.
-
18:35 - 18:37The second, more honest way
-
18:37 - 18:40is to actually fix the model.
-
18:41 - 18:43Change happens slowly.
-
18:43 - 18:46The smoking debate took 40 years,
-
18:46 - 18:47and that was an easy one.
-
18:50 - 18:52Some of this stuff is hard.
-
18:52 - 18:53I mean really though,
-
18:53 - 18:55information seems like our best hope.
-
18:55 - 18:57And I lied.
-
18:57 - 19:00Remember I said feeling, model, reality;
-
19:00 - 19:02I said reality doesn't change.
It actually does. -
19:02 - 19:04We live in a technological world;
-
19:04 - 19:06reality changes all the time.
-
19:07 - 19:10So we might have
-- for the first time in our species -- -
19:10 - 19:13feeling chases model,
model chases reality, reality's moving -- -
19:13 - 19:15they might never catch up.
-
19:17 - 19:18We don't know.
-
19:20 - 19:22But in the long-term,
-
19:22 - 19:24both feeling and reality are important.
-
19:24 - 19:27And I want to close with two quick stories
to illustrate this. -
19:27 - 19:291982 -- I don't know if people
will remember this -- -
19:29 - 19:32there was a short epidemic
-
19:32 - 19:34of Tylenol poisonings
in the United States. -
19:34 - 19:37It's a horrific story.
Someone took a bottle of Tylenol, -
19:37 - 19:40put poison in it, closed it up,
put it back on the shelf. -
19:40 - 19:42Someone else bought it and died.
-
19:42 - 19:44This terrified people.
-
19:44 - 19:46There were a couple of copycat attacks.
-
19:46 - 19:49There wasn't any real risk,
but people were scared. -
19:49 - 19:50And this is how
-
19:50 - 19:53the tamper-proof drug industry
was invented. -
19:53 - 19:55Those tamper-proof caps,
that came from this. -
19:55 - 19:57It's complete security theater.
-
19:57 - 20:00As a homework assignment,
think of 10 ways to get around it. -
20:00 - 20:02I'll give you one, a syringe.
-
20:02 - 20:04But it made people feel better.
-
20:05 - 20:07It made their feeling of security
-
20:07 - 20:09more match the reality.
-
20:10 - 20:12Last story, a few years ago,
a friend of mine gave birth. -
20:12 - 20:14I visit her in the hospital.
-
20:14 - 20:16It turns out when a baby's born now,
-
20:16 - 20:18they put an RFID bracelet on the baby,
-
20:18 - 20:20put a corresponding one on the mother,
-
20:20 - 20:23so if anyone other than the mother
takes the baby out of the maternity ward, -
20:23 - 20:24an alarm goes off.
-
20:24 - 20:26I said, "Well, that's kind of neat.
-
20:26 - 20:29I wonder how rampant baby snatching is
-
20:29 - 20:30out of hospitals."
-
20:30 - 20:32I go home, I look it up.
-
20:32 - 20:34It basically never happens.
-
20:35 - 20:36But if you think about it,
-
20:36 - 20:38if you are a hospital,
-
20:38 - 20:40and you need to take a baby
away from its mother, -
20:40 - 20:42out of the room to run some tests,
-
20:42 - 20:44you better have some
good security theater, -
20:44 - 20:46or she's going to rip your arm off.
-
20:46 - 20:48(Laughter)
-
20:48 - 20:50So it's important for us,
-
20:50 - 20:52those of us who design security,
-
20:52 - 20:55who look at security policy,
-
20:55 - 20:57or even look at public policy
-
20:57 - 20:59in ways that affect security.
-
20:59 - 21:02It's not just reality;
it's feeling and reality. -
21:02 - 21:04What's important
-
21:04 - 21:06is that they be about the same.
-
21:06 - 21:09It's important that,
if our feelings match reality, -
21:09 - 21:11we make better security trade-offs.
-
21:11 - 21:12Thank you.
-
21:12 - 21:14(Applause)
- Title:
- Reconceptualizing Security | Bruce Schneier | TEDxPSU
- Description:
-
Bruce Schneier is an internationally-renowned security technologist and author. Described by The Economist as a "security guru," he is best known as a refreshingly candid and lucid security critic and commentator. When people want to know how security really works, they turn to Schneier.
- Video Language:
- English
- Team:
- closed TED
- Project:
- TEDxTalks
- Duration:
- 21:14
TED Translators admin approved English subtitles for Reconceptualizing Security | Bruce Schneier | TEDxPSU | ||
Ivana Korom accepted English subtitles for Reconceptualizing Security | Bruce Schneier | TEDxPSU | ||
Ivana Korom edited English subtitles for Reconceptualizing Security | Bruce Schneier | TEDxPSU | ||
Ivana Korom edited English subtitles for Reconceptualizing Security | Bruce Schneier | TEDxPSU | ||
Ivana Korom edited English subtitles for Reconceptualizing Security | Bruce Schneier | TEDxPSU | ||
TED Translators admin edited English subtitles for Reconceptualizing Security | Bruce Schneier | TEDxPSU | ||
TED Translators admin edited English subtitles for Reconceptualizing Security | Bruce Schneier | TEDxPSU | ||
TED Translators admin edited English subtitles for Reconceptualizing Security | Bruce Schneier | TEDxPSU |