0:00:00.750,0:00:03.026 So, security is two different things: 0:00:03.050,0:00:05.576 it's a feeling, and it's a reality. 0:00:05.600,0:00:07.025 And they're different. 0:00:07.049,0:00:10.476 You could feel secure even if you're not. 0:00:10.500,0:00:12.476 And you can be secure 0:00:12.500,0:00:14.350 even if you don't feel it. 0:00:14.374,0:00:16.491 Really, we have two separate concepts 0:00:16.515,0:00:18.167 mapped onto the same word. 0:00:18.700,0:00:22.326 And what I want to do in this talk[br]is to split them apart -- 0:00:22.350,0:00:25.960 figuring out when they diverge[br]and how they converge. 0:00:26.451,0:00:28.726 And language is actually a problem here. 0:00:28.750,0:00:30.826 There aren't a lot of good words 0:00:30.850,0:00:32.911 for the concepts[br]we're going to talk about. 0:00:34.035,0:00:38.155 So if you look at security[br]from economic terms, 0:00:38.179,0:00:39.826 it's a trade-off. 0:00:39.850,0:00:43.982 Every time you get some security,[br]you're always trading off something. 0:00:44.006,0:00:45.851 Whether this is a personal decision -- 0:00:45.875,0:00:48.887 whether you're going to install[br]a burglar alarm in your home -- 0:00:48.911,0:00:50.068 or a national decision, 0:00:50.092,0:00:52.402 where you're going to invade[br]a foreign country -- 0:00:52.426,0:00:56.208 you're going to trade off something:[br]money or time, convenience, capabilities, 0:00:56.232,0:00:58.234 maybe fundamental liberties. 0:00:58.258,0:01:01.532 And the question to ask[br]when you look at a security anything 0:01:01.556,0:01:04.938 is not whether this makes us safer, 0:01:04.962,0:01:07.177 but whether it's worth the trade-off. 0:01:07.201,0:01:10.430 You've heard in the past[br]several years, the world is safer 0:01:10.454,0:01:12.344 because Saddam Hussein is not in power. 0:01:12.368,0:01:14.971 That might be true,[br]but it's not terribly relevant. 0:01:14.995,0:01:17.826 The question is: Was it worth it? 0:01:17.850,0:01:20.309 And you can make your own decision, 0:01:20.333,0:01:23.066 and then you'll decide[br]whether the invasion was worth it. 0:01:23.090,0:01:26.651 That's how you think about security:[br]in terms of the trade-off. 0:01:26.675,0:01:29.285 Now, there's often no right or wrong here. 0:01:29.948,0:01:33.256 Some of us have a burglar alarm[br]system at home and some of us don't. 0:01:33.280,0:01:36.011 And it'll depend on where we live, 0:01:36.035,0:01:37.961 whether we live alone or have a family, 0:01:37.985,0:01:39.653 how much cool stuff we have, 0:01:39.677,0:01:42.842 how much we're willing[br]to accept the risk of theft. 0:01:43.683,0:01:46.631 In politics also,[br]there are different opinions. 0:01:47.199,0:01:51.634 A lot of times, these trade-offs[br]are about more than just security, 0:01:51.658,0:01:53.523 and I think that's really important. 0:01:53.547,0:01:56.855 Now, people have a natural intuition[br]about these trade-offs. 0:01:57.328,0:01:58.884 We make them every day. 0:01:59.547,0:02:03.080 Last night in my hotel room,[br]when I decided to double-lock the door, 0:02:03.104,0:02:05.104 or you in your car when you drove here; 0:02:05.931,0:02:07.409 when we go eat lunch 0:02:07.433,0:02:10.041 and decide the food's not[br]poison and we'll eat it. 0:02:10.065,0:02:13.226 We make these trade-offs again and again, 0:02:13.250,0:02:14.826 multiple times a day. 0:02:14.850,0:02:16.439 We often won't even notice them. 0:02:16.463,0:02:19.089 They're just part[br]of being alive; we all do it. 0:02:19.113,0:02:20.668 Every species does it. 0:02:21.214,0:02:24.076 Imagine a rabbit in a field, eating grass. 0:02:24.100,0:02:26.043 And the rabbit sees a fox. 0:02:26.596,0:02:28.645 That rabbit will make[br]a security trade-off: 0:02:28.669,0:02:30.573 "Should I stay, or should I flee?" 0:02:31.120,0:02:32.739 And if you think about it, 0:02:32.763,0:02:35.318 the rabbits that are good[br]at making that trade-off 0:02:35.342,0:02:37.320 will tend to live and reproduce, 0:02:37.344,0:02:39.651 and the rabbits that are bad at it 0:02:39.675,0:02:41.149 will get eaten or starve. 0:02:41.698,0:02:43.306 So you'd think 0:02:44.313,0:02:48.622 that us, as a successful species[br]on the planet -- you, me, everybody -- 0:02:48.646,0:02:51.219 would be really good[br]at making these trade-offs. 0:02:51.866,0:02:54.970 Yet it seems, again and again,[br]that we're hopelessly bad at it. 0:02:56.508,0:02:59.310 And I think that's a fundamentally[br]interesting question. 0:02:59.334,0:03:01.207 I'll give you the short answer. 0:03:01.231,0:03:03.882 The answer is, we respond[br]to the feeling of security 0:03:03.906,0:03:05.424 and not the reality. 0:03:06.604,0:03:09.300 Now, most of the time, that works. 0:03:10.278,0:03:11.781 Most of the time, 0:03:11.805,0:03:13.938 feeling and reality are the same. 0:03:15.516,0:03:19.032 Certainly that's true[br]for most of human prehistory. 0:03:20.373,0:03:23.081 We've developed this ability 0:03:23.105,0:03:25.689 because it makes evolutionary sense. 0:03:26.725,0:03:29.999 One way to think of it[br]is that we're highly optimized 0:03:30.023,0:03:31.826 for risk decisions 0:03:31.850,0:03:34.393 that are endemic to living[br]in small family groups 0:03:34.417,0:03:36.953 in the East African Highlands[br]in 100,000 BC. 0:03:37.532,0:03:40.191 2010 New York, not so much. 0:03:41.619,0:03:44.825 Now, there are several biases[br]in risk perception. 0:03:44.849,0:03:46.590 A lot of good experiments in this. 0:03:46.614,0:03:50.217 And you can see certain biases[br]that come up again and again. 0:03:50.241,0:03:51.594 I'll give you four. 0:03:51.618,0:03:54.826 We tend to exaggerate[br]spectacular and rare risks 0:03:54.850,0:03:56.826 and downplay common risks -- 0:03:56.850,0:03:58.368 so, flying versus driving. 0:03:59.191,0:04:02.985 The unknown is perceived[br]to be riskier than the familiar. 0:04:06.210,0:04:07.649 One example would be: 0:04:07.673,0:04:10.286 people fear kidnapping by strangers, 0:04:10.310,0:04:13.946 when the data supports that kidnapping[br]by relatives is much more common. 0:04:13.970,0:04:15.544 This is for children. 0:04:15.568,0:04:19.608 Third, personified risks[br]are perceived to be greater 0:04:19.632,0:04:21.135 than anonymous risks. 0:04:21.159,0:04:23.946 So, Bin Laden is scarier[br]because he has a name. 0:04:24.922,0:04:26.285 And the fourth is: 0:04:26.309,0:04:31.064 people underestimate risks[br]in situations they do control 0:04:31.088,0:04:34.051 and overestimate them[br]in situations they don't control. 0:04:34.075,0:04:37.459 So once you take up skydiving or smoking, 0:04:37.483,0:04:39.107 you downplay the risks. 0:04:39.777,0:04:42.830 If a risk is thrust upon you --[br]terrorism is a good example -- 0:04:42.854,0:04:44.011 you'll overplay it, 0:04:44.035,0:04:46.374 because you don't feel[br]like it's in your control. 0:04:46.897,0:04:50.390 There are a bunch[br]of other of these cognitive biases, 0:04:50.414,0:04:52.753 that affect our risk decisions. 0:04:53.572,0:04:55.826 There's the availability heuristic, 0:04:55.850,0:05:00.030 which basically means we estimate[br]the probability of something 0:05:00.054,0:05:03.393 by how easy it is to bring[br]instances of it to mind. 0:05:04.571,0:05:06.348 So you can imagine how that works. 0:05:06.372,0:05:10.000 If you hear a lot about tiger attacks,[br]there must be a lot of tigers around. 0:05:10.024,0:05:13.368 You don't hear about lion attacks,[br]there aren't a lot of lions around. 0:05:13.392,0:05:15.689 This works, until you invent newspapers, 0:05:15.713,0:05:20.119 because what newspapers do[br]is repeat again and again 0:05:20.143,0:05:21.549 rare risks. 0:05:21.573,0:05:24.438 I tell people: if it's in the news,[br]don't worry about it, 0:05:24.462,0:05:28.737 because by definition, news is something[br]that almost never happens. 0:05:28.761,0:05:30.530 (Laughter) 0:05:30.554,0:05:33.477 When something is so common,[br]it's no longer news. 0:05:33.501,0:05:35.699 Car crashes, domestic violence -- 0:05:35.723,0:05:37.713 those are the risks you worry about. 0:05:38.453,0:05:40.601 We're also a species of storytellers. 0:05:40.625,0:05:42.740 We respond to stories more than data. 0:05:43.254,0:05:45.660 And there's some basic[br]innumeracy going on. 0:05:45.684,0:05:48.826 I mean, the joke "One, two,[br]three, many" is kind of right. 0:05:48.850,0:05:51.172 We're really good at small numbers. 0:05:51.196,0:05:53.532 One mango, two mangoes, three mangoes, 0:05:53.556,0:05:55.533 10,000 mangoes, 100,000 mangoes -- 0:05:55.557,0:05:58.534 it's still more mangoes[br]you can eat before they rot. 0:05:58.558,0:06:01.826 So one half, one quarter,[br]one fifth -- we're good at that. 0:06:01.850,0:06:03.826 One in a million, one in a billion -- 0:06:03.850,0:06:05.425 they're both almost never. 0:06:06.286,0:06:09.700 So we have trouble with the risks[br]that aren't very common. 0:06:10.500,0:06:12.477 And what these cognitive biases do 0:06:12.501,0:06:15.477 is they act as filters[br]between us and reality. 0:06:16.024,0:06:19.897 And the result is that feeling[br]and reality get out of whack, 0:06:19.921,0:06:21.305 they get different. 0:06:22.110,0:06:26.041 Now, you either have a feeling --[br]you feel more secure than you are, 0:06:26.065,0:06:27.750 there's a false sense of security. 0:06:27.774,0:06:31.148 Or the other way, and that's a false[br]sense of insecurity. 0:06:31.755,0:06:34.635 I write a lot about "security theater," 0:06:34.659,0:06:37.339 which are products[br]that make people feel secure, 0:06:37.363,0:06:39.340 but don't actually do anything. 0:06:39.364,0:06:41.921 There's no real word for stuff[br]that makes us secure, 0:06:41.945,0:06:43.826 but doesn't make us feel secure. 0:06:43.850,0:06:46.570 Maybe it's what the CIA[br]is supposed to do for us. 0:06:48.279,0:06:50.447 So back to economics. 0:06:50.471,0:06:54.127 If economics, if the market,[br]drives security, 0:06:54.151,0:06:58.998 and if people make trade-offs[br]based on the feeling of security, 0:06:59.022,0:07:03.702 then the smart thing for companies to do[br]for the economic incentives 0:07:03.726,0:07:05.783 is to make people feel secure. 0:07:06.682,0:07:09.012 And there are two ways to do this. 0:07:09.036,0:07:11.826 One, you can make people actually secure 0:07:11.850,0:07:13.313 and hope they notice. 0:07:13.337,0:07:16.181 Or two, you can make people[br]just feel secure 0:07:16.205,0:07:18.077 and hope they don't notice. 0:07:19.141,0:07:20.516 Right? 0:07:20.540,0:07:23.681 So what makes people notice? 0:07:24.240,0:07:25.622 Well, a couple of things: 0:07:25.646,0:07:27.912 understanding of the security, 0:07:27.936,0:07:29.826 of the risks, the threats, 0:07:29.850,0:07:31.724 the countermeasures, how they work. 0:07:31.748,0:07:34.038 But if you know stuff, you're more likely 0:07:34.895,0:07:37.121 to have your feelings match reality. 0:07:37.850,0:07:40.995 Enough real-world examples helps. 0:07:41.019,0:07:43.578 We all know the crime rate[br]in our neighborhood, 0:07:43.602,0:07:46.403 because we live there,[br]and we get a feeling about it 0:07:46.427,0:07:48.296 that basically matches reality. 0:07:49.778,0:07:51.985 Security theater is exposed 0:07:52.009,0:07:54.795 when it's obvious[br]that it's not working properly. 0:07:55.949,0:07:58.619 OK. So what makes people not notice? 0:07:59.183,0:08:00.675 Well, a poor understanding. 0:08:01.382,0:08:04.526 If you don't understand the risks,[br]you don't understand the costs, 0:08:04.550,0:08:06.707 you're likely to get the trade-off wrong, 0:08:06.731,0:08:09.219 and your feeling doesn't match reality. 0:08:09.243,0:08:10.980 Not enough examples. 0:08:11.619,0:08:15.125 There's an inherent problem[br]with low-probability events. 0:08:15.659,0:08:19.472 If, for example, terrorism[br]almost never happens, 0:08:19.496,0:08:24.100 it's really hard to judge the efficacy[br]of counter-terrorist measures. 0:08:25.263,0:08:28.826 This is why you keep sacrificing virgins, 0:08:28.850,0:08:31.525 and why your unicorn defenses[br]are working just great. 0:08:31.549,0:08:34.106 There aren't enough examples of failures. 0:08:35.849,0:08:38.636 Also, feelings that cloud the issues -- 0:08:38.660,0:08:42.688 the cognitive biases I talked[br]about earlier: fears, folk beliefs -- 0:08:43.467,0:08:46.212 basically, an inadequate model of reality. 0:08:48.143,0:08:50.314 So let me complicate things. 0:08:50.338,0:08:52.315 I have feeling and reality. 0:08:52.339,0:08:55.135 I want to add a third element.[br]I want to add "model." 0:08:55.579,0:08:57.929 Feeling and model are in our head, 0:08:57.953,0:09:01.405 reality is the outside world;[br]it doesn't change, it's real. 0:09:02.540,0:09:04.754 Feeling is based on our intuition, 0:09:04.778,0:09:06.404 model is based on reason. 0:09:07.123,0:09:09.162 That's basically the difference. 0:09:09.186,0:09:11.163 In a primitive and simple world, 0:09:11.187,0:09:13.324 there's really no reason for a model, 0:09:14.993,0:09:17.288 because feeling is close to reality. 0:09:17.312,0:09:18.685 You don't need a model. 0:09:19.336,0:09:21.486 But in a modern and complex world, 0:09:22.296,0:09:25.946 you need models to understand[br]a lot of the risks we face. 0:09:27.102,0:09:29.386 There's no feeling about germs. 0:09:29.850,0:09:31.966 You need a model to understand them. 0:09:32.897,0:09:36.575 This model is an intelligent[br]representation of reality. 0:09:37.151,0:09:41.902 It's, of course, limited[br]by science, by technology. 0:09:42.989,0:09:45.315 We couldn't have a germ theory of disease 0:09:45.339,0:09:47.873 before we invented[br]the microscope to see them. 0:09:49.056,0:09:51.699 It's limited by our cognitive biases. 0:09:52.850,0:09:55.841 But it has the ability[br]to override our feelings. 0:09:56.247,0:09:59.351 Where do we get these models?[br]We get them from others. 0:09:59.375,0:10:04.594 We get them from religion,[br]from culture, teachers, elders. 0:10:05.038,0:10:08.464 A couple years ago,[br]I was in South Africa on safari. 0:10:08.488,0:10:11.250 The tracker I was with grew up[br]in Kruger National Park. 0:10:11.274,0:10:14.027 He had some very complex[br]models of how to survive. 0:10:14.540,0:10:18.453 And it depended on if you were attacked[br]by a lion, leopard, rhino, or elephant -- 0:10:18.477,0:10:21.211 and when you had to run away,[br]when you couldn't run away, 0:10:21.235,0:10:24.318 when you had to climb a tree,[br]when you could never climb a tree. 0:10:24.342,0:10:25.691 I would have died in a day. 0:10:26.900,0:10:30.682 But he was born there,[br]and he understood how to survive. 0:10:31.230,0:10:32.826 I was born in New York City. 0:10:32.850,0:10:36.101 I could have taken him to New York,[br]and he would have died in a day. 0:10:36.125,0:10:37.126 (Laughter) 0:10:37.150,0:10:41.294 Because we had different models[br]based on our different experiences. 0:10:43.031,0:10:45.500 Models can come from the media, 0:10:45.524,0:10:47.287 from our elected officials ... 0:10:47.974,0:10:51.055 Think of models of terrorism, 0:10:51.079,0:10:53.276 child kidnapping, 0:10:53.300,0:10:55.625 airline safety, car safety. 0:10:56.279,0:10:58.272 Models can come from industry. 0:10:59.088,0:11:02.306 The two I'm following[br]are surveillance cameras, 0:11:02.330,0:11:03.826 ID cards, 0:11:03.850,0:11:06.980 quite a lot of our computer[br]security models come from there. 0:11:07.004,0:11:09.231 A lot of models come from science. 0:11:09.255,0:11:11.092 Health models are a great example. 0:11:11.116,0:11:14.142 Think of cancer, bird flu,[br]swine flu, SARS. 0:11:14.682,0:11:19.552 All of our feelings of security[br]about those diseases 0:11:19.576,0:11:24.231 come from models given to us, really,[br]by science filtered through the media. 0:11:25.778,0:11:27.498 So models can change. 0:11:28.222,0:11:30.325 Models are not static. 0:11:30.349,0:11:33.589 As we become more comfortable[br]in our environments, 0:11:33.613,0:11:37.215 our model can move closer to our feelings. 0:11:38.705,0:11:41.045 So an example might be, 0:11:41.069,0:11:42.665 if you go back 100 years ago, 0:11:42.689,0:11:46.117 when electricity was first[br]becoming common, 0:11:46.141,0:11:47.844 there were a lot of fears about it. 0:11:47.868,0:11:50.346 There were people who were afraid[br]to push doorbells, 0:11:50.370,0:11:53.375 because there was electricity[br]in there, and that was dangerous. 0:11:53.399,0:11:56.268 For us, we're very facile[br]around electricity. 0:11:56.292,0:11:59.110 We change light bulbs[br]without even thinking about it. 0:11:59.688,0:12:05.851 Our model of security around electricity[br]is something we were born into. 0:12:06.475,0:12:08.989 It hasn't changed as we were growing up. 0:12:09.013,0:12:10.578 And we're good at it. 0:12:12.120,0:12:16.619 Or think of the risks on the Internet[br]across generations -- 0:12:16.643,0:12:18.740 how your parents approach[br]Internet security, 0:12:18.764,0:12:20.380 versus how you do, 0:12:20.404,0:12:21.946 versus how our kids will. 0:12:23.040,0:12:25.590 Models eventually fade[br]into the background. 0:12:27.167,0:12:29.660 "Intuitive" is just[br]another word for familiar. 0:12:30.627,0:12:34.477 So as your model is close to reality[br]and it converges with feelings, 0:12:34.501,0:12:36.419 you often don't even know it's there. 0:12:37.979,0:12:41.796 A nice example of this came[br]from last year and swine flu. 0:12:43.021,0:12:45.021 When swine flu first appeared, 0:12:45.045,0:12:47.663 the initial news caused[br]a lot of overreaction. 0:12:48.302,0:12:50.280 Now, it had a name, 0:12:50.304,0:12:52.354 which made it scarier[br]than the regular flu, 0:12:52.378,0:12:53.945 even though it was more deadly. 0:12:54.524,0:12:57.732 And people thought doctors[br]should be able to deal with it. 0:12:58.199,0:13:00.723 So there was that feeling[br]of lack of control. 0:13:00.747,0:13:03.856 And those two things[br]made the risk more than it was. 0:13:03.880,0:13:07.437 As the novelty wore off[br]and the months went by, 0:13:07.461,0:13:10.304 there was some amount of tolerance;[br]people got used to it. 0:13:11.095,0:13:13.765 There was no new data,[br]but there was less fear. 0:13:14.421,0:13:16.595 By autumn, 0:13:16.619,0:13:20.001 people thought the doctors[br]should have solved this already. 0:13:20.462,0:13:22.422 And there's kind of a bifurcation: 0:13:22.446,0:13:28.197 people had to choose[br]between fear and acceptance -- 0:13:29.252,0:13:30.896 actually, fear and indifference -- 0:13:30.920,0:13:32.715 and they kind of chose suspicion. 0:13:33.850,0:13:36.961 And when the vaccine appeared last winter, 0:13:36.985,0:13:39.496 there were a lot of people --[br]a surprising number -- 0:13:39.520,0:13:41.317 who refused to get it. 0:13:43.517,0:13:47.173 And it's a nice example of how[br]people's feelings of security change, 0:13:47.197,0:13:48.800 how their model changes, 0:13:48.824,0:13:50.492 sort of wildly, 0:13:50.516,0:13:53.286 with no new information,[br]with no new input. 0:13:55.067,0:13:56.875 This kind of thing happens a lot. 0:13:57.939,0:13:59.910 I'm going to give one more complication. 0:13:59.934,0:14:02.630 We have feeling, model, reality. 0:14:03.380,0:14:05.890 I have a very relativistic[br]view of security. 0:14:05.914,0:14:07.728 I think it depends on the observer. 0:14:08.435,0:14:13.601 And most security decisions[br]have a variety of people involved. 0:14:14.532,0:14:21.071 And stakeholders with specific trade-offs[br]will try to influence the decision. 0:14:21.095,0:14:22.776 And I call that their agenda. 0:14:24.252,0:14:27.743 And you see agenda --[br]this is marketing, this is politics -- 0:14:28.221,0:14:31.260 trying to convince you to have[br]one model versus another, 0:14:31.284,0:14:33.268 trying to convince you to ignore a model 0:14:33.292,0:14:35.964 and trust your feelings, 0:14:35.988,0:14:38.492 marginalizing people[br]with models you don't like. 0:14:39.484,0:14:41.000 This is not uncommon. 0:14:42.350,0:14:45.579 An example, a great example,[br]is the risk of smoking. 0:14:46.936,0:14:48.719 In the history of the past 50 years, 0:14:48.743,0:14:51.356 the smoking risk shows[br]how a model changes, 0:14:51.380,0:14:55.738 and it also shows how an industry fights[br]against a model it doesn't like. 0:14:56.723,0:14:59.826 Compare that to the secondhand[br]smoke debate -- 0:14:59.850,0:15:01.803 probably about 20 years behind. 0:15:02.722,0:15:04.337 Think about seat belts. 0:15:04.361,0:15:06.385 When I was a kid, no one wore a seat belt. 0:15:06.409,0:15:09.950 Nowadays, no kid will let you drive[br]if you're not wearing a seat belt. 0:15:11.373,0:15:13.826 Compare that to the airbag debate, 0:15:13.850,0:15:15.517 probably about 30 years behind. 0:15:16.746,0:15:18.834 All examples of models changing. 0:15:21.595,0:15:24.391 What we learn is that changing[br]models is hard. 0:15:25.074,0:15:27.127 Models are hard to dislodge. 0:15:27.151,0:15:28.826 If they equal your feelings, 0:15:28.850,0:15:30.749 you don't even know you have a model. 0:15:31.850,0:15:33.736 And there's another cognitive bias 0:15:33.760,0:15:35.826 I'll call confirmation bias, 0:15:35.850,0:15:40.211 where we tend to accept data[br]that confirms our beliefs 0:15:40.235,0:15:42.672 and reject data[br]that contradicts our beliefs. 0:15:44.230,0:15:48.165 So evidence against our model,[br]we're likely to ignore, 0:15:48.189,0:15:49.437 even if it's compelling. 0:15:49.461,0:15:52.466 It has to get very compelling[br]before we'll pay attention. 0:15:53.730,0:15:56.327 New models that extend[br]long periods of time are hard. 0:15:56.351,0:15:58.105 Global warming is a great example. 0:15:58.129,0:16:01.571 We're terrible at models[br]that span 80 years. 0:16:01.595,0:16:03.658 We can do "to the next harvest." 0:16:03.682,0:16:05.988 We can often do "until our kids grow up." 0:16:06.500,0:16:08.701 But "80 years," we're just not good at. 0:16:09.715,0:16:12.017 So it's a very hard model to accept. 0:16:12.739,0:16:15.685 We can have both models[br]in our head simultaneously -- 0:16:16.652,0:16:23.600 that kind of problem where[br]we're holding both beliefs together, 0:16:23.624,0:16:24.994 the cognitive dissonance. 0:16:25.018,0:16:28.174 Eventually, the new model[br]will replace the old model. 0:16:28.904,0:16:31.094 Strong feelings can create a model. 0:16:32.151,0:16:37.514 September 11 created a security model[br]in a lot of people's heads. 0:16:37.538,0:16:40.826 Also, personal experiences[br]with crime can do it, 0:16:40.850,0:16:42.229 personal health scare, 0:16:42.253,0:16:43.719 a health scare in the news. 0:16:44.938,0:16:48.283 You'll see these called[br]"flashbulb events" by psychiatrists. 0:16:48.923,0:16:51.384 They can create a model instantaneously, 0:16:51.408,0:16:53.169 because they're very emotive. 0:16:54.648,0:16:56.236 So in the technological world, 0:16:56.260,0:16:59.497 we don't have experience to judge models. 0:16:59.864,0:17:02.797 And we rely on others. We rely on proxies. 0:17:02.821,0:17:05.440 And this works, as long as[br]it's the correct others. 0:17:05.923,0:17:08.605 We rely on government agencies 0:17:08.629,0:17:13.033 to tell us what pharmaceuticals are safe. 0:17:13.057,0:17:14.970 I flew here yesterday. 0:17:14.994,0:17:16.756 I didn't check the airplane. 0:17:17.439,0:17:20.034 I relied on some other group 0:17:20.058,0:17:22.495 to determine whether[br]my plane was safe to fly. 0:17:22.519,0:17:25.817 We're here, none of us fear the roof[br]is going to collapse on us, 0:17:25.841,0:17:28.046 not because we checked, 0:17:28.070,0:17:31.742 but because we're pretty sure[br]the building codes here are good. 0:17:33.182,0:17:36.171 It's a model we just accept 0:17:36.195,0:17:37.555 pretty much by faith. 0:17:38.071,0:17:39.429 And that's OK. 0:17:42.706,0:17:48.579 Now, what we want is people[br]to get familiar enough with better models, 0:17:48.603,0:17:50.723 have it reflected in their feelings, 0:17:50.747,0:17:53.749 to allow them to make security trade-offs. 0:17:54.850,0:17:58.569 When these go out of whack,[br]you have two options. 0:17:58.593,0:18:02.826 One, you can fix people's feelings,[br]directly appeal to feelings. 0:18:02.850,0:18:05.256 It's manipulation, but it can work. 0:18:05.913,0:18:08.104 The second, more honest way 0:18:08.128,0:18:09.901 is to actually fix the model. 0:18:11.460,0:18:13.561 Change happens slowly. 0:18:13.585,0:18:17.963 The smoking debate took 40 years --[br]and that was an easy one. 0:18:19.935,0:18:21.748 Some of this stuff is hard. 0:18:22.236,0:18:25.992 Really, though, information[br]seems like our best hope. 0:18:26.016,0:18:27.288 And I lied. 0:18:27.312,0:18:31.332 Remember I said feeling, model, reality;[br]reality doesn't change? 0:18:31.356,0:18:32.731 It actually does. 0:18:32.755,0:18:34.469 We live in a technological world; 0:18:34.493,0:18:36.831 reality changes all the time. 0:18:37.627,0:18:40.604 So we might have,[br]for the first time in our species: 0:18:40.628,0:18:43.811 feeling chases model, model chases[br]reality, reality's moving -- 0:18:43.835,0:18:45.368 they might never catch up. 0:18:46.920,0:18:48.248 We don't know. 0:18:50.354,0:18:51.957 But in the long term, 0:18:51.981,0:18:54.185 both feeling and reality are important. 0:18:54.209,0:18:57.442 And I want to close with two quick[br]stories to illustrate this. 0:18:57.466,0:18:59.945 1982 -- I don't know if people[br]will remember this -- 0:18:59.969,0:19:03.339 there was a short epidemic[br]of Tylenol poisonings 0:19:03.363,0:19:04.559 in the United States. 0:19:04.583,0:19:05.944 It's a horrific story. 0:19:05.968,0:19:08.047 Someone took a bottle of Tylenol, 0:19:08.071,0:19:11.073 put poison in it, closed it up,[br]put it back on the shelf, 0:19:11.097,0:19:12.655 someone else bought it and died. 0:19:12.679,0:19:14.352 This terrified people. 0:19:14.376,0:19:16.603 There were a couple of copycat attacks. 0:19:16.627,0:19:19.472 There wasn't any real risk,[br]but people were scared. 0:19:19.496,0:19:23.372 And this is how the tamper-proof[br]drug industry was invented. 0:19:23.396,0:19:25.625 Those tamper-proof caps?[br]That came from this. 0:19:25.649,0:19:27.220 It's complete security theater. 0:19:27.244,0:19:30.135 As a homework assignment,[br]think of 10 ways to get around it. 0:19:30.159,0:19:32.050 I'll give you one: a syringe. 0:19:32.074,0:19:34.855 But it made people feel better. 0:19:35.484,0:19:39.186 It made their feeling of security[br]more match the reality. 0:19:40.130,0:19:43.064 Last story: a few years ago,[br]a friend of mine gave birth. 0:19:43.088,0:19:44.485 I visit her in the hospital. 0:19:44.509,0:19:46.432 It turns out, when a baby's born now, 0:19:46.456,0:19:50.019 they put an RFID bracelet on the baby,[br]a corresponding one on the mother, 0:19:50.043,0:19:53.663 so if anyone other than the mother takes[br]the baby out of the maternity ward, 0:19:53.687,0:19:54.845 an alarm goes off. 0:19:54.869,0:19:56.598 I said, "Well, that's kind of neat. 0:19:56.622,0:20:00.592 I wonder how rampant[br]baby snatching is out of hospitals." 0:20:00.616,0:20:01.852 I go home, I look it up. 0:20:01.876,0:20:03.401 It basically never happens. 0:20:03.425,0:20:05.269 (Laughter) 0:20:05.293,0:20:08.136 But if you think about it,[br]if you are a hospital, 0:20:08.160,0:20:10.540 and you need to take a baby[br]away from its mother, 0:20:10.564,0:20:12.345 out of the room to run some tests, 0:20:12.369,0:20:14.419 you better have some good[br]security theater, 0:20:14.443,0:20:16.388 or she's going to rip your arm off. 0:20:16.412,0:20:17.945 (Laughter) 0:20:18.901,0:20:20.618 So it's important for us, 0:20:20.642,0:20:22.777 those of us who design security, 0:20:22.801,0:20:24.832 who look at security policy -- 0:20:25.686,0:20:28.994 or even look at public policy[br]in ways that affect security. 0:20:29.746,0:20:33.162 It's not just reality;[br]it's feeling and reality. 0:20:33.186,0:20:35.051 What's important 0:20:35.075,0:20:36.620 is that they be about the same. 0:20:36.644,0:20:39.175 It's important that,[br]if our feelings match reality, 0:20:39.199,0:20:41.072 we make better security trade-offs. 0:20:41.451,0:20:42.604 Thank you. 0:20:42.628,0:20:44.761 (Applause)