0:00:00.588,0:00:03.619 I'm a computer science professor, 0:00:03.619,0:00:05.932 and my area of expertise is 0:00:05.932,0:00:08.131 computer and information security. 0:00:08.131,0:00:10.451 When I was in graduate school, 0:00:10.451,0:00:13.052 I had the opportunity to overhear my grandmother 0:00:13.052,0:00:17.186 describing to one of her fellow senior citizens 0:00:17.186,0:00:19.555 what I did for a living. 0:00:19.555,0:00:23.117 Apparently, I was in charge of making sure that 0:00:23.117,0:00:27.017 no one stole the computers from the university. (Laughter) 0:00:27.017,0:00:29.761 And, you know, that's a perfectly reasonable thing 0:00:29.761,0:00:31.681 for her to think, because I told her I was working 0:00:31.681,0:00:33.188 in computer security, 0:00:33.188,0:00:36.785 and it was interesting to get her perspective. 0:00:36.785,0:00:39.402 But that's not the most ridiculous thing I've ever heard 0:00:39.402,0:00:41.419 anyone say about my work. 0:00:41.419,0:00:43.703 The most ridiculous thing I ever heard is, 0:00:43.703,0:00:46.837 I was at a dinner party, and a woman heard 0:00:46.837,0:00:48.620 that I work in computer security, 0:00:48.620,0:00:52.137 and she asked me if -- she said her computer had been 0:00:52.137,0:00:55.573 infected by a virus, and she was very concerned that she 0:00:55.573,0:00:59.524 might get sick from it, that she could get this virus. (Laughter) 0:00:59.524,0:01:02.467 And I'm not a doctor, but I reassured her 0:01:02.467,0:01:05.611 that it was very, very unlikely that this would happen, 0:01:05.611,0:01:08.412 but if she felt more comfortable, she could be free to use 0:01:08.412,0:01:10.260 latex gloves when she was on the computer, 0:01:10.260,0:01:13.652 and there would be no harm whatsoever in that. 0:01:13.652,0:01:16.159 I'm going to get back to this notion of being able to get 0:01:16.159,0:01:19.667 a virus from your computer, in a serious way. 0:01:19.667,0:01:21.307 What I'm going to talk to you about today 0:01:21.307,0:01:26.153 are some hacks, some real world cyberattacks that people 0:01:26.153,0:01:28.707 in my community, the academic research community, 0:01:28.707,0:01:31.501 have performed, which I don't think 0:01:31.501,0:01:32.709 most people know about, 0:01:32.709,0:01:35.737 and I think they're very interesting and scary, 0:01:35.737,0:01:38.178 and this talk is kind of a greatest hits 0:01:38.178,0:01:41.169 of the academic security community's hacks. 0:01:41.169,0:01:43.156 None of the work is my work. It's all work 0:01:43.156,0:01:45.330 that my colleagues have done, and I actually asked them 0:01:45.330,0:01:47.887 for their slides and incorporated them into this talk. 0:01:47.887,0:01:49.629 So the first one I'm going to talk about 0:01:49.629,0:01:52.303 are implanted medical devices. 0:01:52.303,0:01:55.343 Now medical devices have come a long way technologically. 0:01:55.343,0:01:59.199 You can see in 1926 the first pacemaker was invented. 0:01:59.199,0:02:02.751 1960, the first internal pacemaker was implanted, 0:02:02.751,0:02:05.303 hopefully a little smaller than that one that you see there, 0:02:05.303,0:02:08.271 and the technology has continued to move forward. 0:02:08.271,0:02:12.904 In 2006, we hit an important milestone from the perspective 0:02:12.904,0:02:16.071 of computer security. 0:02:16.071,0:02:17.412 And why do I say that? 0:02:17.412,0:02:20.302 Because that's when implanted devices inside of people 0:02:20.302,0:02:23.047 started to have networking capabilities. 0:02:23.047,0:02:24.927 One thing that brings us close to home is we look 0:02:24.927,0:02:27.632 at Dick Cheney's device, he had a device that 0:02:27.632,0:02:31.501 pumped blood from an aorta to another part of the heart, 0:02:31.501,0:02:32.684 and as you can see at the bottom there, 0:02:32.684,0:02:35.693 it was controlled by a computer controller, 0:02:35.693,0:02:38.210 and if you ever thought that software liability 0:02:38.210,0:02:41.799 was very important, get one of these inside of you. 0:02:41.799,0:02:45.494 Now what a research team did was they got their hands 0:02:45.494,0:02:46.914 on what's called an ICD. 0:02:46.914,0:02:48.984 This is a defibrillator, and this is a device 0:02:48.984,0:02:53.320 that goes into a person to control their heart rhythm, 0:02:53.320,0:02:55.658 and these have saved many lives. 0:02:55.658,0:02:58.130 Well, in order to not have to open up the person 0:02:58.130,0:03:00.324 every time you want to reprogram their device 0:03:00.324,0:03:02.779 or do some diagnostics on it, they made the thing be able 0:03:02.779,0:03:05.881 to communicate wirelessly, and what this research team did 0:03:05.881,0:03:08.491 is they reverse engineered the wireless protocol, 0:03:08.491,0:03:10.363 and they built the device you see pictured here, 0:03:10.363,0:03:13.123 with a little antenna, that could talk the protocol 0:03:13.123,0:03:17.598 to the device, and thus control it. 0:03:17.598,0:03:20.287 In order to make their experience real -- they were unable 0:03:20.287,0:03:22.759 to find any volunteers, and so they went 0:03:22.759,0:03:24.903 and they got some ground beef and some bacon 0:03:24.903,0:03:26.691 and they wrapped it all up to about the size 0:03:26.691,0:03:29.489 of a human being's area where the device would go, 0:03:29.489,0:03:30.943 and they stuck the device inside it 0:03:30.943,0:03:34.075 to perform their experiment somewhat realistically. 0:03:34.075,0:03:37.095 They launched many, many successful attacks. 0:03:37.095,0:03:40.151 One that I'll highlight here is changing the patient's name. 0:03:40.151,0:03:41.144 I don't know why you would want to do that, 0:03:41.144,0:03:43.248 but I sure wouldn't want that done to me. 0:03:43.248,0:03:45.579 And they were able to change therapies, 0:03:45.579,0:03:48.074 including disabling the device -- and this is with a real, 0:03:48.074,0:03:49.970 commercial, off-the-shelf device -- 0:03:49.970,0:03:52.016 simply by performing reverse engineering and sending 0:03:52.016,0:03:55.005 wireless signals to it. 0:03:55.005,0:03:58.585 There was a piece on NPR that some of these ICDs 0:03:58.585,0:04:01.007 could actually have their performance disrupted 0:04:01.007,0:04:04.658 simply by holding a pair of headphones onto them. 0:04:04.658,0:04:06.067 Now, wireless and the Internet 0:04:06.067,0:04:07.719 can improve health care greatly. 0:04:07.719,0:04:09.806 There's several examples up on the screen 0:04:09.806,0:04:12.913 of situations where doctors are looking to implant devices 0:04:12.913,0:04:15.778 inside of people, and all of these devices now, 0:04:15.778,0:04:18.903 it's standard that they communicate wirelessly, 0:04:18.903,0:04:20.315 and I think this is great, 0:04:20.315,0:04:23.420 but without a full understanding of trustworthy computing, 0:04:23.420,0:04:25.827 and without understanding what attackers can do 0:04:25.827,0:04:27.974 and the security risks from the beginning, 0:04:27.974,0:04:30.364 there's a lot of danger in this. 0:04:30.364,0:04:31.841 Okay, let me shift gears and show you another target. 0:04:31.841,0:04:33.929 I'm going to show you a few different targets like this, 0:04:33.929,0:04:36.846 and that's my talk. So we'll look at automobiles. 0:04:36.846,0:04:39.742 This is a car, and it has a lot of components, 0:04:39.742,0:04:41.362 a lot of electronics in it today. 0:04:41.362,0:04:45.739 In fact, it's got many, many different computers inside of it, 0:04:45.739,0:04:48.894 more Pentiums than my lab did when I was in college, 0:04:48.894,0:04:52.533 and they're connected by a wired network. 0:04:52.533,0:04:55.964 There's also a wireless network in the car, 0:04:55.964,0:04:59.197 which can be reached from many different ways. 0:04:59.197,0:05:02.898 So there's Bluetooth, there's the FM and XM radio, 0:05:02.898,0:05:05.718 there's actually wi-fi, there's sensors in the wheels 0:05:05.718,0:05:07.871 that wirelessly communicate the tire pressure 0:05:07.871,0:05:09.677 to a controller on board. 0:05:09.677,0:05:14.595 The modern car is a sophisticated multi-computer device. 0:05:14.595,0:05:17.917 And what happens if somebody wanted to attack this? 0:05:17.917,0:05:19.234 Well, that's what the researchers 0:05:19.234,0:05:21.105 that I'm going to talk about today did. 0:05:21.105,0:05:24.082 They basically stuck an attacker on the wired network 0:05:24.082,0:05:26.404 and on the wireless network. 0:05:26.404,0:05:29.103 Now, they have two areas they can attack. 0:05:29.103,0:05:31.141 One is short-range wireless, where you can actually 0:05:31.141,0:05:32.922 communicate with the device from nearby, 0:05:32.922,0:05:35.059 either through Bluetooth or wi-fi, 0:05:35.059,0:05:37.233 and the other is long-range, where you can communicate 0:05:37.233,0:05:39.015 with the car through the cellular network, 0:05:39.015,0:05:40.975 or through one of the radio stations. 0:05:40.975,0:05:44.024 Think about it. When a car receives a radio signal, 0:05:44.024,0:05:46.225 it's processed by software. 0:05:46.225,0:05:49.286 That software has to receive and decode the radio signal, 0:05:49.286,0:05:50.405 and then figure out what to do with it, 0:05:50.405,0:05:53.429 even if it's just music that it needs to play on the radio, 0:05:53.429,0:05:55.697 and that software that does that decoding, 0:05:55.697,0:05:58.790 if it has any bugs in it, could create a vulnerability 0:05:58.790,0:06:01.825 for somebody to hack the car. 0:06:01.825,0:06:04.777 The way that the researchers did this work is, 0:06:04.777,0:06:09.000 they read the software in the computer chips 0:06:09.000,0:06:12.193 that were in the car, and then they used sophisticated 0:06:12.193,0:06:13.607 reverse engineering tools 0:06:13.607,0:06:15.662 to figure out what that software did, 0:06:15.662,0:06:18.703 and then they found vulnerabilities in that software, 0:06:18.703,0:06:22.049 and then they built exploits to exploit those. 0:06:22.049,0:06:24.431 They actually carried out their attack in real life. 0:06:24.431,0:06:25.781 They bought two cars, and I guess 0:06:25.781,0:06:28.699 they have better budgets than I do. 0:06:28.699,0:06:31.289 The first threat model was to see what someone could do 0:06:31.289,0:06:33.433 if an attacker actually got access 0:06:33.433,0:06:35.486 to the internal network on the car. 0:06:35.486,0:06:38.089 Okay, so think of that as, someone gets to go to your car, 0:06:38.089,0:06:40.993 they get to mess around with it, and then they leave, 0:06:40.993,0:06:43.361 and now, what kind of trouble are you in? 0:06:43.361,0:06:46.153 The other threat model is that they contact you 0:06:46.153,0:06:48.610 in real time over one of the wireless networks 0:06:48.610,0:06:50.665 like the cellular, or something like that, 0:06:50.665,0:06:54.665 never having actually gotten physical access to your car. 0:06:54.665,0:06:57.489 This is what their setup looks like for the first model, 0:06:57.489,0:06:59.172 where you get to have access to the car. 0:06:59.172,0:07:02.559 They put a laptop, and they connected to the diagnostic unit 0:07:02.559,0:07:05.498 on the in-car network, and they did all kinds of silly things, 0:07:05.498,0:07:08.281 like here's a picture of the speedometer 0:07:08.281,0:07:11.097 showing 140 miles an hour when the car's in park. 0:07:11.097,0:07:13.470 Once you have control of the car's computers, 0:07:13.470,0:07:14.389 you can do anything. 0:07:14.389,0:07:16.005 Now you might say, "Okay, that's silly." 0:07:16.005,0:07:17.664 Well, what if you make the car always say 0:07:17.664,0:07:20.405 it's going 20 miles an hour slower than it's actually going? 0:07:20.405,0:07:22.947 You might produce a lot of speeding tickets. 0:07:22.947,0:07:26.803 Then they went out to an abandoned airstrip with two cars, 0:07:26.803,0:07:29.548 the target victim car and the chase car, 0:07:29.548,0:07:32.294 and they launched a bunch of other attacks. 0:07:32.294,0:07:35.060 One of the things they were able to do from the chase car 0:07:35.060,0:07:37.034 is apply the brakes on the other car, 0:07:37.034,0:07:38.594 simply by hacking the computer. 0:07:38.594,0:07:41.025 They were able to disable the brakes. 0:07:41.025,0:07:44.203 They also were able to install malware that wouldn't kick in 0:07:44.203,0:07:46.628 and wouldn't trigger until the car was doing something like 0:07:46.628,0:07:50.374 going over 20 miles an hour, or something like that. 0:07:50.374,0:07:53.132 The results are astonishing, and when they gave this talk, 0:07:53.132,0:07:54.848 even though they gave this talk at a conference 0:07:54.848,0:07:56.574 to a bunch of computer security researchers, 0:07:56.574,0:07:58.274 everybody was gasping. 0:07:58.274,0:08:01.973 They were able to take over a bunch of critical computers 0:08:01.973,0:08:05.734 inside the car: the brakes computer, the lighting computer, 0:08:05.734,0:08:08.561 the engine, the dash, the radio, etc., 0:08:08.561,0:08:10.854 and they were able to perform these on real commercial 0:08:10.854,0:08:13.881 cars that they purchased using the radio network. 0:08:13.881,0:08:16.884 They were able to compromise every single one of the 0:08:16.884,0:08:19.350 pieces of software that controlled every single one 0:08:19.350,0:08:22.365 of the wireless capabilities of the car. 0:08:22.365,0:08:24.878 All of these were implemented successfully. 0:08:24.878,0:08:27.230 How would you steal a car in this model? 0:08:27.230,0:08:30.910 Well, you compromise the car by a buffer overflow 0:08:30.910,0:08:33.437 of vulnerability in the software, something like that. 0:08:33.437,0:08:35.640 You use the GPS in the car to locate it. 0:08:35.640,0:08:37.835 You remotely unlock the doors through the computer 0:08:37.835,0:08:40.973 that controls that, start the engine, bypass anti-theft, 0:08:40.973,0:08:42.641 and you've got yourself a car. 0:08:42.641,0:08:45.128 Surveillance was really interesting. 0:08:45.128,0:08:48.337 The authors of the study have a video where they show 0:08:48.337,0:08:50.886 themselves taking over a car and then turning on 0:08:50.886,0:08:53.647 the microphone in the car, and listening in on the car 0:08:53.647,0:08:56.998 while tracking it via GPS on a map, 0:08:56.998,0:08:58.711 and so that's something that the drivers of the car 0:08:58.711,0:09:00.879 would never know was happening. 0:09:00.879,0:09:03.013 Am I scaring you yet? 0:09:03.013,0:09:04.956 I've got a few more of these interesting ones. 0:09:04.956,0:09:06.789 These are ones where I went to a conference, 0:09:06.789,0:09:08.722 and my mind was just blown, and I said, 0:09:08.722,0:09:10.548 "I have to share this with other people." 0:09:10.548,0:09:12.171 This was Fabian Monrose's lab 0:09:12.171,0:09:15.627 at the University of North Carolina, and what they did was 0:09:15.627,0:09:17.702 something intuitive once you see it, 0:09:17.702,0:09:19.416 but kind of surprising. 0:09:19.416,0:09:21.675 They videotaped people on a bus, 0:09:21.675,0:09:24.515 and then they post-processed the video. 0:09:24.515,0:09:26.978 What you see here in number one is a 0:09:26.978,0:09:31.361 reflection in somebody's glasses of the smartphone 0:09:31.361,0:09:32.786 that they're typing in. 0:09:32.786,0:09:34.761 They wrote software to stabilize -- 0:09:34.761,0:09:36.126 even though they were on a bus 0:09:36.126,0:09:39.337 and maybe someone's holding their phone at an angle -- 0:09:39.337,0:09:41.707 to stabilize the phone, process it, and 0:09:41.707,0:09:43.592 you may know on your smartphone, when you type 0:09:43.592,0:09:46.531 a password, the keys pop out a little bit, and they were able 0:09:46.531,0:09:49.371 to use that to reconstruct what the person was typing, 0:09:49.371,0:09:53.692 and had a language model for detecting typing. 0:09:53.692,0:09:56.027 What was interesting is, by videotaping on a bus, 0:09:56.027,0:09:58.156 they were able to produce exactly what people 0:09:58.156,0:10:00.307 on their smartphones were typing, 0:10:00.307,0:10:02.567 and then they had a surprising result, which is that 0:10:02.567,0:10:05.331 their software had not only done it for their target, 0:10:05.331,0:10:06.734 but other people who accidentally happened 0:10:06.734,0:10:08.820 to be in the picture, they were able to produce 0:10:08.820,0:10:11.547 what those people had been typing, and that was kind of 0:10:11.547,0:10:15.164 an accidental artifact of what their software was doing. 0:10:15.164,0:10:19.467 I'll show you two more. One is P25 radios. 0:10:19.467,0:10:22.267 P25 radios are used by law enforcement 0:10:22.267,0:10:25.674 and all kinds of government agencies 0:10:25.674,0:10:27.410 and people in combat to communicate, 0:10:27.410,0:10:30.243 and there's an encryption option on these phones. 0:10:30.243,0:10:32.971 This is what the phone looks like. It's not really a phone. 0:10:32.971,0:10:34.177 It's more of a two-way radio. 0:10:34.177,0:10:37.499 Motorola makes the most widely used one, and you can see 0:10:37.499,0:10:40.148 that they're used by Secret Service, they're used in combat, 0:10:40.148,0:10:43.250 it's a very, very common standard in the U.S. and elsewhere. 0:10:43.250,0:10:45.555 So one question the researchers asked themselves is, 0:10:45.555,0:10:48.259 could you block this thing, right? 0:10:48.259,0:10:49.842 Could you run a denial-of-service, 0:10:49.842,0:10:51.666 because these are first responders? 0:10:51.666,0:10:53.467 So, would a terrorist organization want to black out the 0:10:53.467,0:10:57.955 ability of police and fire to communicate at an emergency? 0:10:57.955,0:11:01.027 They found that there's this GirlTech device used for texting 0:11:01.027,0:11:03.745 that happens to operate at the same exact frequency 0:11:03.745,0:11:06.016 as the P25, and they built what they called 0:11:06.016,0:11:10.350 My First Jammer. (Laughter) 0:11:10.350,0:11:12.728 If you look closely at this device, 0:11:12.728,0:11:16.358 it's got a switch for encryption or cleartext. 0:11:16.358,0:11:19.408 Let me advance the slide, and now I'll go back. 0:11:19.408,0:11:21.955 You see the difference? 0:11:21.955,0:11:24.512 This is plain text. This is encrypted. 0:11:24.512,0:11:27.069 There's one little dot that shows up on the screen, 0:11:27.069,0:11:29.154 and one little tiny turn of the switch. 0:11:29.154,0:11:31.058 And so the researchers asked themselves, "I wonder how 0:11:31.058,0:11:35.315 many times very secure, important, sensitive conversations 0:11:35.315,0:11:36.938 are happening on these two-way radios where they forget 0:11:36.938,0:11:39.848 to encrypt and they don't notice that they didn't encrypt?" 0:11:39.848,0:11:43.187 So they bought a scanner. These are perfectly legal 0:11:43.187,0:11:46.645 and they run at the frequency of the P25, 0:11:46.645,0:11:48.412 and what they did is they hopped around frequencies 0:11:48.412,0:11:50.922 and they wrote software to listen in. 0:11:50.922,0:11:53.556 If they found encrypted communication, they stayed 0:11:53.556,0:11:55.242 on that channel and they wrote down, that's a channel 0:11:55.242,0:11:57.030 that these people communicate in, 0:11:57.030,0:11:58.652 these law enforcement agencies, 0:11:58.652,0:12:02.043 and they went to 20 metropolitan areas and listened in 0:12:02.043,0:12:05.518 on conversations that were happening at those frequencies. 0:12:05.518,0:12:08.757 They found that in every metropolitan area, 0:12:08.757,0:12:10.911 they would capture over 20 minutes a day 0:12:10.911,0:12:13.286 of cleartext communication. 0:12:13.286,0:12:15.286 And what kind of things were people talking about? 0:12:15.286,0:12:16.770 Well, they found the names and information 0:12:16.770,0:12:19.622 about confidential informants. They found information 0:12:19.622,0:12:21.824 that was being recorded in wiretaps, 0:12:21.824,0:12:24.534 a bunch of crimes that were being discussed, 0:12:24.534,0:12:25.696 sensitive information. 0:12:25.696,0:12:29.059 It was mostly law enforcement and criminal. 0:12:29.059,0:12:30.893 They went and reported this to the law enforcement 0:12:30.893,0:12:32.916 agencies, after anonymizing it, 0:12:32.916,0:12:35.916 and the vulnerability here is simply the user interface 0:12:35.916,0:12:37.310 wasn't good enough. If you're talking 0:12:37.310,0:12:40.126 about something really secure and sensitive, it should 0:12:40.126,0:12:43.419 be really clear to you that this conversation is encrypted. 0:12:43.419,0:12:45.305 That one's pretty easy to fix. 0:12:45.305,0:12:46.974 The last one I thought was really, really cool, 0:12:46.974,0:12:49.787 and I just had to show it to you, it's probably not something 0:12:49.787,0:12:50.792 that you're going to lose sleep over 0:12:50.792,0:12:52.583 like the cars or the defibrillators, 0:12:52.583,0:12:55.606 but it's stealing keystrokes. 0:12:55.606,0:12:58.353 Now, we've all looked at smartphones upside down. 0:12:58.353,0:13:00.543 Every security expert wants to hack a smartphone, 0:13:00.543,0:13:05.155 and we tend to look at the USB port, the GPS for tracking, 0:13:05.155,0:13:08.363 the camera, the microphone, but no one up till this point 0:13:08.363,0:13:09.943 had looked at the accelerometer. 0:13:09.943,0:13:11.590 The accelerometer is the thing that determines 0:13:11.590,0:13:15.084 the vertical orientation of the smartphone. 0:13:15.084,0:13:16.501 And so they had a simple setup. 0:13:16.501,0:13:19.259 They put a smartphone next to a keyboard, 0:13:19.259,0:13:21.971 and they had people type, and then their goal was 0:13:21.971,0:13:24.827 to use the vibrations that were created by typing 0:13:24.827,0:13:29.067 to measure the change in the accelerometer reading 0:13:29.067,0:13:32.243 to determine what the person had been typing. 0:13:32.243,0:13:34.819 Now, when they tried this on an iPhone 3GS, 0:13:34.819,0:13:37.588 this is a graph of the perturbations that were created 0:13:37.588,0:13:40.829 by the typing, and you can see that it's very difficult 0:13:40.829,0:13:43.907 to tell when somebody was typing or what they were typing, 0:13:43.907,0:13:46.997 but the iPhone 4 greatly improved the accelerometer, 0:13:46.997,0:13:50.477 and so the same measurement 0:13:50.477,0:13:52.309 produced this graph. 0:13:52.309,0:13:54.795 Now that gave you a lot of information while someone 0:13:54.795,0:13:58.036 was typing, and what they did then is used advanced 0:13:58.036,0:14:01.043 artificial intelligence techniques called machine learning 0:14:01.043,0:14:02.474 to have a training phase, 0:14:02.474,0:14:04.710 and so they got most likely grad students 0:14:04.710,0:14:08.499 to type in a whole lot of things, and to learn, 0:14:08.499,0:14:11.267 to have the system use the machine learning tools that 0:14:11.267,0:14:14.130 were available to learn what it is that the people were typing 0:14:14.130,0:14:16.957 and to match that up 0:14:16.957,0:14:19.434 with the measurements in the accelerometer. 0:14:19.434,0:14:21.069 And then there's the attack phase, where you get 0:14:21.069,0:14:23.880 somebody to type something in, you don't know what it was, 0:14:23.880,0:14:25.177 but you use your model that you created 0:14:25.177,0:14:28.619 in the training phase to figure out what they were typing. 0:14:28.619,0:14:32.103 They had pretty good success. This is an article from the USA Today. 0:14:32.103,0:14:34.712 They typed in, "The Illinois Supreme Court has ruled 0:14:34.712,0:14:37.674 that Rahm Emanuel is eligible to run for Mayor of Chicago" 0:14:37.674,0:14:39.028 — see, I tied it in to the last talk — 0:14:39.028,0:14:41.146 "and ordered him to stay on the ballot." 0:14:41.146,0:14:43.917 Now, the system is interesting, because it produced 0:14:43.917,0:14:46.803 "Illinois Supreme" and then it wasn't sure. 0:14:46.803,0:14:48.753 The model produced a bunch of options, 0:14:48.753,0:14:51.462 and this is the beauty of some of the A.I. techniques, 0:14:51.462,0:14:53.712 is that computers are good at some things, 0:14:53.712,0:14:55.246 humans are good at other things, 0:14:55.246,0:14:57.177 take the best of both and let the humans solve this one. 0:14:57.177,0:14:58.559 Don't waste computer cycles. 0:14:58.559,0:15:00.695 A human's not going to think it's the Supreme might. 0:15:00.695,0:15:02.435 It's the Supreme Court, right? 0:15:02.435,0:15:04.965 And so, together we're able to reproduce typing 0:15:04.965,0:15:07.914 simply by measuring the accelerometer. 0:15:07.914,0:15:11.416 Why does this matter? Well, in the Android platform, 0:15:11.416,0:15:15.549 for example, the developers have a manifest 0:15:15.564,0:15:18.148 where every device on there, the microphone, etc., 0:15:18.148,0:15:20.104 has to register if you're going to use it 0:15:20.104,0:15:22.420 so that hackers can't take over it, 0:15:22.420,0:15:25.528 but nobody controls the accelerometer. 0:15:25.528,0:15:27.744 So what's the point? You can leave your iPhone next to 0:15:27.744,0:15:29.850 someone's keyboard, and just leave the room, 0:15:29.850,0:15:31.489 and then later recover what they did, 0:15:31.489,0:15:33.200 even without using the microphone. 0:15:33.200,0:15:35.374 If someone is able to put malware on your iPhone, 0:15:35.374,0:15:38.222 they could then maybe get the typing that you do 0:15:38.222,0:15:40.543 whenever you put your iPhone next to your keyboard. 0:15:40.543,0:15:42.814 There's several other notable attacks that unfortunately 0:15:42.814,0:15:44.945 I don't have time to go into, but the one that I wanted 0:15:44.945,0:15:47.222 to point out was a group from the University of Michigan 0:15:47.222,0:15:49.663 which was able to take voting machines, 0:15:49.663,0:15:52.161 the Sequoia AVC Edge DREs that 0:15:52.161,0:15:53.716 were going to be used in New Jersey in the election 0:15:53.716,0:15:55.877 that were left in a hallway, and put Pac-Man on it. 0:15:55.877,0:15:59.500 So they ran the Pac-Man game. 0:15:59.500,0:16:01.247 What does this all mean? 0:16:01.247,0:16:04.894 Well, I think that society tends to adopt technology 0:16:04.894,0:16:07.718 really quickly. I love the next coolest gadget. 0:16:07.718,0:16:10.332 But it's very important, and these researchers are showing, 0:16:10.332,0:16:11.692 that the developers of these things 0:16:11.692,0:16:14.557 need to take security into account from the very beginning, 0:16:14.557,0:16:17.342 and need to realize that they may have a threat model, 0:16:17.342,0:16:19.804 but the attackers may not be nice enough 0:16:19.804,0:16:21.581 to limit themselves to that threat model, 0:16:21.581,0:16:24.118 and so you need to think outside of the box. 0:16:24.118,0:16:25.696 What we can do is be aware 0:16:25.696,0:16:28.175 that devices can be compromised, 0:16:28.175,0:16:29.874 and anything that has software in it 0:16:29.874,0:16:32.523 is going to be vulnerable. It's going to have bugs. 0:16:32.523,0:16:36.020 Thank you very much. (Applause)