All your devices can be hacked
-
0:01 - 0:04I'm a computer science professor,
-
0:04 - 0:06and my area of expertise is
-
0:06 - 0:08computer and information security.
-
0:08 - 0:10When I was in graduate school,
-
0:10 - 0:13I had the opportunity to overhear my grandmother
-
0:13 - 0:17describing to one of her fellow senior citizens
-
0:17 - 0:20what I did for a living.
-
0:20 - 0:23Apparently, I was in charge of making sure that
-
0:23 - 0:27no one stole the computers from the university. (Laughter)
-
0:27 - 0:30And, you know, that's a perfectly reasonable thing
-
0:30 - 0:32for her to think, because I told her I was working
-
0:32 - 0:33in computer security,
-
0:33 - 0:37and it was interesting to get her perspective.
-
0:37 - 0:39But that's not the most ridiculous thing I've ever heard
-
0:39 - 0:41anyone say about my work.
-
0:41 - 0:44The most ridiculous thing I ever heard is,
-
0:44 - 0:47I was at a dinner party, and a woman heard
-
0:47 - 0:49that I work in computer security,
-
0:49 - 0:52and she asked me if -- she said her computer had been
-
0:52 - 0:56infected by a virus, and she was very concerned that she
-
0:56 - 1:00might get sick from it, that she could get this virus. (Laughter)
-
1:00 - 1:02And I'm not a doctor, but I reassured her
-
1:02 - 1:06that it was very, very unlikely that this would happen,
-
1:06 - 1:08but if she felt more comfortable, she could be free to use
-
1:08 - 1:10latex gloves when she was on the computer,
-
1:10 - 1:14and there would be no harm whatsoever in that.
-
1:14 - 1:16I'm going to get back to this notion of being able to get
-
1:16 - 1:20a virus from your computer, in a serious way.
-
1:20 - 1:21What I'm going to talk to you about today
-
1:21 - 1:26are some hacks, some real world cyberattacks that people
-
1:26 - 1:29in my community, the academic research community,
-
1:29 - 1:32have performed, which I don't think
-
1:32 - 1:33most people know about,
-
1:33 - 1:36and I think they're very interesting and scary,
-
1:36 - 1:38and this talk is kind of a greatest hits
-
1:38 - 1:41of the academic security community's hacks.
-
1:41 - 1:43None of the work is my work. It's all work
-
1:43 - 1:45that my colleagues have done, and I actually asked them
-
1:45 - 1:48for their slides and incorporated them into this talk.
-
1:48 - 1:50So the first one I'm going to talk about
-
1:50 - 1:52are implanted medical devices.
-
1:52 - 1:55Now medical devices have come a long way technologically.
-
1:55 - 1:59You can see in 1926 the first pacemaker was invented.
-
1:59 - 2:031960, the first internal pacemaker was implanted,
-
2:03 - 2:05hopefully a little smaller than that one that you see there,
-
2:05 - 2:08and the technology has continued to move forward.
-
2:08 - 2:13In 2006, we hit an important milestone from the perspective
-
2:13 - 2:16of computer security.
-
2:16 - 2:17And why do I say that?
-
2:17 - 2:20Because that's when implanted devices inside of people
-
2:20 - 2:23started to have networking capabilities.
-
2:23 - 2:25One thing that brings us close to home is we look
-
2:25 - 2:28at Dick Cheney's device, he had a device that
-
2:28 - 2:32pumped blood from an aorta to another part of the heart,
-
2:32 - 2:33and as you can see at the bottom there,
-
2:33 - 2:36it was controlled by a computer controller,
-
2:36 - 2:38and if you ever thought that software liability
-
2:38 - 2:42was very important, get one of these inside of you.
-
2:42 - 2:45Now what a research team did was they got their hands
-
2:45 - 2:47on what's called an ICD.
-
2:47 - 2:49This is a defibrillator, and this is a device
-
2:49 - 2:53that goes into a person to control their heart rhythm,
-
2:53 - 2:56and these have saved many lives.
-
2:56 - 2:58Well, in order to not have to open up the person
-
2:58 - 3:00every time you want to reprogram their device
-
3:00 - 3:03or do some diagnostics on it, they made the thing be able
-
3:03 - 3:06to communicate wirelessly, and what this research team did
-
3:06 - 3:08is they reverse engineered the wireless protocol,
-
3:08 - 3:10and they built the device you see pictured here,
-
3:10 - 3:13with a little antenna, that could talk the protocol
-
3:13 - 3:18to the device, and thus control it.
-
3:18 - 3:20In order to make their experience real -- they were unable
-
3:20 - 3:23to find any volunteers, and so they went
-
3:23 - 3:25and they got some ground beef and some bacon
-
3:25 - 3:27and they wrapped it all up to about the size
-
3:27 - 3:29of a human being's area where the device would go,
-
3:29 - 3:31and they stuck the device inside it
-
3:31 - 3:34to perform their experiment somewhat realistically.
-
3:34 - 3:37They launched many, many successful attacks.
-
3:37 - 3:40One that I'll highlight here is changing the patient's name.
-
3:40 - 3:41I don't know why you would want to do that,
-
3:41 - 3:43but I sure wouldn't want that done to me.
-
3:43 - 3:46And they were able to change therapies,
-
3:46 - 3:48including disabling the device -- and this is with a real,
-
3:48 - 3:50commercial, off-the-shelf device --
-
3:50 - 3:52simply by performing reverse engineering and sending
-
3:52 - 3:55wireless signals to it.
-
3:55 - 3:59There was a piece on NPR that some of these ICDs
-
3:59 - 4:01could actually have their performance disrupted
-
4:01 - 4:05simply by holding a pair of headphones onto them.
-
4:05 - 4:06Now, wireless and the Internet
-
4:06 - 4:08can improve health care greatly.
-
4:08 - 4:10There's several examples up on the screen
-
4:10 - 4:13of situations where doctors are looking to implant devices
-
4:13 - 4:16inside of people, and all of these devices now,
-
4:16 - 4:19it's standard that they communicate wirelessly,
-
4:19 - 4:20and I think this is great,
-
4:20 - 4:23but without a full understanding of trustworthy computing,
-
4:23 - 4:26and without understanding what attackers can do
-
4:26 - 4:28and the security risks from the beginning,
-
4:28 - 4:30there's a lot of danger in this.
-
4:30 - 4:32Okay, let me shift gears and show you another target.
-
4:32 - 4:34I'm going to show you a few different targets like this,
-
4:34 - 4:37and that's my talk. So we'll look at automobiles.
-
4:37 - 4:40This is a car, and it has a lot of components,
-
4:40 - 4:41a lot of electronics in it today.
-
4:41 - 4:46In fact, it's got many, many different computers inside of it,
-
4:46 - 4:49more Pentiums than my lab did when I was in college,
-
4:49 - 4:53and they're connected by a wired network.
-
4:53 - 4:56There's also a wireless network in the car,
-
4:56 - 4:59which can be reached from many different ways.
-
4:59 - 5:03So there's Bluetooth, there's the FM and XM radio,
-
5:03 - 5:06there's actually wi-fi, there's sensors in the wheels
-
5:06 - 5:08that wirelessly communicate the tire pressure
-
5:08 - 5:10to a controller on board.
-
5:10 - 5:15The modern car is a sophisticated multi-computer device.
-
5:15 - 5:18And what happens if somebody wanted to attack this?
-
5:18 - 5:19Well, that's what the researchers
-
5:19 - 5:21that I'm going to talk about today did.
-
5:21 - 5:24They basically stuck an attacker on the wired network
-
5:24 - 5:26and on the wireless network.
-
5:26 - 5:29Now, they have two areas they can attack.
-
5:29 - 5:31One is short-range wireless, where you can actually
-
5:31 - 5:33communicate with the device from nearby,
-
5:33 - 5:35either through Bluetooth or wi-fi,
-
5:35 - 5:37and the other is long-range, where you can communicate
-
5:37 - 5:39with the car through the cellular network,
-
5:39 - 5:41or through one of the radio stations.
-
5:41 - 5:44Think about it. When a car receives a radio signal,
-
5:44 - 5:46it's processed by software.
-
5:46 - 5:49That software has to receive and decode the radio signal,
-
5:49 - 5:50and then figure out what to do with it,
-
5:50 - 5:53even if it's just music that it needs to play on the radio,
-
5:53 - 5:56and that software that does that decoding,
-
5:56 - 5:59if it has any bugs in it, could create a vulnerability
-
5:59 - 6:02for somebody to hack the car.
-
6:02 - 6:05The way that the researchers did this work is,
-
6:05 - 6:09they read the software in the computer chips
-
6:09 - 6:12that were in the car, and then they used sophisticated
-
6:12 - 6:14reverse engineering tools
-
6:14 - 6:16to figure out what that software did,
-
6:16 - 6:19and then they found vulnerabilities in that software,
-
6:19 - 6:22and then they built exploits to exploit those.
-
6:22 - 6:24They actually carried out their attack in real life.
-
6:24 - 6:26They bought two cars, and I guess
-
6:26 - 6:29they have better budgets than I do.
-
6:29 - 6:31The first threat model was to see what someone could do
-
6:31 - 6:33if an attacker actually got access
-
6:33 - 6:35to the internal network on the car.
-
6:35 - 6:38Okay, so think of that as, someone gets to go to your car,
-
6:38 - 6:41they get to mess around with it, and then they leave,
-
6:41 - 6:43and now, what kind of trouble are you in?
-
6:43 - 6:46The other threat model is that they contact you
-
6:46 - 6:49in real time over one of the wireless networks
-
6:49 - 6:51like the cellular, or something like that,
-
6:51 - 6:55never having actually gotten physical access to your car.
-
6:55 - 6:57This is what their setup looks like for the first model,
-
6:57 - 6:59where you get to have access to the car.
-
6:59 - 7:03They put a laptop, and they connected to the diagnostic unit
-
7:03 - 7:05on the in-car network, and they did all kinds of silly things,
-
7:05 - 7:08like here's a picture of the speedometer
-
7:08 - 7:11showing 140 miles an hour when the car's in park.
-
7:11 - 7:13Once you have control of the car's computers,
-
7:13 - 7:14you can do anything.
-
7:14 - 7:16Now you might say, "Okay, that's silly."
-
7:16 - 7:18Well, what if you make the car always say
-
7:18 - 7:20it's going 20 miles an hour slower than it's actually going?
-
7:20 - 7:23You might produce a lot of speeding tickets.
-
7:23 - 7:27Then they went out to an abandoned airstrip with two cars,
-
7:27 - 7:30the target victim car and the chase car,
-
7:30 - 7:32and they launched a bunch of other attacks.
-
7:32 - 7:35One of the things they were able to do from the chase car
-
7:35 - 7:37is apply the brakes on the other car,
-
7:37 - 7:39simply by hacking the computer.
-
7:39 - 7:41They were able to disable the brakes.
-
7:41 - 7:44They also were able to install malware that wouldn't kick in
-
7:44 - 7:47and wouldn't trigger until the car was doing something like
-
7:47 - 7:50going over 20 miles an hour, or something like that.
-
7:50 - 7:53The results are astonishing, and when they gave this talk,
-
7:53 - 7:55even though they gave this talk at a conference
-
7:55 - 7:57to a bunch of computer security researchers,
-
7:57 - 7:58everybody was gasping.
-
7:58 - 8:02They were able to take over a bunch of critical computers
-
8:02 - 8:06inside the car: the brakes computer, the lighting computer,
-
8:06 - 8:09the engine, the dash, the radio, etc.,
-
8:09 - 8:11and they were able to perform these on real commercial
-
8:11 - 8:14cars that they purchased using the radio network.
-
8:14 - 8:17They were able to compromise every single one of the
-
8:17 - 8:19pieces of software that controlled every single one
-
8:19 - 8:22of the wireless capabilities of the car.
-
8:22 - 8:25All of these were implemented successfully.
-
8:25 - 8:27How would you steal a car in this model?
-
8:27 - 8:31Well, you compromise the car by a buffer overflow
-
8:31 - 8:33of vulnerability in the software, something like that.
-
8:33 - 8:36You use the GPS in the car to locate it.
-
8:36 - 8:38You remotely unlock the doors through the computer
-
8:38 - 8:41that controls that, start the engine, bypass anti-theft,
-
8:41 - 8:43and you've got yourself a car.
-
8:43 - 8:45Surveillance was really interesting.
-
8:45 - 8:48The authors of the study have a video where they show
-
8:48 - 8:51themselves taking over a car and then turning on
-
8:51 - 8:54the microphone in the car, and listening in on the car
-
8:54 - 8:57while tracking it via GPS on a map,
-
8:57 - 8:59and so that's something that the drivers of the car
-
8:59 - 9:01would never know was happening.
-
9:01 - 9:03Am I scaring you yet?
-
9:03 - 9:05I've got a few more of these interesting ones.
-
9:05 - 9:07These are ones where I went to a conference,
-
9:07 - 9:09and my mind was just blown, and I said,
-
9:09 - 9:11"I have to share this with other people."
-
9:11 - 9:12This was Fabian Monrose's lab
-
9:12 - 9:16at the University of North Carolina, and what they did was
-
9:16 - 9:18something intuitive once you see it,
-
9:18 - 9:19but kind of surprising.
-
9:19 - 9:22They videotaped people on a bus,
-
9:22 - 9:25and then they post-processed the video.
-
9:25 - 9:27What you see here in number one is a
-
9:27 - 9:31reflection in somebody's glasses of the smartphone
-
9:31 - 9:33that they're typing in.
-
9:33 - 9:35They wrote software to stabilize --
-
9:35 - 9:36even though they were on a bus
-
9:36 - 9:39and maybe someone's holding their phone at an angle --
-
9:39 - 9:42to stabilize the phone, process it, and
-
9:42 - 9:44you may know on your smartphone, when you type
-
9:44 - 9:47a password, the keys pop out a little bit, and they were able
-
9:47 - 9:49to use that to reconstruct what the person was typing,
-
9:49 - 9:54and had a language model for detecting typing.
-
9:54 - 9:56What was interesting is, by videotaping on a bus,
-
9:56 - 9:58they were able to produce exactly what people
-
9:58 - 10:00on their smartphones were typing,
-
10:00 - 10:03and then they had a surprising result, which is that
-
10:03 - 10:05their software had not only done it for their target,
-
10:05 - 10:07but other people who accidentally happened
-
10:07 - 10:09to be in the picture, they were able to produce
-
10:09 - 10:12what those people had been typing, and that was kind of
-
10:12 - 10:15an accidental artifact of what their software was doing.
-
10:15 - 10:19I'll show you two more. One is P25 radios.
-
10:19 - 10:22P25 radios are used by law enforcement
-
10:22 - 10:26and all kinds of government agencies
-
10:26 - 10:27and people in combat to communicate,
-
10:27 - 10:30and there's an encryption option on these phones.
-
10:30 - 10:33This is what the phone looks like. It's not really a phone.
-
10:33 - 10:34It's more of a two-way radio.
-
10:34 - 10:37Motorola makes the most widely used one, and you can see
-
10:37 - 10:40that they're used by Secret Service, they're used in combat,
-
10:40 - 10:43it's a very, very common standard in the U.S. and elsewhere.
-
10:43 - 10:46So one question the researchers asked themselves is,
-
10:46 - 10:48could you block this thing, right?
-
10:48 - 10:50Could you run a denial-of-service,
-
10:50 - 10:52because these are first responders?
-
10:52 - 10:53So, would a terrorist organization want to black out the
-
10:53 - 10:58ability of police and fire to communicate at an emergency?
-
10:58 - 11:01They found that there's this GirlTech device used for texting
-
11:01 - 11:04that happens to operate at the same exact frequency
-
11:04 - 11:06as the P25, and they built what they called
-
11:06 - 11:10My First Jammer. (Laughter)
-
11:10 - 11:13If you look closely at this device,
-
11:13 - 11:16it's got a switch for encryption or cleartext.
-
11:16 - 11:19Let me advance the slide, and now I'll go back.
-
11:19 - 11:22You see the difference?
-
11:22 - 11:25This is plain text. This is encrypted.
-
11:25 - 11:27There's one little dot that shows up on the screen,
-
11:27 - 11:29and one little tiny turn of the switch.
-
11:29 - 11:31And so the researchers asked themselves, "I wonder how
-
11:31 - 11:35many times very secure, important, sensitive conversations
-
11:35 - 11:37are happening on these two-way radios where they forget
-
11:37 - 11:40to encrypt and they don't notice that they didn't encrypt?"
-
11:40 - 11:43So they bought a scanner. These are perfectly legal
-
11:43 - 11:47and they run at the frequency of the P25,
-
11:47 - 11:48and what they did is they hopped around frequencies
-
11:48 - 11:51and they wrote software to listen in.
-
11:51 - 11:54If they found encrypted communication, they stayed
-
11:54 - 11:55on that channel and they wrote down, that's a channel
-
11:55 - 11:57that these people communicate in,
-
11:57 - 11:59these law enforcement agencies,
-
11:59 - 12:02and they went to 20 metropolitan areas and listened in
-
12:02 - 12:06on conversations that were happening at those frequencies.
-
12:06 - 12:09They found that in every metropolitan area,
-
12:09 - 12:11they would capture over 20 minutes a day
-
12:11 - 12:13of cleartext communication.
-
12:13 - 12:15And what kind of things were people talking about?
-
12:15 - 12:17Well, they found the names and information
-
12:17 - 12:20about confidential informants. They found information
-
12:20 - 12:22that was being recorded in wiretaps,
-
12:22 - 12:25a bunch of crimes that were being discussed,
-
12:25 - 12:26sensitive information.
-
12:26 - 12:29It was mostly law enforcement and criminal.
-
12:29 - 12:31They went and reported this to the law enforcement
-
12:31 - 12:33agencies, after anonymizing it,
-
12:33 - 12:36and the vulnerability here is simply the user interface
-
12:36 - 12:37wasn't good enough. If you're talking
-
12:37 - 12:40about something really secure and sensitive, it should
-
12:40 - 12:43be really clear to you that this conversation is encrypted.
-
12:43 - 12:45That one's pretty easy to fix.
-
12:45 - 12:47The last one I thought was really, really cool,
-
12:47 - 12:50and I just had to show it to you, it's probably not something
-
12:50 - 12:51that you're going to lose sleep over
-
12:51 - 12:53like the cars or the defibrillators,
-
12:53 - 12:56but it's stealing keystrokes.
-
12:56 - 12:58Now, we've all looked at smartphones upside down.
-
12:58 - 13:01Every security expert wants to hack a smartphone,
-
13:01 - 13:05and we tend to look at the USB port, the GPS for tracking,
-
13:05 - 13:08the camera, the microphone, but no one up till this point
-
13:08 - 13:10had looked at the accelerometer.
-
13:10 - 13:12The accelerometer is the thing that determines
-
13:12 - 13:15the vertical orientation of the smartphone.
-
13:15 - 13:17And so they had a simple setup.
-
13:17 - 13:19They put a smartphone next to a keyboard,
-
13:19 - 13:22and they had people type, and then their goal was
-
13:22 - 13:25to use the vibrations that were created by typing
-
13:25 - 13:29to measure the change in the accelerometer reading
-
13:29 - 13:32to determine what the person had been typing.
-
13:32 - 13:35Now, when they tried this on an iPhone 3GS,
-
13:35 - 13:38this is a graph of the perturbations that were created
-
13:38 - 13:41by the typing, and you can see that it's very difficult
-
13:41 - 13:44to tell when somebody was typing or what they were typing,
-
13:44 - 13:47but the iPhone 4 greatly improved the accelerometer,
-
13:47 - 13:50and so the same measurement
-
13:50 - 13:52produced this graph.
-
13:52 - 13:55Now that gave you a lot of information while someone
-
13:55 - 13:58was typing, and what they did then is used advanced
-
13:58 - 14:01artificial intelligence techniques called machine learning
-
14:01 - 14:02to have a training phase,
-
14:02 - 14:05and so they got most likely grad students
-
14:05 - 14:08to type in a whole lot of things, and to learn,
-
14:08 - 14:11to have the system use the machine learning tools that
-
14:11 - 14:14were available to learn what it is that the people were typing
-
14:14 - 14:17and to match that up
-
14:17 - 14:19with the measurements in the accelerometer.
-
14:19 - 14:21And then there's the attack phase, where you get
-
14:21 - 14:24somebody to type something in, you don't know what it was,
-
14:24 - 14:25but you use your model that you created
-
14:25 - 14:29in the training phase to figure out what they were typing.
-
14:29 - 14:32They had pretty good success. This is an article from the USA Today.
-
14:32 - 14:35They typed in, "The Illinois Supreme Court has ruled
-
14:35 - 14:38that Rahm Emanuel is eligible to run for Mayor of Chicago"
-
14:38 - 14:39— see, I tied it in to the last talk —
-
14:39 - 14:41"and ordered him to stay on the ballot."
-
14:41 - 14:44Now, the system is interesting, because it produced
-
14:44 - 14:47"Illinois Supreme" and then it wasn't sure.
-
14:47 - 14:49The model produced a bunch of options,
-
14:49 - 14:51and this is the beauty of some of the A.I. techniques,
-
14:51 - 14:54is that computers are good at some things,
-
14:54 - 14:55humans are good at other things,
-
14:55 - 14:57take the best of both and let the humans solve this one.
-
14:57 - 14:59Don't waste computer cycles.
-
14:59 - 15:01A human's not going to think it's the Supreme might.
-
15:01 - 15:02It's the Supreme Court, right?
-
15:02 - 15:05And so, together we're able to reproduce typing
-
15:05 - 15:08simply by measuring the accelerometer.
-
15:08 - 15:11Why does this matter? Well, in the Android platform,
-
15:11 - 15:16for example, the developers have a manifest
-
15:16 - 15:18where every device on there, the microphone, etc.,
-
15:18 - 15:20has to register if you're going to use it
-
15:20 - 15:22so that hackers can't take over it,
-
15:22 - 15:26but nobody controls the accelerometer.
-
15:26 - 15:28So what's the point? You can leave your iPhone next to
-
15:28 - 15:30someone's keyboard, and just leave the room,
-
15:30 - 15:31and then later recover what they did,
-
15:31 - 15:33even without using the microphone.
-
15:33 - 15:35If someone is able to put malware on your iPhone,
-
15:35 - 15:38they could then maybe get the typing that you do
-
15:38 - 15:41whenever you put your iPhone next to your keyboard.
-
15:41 - 15:43There's several other notable attacks that unfortunately
-
15:43 - 15:45I don't have time to go into, but the one that I wanted
-
15:45 - 15:47to point out was a group from the University of Michigan
-
15:47 - 15:50which was able to take voting machines,
-
15:50 - 15:52the Sequoia AVC Edge DREs that
-
15:52 - 15:54were going to be used in New Jersey in the election
-
15:54 - 15:56that were left in a hallway, and put Pac-Man on it.
-
15:56 - 16:00So they ran the Pac-Man game.
-
16:00 - 16:01What does this all mean?
-
16:01 - 16:05Well, I think that society tends to adopt technology
-
16:05 - 16:08really quickly. I love the next coolest gadget.
-
16:08 - 16:10But it's very important, and these researchers are showing,
-
16:10 - 16:12that the developers of these things
-
16:12 - 16:15need to take security into account from the very beginning,
-
16:15 - 16:17and need to realize that they may have a threat model,
-
16:17 - 16:20but the attackers may not be nice enough
-
16:20 - 16:22to limit themselves to that threat model,
-
16:22 - 16:24and so you need to think outside of the box.
-
16:24 - 16:26What we can do is be aware
-
16:26 - 16:28that devices can be compromised,
-
16:28 - 16:30and anything that has software in it
-
16:30 - 16:33is going to be vulnerable. It's going to have bugs.
-
16:33 - 16:36Thank you very much. (Applause)
- Title:
- All your devices can be hacked
- Speaker:
- Avi Rubin
- Description:
-
Could someone hack your pacemaker? At TEDxMidAtlantic, Avi Rubin explains how hackers are compromising cars, smartphones and medical devices, and warns us about the dangers of an increasingly hack-able world. (Filmed at TEDxMidAtlantic.)
- Video Language:
- English
- Team:
- closed TED
- Project:
- TEDTalks
- Duration:
- 16:36
Olivia Cucinotta edited English subtitles for All your devices can be hacked | ||
Olivia Cucinotta edited English subtitles for All your devices can be hacked | ||
Thu-Huong Ha edited English subtitles for All your devices can be hacked | ||
Thu-Huong Ha approved English subtitles for All your devices can be hacked | ||
Thu-Huong Ha edited English subtitles for All your devices can be hacked | ||
Morton Bast accepted English subtitles for All your devices can be hacked | ||
Morton Bast edited English subtitles for All your devices can be hacked | ||
Joseph Geni added a translation |