Return to Video

All your devices can be hacked

  • 0:01 - 0:04
    I'm a computer science professor,
  • 0:04 - 0:06
    and my area of expertise is
  • 0:06 - 0:08
    computer and information security.
  • 0:08 - 0:10
    When I was in graduate school,
  • 0:10 - 0:13
    I had the opportunity to overhear my grandmother
  • 0:13 - 0:17
    describing to one of her fellow senior citizens
  • 0:17 - 0:20
    what I did for a living.
  • 0:20 - 0:23
    Apparently, I was in charge of making sure that
  • 0:23 - 0:27
    no one stole the computers from the university. (Laughter)
  • 0:27 - 0:30
    And, you know, that's a perfectly reasonable thing
  • 0:30 - 0:32
    for her to think, because I told her I was working
  • 0:32 - 0:33
    in computer security,
  • 0:33 - 0:37
    and it was interesting to get her perspective.
  • 0:37 - 0:39
    But that's not the most ridiculous thing I've ever heard
  • 0:39 - 0:41
    anyone say about my work.
  • 0:41 - 0:44
    The most ridiculous thing I ever heard is,
  • 0:44 - 0:47
    I was at a dinner party, and a woman heard
  • 0:47 - 0:49
    that I work in computer security,
  • 0:49 - 0:52
    and she asked me if -- she said her computer had been
  • 0:52 - 0:56
    infected by a virus, and she was very concerned that she
  • 0:56 - 1:00
    might get sick from it, that she could get this virus. (Laughter)
  • 1:00 - 1:02
    And I'm not a doctor, but I reassured her
  • 1:02 - 1:06
    that it was very, very unlikely that this would happen,
  • 1:06 - 1:08
    but if she felt more comfortable, she could be free to use
  • 1:08 - 1:10
    latex gloves when she was on the computer,
  • 1:10 - 1:14
    and there would be no harm whatsoever in that.
  • 1:14 - 1:16
    I'm going to get back to this notion of being able to get
  • 1:16 - 1:20
    a virus from your computer, in a serious way.
  • 1:20 - 1:21
    What I'm going to talk to you about today
  • 1:21 - 1:26
    are some hacks, some real world cyberattacks that people
  • 1:26 - 1:29
    in my community, the academic research community,
  • 1:29 - 1:32
    have performed, which I don't think
  • 1:32 - 1:33
    most people know about,
  • 1:33 - 1:36
    and I think they're very interesting and scary,
  • 1:36 - 1:38
    and this talk is kind of a greatest hits
  • 1:38 - 1:41
    of the academic security community's hacks.
  • 1:41 - 1:43
    None of the work is my work. It's all work
  • 1:43 - 1:45
    that my colleagues have done, and I actually asked them
  • 1:45 - 1:48
    for their slides and incorporated them into this talk.
  • 1:48 - 1:50
    So the first one I'm going to talk about
  • 1:50 - 1:52
    are implanted medical devices.
  • 1:52 - 1:55
    Now medical devices have come a long way technologically.
  • 1:55 - 1:59
    You can see in 1926 the first pacemaker was invented.
  • 1:59 - 2:03
    1960, the first internal pacemaker was implanted,
  • 2:03 - 2:05
    hopefully a little smaller than that one that you see there,
  • 2:05 - 2:08
    and the technology has continued to move forward.
  • 2:08 - 2:13
    In 2006, we hit an important milestone from the perspective
  • 2:13 - 2:16
    of computer security.
  • 2:16 - 2:17
    And why do I say that?
  • 2:17 - 2:20
    Because that's when implanted devices inside of people
  • 2:20 - 2:23
    started to have networking capabilities.
  • 2:23 - 2:25
    One thing that brings us close to home is we look
  • 2:25 - 2:28
    at Dick Cheney's device, he had a device that
  • 2:28 - 2:32
    pumped blood from an aorta to another part of the heart,
  • 2:32 - 2:33
    and as you can see at the bottom there,
  • 2:33 - 2:36
    it was controlled by a computer controller,
  • 2:36 - 2:38
    and if you ever thought that software liability
  • 2:38 - 2:42
    was very important, get one of these inside of you.
  • 2:42 - 2:45
    Now what a research team did was they got their hands
  • 2:45 - 2:47
    on what's called an ICD.
  • 2:47 - 2:49
    This is a defibrillator, and this is a device
  • 2:49 - 2:53
    that goes into a person to control their heart rhythm,
  • 2:53 - 2:56
    and these have saved many lives.
  • 2:56 - 2:58
    Well, in order to not have to open up the person
  • 2:58 - 3:00
    every time you want to reprogram their device
  • 3:00 - 3:03
    or do some diagnostics on it, they made the thing be able
  • 3:03 - 3:06
    to communicate wirelessly, and what this research team did
  • 3:06 - 3:08
    is they reverse engineered the wireless protocol,
  • 3:08 - 3:10
    and they built the device you see pictured here,
  • 3:10 - 3:13
    with a little antenna, that could talk the protocol
  • 3:13 - 3:18
    to the device, and thus control it.
  • 3:18 - 3:20
    In order to make their experience real -- they were unable
  • 3:20 - 3:23
    to find any volunteers, and so they went
  • 3:23 - 3:25
    and they got some ground beef and some bacon
  • 3:25 - 3:27
    and they wrapped it all up to about the size
  • 3:27 - 3:29
    of a human being's area where the device would go,
  • 3:29 - 3:31
    and they stuck the device inside it
  • 3:31 - 3:34
    to perform their experiment somewhat realistically.
  • 3:34 - 3:37
    They launched many, many successful attacks.
  • 3:37 - 3:40
    One that I'll highlight here is changing the patient's name.
  • 3:40 - 3:41
    I don't know why you would want to do that,
  • 3:41 - 3:43
    but I sure wouldn't want that done to me.
  • 3:43 - 3:46
    And they were able to change therapies,
  • 3:46 - 3:48
    including disabling the device -- and this is with a real,
  • 3:48 - 3:50
    commercial, off-the-shelf device --
  • 3:50 - 3:52
    simply by performing reverse engineering and sending
  • 3:52 - 3:55
    wireless signals to it.
  • 3:55 - 3:59
    There was a piece on NPR that some of these ICDs
  • 3:59 - 4:01
    could actually have their performance disrupted
  • 4:01 - 4:05
    simply by holding a pair of headphones onto them.
  • 4:05 - 4:06
    Now, wireless and the Internet
  • 4:06 - 4:08
    can improve health care greatly.
  • 4:08 - 4:10
    There's several examples up on the screen
  • 4:10 - 4:13
    of situations where doctors are looking to implant devices
  • 4:13 - 4:16
    inside of people, and all of these devices now,
  • 4:16 - 4:19
    it's standard that they communicate wirelessly,
  • 4:19 - 4:20
    and I think this is great,
  • 4:20 - 4:23
    but without a full understanding of trustworthy computing,
  • 4:23 - 4:26
    and without understanding what attackers can do
  • 4:26 - 4:28
    and the security risks from the beginning,
  • 4:28 - 4:30
    there's a lot of danger in this.
  • 4:30 - 4:32
    Okay, let me shift gears and show you another target.
  • 4:32 - 4:34
    I'm going to show you a few different targets like this,
  • 4:34 - 4:37
    and that's my talk. So we'll look at automobiles.
  • 4:37 - 4:40
    This is a car, and it has a lot of components,
  • 4:40 - 4:41
    a lot of electronics in it today.
  • 4:41 - 4:46
    In fact, it's got many, many different computers inside of it,
  • 4:46 - 4:49
    more Pentiums than my lab did when I was in college,
  • 4:49 - 4:53
    and they're connected by a wired network.
  • 4:53 - 4:56
    There's also a wireless network in the car,
  • 4:56 - 4:59
    which can be reached from many different ways.
  • 4:59 - 5:03
    So there's Bluetooth, there's the FM and XM radio,
  • 5:03 - 5:06
    there's actually wi-fi, there's sensors in the wheels
  • 5:06 - 5:08
    that wirelessly communicate the tire pressure
  • 5:08 - 5:10
    to a controller on board.
  • 5:10 - 5:15
    The modern car is a sophisticated multi-computer device.
  • 5:15 - 5:18
    And what happens if somebody wanted to attack this?
  • 5:18 - 5:19
    Well, that's what the researchers
  • 5:19 - 5:21
    that I'm going to talk about today did.
  • 5:21 - 5:24
    They basically stuck an attacker on the wired network
  • 5:24 - 5:26
    and on the wireless network.
  • 5:26 - 5:29
    Now, they have two areas they can attack.
  • 5:29 - 5:31
    One is short-range wireless, where you can actually
  • 5:31 - 5:33
    communicate with the device from nearby,
  • 5:33 - 5:35
    either through Bluetooth or wi-fi,
  • 5:35 - 5:37
    and the other is long-range, where you can communicate
  • 5:37 - 5:39
    with the car through the cellular network,
  • 5:39 - 5:41
    or through one of the radio stations.
  • 5:41 - 5:44
    Think about it. When a car receives a radio signal,
  • 5:44 - 5:46
    it's processed by software.
  • 5:46 - 5:49
    That software has to receive and decode the radio signal,
  • 5:49 - 5:50
    and then figure out what to do with it,
  • 5:50 - 5:53
    even if it's just music that it needs to play on the radio,
  • 5:53 - 5:56
    and that software that does that decoding,
  • 5:56 - 5:59
    if it has any bugs in it, could create a vulnerability
  • 5:59 - 6:02
    for somebody to hack the car.
  • 6:02 - 6:05
    The way that the researchers did this work is,
  • 6:05 - 6:09
    they read the software in the computer chips
  • 6:09 - 6:12
    that were in the car, and then they used sophisticated
  • 6:12 - 6:14
    reverse engineering tools
  • 6:14 - 6:16
    to figure out what that software did,
  • 6:16 - 6:19
    and then they found vulnerabilities in that software,
  • 6:19 - 6:22
    and then they built exploits to exploit those.
  • 6:22 - 6:24
    They actually carried out their attack in real life.
  • 6:24 - 6:26
    They bought two cars, and I guess
  • 6:26 - 6:29
    they have better budgets than I do.
  • 6:29 - 6:31
    The first threat model was to see what someone could do
  • 6:31 - 6:33
    if an attacker actually got access
  • 6:33 - 6:35
    to the internal network on the car.
  • 6:35 - 6:38
    Okay, so think of that as, someone gets to go to your car,
  • 6:38 - 6:41
    they get to mess around with it, and then they leave,
  • 6:41 - 6:43
    and now, what kind of trouble are you in?
  • 6:43 - 6:46
    The other threat model is that they contact you
  • 6:46 - 6:49
    in real time over one of the wireless networks
  • 6:49 - 6:51
    like the cellular, or something like that,
  • 6:51 - 6:55
    never having actually gotten physical access to your car.
  • 6:55 - 6:57
    This is what their setup looks like for the first model,
  • 6:57 - 6:59
    where you get to have access to the car.
  • 6:59 - 7:03
    They put a laptop, and they connected to the diagnostic unit
  • 7:03 - 7:05
    on the in-car network, and they did all kinds of silly things,
  • 7:05 - 7:08
    like here's a picture of the speedometer
  • 7:08 - 7:11
    showing 140 miles an hour when the car's in park.
  • 7:11 - 7:13
    Once you have control of the car's computers,
  • 7:13 - 7:14
    you can do anything.
  • 7:14 - 7:16
    Now you might say, "Okay, that's silly."
  • 7:16 - 7:18
    Well, what if you make the car always say
  • 7:18 - 7:20
    it's going 20 miles an hour slower than it's actually going?
  • 7:20 - 7:23
    You might produce a lot of speeding tickets.
  • 7:23 - 7:27
    Then they went out to an abandoned airstrip with two cars,
  • 7:27 - 7:30
    the target victim car and the chase car,
  • 7:30 - 7:32
    and they launched a bunch of other attacks.
  • 7:32 - 7:35
    One of the things they were able to do from the chase car
  • 7:35 - 7:37
    is apply the brakes on the other car,
  • 7:37 - 7:39
    simply by hacking the computer.
  • 7:39 - 7:41
    They were able to disable the brakes.
  • 7:41 - 7:44
    They also were able to install malware that wouldn't kick in
  • 7:44 - 7:47
    and wouldn't trigger until the car was doing something like
  • 7:47 - 7:50
    going over 20 miles an hour, or something like that.
  • 7:50 - 7:53
    The results are astonishing, and when they gave this talk,
  • 7:53 - 7:55
    even though they gave this talk at a conference
  • 7:55 - 7:57
    to a bunch of computer security researchers,
  • 7:57 - 7:58
    everybody was gasping.
  • 7:58 - 8:02
    They were able to take over a bunch of critical computers
  • 8:02 - 8:06
    inside the car: the brakes computer, the lighting computer,
  • 8:06 - 8:09
    the engine, the dash, the radio, etc.,
  • 8:09 - 8:11
    and they were able to perform these on real commercial
  • 8:11 - 8:14
    cars that they purchased using the radio network.
  • 8:14 - 8:17
    They were able to compromise every single one of the
  • 8:17 - 8:19
    pieces of software that controlled every single one
  • 8:19 - 8:22
    of the wireless capabilities of the car.
  • 8:22 - 8:25
    All of these were implemented successfully.
  • 8:25 - 8:27
    How would you steal a car in this model?
  • 8:27 - 8:31
    Well, you compromise the car by a buffer overflow
  • 8:31 - 8:33
    of vulnerability in the software, something like that.
  • 8:33 - 8:36
    You use the GPS in the car to locate it.
  • 8:36 - 8:38
    You remotely unlock the doors through the computer
  • 8:38 - 8:41
    that controls that, start the engine, bypass anti-theft,
  • 8:41 - 8:43
    and you've got yourself a car.
  • 8:43 - 8:45
    Surveillance was really interesting.
  • 8:45 - 8:48
    The authors of the study have a video where they show
  • 8:48 - 8:51
    themselves taking over a car and then turning on
  • 8:51 - 8:54
    the microphone in the car, and listening in on the car
  • 8:54 - 8:57
    while tracking it via GPS on a map,
  • 8:57 - 8:59
    and so that's something that the drivers of the car
  • 8:59 - 9:01
    would never know was happening.
  • 9:01 - 9:03
    Am I scaring you yet?
  • 9:03 - 9:05
    I've got a few more of these interesting ones.
  • 9:05 - 9:07
    These are ones where I went to a conference,
  • 9:07 - 9:09
    and my mind was just blown, and I said,
  • 9:09 - 9:11
    "I have to share this with other people."
  • 9:11 - 9:12
    This was Fabian Monrose's lab
  • 9:12 - 9:16
    at the University of North Carolina, and what they did was
  • 9:16 - 9:18
    something intuitive once you see it,
  • 9:18 - 9:19
    but kind of surprising.
  • 9:19 - 9:22
    They videotaped people on a bus,
  • 9:22 - 9:25
    and then they post-processed the video.
  • 9:25 - 9:27
    What you see here in number one is a
  • 9:27 - 9:31
    reflection in somebody's glasses of the smartphone
  • 9:31 - 9:33
    that they're typing in.
  • 9:33 - 9:35
    They wrote software to stabilize --
  • 9:35 - 9:36
    even though they were on a bus
  • 9:36 - 9:39
    and maybe someone's holding their phone at an angle --
  • 9:39 - 9:42
    to stabilize the phone, process it, and
  • 9:42 - 9:44
    you may know on your smartphone, when you type
  • 9:44 - 9:47
    a password, the keys pop out a little bit, and they were able
  • 9:47 - 9:49
    to use that to reconstruct what the person was typing,
  • 9:49 - 9:54
    and had a language model for detecting typing.
  • 9:54 - 9:56
    What was interesting is, by videotaping on a bus,
  • 9:56 - 9:58
    they were able to produce exactly what people
  • 9:58 - 10:00
    on their smartphones were typing,
  • 10:00 - 10:03
    and then they had a surprising result, which is that
  • 10:03 - 10:05
    their software had not only done it for their target,
  • 10:05 - 10:07
    but other people who accidentally happened
  • 10:07 - 10:09
    to be in the picture, they were able to produce
  • 10:09 - 10:12
    what those people had been typing, and that was kind of
  • 10:12 - 10:15
    an accidental artifact of what their software was doing.
  • 10:15 - 10:19
    I'll show you two more. One is P25 radios.
  • 10:19 - 10:22
    P25 radios are used by law enforcement
  • 10:22 - 10:26
    and all kinds of government agencies
  • 10:26 - 10:27
    and people in combat to communicate,
  • 10:27 - 10:30
    and there's an encryption option on these phones.
  • 10:30 - 10:33
    This is what the phone looks like. It's not really a phone.
  • 10:33 - 10:34
    It's more of a two-way radio.
  • 10:34 - 10:37
    Motorola makes the most widely used one, and you can see
  • 10:37 - 10:40
    that they're used by Secret Service, they're used in combat,
  • 10:40 - 10:43
    it's a very, very common standard in the U.S. and elsewhere.
  • 10:43 - 10:46
    So one question the researchers asked themselves is,
  • 10:46 - 10:48
    could you block this thing, right?
  • 10:48 - 10:50
    Could you run a denial-of-service,
  • 10:50 - 10:52
    because these are first responders?
  • 10:52 - 10:53
    So, would a terrorist organization want to black out the
  • 10:53 - 10:58
    ability of police and fire to communicate at an emergency?
  • 10:58 - 11:01
    They found that there's this GirlTech device used for texting
  • 11:01 - 11:04
    that happens to operate at the same exact frequency
  • 11:04 - 11:06
    as the P25, and they built what they called
  • 11:06 - 11:10
    My First Jammer. (Laughter)
  • 11:10 - 11:13
    If you look closely at this device,
  • 11:13 - 11:16
    it's got a switch for encryption or cleartext.
  • 11:16 - 11:19
    Let me advance the slide, and now I'll go back.
  • 11:19 - 11:22
    You see the difference?
  • 11:22 - 11:25
    This is plain text. This is encrypted.
  • 11:25 - 11:27
    There's one little dot that shows up on the screen,
  • 11:27 - 11:29
    and one little tiny turn of the switch.
  • 11:29 - 11:31
    And so the researchers asked themselves, "I wonder how
  • 11:31 - 11:35
    many times very secure, important, sensitive conversations
  • 11:35 - 11:37
    are happening on these two-way radios where they forget
  • 11:37 - 11:40
    to encrypt and they don't notice that they didn't encrypt?"
  • 11:40 - 11:43
    So they bought a scanner. These are perfectly legal
  • 11:43 - 11:47
    and they run at the frequency of the P25,
  • 11:47 - 11:48
    and what they did is they hopped around frequencies
  • 11:48 - 11:51
    and they wrote software to listen in.
  • 11:51 - 11:54
    If they found encrypted communication, they stayed
  • 11:54 - 11:55
    on that channel and they wrote down, that's a channel
  • 11:55 - 11:57
    that these people communicate in,
  • 11:57 - 11:59
    these law enforcement agencies,
  • 11:59 - 12:02
    and they went to 20 metropolitan areas and listened in
  • 12:02 - 12:06
    on conversations that were happening at those frequencies.
  • 12:06 - 12:09
    They found that in every metropolitan area,
  • 12:09 - 12:11
    they would capture over 20 minutes a day
  • 12:11 - 12:13
    of cleartext communication.
  • 12:13 - 12:15
    And what kind of things were people talking about?
  • 12:15 - 12:17
    Well, they found the names and information
  • 12:17 - 12:20
    about confidential informants. They found information
  • 12:20 - 12:22
    that was being recorded in wiretaps,
  • 12:22 - 12:25
    a bunch of crimes that were being discussed,
  • 12:25 - 12:26
    sensitive information.
  • 12:26 - 12:29
    It was mostly law enforcement and criminal.
  • 12:29 - 12:31
    They went and reported this to the law enforcement
  • 12:31 - 12:33
    agencies, after anonymizing it,
  • 12:33 - 12:36
    and the vulnerability here is simply the user interface
  • 12:36 - 12:37
    wasn't good enough. If you're talking
  • 12:37 - 12:40
    about something really secure and sensitive, it should
  • 12:40 - 12:43
    be really clear to you that this conversation is encrypted.
  • 12:43 - 12:45
    That one's pretty easy to fix.
  • 12:45 - 12:47
    The last one I thought was really, really cool,
  • 12:47 - 12:50
    and I just had to show it to you, it's probably not something
  • 12:50 - 12:51
    that you're going to lose sleep over
  • 12:51 - 12:53
    like the cars or the defibrillators,
  • 12:53 - 12:56
    but it's stealing keystrokes.
  • 12:56 - 12:58
    Now, we've all looked at smartphones upside down.
  • 12:58 - 13:01
    Every security expert wants to hack a smartphone,
  • 13:01 - 13:05
    and we tend to look at the USB port, the GPS for tracking,
  • 13:05 - 13:08
    the camera, the microphone, but no one up till this point
  • 13:08 - 13:10
    had looked at the accelerometer.
  • 13:10 - 13:12
    The accelerometer is the thing that determines
  • 13:12 - 13:15
    the vertical orientation of the smartphone.
  • 13:15 - 13:17
    And so they had a simple setup.
  • 13:17 - 13:19
    They put a smartphone next to a keyboard,
  • 13:19 - 13:22
    and they had people type, and then their goal was
  • 13:22 - 13:25
    to use the vibrations that were created by typing
  • 13:25 - 13:29
    to measure the change in the accelerometer reading
  • 13:29 - 13:32
    to determine what the person had been typing.
  • 13:32 - 13:35
    Now, when they tried this on an iPhone 3GS,
  • 13:35 - 13:38
    this is a graph of the perturbations that were created
  • 13:38 - 13:41
    by the typing, and you can see that it's very difficult
  • 13:41 - 13:44
    to tell when somebody was typing or what they were typing,
  • 13:44 - 13:47
    but the iPhone 4 greatly improved the accelerometer,
  • 13:47 - 13:50
    and so the same measurement
  • 13:50 - 13:52
    produced this graph.
  • 13:52 - 13:55
    Now that gave you a lot of information while someone
  • 13:55 - 13:58
    was typing, and what they did then is used advanced
  • 13:58 - 14:01
    artificial intelligence techniques called machine learning
  • 14:01 - 14:02
    to have a training phase,
  • 14:02 - 14:05
    and so they got most likely grad students
  • 14:05 - 14:08
    to type in a whole lot of things, and to learn,
  • 14:08 - 14:11
    to have the system use the machine learning tools that
  • 14:11 - 14:14
    were available to learn what it is that the people were typing
  • 14:14 - 14:17
    and to match that up
  • 14:17 - 14:19
    with the measurements in the accelerometer.
  • 14:19 - 14:21
    And then there's the attack phase, where you get
  • 14:21 - 14:24
    somebody to type something in, you don't know what it was,
  • 14:24 - 14:25
    but you use your model that you created
  • 14:25 - 14:29
    in the training phase to figure out what they were typing.
  • 14:29 - 14:32
    They had pretty good success. This is an article from the USA Today.
  • 14:32 - 14:35
    They typed in, "The Illinois Supreme Court has ruled
  • 14:35 - 14:38
    that Rahm Emanuel is eligible to run for Mayor of Chicago"
  • 14:38 - 14:39
    — see, I tied it in to the last talk —
  • 14:39 - 14:41
    "and ordered him to stay on the ballot."
  • 14:41 - 14:44
    Now, the system is interesting, because it produced
  • 14:44 - 14:47
    "Illinois Supreme" and then it wasn't sure.
  • 14:47 - 14:49
    The model produced a bunch of options,
  • 14:49 - 14:51
    and this is the beauty of some of the A.I. techniques,
  • 14:51 - 14:54
    is that computers are good at some things,
  • 14:54 - 14:55
    humans are good at other things,
  • 14:55 - 14:57
    take the best of both and let the humans solve this one.
  • 14:57 - 14:59
    Don't waste computer cycles.
  • 14:59 - 15:01
    A human's not going to think it's the Supreme might.
  • 15:01 - 15:02
    It's the Supreme Court, right?
  • 15:02 - 15:05
    And so, together we're able to reproduce typing
  • 15:05 - 15:08
    simply by measuring the accelerometer.
  • 15:08 - 15:11
    Why does this matter? Well, in the Android platform,
  • 15:11 - 15:16
    for example, the developers have a manifest
  • 15:16 - 15:18
    where every device on there, the microphone, etc.,
  • 15:18 - 15:20
    has to register if you're going to use it
  • 15:20 - 15:22
    so that hackers can't take over it,
  • 15:22 - 15:26
    but nobody controls the accelerometer.
  • 15:26 - 15:28
    So what's the point? You can leave your iPhone next to
  • 15:28 - 15:30
    someone's keyboard, and just leave the room,
  • 15:30 - 15:31
    and then later recover what they did,
  • 15:31 - 15:33
    even without using the microphone.
  • 15:33 - 15:35
    If someone is able to put malware on your iPhone,
  • 15:35 - 15:38
    they could then maybe get the typing that you do
  • 15:38 - 15:41
    whenever you put your iPhone next to your keyboard.
  • 15:41 - 15:43
    There's several other notable attacks that unfortunately
  • 15:43 - 15:45
    I don't have time to go into, but the one that I wanted
  • 15:45 - 15:47
    to point out was a group from the University of Michigan
  • 15:47 - 15:50
    which was able to take voting machines,
  • 15:50 - 15:52
    the Sequoia AVC Edge DREs that
  • 15:52 - 15:54
    were going to be used in New Jersey in the election
  • 15:54 - 15:56
    that were left in a hallway, and put Pac-Man on it.
  • 15:56 - 16:00
    So they ran the Pac-Man game.
  • 16:00 - 16:01
    What does this all mean?
  • 16:01 - 16:05
    Well, I think that society tends to adopt technology
  • 16:05 - 16:08
    really quickly. I love the next coolest gadget.
  • 16:08 - 16:10
    But it's very important, and these researchers are showing,
  • 16:10 - 16:12
    that the developers of these things
  • 16:12 - 16:15
    need to take security into account from the very beginning,
  • 16:15 - 16:17
    and need to realize that they may have a threat model,
  • 16:17 - 16:20
    but the attackers may not be nice enough
  • 16:20 - 16:22
    to limit themselves to that threat model,
  • 16:22 - 16:24
    and so you need to think outside of the box.
  • 16:24 - 16:26
    What we can do is be aware
  • 16:26 - 16:28
    that devices can be compromised,
  • 16:28 - 16:30
    and anything that has software in it
  • 16:30 - 16:33
    is going to be vulnerable. It's going to have bugs.
  • 16:33 - 16:36
    Thank you very much. (Applause)
Title:
All your devices can be hacked
Speaker:
Avi Rubin
Description:

Could someone hack your pacemaker? At TEDxMidAtlantic, Avi Rubin explains how hackers are compromising cars, smartphones and medical devices, and warns us about the dangers of an increasingly hack-able world. (Filmed at TEDxMidAtlantic.)

more » « less
Video Language:
English
Team:
closed TED
Project:
TEDTalks
Duration:
16:36
Olivia Cucinotta edited English subtitles for All your devices can be hacked
Olivia Cucinotta edited English subtitles for All your devices can be hacked
Thu-Huong Ha edited English subtitles for All your devices can be hacked
Thu-Huong Ha approved English subtitles for All your devices can be hacked
Thu-Huong Ha edited English subtitles for All your devices can be hacked
Morton Bast accepted English subtitles for All your devices can be hacked
Morton Bast edited English subtitles for All your devices can be hacked
Joseph Geni added a translation

English subtitles

Revisions Compare revisions