Return to Video

OEB 2015 - Opening Plenary - Cory Doctorow

  • 0:00 - 0:01
    - Thank you very much.
  • 0:01 - 0:05
    So I'd like to start with something of a
    benediction or permission.
  • 0:05 - 0:08
    I am one of nature's fast talkers
  • 0:08 - 0:11
    and many of you are not
    native English speakers,
  • 0:11 - 0:13
    or maybe not accustomed
    to my harsh Canadian accent.
  • 0:13 - 0:16
    In addition, I've just come in
    from Australia
  • 0:16 - 0:19
    and so like many of you, I am horribly
    jetlagged and have drunk enough coffee
  • 0:19 - 0:21
    this morning to kill a rhino.
  • 0:22 - 0:24
    When I used to be at the United Nations
  • 0:24 - 0:27
    I was known as the scourge of the
    simultaneous translation core
  • 0:27 - 0:30
    I would stand up and speak
    as slowly as I could
  • 0:30 - 0:32
    and turn around, and there they
    would be in their boots doing this
  • 0:32 - 0:35
    (laughter)
    When I start to speak too fast,
  • 0:35 - 0:38
    this is the universal symbol --
    my wife invented it
  • 0:38 - 0:42
    for "Cory, you are talking too fast."
    Please, don't be shy.
  • 0:42 - 0:46
    So, I'm a parent, like many of you,
    and like I'm sure all of you
  • 0:46 - 0:49
    who are parents, parenting kicks my ass
    all the time.
  • 0:50 - 0:55
    And there are many regrets I have
    about the mere seven and half years
  • 0:55 - 0:58
    that I've been a parent
    but none are so keenly felt
  • 0:58 - 1:01
    as my regrets over what's happened
    when I've been wandering
  • 1:01 - 1:05
    around the house and seen my
    daughter working on something
  • 1:05 - 1:10
    that was beyond her abilities, that was
    right at the edge of what she could do
  • 1:10 - 1:13
    and where she was doing something
    that she didn't have competence in yet
  • 1:13 - 1:17
    and you know it's that amazing thing
    to see that frown of concentration,
  • 1:17 - 1:20
    tongue stuck out. As a parent, your
    heart swells with pride
  • 1:20 - 1:21
    and you can't help but go over
  • 1:21 - 1:24
    and sort of peer over their shoulder
    at what they are doing
  • 1:24 - 1:28
    and those of you who are parents know
    what happens when you look too closely
  • 1:28 - 1:31
    at someone who is working
    beyond the age of their competence.
  • 1:31 - 1:33
    They go back to doing something
    they're already good at.
  • 1:33 - 1:35
    You interrupt a moment
    of genuine learning
  • 1:35 - 1:39
    and you replace it with
    a kind of embarrassment
  • 1:39 - 1:43
    about what you're good at
    and what you're not.
  • 1:43 - 1:48
    So, it matters a lot that our schools are
    increasingly surveilled environments,
  • 1:48 - 1:53
    environments in which everything that
    our kids do is watched and recorded.
  • 1:53 - 1:57
    Because when you do that, you interfere
    with those moments of real learning.
  • 1:57 - 2:01
    Our ability to do things that we are not
    good at yet, that we are not proud of yet,
  • 2:01 - 2:04
    is negatively impacted
    by that kind of scrutiny.
  • 2:04 - 2:07
    And that scrutiny comes
    from a strange place.
  • 2:07 - 2:11
    We have decided that there are
    some programmatic means
  • 2:11 - 2:14
    by which we can find all the webpages
    children shouldn't look at
  • 2:14 - 2:19
    and we will filter our networks
    to be sure that they don't see them.
  • 2:19 - 2:22
    Anyone who has ever paid attention
    knows that this doesn't work.
  • 2:22 - 2:26
    There are more webpages
    that kids shouldn't look at
  • 2:26 - 2:29
    than can ever be catalogued,
    and any attempt to catalog them
  • 2:29 - 2:32
    will always catch pages that kids
    must be looking at.
  • 2:32 - 2:35
    Any of you who have ever taught
    a unit on reproductive health
  • 2:35 - 2:39
    knows the frustration of trying
    to get round a school network.
  • 2:39 - 2:43
    Now, this is done in the name of
    digital protection
  • 2:43 - 2:47
    but it flies in the face of digital
    literacy and of real learning.
  • 2:47 - 2:50
    Because the only way to stop kids
    from looking at web pages
  • 2:50 - 2:52
    they shouldn't be looking at
  • 2:52 - 2:56
    is to take all of the clicks that they
    make, all of the messages that they send,
  • 2:56 - 3:00
    all of their online activity
    and offshore it to a firm
  • 3:00 - 3:04
    that has some nonsensically arrived at
    list of the bad pages.
  • 3:04 - 3:08
    And so, what we are doing is that we're
    exfiltrating all of our students' data
  • 3:08 - 3:10
    to unknown third parties.
  • 3:10 - 3:13
    Now, most of these firms,
    their primary business isn't
  • 3:13 - 3:14
    serving the education sector.
  • 3:14 - 3:17
    Most of them service
    the government sectors.
  • 3:17 - 3:21
    They primarily service governments in
    repressive autocratic regimes.
  • 3:21 - 3:24
    They help them make sure that
    their citizens aren't looking at
  • 3:24 - 3:26
    Amnesty International web pages.
  • 3:26 - 3:30
    They repackage those tools
    and sell them to our educators.
  • 3:30 - 3:34
    So we are offshoring our children's clicks
    to war criminals.
  • 3:34 - 3:37
    And what our kids do, we know,
    is they just get around it,
  • 3:37 - 3:39
    because it's not hard to get around it.
  • 3:39 - 3:44
    You know, never underestimate the power
    of a kid who is time-rich and cash-poor
  • 3:44 - 3:48
    to get around our
    technological blockades.
  • 3:48 - 3:51
    But when they do this, they don't acquire
    the kind of digital literacy
  • 3:51 - 3:54
    that we want them to do, they don't
    acquire real digital agency
  • 3:54 - 3:58
    and moreover, they risk exclusion
    and in extreme cases,
  • 3:58 - 4:00
    they risk criminal prosecution.
  • 4:00 - 4:04
    So what if instead, those of us who are
    trapped in this system of teaching kids
  • 4:04 - 4:08
    where we're required to subject them
    to this kind of surveillance
  • 4:08 - 4:10
    that flies in the face
    of their real learning,
  • 4:10 - 4:13
    what if instead, we invented
    curricular units
  • 4:13 - 4:16
    that made them real first class
    digital citizens,
  • 4:16 - 4:20
    in charge of trying to influence
    real digital problems?
  • 4:20 - 4:23
    Like what if we said to them:
    "We want you to catalog the web pages
  • 4:23 - 4:26
    that this vendor lets through
    that you shouldn't be seeing.
  • 4:26 - 4:29
    We want you to catalog those pages that
    you should be seeing, that are blocked.
  • 4:29 - 4:32
    We want you to go and interview
    every teacher in the school
  • 4:32 - 4:35
    about all those lesson plans that were
    carefully laid out before lunch
  • 4:35 - 4:38
    with a video and a web page,
    and over lunch,
  • 4:38 - 4:41
    the unaccountable distance censor
    blocked those critical resources
  • 4:41 - 4:45
    and left them handing out photographed
    worksheets in the afternoon
  • 4:45 - 4:47
    instead of the unit that they'd prepared.
  • 4:47 - 4:51
    We want you to learn how to do the Freedom
    of Information Act requests
  • 4:51 - 4:53
    and find out what your
    school authority is spending
  • 4:53 - 4:56
    to censor your internet access
    and surveil your activity.
  • 4:56 - 5:00
    We want you to learn to use the internet
    to research these companies
  • 5:00 - 5:04
    and we want you to present this
    to your parent-teacher association,
  • 5:04 - 5:07
    to your school authority,
    to your local newspaper.
  • 5:07 - 5:09
    Because that's the kind
    of digital literacy
  • 5:09 - 5:11
    that makes kids into first-class
    digital citizens,
  • 5:11 - 5:16
    that prepares them for a future
    in which they can participate fully
  • 5:16 - 5:19
    in a world that's changing.
  • 5:19 - 5:23
    Kids are the beta-testers
    of the surveillance state.
  • 5:23 - 5:27
    The path of surveillance technology
    starts with prisoners,
  • 5:27 - 5:30
    moves to asylum seekers,
    people in mental institutions
  • 5:30 - 5:34
    and then to its first non-incarcerated
    population, children.
  • 5:34 - 5:37
    It then moves to blue-collar workers,
    government workers
  • 5:37 - 5:38
    and white-collar workers.
  • 5:38 - 5:42
    And so, what we do to kids today
    is what we did to prisoners yesterday
  • 5:42 - 5:44
    and what we're going to be doing
    to you tomorrow.
  • 5:44 - 5:47
    And so it matters, what we teach our kids.
  • 5:47 - 5:51
    If you want to see where this goes, this
    is a kid named Blake Robbins
  • 5:51 - 5:55
    and he attended Lower Merion High School
    in Lower Merion Pennsylvania
  • 5:55 - 5:56
    outside of Philadelphia.
  • 5:56 - 6:00
    It's the most affluent public school
    district in America, so affluent
  • 6:00 - 6:03
    that all the kids were issued Macbooks
    at the start of the year
  • 6:03 - 6:05
    and they had to do their homework on
    their Macbooks,
  • 6:05 - 6:08
    and they had to bring them to school every
    day and bring them home every night.
  • 6:08 - 6:11
    And the Macbooks had been fitted with
    Laptop Theft Recovery Software,
  • 6:11 - 6:16
    which is fancy word for a rootkit, that
    let the school administration
  • 6:16 - 6:20
    covertly operate the cameras
    and microphones on these computers
  • 6:20 - 6:24
    and harvest files off
    of their hard drives
  • 6:24 - 6:26
    view all their clicks, and so on.
  • 6:26 - 6:31
    Now Blake Robbins found out
    that the software existed
  • 6:31 - 6:34
    and how it was being used
    because he and the head teacher
  • 6:34 - 6:37
    had been knocking heads for years,
    since he'd first got into the school,
  • 6:37 - 6:40
    and one day, the head teacher
    summoned him to his office
  • 6:40 - 6:42
    and said: "Blake, I've got you now."
  • 6:42 - 6:46
    and handed him a print-out of Blake
    in his bedroom the night before,
  • 6:46 - 6:49
    taking what looked like a pill,
    and he said: "You're taking drugs."
  • 6:49 - 6:54
    And Blake Robbins said: "That's a candy,
    it's a Mike and Ike's candy, I take them
  • 6:54 - 6:56
    "when I, I eat them when I'm studying.
  • 6:56 - 6:59
    "How did you get a picture
    of me in my bedroom?"
  • 6:59 - 7:03
    This head teacher had taken
    over 6,000 photos of Blake Robbins:
  • 7:03 - 7:07
    awake and asleep, dressed and undressed,
    in the presence of his family.
  • 7:07 - 7:10
    And in the ensuing lawsuit, the school
    settled for a large amount of money
  • 7:10 - 7:13
    and promised that
    they wouldn't do it again
  • 7:13 - 7:16
    without informing the students
    that it was going on.
  • 7:16 - 7:19
    And increasingly, the practice is now
  • 7:19 - 7:23
    that school administrations hand out
    laptops, because they're getting cheaper,
  • 7:23 - 7:25
    with exactly the same kind of software,
  • 7:25 - 7:28
    but they let the students know and
    they find that that works even better
  • 7:28 - 7:30
    at curbing the students' behavior,
  • 7:30 - 7:34
    because the students know that
    they're always on camera.
  • 7:34 - 7:38
    Now, the surveillance state is moving
    from kids to the rest of the world.
  • 7:38 - 7:40
    It's metastasizing.
  • 7:40 - 7:44
    Our devices are increasingly designed
    to treat us as attackers,
  • 7:44 - 7:47
    as suspicious parties
    who can't be trusted
  • 7:47 - 7:51
    because our devices' job is to do things
    that we don't want them to do.
  • 7:51 - 7:54
    Now that's not because the vendors
    who make our technology
  • 7:54 - 7:56
    want to spy on us necessarily,
  • 7:56 - 8:01
    but they want to take
    the ink-jet printer business model
  • 8:01 - 8:04
    and bring it into every other realm
    of the world.
  • 8:04 - 8:09
    So the ink-jet printer business model
    is where you sell someone a device
  • 8:09 - 8:12
    and then you get a continuing
    revenue stream from that device
  • 8:12 - 8:16
    by making sure that competitors can't make
    consumables or parts
  • 8:16 - 8:19
    or additional features
    or plugins for that device,
  • 8:19 - 8:22
    without paying rent
    to the original manufacturer.
  • 8:22 - 8:26
    And that allows you to maintain
    monopoly margins on your devices.
  • 8:26 - 8:31
    Now, in 1998, the American government
    passed a law called
  • 8:31 - 8:32
    the Digital Millennium Copyright Act.
  • 8:32 - 8:35
    In 2001 the European Union
    introduced its own version,
  • 8:35 - 8:38
    the European Union Copyright Directive.
  • 8:38 - 8:40
    And these two laws, along with laws
    all around the world,
  • 8:40 - 8:46
    in Australia, Canada and elsewhere,
    these laws prohibit removing digital locks
  • 8:46 - 8:49
    that are used to restrict
    access to copyrighted works
  • 8:49 - 8:52
    and they were original envisioned as a way
    of making sure that Europeans didn't
  • 8:52 - 8:55
    bring cheap DVDs in from America,
  • 8:55 - 8:58
    or making sure that Australians didn't
    import cheap DVDs from China.
  • 8:58 - 9:04
    And so you have a digital work, a DVD,
    and it has a lock on it and to unlock it,
  • 9:04 - 9:05
    you have to buy an authorized player
  • 9:05 - 9:07
    and the player checks to make sure
    you are in region
  • 9:07 - 9:10
    and making your own player
    that doesn't make that check
  • 9:10 - 9:12
    is illegal because you'd have
    to remove the digital lock.
  • 9:12 - 9:14
    And that was the original intent,
  • 9:14 - 9:19
    it was to allow high rents to be
    maintained on removable media,
  • 9:19 - 9:21
    DVDs and other entertainment content.
  • 9:21 - 9:25
    But it very quickly spread
    into new realms.
  • 9:25 - 9:28
    So, for example, auto manufacturers
    now lock up
  • 9:28 - 9:31
    all of their cars' telemetry
    with digital locks.
  • 9:31 - 9:33
    If you're a mechanic
    and you want to fix a car,
  • 9:33 - 9:37
    you have to get a reader
    from the manufacturer
  • 9:37 - 9:40
    to make sure that you can
    see the telemetry
  • 9:40 - 9:43
    and then know what parts to order
    and how to fix it.
  • 9:43 - 9:46
    And in order to get this reader,
    you have to promise the manufacturer
  • 9:46 - 9:50
    that you will only buy parts
    from that manufacturer
  • 9:50 - 9:51
    and not from third parties.
  • 9:51 - 9:54
    So the manufacturers can keep
    the repair costs high
  • 9:54 - 9:58
    and get a secondary revenue stream
    out of the cars.
  • 9:58 - 10:05
    This year, the Chrysler corporation filed
    comments with the US Copyright Office,
  • 10:05 - 10:08
    to say that they believed that
    this was the right way to do it
  • 10:08 - 10:10
    and that it should be a felony,
    punishable by 5 years in prison
  • 10:10 - 10:12
    and a $500,000 fine,
  • 10:12 - 10:17
    to change the locks on a car that you own,
    so that you can choose who fixes it.
  • 10:17 - 10:20
    It turned out that when they advertised,
  • 10:20 - 10:22
    oh, where is my slide here?
    Oh, there we go.
  • 10:22 - 10:26
    When they advertised that
    it wasn't your father's Oldsmobile,
  • 10:26 - 10:29
    they weren't speaking metaphorically,
    they literally meant
  • 10:29 - 10:30
    that even though your father
    bought the Oldsmobile,
  • 10:30 - 10:34
    it remained their property in perpetuity.
  • 10:34 - 10:36
    And it's not just cars,
    it's every kind of device,
  • 10:36 - 10:40
    because every kind of device today
    has a computer in it.
  • 10:40 - 10:43
    The John Deere Company, the world's
    leading seller of heavy equipment
  • 10:43 - 10:46
    and agricultural equipment technologies,
  • 10:46 - 10:49
    they now view their tractors as
    information gathering platforms
  • 10:49 - 10:51
    and they view the people who use them
  • 10:51 - 10:56
    as the kind of inconvenient gut flora
    of their ecosystem.
  • 10:56 - 10:58
    So if you are a farmer
    and you own a John Deere tractor,
  • 10:58 - 11:03
    when you drive it around your fields,
    the torque sensors on the wheels
  • 11:03 - 11:08
    conduct a centimeter-accurate soil
    density survey of your agricultural land.
  • 11:08 - 11:11
    That would be extremely useful to you
    when you're planting your seed
  • 11:11 - 11:13
    but that data is not available to you
  • 11:13 - 11:15
    unless you remove the digital lock
    from your John Deere tractor
  • 11:15 - 11:18
    which again, is against the law
    everywhere in the world.
  • 11:18 - 11:20
    Instead, in order to get that data
  • 11:20 - 11:23
    you have to buy it bundled with seed
    from Monsanto,
  • 11:23 - 11:25
    who are John Deere's seed partners.
  • 11:25 - 11:29
    John Deere then takes this data that they
    aggregate across whole regions
  • 11:29 - 11:31
    and they use it to gain insight
    into regional crop yields
  • 11:31 - 11:34
    that they use to play the futures market.
  • 11:34 - 11:37
    John Deere's tractors are really just
    a way of gathering information
  • 11:37 - 11:39
    and the farmers are secondary to it.
  • 11:39 - 11:42
    Just because you own it
    doesn't mean it's yours.
  • 11:42 - 11:46
    And it's not just the computers
    that we put our bodies into
  • 11:46 - 11:47
    that have this business model.
  • 11:47 - 11:50
    It's the computers that we put
    inside of our bodies.
  • 11:50 - 11:51
    If you're someone who is diabetic
  • 11:51 - 11:55
    and you're fitted with a continuous
    glucose-measuring insulin pump,
  • 11:55 - 11:58
    that insulin pump is designed
    with a digital lock
  • 11:58 - 12:02
    that makes sure that your doctor
    can only use the manufacturer's software
  • 12:02 - 12:04
    to read the data coming off of it
  • 12:04 - 12:07
    and that software is resold
    on a rolling annual license
  • 12:07 - 12:10
    and it can't be just bought outright.
  • 12:10 - 12:12
    And the digital locks are also
    used to make sure
  • 12:12 - 12:14
    that you only buy the insulin
    that the vendors approved
  • 12:14 - 12:17
    and not generic insulin
    that might be cheaper.
  • 12:17 - 12:21
    We've literally turned human beings
    into ink-jet printers.
  • 12:21 - 12:28
    Now, this has really deep implications
    beyond the economic implications.
  • 12:28 - 12:31
    Because the rules that prohibit
    breaking these digital locks
  • 12:31 - 12:35
    also prohibit telling people
    about flaws that programmers made
  • 12:35 - 12:38
    because if you know about a flaw
    that a programmer made,
  • 12:38 - 12:40
    you can use it to break the digital lock.
  • 12:40 - 12:44
    And that means that the errors,
    the vulnerabilities,
  • 12:44 - 12:50
    the mistakes in our devices, they fester
    in them, they go on and on and on
  • 12:50 - 12:54
    and our devices become these longlife
    reservoirs of digital pathogens.
  • 12:54 - 12:56
    And we've seen how that plays out.
  • 12:56 - 12:59
    One of the reasons that Volkswagen
    was able to get away
  • 12:59 - 13:01
    with their Diesel cheating for so long
  • 13:01 - 13:05
    is because no one could independently
    audit their firmware.
  • 13:05 - 13:08
    It's happening all over the place.
  • 13:08 - 13:11
    You may have seen --
    you may have seen this summer
  • 13:11 - 13:15
    that Chrysler had to recall
    1.4 million Jeeps
  • 13:15 - 13:19
    because it turned out that they could be
    remotely controlled over the internet
  • 13:19 - 13:22
    while driving down a motorway
    and have their brakes and steerings
  • 13:22 - 13:27
    commandeered by anyone, anywhere
    in the world, over the internet.
  • 13:27 - 13:32
    We only have one methodology
    for determining whether security works
  • 13:32 - 13:34
    and that's to subject it
    to public scrutiny,
  • 13:34 - 13:38
    to allow for other people to see
    what assumptions you've made.
  • 13:38 - 13:40
    Anyone can design a security system
  • 13:40 - 13:43
    that he himself can't think
    of a way of breaking,
  • 13:43 - 13:45
    but all that means is that you've
    designed a security system
  • 13:45 - 13:48
    that works against people
    who are stupider than you.
  • 13:48 - 13:50
    And in this regard, security
    is no different
  • 13:50 - 13:52
    from any other kind of knowledge creation.
  • 13:52 - 13:55
    You know, before we had
    contemporary science and scholarship,
  • 13:55 - 13:58
    we had something that looked
    a lot like it, called alchemy.
  • 13:58 - 14:02
    And for 500 years, alchemists kept
    what they thought they knew a secret.
  • 14:02 - 14:06
    And that meant that every alchemist
    was capable of falling prey
  • 14:06 - 14:13
    to that most urgent of human frailties,
    which is our ability to fool ourselves.
  • 14:13 - 14:16
    And so, every alchemist discovered
    for himself in the hardest way possible
  • 14:16 - 14:19
    that drinking mercury was a bad idea.
  • 14:19 - 14:22
    We call that 500-year period the Dark Ages
  • 14:22 - 14:25
    and we call the moment at which
    they started publishing
  • 14:25 - 14:28
    and subjecting themselves
    to adversarial peer review,
  • 14:28 - 14:31
    which is when your friends tell you
    about the mistakes that you've made
  • 14:31 - 14:33
    and your enemies call you an idiot
    for having made them,
  • 14:33 - 14:37
    we call that moment the Enlightenment.
  • 14:37 - 14:40
    Now, this has profound implications.
  • 14:40 - 14:46
    The restriction of our ability to alter
    the security of our devices
  • 14:46 - 14:49
    for our own surveillance society,
  • 14:49 - 14:52
    for our ability to be free people
    in society.
  • 14:52 - 14:56
    At the height of the GDR, in 1989,
  • 14:56 - 15:01
    the Stasi had one snitch for every
    60 people in East Germany,
  • 15:01 - 15:04
    in order to surveil the entire country.
  • 15:04 - 15:07
    A couple of decades later, we found out
    through Edward Snowden
  • 15:07 - 15:10
    that the NSA was spying
    on everybody in the world.
  • 15:10 - 15:13
    And the ratio of people who work
    at the NSA to people that they're spying on
  • 15:13 - 15:15
    is more like 1 in 10,000.
  • 15:15 - 15:18
    They've achieved a two and a half
    order of magnitude
  • 15:18 - 15:20
    productivity gain in surveillance.
  • 15:20 - 15:23
    And the way that they got there
    is in part by the fact that
  • 15:23 - 15:26
    we use devices that
    we're not allowed to audit,
  • 15:26 - 15:28
    that are designed to treat us as attackers
  • 15:28 - 15:32
    and that gather an enormous
    amount of information on us.
  • 15:32 - 15:34
    If the government told you that you
    were required to carry around
  • 15:34 - 15:38
    a small electronic rectangle that
    recorded all of your social relationships,
  • 15:38 - 15:39
    all of your movements,
  • 15:39 - 15:43
    all of your transient thoughts that
    you made note of or looked up,
  • 15:43 - 15:46
    and would make that
    available to the state,
  • 15:46 - 15:49
    and you would have to pay for it,
    you would revolt.
  • 15:49 - 15:52
    But the phone companies have
    managed to convince us,
  • 15:52 - 15:54
    along with the mobile vendors,
  • 15:54 - 15:57
    that we should foot the bill
    for our own surveillance.
  • 15:57 - 15:59
    It's a bit like during
    the Cultural Revolution,
  • 15:59 - 16:01
    where, after your family members
    were executed,
  • 16:01 - 16:04
    they sent you a bill for the bullet.
  • 16:05 - 16:11
    So, this has big implications, as I said,
    for where we go as a society.
  • 16:11 - 16:15
    Because just like our kids have
    a hard time functioning
  • 16:15 - 16:17
    in the presence of surveillance
    and learning,
  • 16:17 - 16:19
    and advancing their own knowledge,
  • 16:19 - 16:21
    we as a society have a hard time
    progressing
  • 16:21 - 16:23
    in the presence of surveillance.
  • 16:23 - 16:29
    In our own living memory, people who are
    today thought of as normal and right
  • 16:29 - 16:31
    were doing something that
    a generation ago
  • 16:31 - 16:34
    would have been illegal
    and landed them in jail.
  • 16:34 - 16:35
    For example, you probably know someone
  • 16:35 - 16:38
    who's married to a partner
    of the same sex.
  • 16:38 - 16:41
    If you live in America, you may know
    someone who takes medical marijuana,
  • 16:41 - 16:43
    or if you live in the Netherlands.
  • 16:43 - 16:48
    And not that long ago, people
    who undertook these activities
  • 16:48 - 16:49
    could have gone to jail for them,
  • 16:49 - 16:52
    could have faced enormous
    social exclusion for them.
  • 16:52 - 16:56
    The way that we got from there to here
    was by having a private zone,
  • 16:56 - 16:58
    a place where people weren't surveilled,
  • 16:58 - 17:01
    in which they could advance
    dangerous ideas,
  • 17:01 - 17:04
    do things that were thought of as
    socially unacceptable
  • 17:04 - 17:07
    and slowly change our social attitudes.
  • 17:07 - 17:09
    And unless you think
    that in 50 years,
  • 17:09 - 17:13
    your grandchildren will sit around
    the Christmas table, in 2065, and say,
  • 17:13 - 17:15
    "How was it, Grandma,
    how was it, Grandpa,
  • 17:15 - 17:17
    that in 2015, you got it all right,
  • 17:17 - 17:20
    and we haven't had
    any social changes since then?"
  • 17:20 - 17:22
    Then you have to ask yourself
    how in a world,
  • 17:22 - 17:25
    in which we are all
    under continuous surveillance,
  • 17:25 - 17:28
    we are going to find a way
    to improve this.
  • 17:28 - 17:30
    So, our kids need ICT literacy,
  • 17:30 - 17:35
    but ICT literacy isn't just typing skills
    or learning how to use PowerPoint.
  • 17:35 - 17:37
    It's learning how to think critically
  • 17:37 - 17:40
    about how they relate
    to the means of information,
  • 17:40 - 17:44
    about whether they are its masters
    or servants.
  • 17:44 - 17:47
    Our networks are not
    the most important issue that we have.
  • 17:47 - 17:52
    There are much more important issues
    in society and in the world today.
  • 17:52 - 17:55
    The future of the internet is
    way less important
  • 17:55 - 17:58
    than the future of our climate,
    the future of gender equity,
  • 17:58 - 18:00
    the future of racial equity,
  • 18:00 - 18:03
    the future of the wage gap
    and the wealth gap in the world,
  • 18:03 - 18:07
    but every one of those fights is going
    to be fought and won or lost
  • 18:07 - 18:11
    on the internet.
    It's our most foundational fight
  • 18:11 - 18:16
    So we can, computers
    can make us more free
  • 18:16 - 18:18
    or they can take away our freedom.
  • 18:18 - 18:21
    It all comes down to how we regulate them
    and how we use them.
  • 18:21 - 18:25
    And it's our job, as people who are
    training the next generation,
  • 18:25 - 18:29
    and whose next generation
    is beta-testing
  • 18:29 - 18:31
    the surveillance technology
    that will be coming to us,
  • 18:31 - 18:35
    it's our job to teach them to seize
    the means of information,
  • 18:35 - 18:39
    to make themselves self-determinant
    in the way that they use their networks
  • 18:39 - 18:43
    and to find ways to show them
    how to be critical and how to be smart
  • 18:43 - 18:45
    and how to be, above all, subversive
  • 18:45 - 18:49
    in how to use the technology around them.
    Thank you.
  • 18:49 - 18:56
    (Applause)
  • 18:57 - 18:59
    (Moderator) Cory, thank you very much
    indeed.
  • 18:59 - 19:00
    (Doctorow) Thank you.
    (Moderator) And I've got a bundle of
  • 19:00 - 19:05
    points which you've stimulated
    from many in the audience, which sent...
  • 19:05 - 19:07
    (Doctorow) I'm shocked to hear that
    that was at all controversial,
  • 19:07 - 19:08
    but go on.
    (Moderator) I didn't say "controversial,"
  • 19:08 - 19:11
    you stimulated thinking, which is great.
    (Doctorow laughs)
  • 19:11 - 19:17
    But a lot of them resonate around
    violation of secrecy and security.
  • 19:17 - 19:21
    And this, for example,
    from Annika Burgess
  • 19:21 - 19:23
    "Is there a way for students
    to protect themselves
  • 19:23 - 19:27
    from privacy violations by institutions
    they are supposed to trust?"
  • 19:27 - 19:30
    I think this is probably a question
    Ian Goldin as well,
  • 19:30 - 19:34
    as someone who is a senior figure
    in a major university, but
  • 19:34 - 19:37
    this issue of privacy violations and trust.
  • 19:37 - 19:40
    (Doctorow) Well, I think that computers
    have a curious dual nature.
  • 19:40 - 19:44
    So on the one hand, they do expose us
    to an enormous amount of scrutiny,
  • 19:44 - 19:46
    depending on how they are configured.
  • 19:46 - 19:49
    But on the other hand, computers
    have brought new powers to us
  • 19:49 - 19:52
    that are literally new
    on the face of the world, right?
  • 19:52 - 19:56
    We have never had a reality in which
    normal people could have secrets
  • 19:56 - 19:58
    from powerful people.
  • 19:58 - 20:01
    But with the computer in your pocket,
    with that, [shows a smartphone]
  • 20:01 - 20:03
    you can encrypt a message so thoroughly
  • 20:03 - 20:06
    that if every hydrogen atom in the
    universe were turned into a computer
  • 20:06 - 20:09
    and it did nothing until
    the heat death of the universe,
  • 20:09 - 20:10
    but try to guess what your key was,
  • 20:10 - 20:14
    we would run out of universe
    before we ran out of possible keys.
  • 20:14 - 20:17
    So, computers do give us
    the power to have secrets.
  • 20:17 - 20:21
    The problem is that institutions
    prohibit the use of technology
  • 20:21 - 20:24
    that allow you to take back
    your own privacy.
  • 20:24 - 20:26
    It's funny, right? Because we take kids
    and we say to them,
  • 20:26 - 20:29
    "Your privacy is like your virginity:
  • 20:29 - 20:32
    "once you've lost it,
    you'll never get it back.
  • 20:32 - 20:34
    "Watch out what you're
    putting on Facebook."
  • 20:34 - 20:36
    And I think they should watch
    what they're putting on Facebook.
  • 20:36 - 20:40
    I'm a Facebook vegan, I don't even,
    I don't use it, but we say,
  • 20:40 - 20:42
    "Watch what you're putting on Facebook.
  • 20:42 - 20:45
    "Don't send out dirty pictures
    of yourself on SnapChat."
  • 20:45 - 20:46
    All good advice.
  • 20:46 - 20:50
    But we do it while we are taking away
    all the private information
  • 20:50 - 20:54
    that they have, all of their privacy
    and all of their agency.
  • 20:54 - 20:56
    You know, if a parent says to a kid,
  • 20:56 - 20:58
    "You mustn't smoke
    because you'll get sick"
  • 20:58 - 21:00
    and the parent says it
    while lighting a new cigarette
  • 21:00 - 21:03
    off the one that she's just put down
    in the ashtray,
  • 21:03 - 21:06
    the kid knows that what you're doing
    matters more than what you're saying.
  • 21:06 - 21:07
    (Moderator) The point is deficit of trust.
  • 21:07 - 21:09
    It builds in the kind of work that
    David has been doing as well,
  • 21:09 - 21:12
    this deficit of trust and privacy.
  • 21:12 - 21:13
    And there is another point here.
  • 21:13 - 21:15
    Is the battle for privacy already lost?
  • 21:15 - 21:19
    Are we already too comfortable
    with giving away our data?
  • 21:19 - 21:20
    (Doctorow) No, I don't think so at all.
  • 21:20 - 21:23
    In fact, I think that if anything,
    we've reached
  • 21:23 - 21:25
    peak indifference to surveillance, right?
  • 21:25 - 21:29
    The surveillance races aren't over by it
    by a long shot.
  • 21:29 - 21:32
    There will be more surveillance
    before there is less.
  • 21:32 - 21:34
    But there'll never be fewer people
    who care about surveillance
  • 21:34 - 21:36
    than there are today,
  • 21:36 - 21:39
    because, as privacy advocates,
    we spectacularly failed,
  • 21:39 - 21:43
    over the last 20 years, to get people
    to take privacy seriously
  • 21:43 - 21:46
    and now we have firms that are
    frankly incompetent,
  • 21:46 - 21:50
    retaining huge amounts of our personally
    identifying sensitive information
  • 21:50 - 21:53
    and those firms are leaking
    that information at speed.
  • 21:53 - 21:58
    So, it started this year with things like...
    (phone rings) Is that me? That's me.
  • 21:58 - 22:01
    It started this year with things like
    Ashley Madison
  • 22:01 - 22:02
    (Moderator) Do you want to take it?
  • 22:02 - 22:05
    (Doctorow) No no, that was my timer going
    off, telling me I've hit my 22 minutes.
  • 22:05 - 22:06
    (Moderator) It may be someone who
    doesn't trust my....
  • 22:06 - 22:08
    (Doctorow) No,
    that was my 22 minutes.
  • 22:08 - 22:14
    So, it started with Ashley Madison,
    the Office of Personnel Management,
  • 22:14 - 22:18
    everyone who ever applied for security
    clearance in America
  • 22:18 - 22:21
    had all of their sensitive information,
    everything you have to give them
  • 22:21 - 22:23
    about why you shouldn't have
    security clearance,
  • 22:23 - 22:25
    everything that's potentially
    compromising about you,
  • 22:25 - 22:30
    all of it exfiltrated by what's believed
    to have been a Chinese spy ring.
  • 22:30 - 22:32
    Something like
    one in twenty Americans now,
  • 22:32 - 22:36
    have had their data captured and
    exfiltrated from the United States.
  • 22:36 - 22:40
    This week, VTech, the largest electronic
    toy manufacturer in the world,
  • 22:40 - 22:44
    leaked the personal information of
    at least five million children,
  • 22:44 - 22:47
    including potentially photos that
    they took with their electronic toys,
  • 22:47 - 22:52
    as well as their parents' information,
    their home addresses, their passwords,
  • 22:52 - 22:54
    their parents' passwords
    and their password hints.
  • 22:54 - 22:57
    So every couple of weeks,
    from now on,
  • 22:57 - 22:59
    a couple of million people
    are going to show up
  • 22:59 - 23:01
    on the door of people who
    care about privacy and say,
  • 23:01 - 23:04
    "You were right all along,
    what do we do?"
  • 23:04 - 23:06
    And the challenge is to give them
    something useful
  • 23:06 - 23:08
    they can do about privacy.
  • 23:08 - 23:10
    And there are steps that
    you can take personally.
  • 23:10 - 23:13
    If you go to the Surveillance
    Self-Defense Kit
  • 23:13 - 23:15
    at the Electronic Frontier
    Foundation's website,
  • 23:15 - 23:17
    you'll find a set of tools you can use.
  • 23:17 - 23:20
    But ultimately, it's not an individual
    choice, it's a social one.
  • 23:20 - 23:24
    Privacy is a team sport, because if you
    keep your information private
  • 23:24 - 23:27
    and you send it to someone else
    who is in your social network,
  • 23:27 - 23:30
    who doesn't keep it private, well, then
    it leaks out their back door.
  • 23:30 - 23:35
    And so we need social movements
    to improve our privacy.
  • 23:35 - 23:37
    We also need better tools.
  • 23:37 - 23:41
    It's really clear that our privacy tools
    have fallen short of the mark
  • 23:41 - 23:44
    in terms of being accessible
    to normal people.
  • 23:44 - 23:45
    (Moderator) Corey --
    (Doctorow) I sense you want
  • 23:45 - 23:46
    to say something, go on
    (Moderator) Yes.
  • 23:46 - 23:49
    Yes, but what I want to do is keep pushing
    this down the route of education as well.
  • 23:49 - 23:54
    There are two or three questions here
    which are very specific about that.
  • 23:54 - 23:56
    Could you tell us more about
    how we help students
  • 23:56 - 23:58
    become digital citizens?
  • 23:58 - 24:02
    It links in important ways to the
    German and Nordic pre-digital concept
  • 24:02 - 24:04
    of Bildung,
    that from Gerud Fullend.
  • 24:04 - 24:08
    And if you could give your daughter
    just one piece of advice for living
  • 24:08 - 24:12
    in an age of digital surveillance,
    what would it be?
  • 24:12 - 24:15
    (Doctorow) So, the one piece of advice
    I would give her is
  • 24:15 - 24:18
    "Don't settle for anything less
    than total control
  • 24:18 - 24:20
    "over the means of information."
  • 24:20 - 24:23
    Anytime someone tells you that
    the computer says you must do something,
  • 24:23 - 24:27
    you have to ask yourself why the computer
    isn't doing what you want it to do.
  • 24:27 - 24:30
    Who is in charge here?
  • 24:30 - 24:31
    Thank you.
  • 24:31 - 24:35
    In terms of how we improve
    digital citizenship,
  • 24:35 - 24:38
    I think that we are at odds with
    our institutions in large part here
  • 24:38 - 24:42
    and I think that our institutions
    have decreed that privacy
  • 24:42 - 24:44
    is a luxury we can't afford
    to give our students
  • 24:44 - 24:46
    because if we do, they might do
    something we wouldn't like.
  • 24:46 - 24:51
    And so, as teachers, the only thing that
    we can do without risking our jobs,
  • 24:51 - 24:54
    without risking our students' education
    and their ability to stay in school
  • 24:54 - 24:57
    is to teach them how to do judo,
    to teach them
  • 24:57 - 25:01
    how to ask critical questions,
    to gather data,
  • 25:01 - 25:03
    to demand evidence-based policy.
  • 25:03 - 25:05
    And ultimately, I actually think
    that's better.
  • 25:05 - 25:07
    I think that teaching kids
    how to evade firewalls
  • 25:07 - 25:10
    is way less interesting than
    teaching them how to identify
  • 25:10 - 25:12
    who is providing that firewall,
  • 25:12 - 25:15
    who made the decision
    to buy that firewall,
  • 25:15 - 25:16
    how much they're spending,
  • 25:16 - 25:19
    and what the process is
    for getting that person fired.
  • 25:19 - 25:23
    That's a much more interesting
    piece of digital citizenship
  • 25:23 - 25:24
    than any piece of firewall hacking.
  • 25:24 - 25:26
    (Moderator) But Corey, you're telling
    kids to get hold of their information,
  • 25:26 - 25:29
    but picking up on your analogy
    with the auto
  • 25:29 - 25:33
    where there's very clear lock-on,
    so much which has to be done
  • 25:33 - 25:37
    has to be done through the manufacturers.
    Martin Siebel picking up.
  • 25:37 - 25:40
    How busy is the hacker community
    with breaking digital locks?
  • 25:40 - 25:44
    In other words, digital locks
    are being put on so much,
  • 25:44 - 25:47
    can the next generation, can the kids
    work out where these locks are,
  • 25:47 - 25:49
    to keep control of their information?
  • 25:49 - 25:52
    (Doctorow) So, getting the digital locks
    off of our devices
  • 25:52 - 25:56
    is not just a matter of technology,
    it's also a matter of policy.
  • 25:56 - 25:59
    Laws like the EUCD and new treaty
    instruments like TTIP
  • 25:59 - 26:01
    and the Transpacific Partnership
  • 26:01 - 26:05
    specify that these digital locks
    can't be removed legally,
  • 26:05 - 26:09
    that firms can use them to extract
    monopoly rents, to harm our security.
  • 26:09 - 26:13
    And these treaties will make it impossible
    to remove these locks legally.
  • 26:13 - 26:16
    It's not enough for there to be
    a demi-monde, in which
  • 26:16 - 26:19
    people who are technologically savvy
    know how to break their iPhones.
  • 26:19 - 26:23
    The problem with that is that it produces
    this world in which you can't know
  • 26:23 - 26:28
    whether the software that you're using
    has anything bad lurking in it, right?
  • 26:28 - 26:33
    To say to people who have iPhones that
    they want it reconfigured
  • 26:33 - 26:35
    to accept software from parties that
    aren't approved by Apple,
  • 26:35 - 26:39
    well, all you need to do is find a random
    piece of software on the internet
  • 26:39 - 26:42
    that claims it will allow you to do it,
    get root on your phone,
  • 26:42 - 26:45
    and then run this software on it
    is like saying to people,
  • 26:45 - 26:48
    "Well, all you need to do is just
    find some heroin lying around
  • 26:48 - 26:52
    and put it in your arm. I'm sure
    it'll work out fine." Right?
  • 26:52 - 26:56
    The ways in which a phone that is
    compromised can harm you
  • 26:56 - 26:58
    are really without limit, right?
  • 26:58 - 27:01
    If you think about this, this is a
    rectangle with a camera and a microphone
  • 27:01 - 27:03
    that you take into the bedroom
    and the toilet.
  • 27:03 - 27:06
    and the only way you know whether
    that camera and microphone are on or off
  • 27:06 - 27:10
    is if the software is doing
    what it says it's doing, right?
  • 27:10 - 27:13
    So, I think that we have this policy
    dimension that's way more important
  • 27:13 - 27:15
    than the technical dimension,
  • 27:15 - 27:17
    and that the only way we're going to
    change that policy
  • 27:17 - 27:20
    is by making kids know what they can do
  • 27:20 - 27:23
    and them making them know
    why they're not allowed to do it,
  • 27:23 - 27:26
    and then setting them loose.
    (Moderator) Two final points then,
  • 27:26 - 27:28
    building on that: Ian Harreatman.
  • 27:28 - 27:30
    But what do you think of recent laws,
    say, in the Netherlands,
  • 27:30 - 27:34
    allowing police to spy on criminals
    and terrorists with webcams
  • 27:34 - 27:36
    on their own laptops?
  • 27:36 - 27:39
    In other words, the extension implicit
    in this question is,
  • 27:39 - 27:43
    what about keeping an eye on others,
    including the next generation coming up,
  • 27:43 - 27:46
    and this from Peter Andreesen,
    which is slightly connected.
  • 27:46 - 27:49
    But when the predominant management
    tool in our business is still based
  • 27:49 - 27:55
    on the assumption of control, how do we
    change that psychology and that mindset?
  • 27:55 - 27:59
    And David is nodding for those of you
    who can't see it at the back.
  • 27:59 - 28:02
    (Doctorow) You know, there's not an
    operating system or a suite of applications
  • 28:02 - 28:06
    that bad guys use, and another set
    that the rest of us use.
  • 28:06 - 28:09
    And so, when our security services
    discover vulnerabilities
  • 28:09 - 28:13
    in the tools that we rely on, and then
    hoard those vulnerabilities,
  • 28:13 - 28:16
    so that they can weaponize them
    to attack criminals or terrorists
  • 28:16 - 28:19
    or whoever they're after,
  • 28:19 - 28:23
    they insure that those vulnerabilities
    remain unpatched on our devices too.
  • 28:23 - 28:25
    So that the people who are
    our adversaries,
  • 28:25 - 28:26
    the criminals who want to attack us,
  • 28:26 - 28:31
    the spies who want to exfiltrate
    our corporate secrets to their countries,
  • 28:31 - 28:36
    the griefers, the autocratic regimes
    who use those vulnerabilities
  • 28:36 - 28:37
    to spy on their population,
  • 28:37 - 28:40
    at Electronic Frontier Foundation,
    we have a client from Ethiopia,
  • 28:40 - 28:42
    Mr. Kadani, who is dissident journalist.
  • 28:42 - 28:46
    Ethiopia imprisons more journalists
    than any other country in the world.
  • 28:46 - 28:50
    He fled to America, where the Ethiopian
    government hacked his computer
  • 28:50 - 28:53
    with a bug in Skype that they had
    discovered and not reported.
  • 28:53 - 28:56
    They'd bought the tool to do this from
    Italy, from a company called Hacking Team,
  • 28:56 - 29:00
    they hacked his computer, they found out
    who his colleagues were in Ethiopia
  • 29:00 - 29:03
    and they arrested them and
    subjected them to torture.
  • 29:03 - 29:07
    So, when our security services take a bug
    and weaponize it
  • 29:07 - 29:11
    so they can attack their adversaries,
    they leave that bug intact
  • 29:11 - 29:13
    so that our adversaries can attack us.
  • 29:13 - 29:16
    You know, this is, I'm giving a talk
    later today,
  • 29:16 - 29:20
    no matter what side you're on in the war
    on general-purpose computing,
  • 29:20 - 29:21
    you're losing,
  • 29:21 - 29:24
    because all we have right now
    is attack and not defense.
  • 29:24 - 29:28
    Our security services are so concerned
    with making sure that their job is easy
  • 29:28 - 29:31
    when they attack their adversaries,
    that they are neglecting the fact
  • 29:31 - 29:34
    that they are making their adversaries'
    job easy in attacking us.
  • 29:34 - 29:36
    If this were a football match,
  • 29:36 - 29:39
    the score would be tied
    400 to 400 after 10 minutes.
  • 29:39 - 29:41
    (Moderator) Corey,
    I've got to stop you there.
  • 29:41 - 29:43
    Thank you very much indeed.
    (Doctorow) Thank you very much.
  • 29:43 - 29:46
    (Applause)
Title:
OEB 2015 - Opening Plenary - Cory Doctorow
Description:

Cory Doctorow - Writer, Blogger, Activist - USA

The Opening Plenary session of OEB 2015 looked at the challenges of modernity and identify how people, organisations, institutions and societies can make technology and knowledge work together to accelerate the shift to a new age of opportunity.

More info: http://bit.ly/1lugQWX

more » « less
Video Language:
English
Team:
Captions Requested
Duration:
29:46

English subtitles

Revisions Compare revisions