Return to Video

OEB 2015 - Opening Plenary - Cory Doctorow

  • 0:00 - 0:01
    (Cory Doctorow) Thank you very much
  • 0:01 - 0:05
    So I'd like to start with something of a
    benediction or permission.
  • 0:05 - 0:08
    I am one of nature's fast talkers
  • 0:08 - 0:11
    and many of you are not
    native English speakers, or
  • 0:11 - 0:13
    maybe not accustomed
    to my harsh Canadian accent
  • 0:13 - 0:16
    in addition I've just come in
    from Australia
  • 0:16 - 0:19
    and so like many of you I am horribly
    jetlagged and have drunk enough coffee
  • 0:19 - 0:21
    this morning to kill a rhino.
  • 0:22 - 0:24
    When I used to be at the United Nations
  • 0:24 - 0:27
    I was known as the scourge of the
    simultaneous translation core
  • 0:27 - 0:30
    I would stand up and speak
    as slowly as I could
  • 0:30 - 0:32
    and turn around, and there they
    would be in their boots doing this
  • 0:32 - 0:35
    (laughter)
    When I start to speak too fast,
  • 0:35 - 0:38
    this is the universal symbol --
    my wife invented it --
  • 0:38 - 0:42
    for "Cory, you are talking too fast".
    Please, don't be shy.
  • 0:42 - 0:46
    So, I'm a parent , like many of you
    and I'm like I'm sure all of you
  • 0:46 - 0:49
    who are parents, parenting takes my ass
    all the time.
  • 0:50 - 0:55
    And there are many regrets I have
    about the mere seven and half years
  • 0:55 - 0:57
    that I've been a parent
    but none ares so keenly felt
  • 0:58 - 1:01
    as my regrets over what's happened
    when I've been wandering
  • 1:01 - 1:05
    around the house and seen my
    daughter working on something
  • 1:05 - 1:09
    that was beyond her abilities, that was
    right at the edge of what she could do
  • 1:09 - 1:13
    and where she was doing something
    that she didn't have competence in yet
  • 1:13 - 1:17
    and you know it's that amazing thing
    to see that frowning concentration,
  • 1:17 - 1:20
    tongue stuck out: as a parent, your
    heart swells with pride
  • 1:20 - 1:21
    and you can't help but go over
  • 1:21 - 1:24
    and sort of peer over their shoulder
    what they are doing
  • 1:24 - 1:27
    and those of you who are parents know
    what happens when you look too closely
  • 1:28 - 1:30
    at someone who is working
    beyond the age of their competence.
  • 1:31 - 1:33
    They go back to doing something
    they're already good at.
  • 1:33 - 1:35
    You interrupt a moment
    of genuine learning
  • 1:35 - 1:38
    and you replace it with
    a kind of embarrassment
  • 1:39 - 1:42
    about what you're good at
    and what you're not.
  • 1:43 - 1:48
    So, it matters a lot that our schools are
    increasingly surveilled environments,
  • 1:48 - 1:52
    environments in which everything that
    our kids do is watched and recorded.
  • 1:53 - 1:56
    Because when you do that, you interfere
    with those moments of real learning.
  • 1:57 - 2:01
    Our ability to do things that we are not
    good at yet, that we are not proud of yet,
  • 2:01 - 2:04
    is negatively impacted
    by that kind of scrutiny.
  • 2:04 - 2:06
    And that scrutiny comes
    from a strange place.
  • 2:07 - 2:11
    We have decided that there are
    some programmatic means
  • 2:11 - 2:14
    by which we can find all the web page
    children shouldn't look at
  • 2:14 - 2:18
    and we will filter our networks
    to be sure that they don't see them.
  • 2:19 - 2:22
    Anyone who has ever paid attention
    knows that this doesn't work.
  • 2:22 - 2:26
    There are more web pages
    that kids shouldn't look at
  • 2:26 - 2:29
    than can ever be cataloged,
    and any attempt to catalog them
  • 2:29 - 2:32
    will always catch pages that kids
    must be looking at.
  • 2:32 - 2:35
    Any of you who have ever taught
    a unit on reproductive health
  • 2:35 - 2:38
    know the frustration of trying
    to get round a school network.
  • 2:39 - 2:42
    Now, this is done in the name of
    digital protection
  • 2:43 - 2:46
    but it flies in the face of digital
    literacy and of real learning.
  • 2:47 - 2:50
    Because the only way to stop kids
    from looking at web pages
  • 2:50 - 2:52
    they shouldn't be looking at
  • 2:52 - 2:56
    is to take all of the clicks that they
    make, all of the messages that they send,
  • 2:56 - 3:00
    all of their online activity
    and offshore it to a firm
  • 3:00 - 3:04
    that has some nonsensically arrived at
    list of the bad pages.
  • 3:04 - 3:08
    And so, what we are doing is that we're
    exfiltrating all of our students' data
  • 3:08 - 3:10
    to unknown third parties.
  • 3:10 - 3:13
    Now, most of these firms,
    their primary business isn't
  • 3:13 - 3:14
    serving the education sector.
  • 3:14 - 3:16
    Most of them service
    the government sectors.
  • 3:17 - 3:21
    They primarily service governments in
    repressive autocratic regimes.
  • 3:21 - 3:24
    They help them ensure that
    their citizens aren't looking at
  • 3:24 - 3:25
    Amnesty International web pages.
  • 3:26 - 3:29
    They repackage those tools
    and sell them to our educators.
  • 3:30 - 3:33
    So we are offshoring our children's clicks
    to war criminals.
  • 3:34 - 3:37
    And what our kids do, now,
    is they just get around it,
  • 3:37 - 3:39
    because it's not hard to get around it.
  • 3:39 - 3:44
    You know, never underestimate the power
    of a kid who is time-rich and cash-poor
  • 3:44 - 3:46
    to get around our
    technological blockades.
  • 3:48 - 3:51
    But when they do this, they don't acquire
    the kind of digital literacy
  • 3:51 - 3:54
    that we want them to do, they don't
    acquire real digital agency
  • 3:54 - 3:58
    and moreover, they risk exclusion
    and in extreme cases,
  • 3:58 - 3:59
    they risk criminal prosecution.
  • 4:00 - 4:04
    So what if instead, those of us who are
    trapped in this system of teaching kids
  • 4:04 - 4:08
    where we're required to subject them
    to this kind of surveillance
  • 4:08 - 4:10
    that flies in the face
    of their real learning,
  • 4:10 - 4:13
    what if instead, we invented
    curricular units
  • 4:13 - 4:16
    that made them real first class
    digital citizens,
  • 4:16 - 4:20
    in charge of trying to influence
    real digital problems?
  • 4:20 - 4:23
    Like what if we said to them:
    "We want you to catalog the web pages
  • 4:23 - 4:25
    that this vendor lets through
    that you shouldn't be seeing.
  • 4:25 - 4:29
    We want you to catalog those pages that
    you should be seeing, that are blocked.
  • 4:29 - 4:32
    We want you to go and interview
    every teacher in the school
  • 4:32 - 4:35
    about all those lesson plans that were
    carefully laid out before lunch
  • 4:35 - 4:38
    with a video and a web page,
    and over lunch,
  • 4:38 - 4:41
    the unaccountable distance center
    blocked these critical resources
  • 4:41 - 4:45
    and left them handing out photographed
    worksheets in the afternoon
  • 4:45 - 4:47
    instead of the unit they prepared.
  • 4:47 - 4:51
    We want you to learn how to do the Freedom
    of Information Act requests
  • 4:51 - 4:53
    and find out what your
    school authority is spending
  • 4:53 - 4:56
    to censor your internet access
    and surveil your activity.
  • 4:56 - 5:00
    We want you to learn to use the internet
    to research these companies
  • 5:00 - 5:04
    and we want you to present this
    to your parent-teacher association,
  • 5:04 - 5:07
    to your school authority,
    to your local newspaper."
  • 5:07 - 5:09
    Because that's the kind
    of digital literacy
  • 5:09 - 5:11
    that makes kids into first-class
    digital citizens,
  • 5:11 - 5:16
    that prepares them for a future
    in which they can participate fully
  • 5:16 - 5:18
    in a world that's changing.
  • 5:19 - 5:23
    Kids are the beta-testers
    of the surveillance state.
  • 5:23 - 5:27
    The path of surveillance technology
    starts with prisoners,
  • 5:27 - 5:30
    moves to asylum seekers,
    people in mental institutions
  • 5:30 - 5:34
    and then to its first non-incarcerated
    population: children
  • 5:34 - 5:37
    and then moves to blue-collar workers,
    government workers
  • 5:37 - 5:38
    and white-collar workers.
  • 5:38 - 5:42
    And so, what we do to kids today
    is what we did to prisoners yesterday
  • 5:42 - 5:44
    and what we're going to be doing
    to you tomorrow.
  • 5:44 - 5:47
    And so it matters, what we teach our kids.
  • 5:47 - 5:51
    If you want to see where this goes, this
    is a kid named Blake Robbins
  • 5:51 - 5:55
    and he attended Lower Merion High School
    in Lower Merion Pennsylvania
  • 5:55 - 5:56
    outside f Philadelphia.
  • 5:56 - 6:00
    It's the most affluent public school
    district in America, so affluent
  • 6:00 - 6:03
    that all the kids were issued Macbooks
    at the start of the year
  • 6:03 - 6:05
    and they had to do their homework on
    their Macbooks,
  • 6:05 - 6:08
    and bring them to school every day
    and bring them home every night.
  • 6:08 - 6:11
    And the Macbooks had been fitted with
    Laptop Theft Recovery Software,
  • 6:11 - 6:16
    which is fancy word for a rootkit, that
    let the school administration
  • 6:16 - 6:20
    covertly (check) operate the cameras
    and microphones on these computers
  • 6:20 - 6:23
    and harvest files off
    of their hard drives
  • 6:24 - 6:26
    view all their clicks, and so on.
  • 6:26 - 6:31
    Now Blake Robbins found out
    that the software existed
  • 6:31 - 6:34
    and how it was being used
    because he and the head teacher
  • 6:34 - 6:37
    had been knocking heads for years,
    since he first got into the school,
  • 6:37 - 6:40
    and one day, the head teacher
    summoned him to his office
  • 6:40 - 6:41
    and said: "Blake, I've got you now."
  • 6:42 - 6:45
    and handed him a print-out of Blake
    in his bedroom the night before,
  • 6:46 - 6:49
    taking what looked like a pill,
    and he said: "You're taking drugs."
  • 6:49 - 6:54
    And Blake Robbins said: "That's a candy,
    it's a Mike and Ike's candy, I take them
  • 6:54 - 6:56
    when I -- I eat them when I'm studying.
  • 6:56 - 6:58
    How did you get a picture
    of me in my bedroom?"
  • 6:59 - 7:03
    This head teacher had taken
    over 6000 photos of Blake Robbins:
  • 7:03 - 7:06
    awake and asleep, dressed and undressed,
    in the presence of his family.
  • 7:07 - 7:10
    And in the ensuing lawsuit, the school
    settled for a large amount of money
  • 7:10 - 7:12
    and promised that
    they wouldn't do it again
  • 7:13 - 7:16
    without informing the students
    that it was going on.
  • 7:16 - 7:19
    And increasingly, the practice is now
  • 7:19 - 7:22
    that school administrations hand out
    laptops, because they're getting cheaper,
  • 7:23 - 7:25
    with exactly the same kind of software,
  • 7:25 - 7:28
    but they let the students know and
    hey find that that works even better
  • 7:28 - 7:30
    at curbing the students' behavior,
  • 7:30 - 7:33
    because the students know that
    they're always on camera.
  • 7:34 - 7:38
    Now, the surveillance state is moving
    from kids to the rest of the world.
  • 7:38 - 7:40
    It's metastasizing.
  • 7:40 - 7:44
    Our devices are increasingly designed
    to treat us as attackers,
  • 7:44 - 7:47
    as suspicious parties
    who can't be trusted
  • 7:47 - 7:51
    because our devices' job is to do things
    that we don't want them to do.
  • 7:51 - 7:54
    Now that's not because the vendors
    who make our technology
  • 7:54 - 7:56
    want to spy on us necessarily,
  • 7:56 - 8:01
    but they want to take
    the ink-jet printer business model
  • 8:01 - 8:04
    and bring it into every other realm
    of the world.
  • 8:04 - 8:09
    So the ink-jet printer business model
    is where you sell someone a device
  • 8:09 - 8:12
    and then you get a continuing
    revenue stream from that device
  • 8:12 - 8:16
    by making sure that competitors can't make
    consumables or parts
  • 8:16 - 8:19
    or additional features
    or plugins for that device,
  • 8:19 - 8:22
    without paying rent
    to the original manufacturer.
  • 8:22 - 8:26
    And that allows you to maintain
    monopoly margins on your devices.
  • 8:26 - 8:31
    Now, in 1998, the American government
    passed a law called
  • 8:31 - 8:32
    the Digital Millennium Copyright Act,
  • 8:32 - 8:35
    in 2001 the European Union
    introduced its own version,
  • 8:35 - 8:37
    the European Union Copyright Directive.
  • 8:38 - 8:40
    And these two laws, along with laws
    all around the world,
  • 8:40 - 8:46
    in Australia, Canada and elsewhere,
    these laws prohibit removing digital laws
  • 8:46 - 8:49
    that are used to restrict
    access to copyrighted works
  • 8:49 - 8:52
    and they were original envisioned as a way
    of making sure that Europeans didn't
  • 8:52 - 8:55
    bring cheap DVDs in from America,
  • 8:55 - 8:58
    or making sure that Australians didn't
    import cheap DVDs from China.
  • 8:58 - 9:04
    And so you have a digital work, a DVD,
    and it has a lock on it and to unlock it,
  • 9:04 - 9:05
    you have to buy an authorized player
  • 9:05 - 9:07
    and the player checks to make sure
    you are in region
  • 9:07 - 9:10
    and making your own player
    that doesn't make that check
  • 9:10 - 9:12
    is illegal because you'd have
    to remove the digital lock.
  • 9:12 - 9:14
    And that was the original intent,
  • 9:14 - 9:19
    it was to allow high rents to be
    maintained on removable media,
  • 9:19 - 9:21
    DVDs and other entertainment content.
  • 9:21 - 9:24
    But it very quickly spread
    into new rounds.
  • 9:25 - 9:28
    So, for example, auto manufacturers
    now lock up
  • 9:28 - 9:31
    all of their cars' telemetry
    with digital locks.
  • 9:31 - 9:33
    If you're a mechanic
    and want to fix a car,
  • 9:33 - 9:37
    you have to get a reader
    from the manufacturer
  • 9:37 - 9:40
    to make sure that you can
    see the telemetry
  • 9:40 - 9:43
    and then know what parts to order
    and how to fix it.
  • 9:43 - 9:46
    And in order to get this reader,
    you have to promise the manufacturer
  • 9:46 - 9:50
    that you will only buy parts
    from that manufacturer
  • 9:50 - 9:51
    and not from third parties.
  • 9:51 - 9:54
    So the manufacturers can keep
    the repair costs high
  • 9:54 - 9:57
    and get a secondary revenue stream
    out of the cars.
  • 9:57 - 10:05
    This year, the Chrysler corporation filed
    comments with the US Copyright Office,
  • 10:05 - 10:08
    to say that they believed that
    this was the right way to do it
  • 10:08 - 10:10
    and that it should be a felony,
    punishable by 5 years in prison
  • 10:10 - 10:12
    and a $500'000 fine,
  • 10:12 - 10:16
    to change the locks on a car that you own,
    so that you can choose who fixes it.
  • 10:17 - 10:20
    It turned out that when they advertised
  • 10:20 - 10:22
    -- well, where is my slide here?
    Oh, there we go --
  • 10:22 - 10:25
    when they advertised that
    it wasn't your father's Oldsmobile,
  • 10:26 - 10:29
    they weren't speaking metaphorically,
    they literally meant
  • 10:29 - 10:30
    that even though your father
    bought the Oldsmobile,
  • 10:30 - 10:33
    it remained their property in perpetuity.
  • 10:34 - 10:36
    And it's not just cars,
    it's every kind of device,
  • 10:36 - 10:39
    because every kind of device today
    has a computer in it.
  • 10:40 - 10:43
    The John Deer Company, the world's leading seller of heavy equipment
  • 10:43 - 10:45
    and agricultural equipment technologies,
  • 10:46 - 10:49
    they now view their tractors as
    information gathering platforms
  • 10:49 - 10:51
    and they view the people who use them
  • 10:51 - 10:56
    as the kind of inconvenient gut flora
    of their ecosystem.
  • 10:56 - 10:58
    So if you are a farmer
    and you own a John Deer tractor,
  • 10:58 - 11:03
    when you drive it around your fields,
    the torque centers on the wheels
  • 11:03 - 11:08
    conduct a centimeter-accurate soil
    density survey of your agricultural land.
  • 11:08 - 11:11
    That would be extremely useful to you
    when you're planting your seed
  • 11:11 - 11:13
    but that data is not available to you
  • 11:13 - 11:15
    unless unless you remove the digital lock
    from your John Deer tractor
  • 11:15 - 11:18
    which again, is against the law
    everywhere in the world.
  • 11:18 - 11:20
    Instead, in order to get that data
  • 11:20 - 11:23
    you have to buy a bundle with seeds
    from Monsanto,
  • 11:23 - 11:25
    who are John Deer's seed partners.
  • 11:25 - 11:29
    John Deer then takes this data that they
    aggregate across whole regions
  • 11:29 - 11:31
    and they use it to gain insight
    into regional crop yields
  • 11:31 - 11:33
    that they use to play the futures market.
  • 11:34 - 11:37
    John Deer's tractors are really just
    a way of gathering information
  • 11:37 - 11:39
    and the farmers are secondary to it.
  • 11:39 - 11:42
    Just because you own it
    doesn't mean it's yours.
  • 11:42 - 11:45
    And it's not just the computers
    that we put our bodies into
  • 11:46 - 11:47
    that have this business model.
  • 11:47 - 11:49
    It's the computers that we put
    inside of our bodies.
  • 11:50 - 11:51
    If you're someone who is diabetic
  • 11:51 - 11:55
    and you're fitted with a continuous
    glucose-measuring insulin pump,
  • 11:55 - 11:58
    that insulin pump is designed
    with a digital lock
  • 11:58 - 12:02
    that makes sure that your doctor
    can only use the manufacturer's software
  • 12:02 - 12:04
    to read the data coming off of it
  • 12:04 - 12:07
    and that software is resold
    on a rolling annual license
  • 12:07 - 12:10
    and it can't be just bought outright.
  • 12:10 - 12:12
    And the digital locks are also
    used to make sure
  • 12:12 - 12:14
    that you only buy the insulin
    that vendors approved
  • 12:14 - 12:17
    and not generic insulin
    that might be cheaper.
  • 12:17 - 12:21
    We've literally turned human beings
    into ink-jet printers.
  • 12:21 - 12:28
    Now, this has really deep implications
    beyond the economic implications.
  • 12:28 - 12:31
    Because the rules that prohibit
    breaking these digital locks
  • 12:31 - 12:35
    also prohibit telling people
    about flaws that programmers made
  • 12:35 - 12:37
    because if you know about a flaw
    that a programmer made,
  • 12:38 - 12:40
    you can use it to break the digital lock.
  • 12:40 - 12:44
    And that means that the errors,
    the vulnerabilities,
  • 12:44 - 12:50
    the mistakes in our devices, they fester
    in them, they go on and on and on
  • 12:50 - 12:54
    and our devices become these longlife
    reservoirs of digital pathogens.
  • 12:54 - 12:56
    And we've seen how that plays out.
  • 12:56 - 12:59
    One of the reasons that Volkswagen
    was able to get away
  • 12:59 - 13:01
    with their Diesel cheating for so long
  • 13:01 - 13:04
    is because no one could independently
    alter their firmware.
  • 13:05 - 13:07
    It's happening all over the place.
  • 13:08 - 13:11
    You may have seen --
    you may have seen this summer
  • 13:11 - 13:15
    that Chrysler had to recall
    1.4 million jeeps
  • 13:15 - 13:19
    because it turned out that they could be
    remotely controlled over the internet
  • 13:19 - 13:22
    while driving down a motorway
    and have their brakes and steerings
  • 13:22 - 13:26
    commandeered by anyone, anywhere
    in the world, over the internet.
  • 13:27 - 13:31
    We only have one methodology
    for determining whether security works
  • 13:31 - 13:34
    and that's to subject it
    to public scrutiny,
  • 13:34 - 13:38
    to allow for other people to see
    what assumptions you've made.
  • 13:38 - 13:40
    Anyone can design a security system
  • 13:40 - 13:42
    that he himself can think
    of a way of breaking,
  • 13:43 - 13:45
    but all that means is that you've
    designed a security system
  • 13:45 - 13:47
    that works against people
    who are stupider than you.
  • 13:48 - 13:50
    And in this regard, security
    is no different
  • 13:50 - 13:52
    from any other kind of knowledge creation.
  • 13:52 - 13:55
    You know, before we had
    contemporary science and scholarship,
  • 13:55 - 13:58
    we had something that looked
    a lot like it, called alchemy.
  • 13:58 - 14:02
    And for 500 years, alchemists kept
    what they thought they knew a secret.
  • 14:02 - 14:06
    And that meant that every alchemist
    was capable of falling prey
  • 14:06 - 14:13
    to that most urgent of human frailties,
    which is our ability to fool ourselves.
  • 14:13 - 14:16
    And so, every alchemist discovered
    for himself in the hardest way possible
  • 14:16 - 14:19
    that drinking mercury was a bad idea.
  • 14:19 - 14:22
    We call that 500-year period the Dark Ages
  • 14:22 - 14:25
    and we call the moment at which
    they started publishing
  • 14:25 - 14:28
    and subjectig themselves
    to adversarial peer review,
  • 14:28 - 14:31
    which is when your friends tell you
    about the mistakes that you've made
  • 14:31 - 14:33
    and your enemies call you an idiot
    for having made them,
  • 14:33 - 14:36
    we call that moment the Enlightenment.
  • 14:37 - 14:40
    Now, this has profound implications.
  • 14:40 - 14:46
    The restriction of our ability to alter
    the security of our devices
  • 14:46 - 14:49
    for our own surveillance society,
  • 14:49 - 14:52
    for our ability to be free people
    in society.
  • 14:52 - 14:56
    At the height of the GDR, in 1989,
  • 14:56 - 15:01
    the STASI had one snitch for every
    60 people in East Germany,
  • 15:01 - 15:03
    in order to surveil the entire country.
  • 15:04 - 15:07
    A couple of decades later, we found out
    through Edward Snowden
  • 15:07 - 15:09
    that the NSA was spying
    on everybody in the world.
  • 15:10 - 15:13
    And the ratio of people who work
    at the NSA to people they are spying on
  • 15:13 - 15:15
    is more like 1 in 10'000.
  • 15:15 - 15:18
    They've achieved a two and a half
    order of magnitude
  • 15:18 - 15:20
    productivity gain in surveillance.
  • 15:20 - 15:23
    And the way that they got there
    is in part by the fact that
  • 15:23 - 15:26
    we use devices that
    we're not allowed to alter,
  • 15:26 - 15:28
    that are designed to treat us as attackers
  • 15:28 - 15:31
    and that gather an enormous
    amount of information on us.
  • 15:32 - 15:34
    If the government told you that you're
    required to carry around
  • 15:34 - 15:38
    a small electronic rectangle that
    recorded all of your social relationships,
  • 15:38 - 15:39
    all of your movements,
  • 15:39 - 15:42
    all of your transient thoughts that
    you made known or ever looked up, (check)
  • 15:43 - 15:46
    and would make that
    available to the state,
  • 15:46 - 15:48
    and you would have to pay for it,
    you would revolt.
  • 15:49 - 15:52
    But the phone companies have
    managed to convince us,
  • 15:52 - 15:54
    along with the mobile vendors,
  • 15:54 - 15:57
    that we should foot the bill
    for our own surveillance.
  • 15:57 - 15:59
    It's a bit like during
    the Cultural Revolution,
  • 15:59 - 16:01
    where, after your family members
    were executed,
  • 16:01 - 16:03
    they sent you a bill for the bullet.
  • 16:05 - 16:11
    So, this has big implications, as I said,
    for where we go as a society.
  • 16:11 - 16:15
    Because just as our kids have
    a hard time functioning
  • 16:15 - 16:17
    in the presence of surveillance,
    and learning,
  • 16:17 - 16:19
    and advancing their own knowledge,
  • 16:19 - 16:21
    we as a society have a hard time
    progressing
  • 16:21 - 16:23
    in the presence of surveillance.
  • 16:23 - 16:29
    In our own living memory, people who are
    today thought of as normal and right
  • 16:29 - 16:31
    were doing something that
    a generation ago
  • 16:31 - 16:34
    would have been illegal
    and landed them in jail.
  • 16:34 - 16:35
    For example, you probably know someone
  • 16:35 - 16:38
    who's married to a partner
    of the same sex.
  • 16:38 - 16:41
    If you live in America, you may know
    someone who takes medical marijuana,
  • 16:41 - 16:43
    or if you live in the Netherlands.
  • 16:43 - 16:48
    And not that long ago, people
    who undertook these activities
  • 16:48 - 16:49
    could have gone to jail for them,
  • 16:49 - 16:52
    could have faced enormous
    social exclusion for them.
  • 16:52 - 16:56
    The way that we got from there to here
    was by having a private zone,
  • 16:56 - 16:58
    a place where people weren't surveilled,
  • 16:58 - 17:01
    in which they could advance
    their interest ideas,
  • 17:01 - 17:04
    do things that were thought of as
    socially unacceptable
  • 17:04 - 17:06
    and slowly change our social attitudes.
  • 17:07 - 17:09
    And in ........ (check) few things
    that in 50 years,
  • 17:09 - 17:13
    your grandchildren will sit around
    the Christmas table, in 2065, and say:
  • 17:13 - 17:15
    "How was it, grandma,
    how was it, grandpa,
  • 17:15 - 17:17
    that in 2015, you got it all right,
  • 17:17 - 17:20
    and we haven't had
    any social changes since then?"
  • 17:20 - 17:22
    Then you have to ask yourself
    how in a world,
  • 17:22 - 17:24
    in which we are all
    under continuous surveillance,
  • 17:24 - 17:27
    we are going to find a way
    to improve this.
  • Not Synced
    So, our kids need ICT literacy,
  • Not Synced
    but ICT literacy isn't just typing skills
    or learning how to use PowerPoing.
  • Not Synced
    It's learning how to think critically
  • Not Synced
    about how they relate
    to the means of information,
  • Not Synced
    about whether they are its masters
    or servants.
  • Not Synced
    Our networks are not
    the most important issue that we have.
  • Not Synced
    There are much more important issues
    in society and in the world today.
  • Not Synced
    The future of the internet is
    way less important
  • Not Synced
    than the future of our climate,
    the future of gender equity,
  • Not Synced
    the future of racial equity,
  • Not Synced
    the future of the wage gap
    and the wealth gap in the world,
  • Not Synced
    but everyone of those fights is going
    to be fought and won or lost
  • Not Synced
    on the internet:
    it's our most foundational fight
  • Not Synced
    So weakened (check) computers
    can make us more free
  • Not Synced
    or they can take away our freedom.
  • Not Synced
    It all comes down to how we regulate them
    and how we use them.
  • Not Synced
    And it's our job, as people who are
    training the next generation,
  • Not Synced
    and whose next generation
    is beta-testing
  • Not Synced
    the surveillance technology
    that will be coming to us,
  • Not Synced
    it's our job to teach them to seize
    the means of information,
  • Not Synced
    to make themselves self-determinant
    in the way that they use their networks
  • Not Synced
    and to find ways to show them
    how to be critical and how to be smart
  • Not Synced
    and how to be, above all, subversive
    and how to use the technology around them.
  • Not Synced
    Thank you. (18:49)
  • Not Synced
    (Applause)
  • Not Synced
    (Moderator) Cory, thank you very much
    indeed.
  • Not Synced
    (Doctorow) Thank you
    (Moderator) And I've got a bundle
  • Not Synced
    of points which you've stimulated
    from many in the audience, which sent --
  • Not Synced
    (Doctorow) I'm shocked to hear that
    that was at all controversial,
  • Not Synced
    but go on.
    (Moderator) I didn't say "controversial",
  • Not Synced
    you stimulated thinking, which is great.
  • Not Synced
    But a lot of them resonate around
    violation of secrecy and security.
  • Not Synced
    And this, for example,
    from Anneke Burgess (check)
  • Not Synced
    "Is there a way for students
    to protect themselves
  • Not Synced
    from privacy violations by institutions
    they are supposed to trust."
  • Not Synced
    I think this is probably a question
    William Golding (check) as well,
  • Not Synced
    someone who is a senior figure
    in a major university, but
  • Not Synced
    this issue of privacy violations and trust.
  • Not Synced
    (Doctorow) Well, I think that computers
    have a curious dual nature.
  • Not Synced
    So on the one hand, they do expose us
    to an enormous amount of scrutiny,
  • Not Synced
    depending on how they are configured.
  • Not Synced
    But on the other hand, computers
    have brought new powers to us
  • Not Synced
    that are literally new
    on the face of the world, right?
  • Not Synced
    We have never had a reality in which
    normal people could have secrets
  • Not Synced
    from powerful people.
  • Not Synced
    But with the computer in your pocket,
    with that,
  • Not Synced
    you can encrypt a message so thoroughly
  • Not Synced
    that if every hydrogen atom in the
    universe were turned into a computer
  • Not Synced
    and it did nothing until
    the heat death of the universe,
  • Not Synced
    but try to guess what your key was,
  • Not Synced
    we would run out of universe
    before we ran out of possible keys.
  • Not Synced
    So, computers do give us
    the power to have secrets.
  • Not Synced
    The problem is that institutions
    prohibit the use of technology
  • Not Synced
    that allow you to take back
    your own privacy.
  • Not Synced
    It's funny, right? Because we take kids
    and we say to them:
  • Not Synced
    "Your privacy is like your virginity:
  • Not Synced
    once you've lost it,
    you'll never get it back.
  • Not Synced
    Watch out what you're
    putting on Facebook"
  • Not Synced
    -- and I think they should watch
    what they're putting on Facebook,
  • Not Synced
    I'm a Facebook vegan, I don't even --
    I don't use it, but we say:
  • Not Synced
    "Watch what you're putting on Facebook,
  • Not Synced
    don't send out dirty pictures
    of yourself on SnapChat."
  • Not Synced
    All good advice.
  • Not Synced
    But we do it while we are taking away
    all the private information
  • Not Synced
    that they have, all of their privacy
    and all of their agency.
  • Not Synced
    You know, if a parent says to a kid:
  • Not Synced
    "You mustn't smoke
    because you'll get sick"
  • Not Synced
    and the parent says it
    while lighting a new cigarette
  • Not Synced
    off the one that she's just put down
    in the ashtray,
  • Not Synced
    the kid knows that what you're doing
    matters more than what you're saying.
  • Not Synced
    (Moderator) The point is deficit of trust.
  • Not Synced
    It builds in the kind of work that
    David has been doing as well,
  • Not Synced
    this deficit of trust and privacy.
  • Not Synced
    And there is another point here:
  • Not Synced
    "Is the battle for privacy already lost?
  • Not Synced
    Are we already too comfortable
    with giving away our data?"
  • Not Synced
    (Doctorow) No, I don't think so at all.
  • Not Synced
    In fact, I think that if anything,
    we've reached
  • Not Synced
    peak indifference to surveillance, right?
  • Not Synced
    The surveillance races on by a long shot.
  • Not Synced
    There will be more surveillance
    before there is less.
  • Not Synced
    But there'll never be fewer people
    who care about surveillance
  • Not Synced
    than there are today,
  • Not Synced
    because, as privacy advocates,
    we spectacularly failed,
  • Not Synced
    over the last 20 years, to get people
    to take privacy seriously
  • Not Synced
    and now we have firms that are
    frankly incompetent,
  • Not Synced
    retaining huge amounts of our personally
    identifying sensitive information
  • Not Synced
    and those firms are leaking
    that information at speed.
  • Not Synced
    So, it started this year with things like
    (phone rings) -- is that me? it's me! --
  • Not Synced
    It started this year with things like
    Ashley Madison
  • Not Synced
    (Moderator) Do you want to take it?
  • Not Synced
    (Doctorow) No no, that was my timer going
    off telling me I've had my 22 minutes
  • Not Synced
    (Moderator) It may be someone who
    doesn't trust my....
  • Not Synced
    (Doctorow) No,no, that was 22 minutes.
  • Not Synced
    So, it started with Ashley Madison,
    the office of personnel management,
  • Not Synced
    everyone who ever applied for security
    clearance in America
  • Not Synced
    had all of their sensitive information,
    everything you had to give them
  • Not Synced
    about why you shouldn't have
    security clearance,
  • Not Synced
    everything that's potentially
    compromising about you,
  • Not Synced
    all of it exfiltrated by what's believed
    to have been a Chinese spy ring.
  • Not Synced
    Something like
    one in twenty Americans now,
  • Not Synced
    have had their data captured and
    exfiltrated from the United States.
  • Not Synced
    This week, VTech, the largest electronic
    toy manufacturer in the world,
  • Not Synced
    leaked the personal information of
    at least five million children,
  • Not Synced
    including potentially photos that
    they took with their electronic toys,
  • Not Synced
    as well as their parents' information,
    their home addresses, their passwords,
  • Not Synced
    their parents' passwords
    and their password hints.
  • Not Synced
    So every couple of weeks,
    from now on,
  • Not Synced
    a couple of million people
    are going to show up
  • Not Synced
    on the door of people who
    care about privacy and say:
  • Not Synced
    "You were right all along,
    what do we do?"
  • Not Synced
    And the challenge is to give them
    something useful
  • Not Synced
    they can do about privacy.
  • Not Synced
    And there are steps that
    you can take personally.
  • Not Synced
    If you go to the Surveillance Defense Kit
  • Not Synced
    at the Electronic Frontier Foundation's
    website,
  • Not Synced
    you'll find a set of tools you can use.
  • Not Synced
    But ultimately, it's not
    an individual choice.
  • Not Synced
    It's a social one.
  • Not Synced
    Privacy is a team sport, because if you
    keep your information private
  • Not Synced
    and you send it to someone else
    who is in your social network,
  • Not Synced
    who doesn't keep it private, well, then
    it leaks out their back door.
  • Not Synced
    And so we need social movements
    to improve our privacy.
  • Not Synced
    We also need better tools.
  • Not Synced
    It's really clear that our privacy tools
    have fallen short of the mark
  • Not Synced
    in terms of being accessible
    to normal people.
  • Not Synced
    (Moderator) Corey --
    (Doctorow) I sense you want
  • Not Synced
    to say something: go on
    (Moderator) Yes.
  • Not Synced
    But what I want to do is keep pushing this
    down the route of education as well.
  • Not Synced
    There are two or three questions here
    which are very specific about that.
  • Not Synced
    Could you tell us more about
    how we help students
  • Not Synced
    become digital citizens?
  • Not Synced
    It links in important ways to the
    German and Nordic pre-digital concept
  • Not Synced
    of Bildung ............... (check)
  • Not Synced
    and if you could give your daughter
    just one piece of advice for living
  • Not Synced
    in an age of digital surveillance,
    what would it be?
  • Not Synced
    (Doctorow) So, the one piece of advice
    I would give her is
  • Not Synced
    "Don't settle for anything less
    than total control
  • Not Synced
    of the means of information.
  • Not Synced
    Anytime someone tells you that
    the computer says you must do something,
  • Not Synced
    you have to ask yourself why the computer
    isn't doing what you want it to do.
  • Not Synced
    Who is in charge here?"
  • Not Synced
    In terms of how we improve
    digital citizenship,
  • Not Synced
    I think we are at odds with
    our institutions in large part here.
  • Not Synced
    I think that our institutions
    have decreed that privacy
  • Not Synced
    is a luxury we can't afford
    to give our students
  • Not Synced
    because if we do, they might do
    something we wouldn't like.
  • Not Synced
    And so, as teachers, the only thing that
    we can do without risking our jobs,
  • Not Synced
    without risking our students' education
    and their ability to stay in school
  • Not Synced
    is to teach them how to do judo,
    to teach them
  • Not Synced
    how to ask critical questions,
    to gather data,
  • Not Synced
    to demand evidence-based policy.
  • Not Synced
    And ultimately, I actually think
    that's better.
  • Not Synced
    I think that teaching kids
    how to evade firewalls
  • Not Synced
    is way less interesting than
    teaching them how to identify
  • Not Synced
    who is providing that firewall,
  • Not Synced
    who made the decision
    to buy that firewall,
  • Not Synced
    how much they're spending,
  • Not Synced
    and what the process is
    for getting that person fired.
  • Not Synced
    That's a much more interesting
    piece of digital citizenship
  • Not Synced
    than any piece of firewall hacking.
  • Not Synced
    (Moderator) But Corey, you're teaching
    kids to get hold of their information,
  • Not Synced
    but picking up on your analogy
    with the auto (check)
  • Not Synced
    where there's a very clear lock on,
    so much which has to be done
  • Not Synced
    has to be done through the manufacturers
    -- Martin Siebel (check) picking up --
  • Not Synced
    How busy is the hacker community
    with breaking digital locks?
  • Not Synced
    In other words, digital locks
    are being put on so much,
  • Not Synced
    can the next generation, can the kids
    work out where these locks are,
  • Not Synced
    to keep control of their information.
  • Not Synced
    (Doctorow) So, getting the digital locks
    off of our devices
  • Not Synced
    is not just a matter of technology,
    it's also a matter of policy.
  • Not Synced
    Laws like the EUCD and new treaty
    instruments like TTIP
  • Not Synced
    and the Transpacific Partnership
  • Not Synced
    specify that these digital locks
    can't be removed legally,
  • Not Synced
    that firms can use them to extract
    monopoly ransom, to harm our security.
  • Not Synced
    And these treaties will make it impossible
    to remove these locks legally.
  • Not Synced
    It's not enough for there to be
    a demi-monde, in which
  • Not Synced
    people who are technologically savvy
    know how to break their iPhones.
  • Not Synced
    The problem with that is that it produces
    this world in which you can't know
  • Not Synced
    whether the software that you're using
    has anything bad lurking in it, right?
  • Not Synced
    to say to people who have iPhones that
    they want it reconfigured
  • Not Synced
    to accept software from parties that
    aren't approved by Apple,
  • Not Synced
    well all you need to do is find a random
    piece of software on the internet
  • Not Synced
    that claims it will allow you to do it,
    get it (check) on your phone,
  • Not Synced
    and then run this software on it
    is like saying to people:
  • Not Synced
    "Well, all you need to do is just
    find some heroin lying around
  • Not Synced
    and put it in your arm: I'm sure
    it'll work out fine." Right?
  • Not Synced
    The ways in which a phone that is
    compromised can harm you
  • Not Synced
    are really without limits, right?
  • Not Synced
    If you think about this, this is a
    rectangle with a camera and a microphone
  • Not Synced
    that you take into the bedroom
    and the toilet.
  • Not Synced
    and the only way you know whether
    that camera or microphone are on or off
  • Not Synced
    is if the software is doing
    what it says it's doing, right?
  • Not Synced
    So, I think that we have this policy
    dimension that's way more important
  • Not Synced
    than the technical dimension,
  • Not Synced
    and that the only way we're going to
    change that policy
  • Not Synced
    is by making kids know what they can't do
  • Not Synced
    and them making them know
    why they're not allowed to do it,
  • Not Synced
    and then set them loose.
    (Moderator) Two final points then,
  • Not Synced
    building on their yen ............... (check)
  • Not Synced
    but what do you think of recent laws,
    say, in the Netherlands,
  • Not Synced
    allowing police to spy on criminals
    and terrorists with webcams
  • Not Synced
    on their own laptops?
  • Not Synced
    In other words, the extension implicit
    in this question is,
  • Not Synced
    what about keeping an eye on others,
    including the next generation coming up,
  • Not Synced
    and this from Peter Andreesen (check),
    which is slightly connected:
  • Not Synced
    When the predominant management tool
    in our business is still based
  • Not Synced
    on the assumption of control, how do we
    change that psychology and that mindset?
  • Not Synced
    And David is nodding for those views
    ........ (check) at the back.
  • Not Synced
    (Doctorow) You know, there's not an
    operating system or suite of applications
  • Not Synced
    that bad guys use, and another set
    that the rest of us use.
  • Not Synced
    And so, when our security services
    discover vulnerabilities
  • Not Synced
    in the tools that we rely on, and then
    hoard those vulnerabilities,
  • Not Synced
    so that they can weaponize them
    to attack criminals or terrorists
  • Not Synced
    or whoever they're after,
  • Not Synced
    they insure that those vulnerabilities
    remain unpatched on our devices too.
  • Not Synced
    So that the people who are
    our adversaries,
  • Not Synced
    the criminals who want to attack us,
  • Not Synced
    the spies who want to exfiltrate
    our corporate secrets to their countries,
  • Not Synced
    the griefers, the autocratic regimes
    who use those vulnerabilities
  • Not Synced
    to spy on their population --
  • Not Synced
    at Electronic Frontier Foundation,
    we have a client from Ethiopia,
  • Not Synced
    Mr. Kadani, who is dissident journalist
  • Not Synced
    -- Ethiopia imprisons more journalists
    than any other country in the world --
  • Not Synced
    he fled to America, where the Ethiopian
    government hacked his computer
  • Not Synced
    with a bug in Skype that they had
    discovered and not reported.
  • Not Synced
    They'd bought the tools to do this from
    Italy, from a company called Hacking Team,
  • Not Synced
    they hacked his computer, they found out
    who is colleagues were in Ethiopia
  • Not Synced
    and they arrested them and
    subjected them to torture.
  • Not Synced
    So, when our security services take a bug
    and weaponize it
  • Not Synced
    so they can attack their adversaries,
    they leave that bug intact
  • Not Synced
    so that our adversaries can attack us.
  • Not Synced
    You know, this is -- I'm giving a talk
    later today,
  • Not Synced
    no matter what side you're on in the war
    on general-purpose computing,
  • Not Synced
    you're losing,
  • Not Synced
    because all we have right now
    is attack and not defense.
  • Not Synced
    Our security services are so concerned
    with making sure that their job is easy
  • Not Synced
    when they attack their adversaries,
    that they are neglecting the fact
  • Not Synced
    that they are making their adversaries'
    job easy in attacking us.
  • Not Synced
    If this were a football match,
  • Not Synced
    the score would be tied
    400 to 400 after 10 minutes.
  • Not Synced
    (Moderator) Corey,
    I've got to stop you there.
  • Not Synced
    Thank you very much indeed.
    (Doctorow) Thank you very much.
Title:
OEB 2015 - Opening Plenary - Cory Doctorow
Description:

Cory Doctorow - Writer, Blogger, Activist - USA

The Opening Plenary session of OEB 2015 looked at the challenges of modernity and identify how people, organisations, institutions and societies can make technology and knowledge work together to accelerate the shift to a new age of opportunity.

More info: http://bit.ly/1lugQWX

more » « less
Video Language:
English
Team:
Captions Requested
Duration:
29:46

English subtitles

Revisions Compare revisions