Return to Video

Why Freedom of Thought Requires Free Media and Why Free Media Require Free Technology

  • 0:15 - 0:21
    Good morning, it's a pleasure to be here and an honor to be at re: publica.
  • 0:21 - 0:34
    For the last thousand years, we, our mothers, and our fathers, have been struggling for freedom of thought.
  • 0:34 - 0:49
    We have sustained many horrible losses and some imense vicroties and we are now at a very serious time.
  • 0:49 - 0:59
    From the adoption of printing by Europeans in the 15th century, we began to be concerned primarially
  • 0:59 - 1:03
    with access to printed material.
  • 1:03 - 1:10
    The right to read and the right to publish were the central subjects of our struggle for freedom of
  • 1:10 - 1:16
    thought for most of the last half milenium.
  • 1:16 - 1:32
    The basic concern was for the right to read in private and to think and speak and act on the basis of
  • 1:32 - 1:33
    a free and uncensored will.
  • 1:33 - 1:47
    The primary antogonist for freedom of thought in the beginning of our struggle was the universal Catholic
  • 1:47 - 1:54
    church, an institution directed at the control of thought in the European world.
  • 1:54 - 2:06
    Based around weekly surveillance of the conduct and thoughts of every human being.
  • 2:06 - 2:14
    Based around the censorship of all reading material, and, in the end, based upon the ability to predict
  • 2:14 - 2:21
    and to punish unorthodox thought.
  • 2:21 - 2:30
    The tools available for thought control in early modern Europe were poor, even by 20th century standards
  • 2:30 - 2:32
    But they worked.
  • 2:32 - 2:41
    And for hundreds of years, the struggle primarially centered around that increasingly important first
  • 2:41 - 2:49
    mass-manufactured article in western culture: the book.
  • 2:49 - 3:00
    Whether you could print them, possess them, traffic in them, read them, teach from them without the permission
  • 3:00 - 3:10
    or control of an entity empowered to punish thought.
  • 3:10 - 3:21
    By the end of the 17th century, censorship of written material in Euopre had begun to break down.
  • 3:21 - 3:31
    First in the Netherlands, then in the UK, then afterwards in waves throughout the European world.
  • 3:31 - 3:43
    And the book became an article of subversive commerce and began eating away at the control of thought.
  • 3:43 - 3:53
    By the late 18th century, that struggle for the freedom of reading had begun to attack the substance
  • 3:53 - 4:08
    of Christianity itself.
  • 4:08 - 4:17
    And the European world trembled on the brink of the first great revolution of the mind. It spoke of " Liberté, égalité, fraternité"
  • 4:17 - 4:18
    but actually it meant freedom to think differently.
  • 4:18 - 4:26
    The Ancien Régime began to struggle against thinking and we moved into the next phase of the struggle
  • 4:26 - 4:36
    for freedom of thought, which presumed the possibility of unothorodox thinking and revolutionary acting.
  • 4:36 - 4:44
    And for 200 years we struggled with the consequences of those changes.
  • 4:44 - 4:47
    that was then and this is now.
  • 4:47 - 4:57
    Now we begin a new phase in the history of the human race. We are building a single nervous system which
  • 4:57 - 5:01
    will embrace every human mind.
  • 5:01 - 5:12
    We are less than two generations now from the moment at which every human being will be connected to a single
  • 5:12 - 5:23
    network in which all thoughts, plans, dreams, and actions will flow as nervious impulses in the network.
  • 5:23 - 5:30
    And the fate of freedom of thought, indeed the fate of human freedom all together, everything that we
  • 5:30 - 5:39
    have fought for for a thousand years will depend upon the neuroanatomy of that network.
  • 5:39 - 5:50
    Ours are the last generation of human brains that will be formed without contact with the net.
  • 5:50 - 5:58
    From here on out every human brain, by two generations form now, every single human brain will be formed
  • 5:58 - 6:03
    from early life in direct connection to the network.
  • 6:03 - 6:16
    Humanity will become a super-organism in which each of us is but a neuron in the brain and we are describing
  • 6:16 - 6:24
    now, now, all of us now, this generation, unique in the history of the human race, in this generation
  • 6:24 - 6:29
    we will decide how that network is organized.
  • 6:29 - 6:33
    Unfortunately, we are beginning badly.
  • 6:33 - 6:40
    Here's the problem: we grew up to be consumers of media. That's what they taught us, we are consumers
  • 6:40 - 6:41
    of media. That's what they taught us.
  • 6:41 - 6:50
    Now media is consuming us.
  • 6:50 - 6:59
    The things we read watch us read them. The things we listen to listen to us listen to them.
  • 6:59 - 7:08
    We are tracked, we are monitored, we are predicted by the media we use.
  • 7:08 - 7:18
    The process of the building of the network institutionalizes basic principles of information flow.
  • 7:18 - 7:25
    It determines whether there is such a thing as anonymous reading.
  • 7:25 - 7:32
    And it is determining against anonymous reading.
  • 7:32 - 7:42
    20 years ago I began working as a lawyer for a man called Philip Zimmerman who had created a form of
  • 7:42 - 7:47
    public key encryption for mass use called "Pretty Good Privacy".
  • 7:47 - 7:53
    The effort to create Pretty Good Privacy was the effort to retain the possibility of secrets in the late
  • 7:53 - 8:00
    20th century. Phil was trying to prevent government from reading everything.
  • 8:00 - 8:07
    And, as a result, he was at least threatened with prosecution by the United States government for sharing
  • 8:07 - 8:13
    military secrets, which is what we called public key encryption back then.
  • 8:13 - 8:18
    We said you shouldn't do this, there will be trillions of dollars of electronic commerce if everybody
  • 8:18 - 8:22
    has strong encryption. Nobody was interested.
  • 8:22 - 8:31
    But what was important about Pretty Good Privacy, about the strugle for freedom that public key encryption
  • 8:31 - 8:39
    in civil society represented, what was crucial, became clear when we began to win.
  • 8:39 - 8:48
    In 1995 there was a debate at Harvard Law School. Four of us discussing the future of public key encryption
  • 8:48 - 8:51
    and it's control.
  • 8:51 - 8:59
    I was on the side, I suppose, of freedom. It is where I try to be. With me at that debate was a man called
  • 8:59 - 9:06
    Daniel Weitzner who now works in the White House making internet policy for the Obame Administration.
  • 9:06 - 9:14
    On the other side was the, then deputy, Attorney General of the United States and a lawyer in private
  • 9:14 - 9:21
    practice named Stuart Baker, who had been chief council to the National Security Agency, our listeners,
  • 9:21 - 9:30
    and who was then in private life helping businesses to deal with the listeners. He then became latter
  • 9:30 - 9:36
    on the deputy for policy planning in the Department of Homeland Security in the United States, and has
  • 9:36 - 9:43
    had much to do with what happened in our network after 2001. At any rate, the four of us spent two pleasant
  • 9:43 - 9:51
    hours debating the right to encrypt and at the end there was a little dinner party in the Harvard faculty
  • 9:51 - 9:56
    club and at the end, after all the food had been taken away and just the port and the walnuts were left
  • 9:56 - 10:03
    on the table, Stuart said "Alright, among us, now that we are all in private, just us girls, I'll let
  • 10:03 - 10:11
    our hair down" he didn't have much hair even then, but he let it down. "We're not going to prosecute
  • 10:11 - 10:20
    your client, Mr. Zimmerman" he said, "public key encryption will become available. We fought a long losing
  • 10:20 - 10:29
    battle against it but it was just a delaying tactic." And then he looked around the room and he said
  • 10:29 - 10:35
    "But nobody cares about anonymity, do they?"
  • 10:35 - 10:40
    And a cold chill went up my spine and I thought alright, Stuart, now I know, you're going to spend the
  • 10:40 - 10:47
    next 20 years trying to eliminate anonymity in human society and I'm going to try and stop you and we'll
  • 10:47 - 10:49
    see how it goes.
  • 10:49 - 10:53
    And it is going badly.
  • 10:53 - 10:59
    We didn't build the net with anonymity built in. That was a mistake.
  • 10:59 - 11:02
    Now we're paying for it.
  • 11:02 - 11:08
    Our network assumes that you can be tracked everywhere.
  • 11:08 - 11:15
    And we've taken the web and we've made Facebook out of it.
  • 11:15 - 11:20
    We put one man in the middle of everything.
  • 11:20 - 11:27
    We live our social lives, our private lives, in the web and we share everything with our friends, and
  • 11:27 - 11:38
    also with our super-friend, the one who reports to anybody who makes him, who pays him, who helps him
  • 11:38 - 11:45
    or who gives him the 100 billion dollar he desires.
  • 11:45 - 11:53
    We are creating a media that consume us, and media loves it.
  • 11:53 - 12:03
    The primary purpose of 21 first commerce is to predict how we can be made to buy.
  • 12:03 - 12:09
    And the thing that people most want us to buy is debt.
  • 12:09 - 12:20
    So we are going into debt. We're getting heavier, heavier with debt, heavier with doubt, heavier with
  • 12:20 - 12:27
    all we need we didn't know we needed until they told us we were thinking about it.
  • 12:27 - 12:33
    Because they own the search box and we put our dreams in it.
  • 12:33 - 12:40
    Everything we want, everything we hope, everything we'd like, everything we wish we knew about is in
  • 12:40 - 12:45
    the search box, and they own it.
  • 12:45 - 12:49
    We are reported everywhere, all the time.
  • 12:49 - 12:56
    In the 20th century you had to build Lubyanka, you had to torture people, you had to threaten people,
  • 12:56 - 13:06
    you had to press people to inform on their friends. I don't need to talk about that in Berlin.
  • 13:06 - 13:09
    In the 21st century, why bother?
  • 13:09 - 13:15
    You just build social networking and everybody informs on everybody else for you.
  • 13:15 - 13:21
    Why waste time and money having buildings full of little men who check who is in which photographs?
  • 13:21 - 13:28
    Just tell everybody to tag their friends and bing, you're done.
  • 13:28 - 13:32
    Oh, did I use that word, "bing?" You're done.
  • 13:32 - 13:40
    There's a search box, and they own it, and we put our dreams in it, and they eat them.
  • 13:40 - 13:44
    And they tell us who we are right back.
  • 13:44 - 13:47
    "If you like that, you'll love this"
  • 13:47 - 13:51
    And we do.
  • 13:51 - 13:56
    They figure us out, the machines do.
  • 13:56 - 14:00
    Every time you make a link, you're teaching the machine.
  • 14:00 - 14:06
    Every time you make a link about someone else, you're teaching the machine about someone else.
  • 14:06 - 14:11
    We need to build that network, we need to make that brain.
  • 14:11 - 14:19
    This is humanity's highest purpose, we're fuilfiling it. But we musn't do it wrong.
  • 14:19 - 14:25
    Once upon a time the technological mistakes were mistakes, we made them.
  • 14:25 - 14:34
    They were the unintended consequences of our thoughtful behavior. That's not the way it is right now.
  • 14:34 - 14:38
    The things that are going on are not mistakes, they're designs.
  • 14:38 - 14:50
    They have purpose, and the purpose is to make the human population readable.
  • 14:50 - 14:54
    I was talking to a senior government official in the United States a few weeks ago;
  • 14:54 - 14:59
    our government has been misbehaving.
  • 14:59 - 15:09
    We had rules, we made them after 9/11, they said "we will keep databases about people, and some of those people will be innocent.
  • 15:09 - 15:15
    they won't be suspected of anything." The rules we made in 2001 said
  • 15:15 - 15:23
    "We will keep information about people not suspected of anything for a maxium of 180 days,
  • 15:23 - 15:27
    they we will discard it."
  • 15:27 - 15:37
    In March, in the middle of the night, on a Wednesday, after everything shut down, when it was raining,
  • 15:37 - 15:42
    the Department of Justice and the director of National Intelligence in the United States said "Oh, we're
  • 15:42 - 15:52
    changing those rules. This small change: we used to say we would keep information on people not suspected
  • 15:52 - 16:00
    of anything for only 180 days maximum, we're changing that a little bit to five years."
  • 16:00 - 16:02
    Which is infinity.
  • 16:02 - 16:08
    I joked with the lawyers I work with in NY, they only wrote five years in the press release because they couldn't get the
  • 16:08 - 16:12
    sideways 8 into the font for the press release.
  • 16:12 - 16:16
    Otherwise they'd have just said "infinity," which is what they mean.
  • 16:16 - 16:22
    So I was having a conversation with a senior government official I have known all these many years who
  • 16:22 - 16:31
    works in the White House and I said "You're changing American society" he said, "Well we realized that
  • 16:31 - 16:38
    we need a robust social graph of the United States."
  • 16:38 - 16:42
    I said "You need a robust social graph of the United States?"
  • 16:42 - 16:43
    "Yes" he said.
  • 16:43 - 16:50
    I said "You mean the United States government is, from now on, going to keep a list of everybody every
  • 16:50 - 16:59
    American knows. Do you think by any chance that should require a law?"
  • 16:59 - 17:02
    And he just laughed.
  • 17:02 - 17:09
    Because they did it in a press release in the middle of the night, on Wednesday when it was raining.
  • 17:09 - 17:18
    We're going to live in a world, unless we do something quickly, in which our media consume us and spit
  • 17:18 - 17:22
    in the government's cup.
  • 17:22 - 17:29
    There will never have been any place like it before.
  • 17:29 - 17:36
    And if we let it happen, there will never ever be any place different from it again.
  • 17:36 - 17:47
    Humanity will all have been wired together and media will consume us and spit in the government's cup.
  • 17:47 - 17:51
    And the state will own our minds.
  • 17:51 - 18:00
    The soon-to-be-ex president of France campaigned, as you will recall last month on the proposition that
  • 18:00 - 18:09
    there should be criminal penalties for repeat visiting of jhiadi websites.That was a threat to
  • 18:09 - 18:15
    criminalize reading in France.
  • 18:15 - 18:28
    Well, he will be soon the ex-president of France, but that doesn't mean that that will be an ex-idea in France at all.
  • 18:28 - 18:34
    The criminalization of reading is well advanced.
  • 18:34 - 18:41
    In the United States, in what we call terrorism prosecutions, we now routinely see evidence of people's
  • 18:41 - 18:48
    Google searches submitted as proof of their conspiratorial behavior.
  • 18:48 - 18:57
    The act of seeking knowledge has become an overt act in conspiricy prosecutions.
  • 18:57 - 19:04
    We are criminalizing thinking, reading, and research.
  • 19:04 - 19:10
    We are doing this in so called free societies. We are doing this in a place with the 1st Amendment.
  • 19:10 - 19:20
    We are doing this despite everything our history teaches us because we are forgetting even as we learn.
  • 19:20 - 19:24
    We don't have much time.
  • 19:24 - 19:35
    The generation that grew up outside the net is the last generation that can fix it without force.
  • 19:35 - 19:45
    Governments all over the world are falling in love with the idea of datamining their populations.
  • 19:45 - 19:53
    I used to think that we were going to be fighting the Chinese Communit party in the third decade
  • 19:53 - 19:56
    of the 21st century.
  • 19:56 - 20:03
    I didn't anticipate that we were going to be fighting the United States government and the government
  • 20:03 - 20:08
    of the People's Republic of China.
  • 20:08 - 20:17
    And when Ms. Cruise is here on Friday, perhaps you'll ask her whether we're going to be fighting her too.
  • 20:17 - 20:21
    Governments are falling in love with datamining because it really really works.
  • 20:21 - 20:26
    It's good. It's good for good things as well as evil things.
  • 20:26 - 20:32
    It's good for helping government understand how to deliver services; it is good for government to understand
  • 20:32 - 20:42
    what the problems are going to be; it is good for politicians to understand how voters are going to think.
  • 20:42 - 20:48
    But it creates the possibility of kinds of social control that were previously very difficult, very
  • 20:48 - 20:56
    expensive, and very cumbersome, in very simple and efficient ways.
  • 20:56 - 21:03
    It is no longer necessary to maintain enormous networks of informants, as I have pointed out.
  • 21:03 - 21:12
    Stazi gets a bargin now, if it comes back, because Zuckerberg does its work for it.
  • 21:12 - 21:20
    But it is more than just the ease of surveillance it is more than just the permanence of data,
  • 21:20 - 21:25
    it's the relentless of living after the end of forgetting.
  • 21:25 - 21:30
    Nothing ever goes away any more.
  • 21:30 - 21:38
    What isn't understood today will be understood tomorrow. The encrypted traffic you use today in relative
  • 21:38 - 21:44
    security is simply waiting until there is enough of it for the crypto analysis to work.
  • 21:44 - 21:48
    For the breakers to succeed in breaking it.
  • 21:48 - 21:56
    We're going to have to re-do all our security all the time forever because no encrypted packet is ever
  • 21:56 - 21:59
    lost again.
  • 21:59 - 22:08
    Nothing is unconnected infinitely, only finitely. Every piece of information can be retained and everything
  • 22:08 - 22:12
    eventually gets linked to something else.
  • 22:12 - 22:19
    That's the rationale for the government official who says we need a robust social graph of the social
  • 22:19 - 22:20
    graph of the United States.
  • 22:20 - 22:22
    Why do you need it?
  • 22:22 - 22:30
    So the dots you don't connect today, you can connect tomorrow, or next year, or the year after next.
  • 22:30 - 22:39
    Nothing is ever lost, nothing ever goes away, nothing is forgotten any more.
  • 22:39 - 22:49
    So the primary form of collection that should concern us most is media that spy on us while we use them.
  • 22:49 - 22:57
    Books that watch us read them; music that listens to us listen to it; search boxes that report what we
  • 22:57 - 23:04
    are searching for to whoever is searching for us and doesn't know us yet.
  • 23:04 - 23:10
    There is a lot of talk about data coming out of Facebook.
  • 23:10 - 23:16
    Is it coming to me? Is it coming to him? Is it coming to them?
  • 23:16 - 23:21
    They want you to think that the threat is data coming out.
  • 23:21 - 23:27
    You should know that the threat is code going in.
  • 23:27 - 23:34
    For the last 15 years what has been happening in enterprise computing
  • 23:34 - 23:43
    is the addition of that layer of analytics on top of the data warehouse that mostly goes, in enterprise computing,
  • 23:43 - 23:46
    by the name of "business intelligence."
  • 23:46 - 23:54
    What it means is you've been building these vast data warehouses in your company for a decade or two now,
  • 23:54 - 24:01
    you have all the information about your own operations, your suppliers, your competitors, your customers,
  • 24:01 - 24:09
    now you want to make that data start to do tricks by adding it to all the open source data
  • 24:09 - 24:15
    out there in the world and using it to tell you the answers to questions you didn't know you had.
  • 24:15 - 24:18
    That's business intelligence.
  • 24:18 - 24:23
    The real threat of Facebook is the BI layer on top of the Facebook warehouse.
  • 24:23 - 24:27
    The Facebook data warehouse contains the behavior,
  • 24:27 - 24:31
    not just the thinking but also the behavior,
  • 24:31 - 24:36
    of somewhere nearing a billion people.
  • 24:36 - 24:43
    The business Intelligence layer on top of it, which is just all that code they get to run
  • 24:43 - 24:47
    covered by the terms of service that say they can run any code they want
  • 24:47 - 24:51
    for "improvement of the experience."
  • 24:51 - 24:59
    The business intelligence layer on top of Facebook is where every intelligence service in the world wants to go.
  • 24:59 - 25:07
    Imagine that you're a tiny little secret police organization in some not very important country.
  • 25:07 - 25:14
    Let's put ourselves in their position, let's call them, I don't know what, you know Kyrgyzstan.
  • 25:14 - 25:21
    You're secret police, you're in the people business. Secret policing is people business.
  • 25:21 - 25:29
    You have classes of people that you want. You want agents, you want sources, you have adversaries,
  • 25:29 - 25:33
    and you have "infulenciables," that is people that you can torture who are related to adversaries:
  • 25:33 - 25:39
    wives, husbands, fathers, daughters, you know, those people.
  • 25:39 - 25:46
    So you're looking for classes of people. You don't know their names but you know what they're like.
  • 25:46 - 25:51
    You know who is recruitable for you as an agent, you know who are likely sources.
  • 25:51 - 25:56
    You can give the social characteristics of your adversaries.
  • 25:56 - 26:00
    And once you know your adversaries you can find the infulencables.
  • 26:00 - 26:04
    So what you want to do is run code inside Facebook.
  • 26:04 - 26:07
    It will help you find the people that you want.
  • 26:07 - 26:14
    It will show you the people whose behavior and whose social circles tell you that they are what you want
  • 26:14 - 26:22
    by way of agents, sources, what the adversaries are and who you can torture to get to them.
  • 26:22 - 26:27
    So you don't want data out of Facebook, the minute you take data out of Facebook it is dead.
  • 26:27 - 26:32
    You want to put code into Facebook, and run it there and get the results.
  • 26:32 - 26:35
    You want to cooperate.
  • 26:35 - 26:39
    Facebook wants to be a media company.
  • 26:39 - 26:41
    It wants to own the web.
  • 26:41 - 26:45
    It wants you to punch "like" buttons.
  • 26:45 - 26:51
    "Like" buttons are terrific even if you don't punch them because they're web bugs,
  • 26:51 - 26:57
    because they show Facebook every other webpage that you touch that has a "like" button on it,
  • 26:57 - 27:00
    whether you punch it or you don't, they still get a record.
  • 27:00 - 27:09
    The record is you read a page which had a "like" button on it and either you said yes or you said no,
  • 27:09 - 27:16
    and either way you made data, you taught the machine.
  • 27:16 - 27:25
    So media want to know you better than you know yourself and we shouldn't let anybody do that.
  • 27:25 - 27:28
    We fought for a thousand years for the internal space,
  • 27:28 - 27:36
    the space where we read, think, reflect, and become unorthodox,
  • 27:36 - 27:39
    inside our own minds.
  • 27:39 - 27:44
    That's the space that everybody wants to take away.
  • 27:44 - 27:47
    Tell us your dreams.
  • 27:47 - 27:49
    Tell us your thoughts.
  • 27:49 - 27:51
    Tell us what you hope.
  • 27:51 - 27:52
    Tell us what you fear.
  • 27:52 - 28:01
    This is not weekly oricular confession, this is confession 24 by 7.
  • 28:01 - 28:03
    The mobile robot that you carry around with you,
  • 28:03 - 28:07
    the one that knows where you are all the times and listens to all your conversations.
  • 28:07 - 28:12
    The one that you hope isn't reporting in at headquarters but it is only hope?
  • 28:12 - 28:20
    The one that runs all that software you can't read, can't study, can't see, can't modify, and can't understand?
  • 28:20 - 28:21
    That one.
  • 28:21 - 28:27
    That one is taking your confession all the time.
  • 28:27 - 28:31
    When you hold it up to your face from now on it is going to know your heartbeat.
  • 28:31 - 28:34
    That's an Android app right now.
  • 28:34 - 28:40
    Micro changes in the color of your face reveal your heartrate.
  • 28:40 - 28:45
    That's a little lie detector you're carrying around with you.
  • 28:45 - 28:53
    Pretty soon I'll be able to sit in a classroom and watch the blood pressure of my students go up and down.
  • 28:53 - 29:01
    In a law school classroom in the United States, that is really important information.
  • 29:01 - 29:03
    But it is not just me of course, it's everybody right?
  • 29:03 - 29:07
    Because it's just data and people will have access to it.
  • 29:07 - 29:12
    The inside of your head becomes the outside of your face, becomes the inside of your smart phone,
  • 29:12 - 29:22
    becomes the inside of the network, becomes the front of the file at headquarters.
  • 29:22 - 29:28
    So we need free media or we lose freedom of thought. It's that simple.
  • 29:28 - 29:30
    What is free media mean?
  • 29:30 - 29:34
    Media that you can read, that you can think about, that you can add to,
  • 29:34 - 29:40
    that you can participate in without being monitored.
  • 29:40 - 29:47
    Without being surveiled, without being reported in on. That's free media.
  • 29:47 - 29:57
    If we don't have it, we lose freedom of thought, possibly forever.
  • 29:57 - 30:07
    Having free media means having a network that behaves according to the needs of the people at the edge,
  • 30:07 - 30:15
    not according to the needs of the servers in the middle.
  • 30:15 - 30:22
    Making free media requires a network of peers, not a network of masters and servants,
  • 30:22 - 30:34
    not a network of clients and servers, not a network where network operators control all the packets they move.
  • 30:34 - 30:41
    This is not simple, but it is still possible.
  • 30:41 - 30:47
    We require free technology.
  • 30:47 - 30:56
    The last time I gave a political speech in Berlin it was in 2004. It was called "Die Gedanken sind frei."
  • 30:56 - 31:04
    I said we need three things: free software, free hardware, free bandwidth. Now we need them more.
  • 31:04 - 31:09
    It is eight years later; we've made some mistakes; we're in more trouble.
  • 31:09 - 31:14
    We haven't come forward, we've gone back.
  • 31:14 - 31:20
    We need free software, that means software you can copy, modify, and re-distribute.
  • 31:20 - 31:32
    We need that because we need the software that runs the network to be modifiable by the people the network embraces.
  • 31:32 - 31:39
    The death of Mr. Jobs is a positive event. I am sorry to break it to you like that.
  • 31:39 - 31:43
    He was a great artist and a moral monster
  • 31:43 - 31:53
    and he brought us closer to the end of freedom every single time he put something out because he hated sharing.
  • 31:53 - 31:56
    It wasn't his fault, he was an artist.
  • 31:56 - 32:03
    He hated sharing because he believed he invented everything, even though he didn't.
  • 32:03 - 32:07
    Inside those fine little boxes with the lit up apples on them I see all around the room,
  • 32:07 - 32:14
    is a bunch of free software, tailored to give him control.
  • 32:14 - 32:18
    Nothing illegal, nothing wrong, he obeyed the licenses.
  • 32:18 - 32:23
    He screwed us every time he could and he took everything we gave him,
  • 32:23 - 32:29
    and he made beautiful stuff that controlled its users.
  • 32:29 - 32:36
    Once upon a time there was a man here who built stuff in Berlin, for Alberst Spare(sp),
  • 32:36 - 32:41
    his name was Philip Johnson and he was a wonderful artist and a moral monster.
  • 32:41 - 32:49
    And he said he went to work building buildings for the Nazis because they had all the best graphics.
  • 32:49 - 32:53
    And he meant it, because he was an artist.
  • 32:53 - 32:57
    As Mr. Jobs was an artist.
  • 32:57 - 33:01
    But artistry is no guaranty of morality.
  • 33:01 - 33:05
    We need free software.
  • 33:05 - 33:12
    The tablets that you use that Mr. Jobs designed are made to control you.
  • 33:12 - 33:19
    You can't change the software. It's hard even to do ordinary programming.
  • 33:19 - 33:27
    It doesn't really matter, they're just tablets, we just use them, we're just consuming the glories of what they give us.
  • 33:27 - 33:32
    But they're consumming you too.
  • 33:32 - 33:39
    We live, as the science fiction we read when we were children suggested we would, among robots not.
  • 33:39 - 33:46
    We live commensally with robots. But they don't have hands and feet, we're their hands and feet.
  • 33:46 - 33:52
    We carry the robots around with us; they know everywhere we go, they see everything we see;
  • 33:52 - 33:59
    everything we say they listen to and there is no first law of robotics.
  • 33:59 - 34:05
    They hurt us every day and there is no progamming to prevent it.
  • 34:05 - 34:08
    So we need free software.
  • 34:08 - 34:17
    Unless we control the software in the network, the network will, in the end, control us.
  • 34:17 - 34:18
    We need free hardware.
  • 34:18 - 34:27
    What that means is that when we buy an electronic something, it should be ours, not someone else's.
  • 34:27 - 34:32
    We should be free to change it, to use it our way,
  • 34:32 - 34:37
    to assure that it is not working for anyone other than ourselves.
  • 34:37 - 34:42
    Of course most of us will never change anything.
  • 34:42 - 34:49
    But the fact that we can change it will keep us safe.
  • 34:49 - 34:56
    Of course we will never be the people that they most want to surveil.
  • 34:56 - 35:05
    The man who will not be president of France, for sure, but who thought he would, now says that he was
  • 35:05 - 35:12
    trapped and his political career was destroyed, not because he raped a hotel housekeeper,
  • 35:12 - 35:17
    but because he was setup by spying inside his smart phone.
  • 35:17 - 35:21
    Maybe he's telling the truth and maybe he isn't.
  • 35:21 - 35:24
    But he's not wrong about the smart phone.
  • 35:24 - 35:29
    Maybe it happened, maybe it didn't, but it will.
  • 35:29 - 35:32
    We carry dangerous stuff around with us everywhere we go.
  • 35:32 - 35:35
    It doesn't work for us, it works for someone else.
  • 35:35 - 35:40
    We put up with it, we have to stop.
  • 35:40 - 35:42
    We need free bandwidth.
  • 35:42 - 35:47
    That means we need network operators who are common carriers,
  • 35:47 - 35:50
    whose only job is to move the packet from A to B.
  • 35:50 - 35:56
    They're meerly pipes, they're not allowed to get involved.
  • 35:56 - 35:59
    It used to be that when you shipped a thing from point A to point B,
  • 35:59 - 36:05
    if the guy in the middle opened it up and looked inside it, he was committing a crime.
  • 36:05 - 36:07
    Not any more.
  • 36:07 - 36:11
    In the United States, the House of Representatives voted last week
  • 36:11 - 36:18
    that the network operators in the United States should be completely immunized against lawsuits
  • 36:18 - 36:28
    for cooperating with illegal government spying so long as they do it "in good faith."
  • 36:28 - 36:33
    And capitalism means never having to say you're sorry; you're always doing it in good faith.
  • 36:33 - 36:38
    In good faith all we wanted to do was make money, your honor, let us out.
  • 36:38 - 36:40
    Ok, you're gone.
  • 36:40 - 36:43
    We must have free bandwidth.
  • 36:43 - 36:49
    We still own the electromagnetic spectrum; it still belongs to all of us.
  • 36:49 - 36:55
    It doesn't belong to anyone else. Government is a trustee, not an owner.
  • 36:55 - 37:01
    We have to have spectrum we control, equal for everybody.
  • 37:01 - 37:10
    Nobody's allowed to listen to anybody else, no inspecting, no checking, no record keeping.
  • 37:10 - 37:13
    Those have to be the rules.
  • 37:13 - 37:19
    Those have to be the rules in the same way that censorship had to go.
  • 37:19 - 37:24
    If we don't have rules for free communication, we are re-introducing censorship,
  • 37:24 - 37:28
    whether we know it or not.
  • 37:28 - 37:31
    So we have very little choice now,
  • 37:31 - 37:40
    our space has gotten smaller, our opportunity for change has gotten less.
  • 37:40 - 37:48
    We have to have free software. We have to have free hardware. We have to have free bandwidth.
  • 37:48 - 37:53
    Only from them can we make free media.
  • 37:53 - 37:57
    But we have to work on media too, directly.
  • 37:57 - 38:02
    Not intermitently, not off-hand.
  • 38:02 - 38:11
    We need to demand of media organizations that they obey primary ethics, a first law of media robotics:
  • 38:11 - 38:13
    do no harm.
  • 38:13 - 38:20
    The first rule is: "do not surveil the reader."
  • 38:20 - 38:24
    We can't live in a world where every book reports every reader.
  • 38:24 - 38:31
    If we are, we're living in libraries operated by the KGB.
  • 38:31 - 38:35
    Well, amazon.com
  • 38:35 - 38:40
    Or the KGB, or both, you'll never know.
  • 38:40 - 38:47
    The book, that wonderful printed article, that first commodity of mass capitalism,
  • 38:47 - 38:49
    the book is dying.
  • 38:49 - 38:52
    It's a shame, but it's dying.
  • 38:52 - 38:59
    And the replacement is a box which either surveils the reader or it doesn't.
  • 38:59 - 39:02
    You will remember that amazon.com decided
  • 39:02 - 39:09
    that a book by George Orwell could not be distributed in the United States for copyright reasons.
  • 39:09 - 39:14
    They went and errased it out of all the little amazon book reading devices
  • 39:14 - 39:18
    where customers had "purchased" copies of Animal Farm.
  • 39:18 - 39:25
    Oh, you may have bought it but that doesn't mean that you're allowed to read it.
  • 39:25 - 39:28
    That's censorship.
  • 39:28 - 39:31
    That's book burning.
  • 39:31 - 39:37
    That's what we all lived through in the 20th century.
  • 39:37 - 39:41
    We burned people, places, and art.
  • 39:41 - 39:42
    We fought.
  • 39:42 - 39:50
    We killed tens of millions of people to bring an end to a world in which the state would burn books.
  • 39:50 - 39:53
    And then we watched as it was done again and again.
  • 39:53 - 39:58
    And now we are preparing to allow it to be done without matches.
  • 39:58 - 40:02
    Everywhere, any time.
  • 40:02 - 40:09
    We must have media ethics, and we have the power to enforce those ethics
  • 40:09 - 40:12
    because we're still the people who pay the freight.
  • 40:12 - 40:18
    We should not deal with people who sell surveiled books.
  • 40:18 - 40:25
    We should not deal with people who sell surveiled music.
  • 40:25 - 40:34
    We should not deal with movie companies that sell surveiled movies.
  • 40:34 - 40:39
    We are going to have to say that, even as we work on the technology,
  • 40:39 - 40:48
    because otherwise capitalism will move as fast as possible to make our efforts at freedom irrelevant
  • 40:48 - 40:55
    and there are children growing up who will never know what freedom means.
  • 40:55 - 40:58
    So we have to make a point about it.
  • 40:58 - 41:01
    It will cost us a little bit.
  • 41:01 - 41:03
    Not much, but a little bit.
  • 41:03 - 41:11
    We will have to forgoe and make a few sacrifices in our lives to enforce ethics on media.
  • 41:11 - 41:14
    But that's our role.
  • 41:14 - 41:17
    Along with making free technology, that's our role.
  • 41:17 - 41:23
    We are the last generation capable of understanding directly what the changes are
  • 41:23 - 41:28
    because we have lived on both sides of them and we know.
  • 41:28 - 41:31
    So we have a responsibility.
  • 41:31 - 41:36
    You understand that.
  • 41:36 - 41:39
    It's always a surprise to me, though it is deeply true,
  • 41:39 - 41:44
    that of all the cities in the world I travel to, Berlin in the freeist.
  • 41:44 - 41:48
    You cannot wear a hat in the Hong Kong airport any more,
  • 41:48 - 41:52
    I found out last month trying to wear my hat in the Hong Kong airport.
  • 41:52 - 41:58
    You're not allowed, it disrupts the facial recognition.
  • 41:58 - 42:03
    There will be a new airport here. Will it be so heavily surveiled
  • 42:03 - 42:09
    that you won't be allowed to wear a hat because it disrupts the facial recognition?
  • 42:09 - 42:13
    We have a responsibility. We know.
  • 42:13 - 42:16
    That's how Berlin became the freeist city that I go to.
  • 42:16 - 42:20
    Because we know. Because we have a responsibility.
  • 42:20 - 42:27
    Because we remember, because we've been on both sides of the wall.
  • 42:27 - 42:30
    That must not be lost now.
  • 42:30 - 42:35
    If we forget, no other forgetting will ever happen.
  • 42:35 - 42:37
    Everything will be remembered.
  • 42:37 - 42:42
    Everything you read, all through life, everything you listened to,
  • 42:42 - 42:46
    everything you watched, everything you searched for.
  • 42:46 - 42:54
    Surely we can pass along to the next generation a world freeier than that.
  • 42:54 - 42:56
    Surely we must.
  • 42:56 - 42:59
    What if we don't?
  • 42:59 - 43:08
    What will they say when they realize that we lived at the end of a thousand years
  • 43:08 - 43:12
    of struggling for freedom of thought, at the end.
  • 43:12 - 43:19
    When we had almost everything we gave it away.
  • 43:19 - 43:24
    For convenience. For social networking.
  • 43:24 - 43:28
    Because Mr. Zuckerberg asked us to.
  • 43:28 - 43:33
    Because we couldn't find a better way to talk to our friends.
  • 43:33 - 43:41
    Because we loved the beautiful pretty things that felt so warm in the hand.
  • 43:41 - 43:47
    Because we didn't really care about the future of freedom of thought.
  • 43:47 - 43:50
    Because we considered that to be someone else's business.
  • 43:50 - 43:52
    Because we thought it was over.
  • 43:52 - 43:56
    Because we believed we were free.
  • 43:56 - 43:59
    Because we didn't think there was any struggling left to do.
  • 43:59 - 44:01
    That's why we gave it all away.
  • 44:01 - 44:04
    Is that what we're going to tell them?
  • 44:04 - 44:07
    Is that what we're going to tell them?
  • 44:07 - 44:13
    Free thought requires free media.
  • 44:13 - 44:19
    Free media requires free technology.
  • 44:19 - 44:30
    We require ethical treatment when we go to read, to write, to listen, and to watch.
  • 44:30 - 44:38
    Those are the hallmarks of our politics. We need to keep those politics until we die.
  • 44:38 - 44:43
    Because, if we don't, something else will die,
  • 44:43 - 44:50
    something so precious that many many many of our fathers and mothers gave their lives for it.
  • 44:50 - 44:56
    Something so precious that we understood it to define what it meant to be human.
  • 44:56 - 45:01
    It will die if we don't keep those politics for the rest of our lives.
  • 45:01 - 45:09
    And if we do, then all the things we struggled for, we'll get.
  • 45:09 - 45:15
    Because everywhere on earth, everybody will be able to read freely.
  • 45:15 - 45:22
    Because all the Einsteins in the street will be allowed to learn.
  • 45:22 - 45:28
    Because all the Stravinskys will become composers.
  • 45:28 - 45:32
    Because all the Saulks will become research physicians.
  • 45:32 - 45:37
    Because humanity will be connected and every brain will be allowed to learn
  • 45:37 - 45:43
    and no brain will be crushed for thinking wrong.
  • 45:43 - 45:47
    We're at the moment where we get to pick.
  • 45:47 - 45:52
    Whether we carry through that great revolution we've been making
  • 45:52 - 45:57
    bit by bloody bit for a thousand years,
  • 45:57 - 46:03
    or whether we give it away for convenience,
  • 46:03 - 46:07
    for simplicity of talking to our friends, for speed in search,
  • 46:07 - 46:13
    and other really important stuff.
  • 46:13 - 46:20
    I said in 2004, when I was here, and I say now, we can win.
  • 46:20 - 46:28
    We can be the generation of people who completed the work of building freedom of thought.
  • 46:28 - 46:35
    I didn't say then, and I must say now, that we are also potentially the generation that can lose.
  • 46:35 - 46:42
    We can slip back into an inquisition worse than any inquisition that ever existed.
  • 46:42 - 46:50
    It may not use as much torture, it may not be as bloody, but it will be more effective.
  • 46:50 - 46:53
    And we mustn't mustn't let that happen.
  • 46:53 - 46:58
    Too many people fought for us. Too many people died for us.
  • 46:58 - 47:02
    Too many people hoped and dreamed for what we can still make possible.
  • 47:02 - 47:06
    We must not fail.
  • 47:06 - 47:07
    Thank you very much.
  • 47:07 - 48:00
    [Applause]
  • 48:00 - 48:03
    Let's learn how to take questions here.
  • 48:03 - 48:08
    It's not going to be simple but let's set a good example.
  • 48:08 - 48:21
    [pause]
  • 48:21 - 48:22
    [Questioner 1] Thank you.
  • 48:22 - 48:26
    [Questioner 1] You put forward a very gruesome picture of the possible future.
  • 48:26 - 48:29
    [Questioner 1] Could you name some organizations or groups
  • 48:29 - 48:36
    [Questioner 1] in the United States that put forward actions in your way,
  • 48:36 - 48:41
    [Questioner 1] in your positive way of transforming society?
  • 48:41 - 48:45
    Not only in the United States, but around the world we have organizations
  • 48:45 - 48:48
    that are concerned with electronic civil liberties.
  • 48:48 - 48:51
    The EFF, the Electronic Frontier Foundation in the United States.
  • 48:51 - 48:54
    La Quadrature du Net in France
  • 48:54 - 48:57
    Bits of Freedom in the Netherlands and so on.
  • 48:57 - 49:02
    Electronic civil liberties agitation is extraordinarily important.
  • 49:02 - 49:05
    Pressure on governments to obey rules that came down from
  • 49:05 - 49:10
    the 18th century regarding protection of human dignity
  • 49:10 - 49:15
    and the prevention of state surveillance are crucially important.
  • 49:15 - 49:20
    Unfortunately, electronic civil liberties work against governments are not enough.
  • 49:20 - 49:26
    The free software movement, the FSF, the Free Software Foundation in the United States,
  • 49:26 - 49:29
    and the Free Software Foundation Europe, headquartered in Germany,
  • 49:29 - 49:36
    are working in an important way to maintain that system of
  • 49:36 - 49:42
    the anarchistic creation of software which has brought us so much technology we can control.
  • 49:42 - 49:44
    That's crucially important.
  • 49:44 - 49:48
    The Creative Commons movement, which is strongly entrenched,
  • 49:48 - 49:53
    not only in the United States and Germany, but in more than 40 countries around the world
  • 49:53 - 50:00
    is also extraordinarily important because Creative Commons gives to creative workers
  • 50:00 - 50:06
    alternatives to the kind of massive over control in the copyright system
  • 50:06 - 50:11
    which makes surveillance media profitable.
  • 50:11 - 50:15
    The Wikipedia is an extraordinarily important human institution.
  • 50:15 - 50:21
    and we need to continue to support the Wikimedia Foundation as deeply as we can.
  • 50:21 - 50:26
    Of the 100 most visited websites in the United States,
  • 50:26 - 50:29
    in a study conducted by the Wall Street Journal,
  • 50:29 - 50:33
    of the 100 most visited websites in the United States,
  • 50:33 - 50:37
    only one does not surveil its users.
  • 50:37 - 50:41
    You can guess which one it is, it's Wikipedia.
  • 50:41 - 50:48
    We have enormously important developments now going on throughout the world of higher education
  • 50:48 - 50:54
    as universities begin to realize that the costs of higher education must come down.
  • 50:54 - 50:58
    and that brains will grow in the web.
  • 50:58 - 51:04
    The Universitat Oberta de Catalunya the walk is the most extrodinary
  • 51:04 - 51:08
    online only university in the world right now.
  • 51:08 - 51:14
    It will soon be competing with more extraordinary universities still.
  • 51:14 - 51:19
    MITx, the Massachusetts Institute of Technology's new program
  • 51:19 - 51:24
    for web education will provide the highest quality technical education on earth
  • 51:24 - 51:30
    for free to everybody everywhere all the time, building on existing MIT OpenCourseWare.
  • 51:30 - 51:36
    Stanford is about to spin off a proprietary web learning structure
  • 51:36 - 51:40
    which will be the Google of higher education if Stanford gets it lucky.
  • 51:40 - 51:44
    We need to support free higher education on the web.
  • 51:44 - 51:49
    Every European national ministry of Education should be working on it.
  • 51:49 - 51:57
    There are many places to look for free software, free hardware, free bandwidth, and free media.
  • 51:57 - 52:02
    There's no better place to look for free media right now on earth than this room.
  • 52:02 - 52:06
    Everybody knows what they can do, they're doing it.
  • 52:06 - 52:12
    We just have to make everybody else understand that if we stop, or if we fail,
  • 52:12 - 52:16
    freedom of thought will be the causality and we will regret it forever.
  • 52:18 - 52:22
    [Organizer] We've had three more questions in the meantime.
  • 52:22 - 52:24
    [Organizer] The gentleman with the microphone over here will begin.
  • 52:24 - 52:27
    [Organizer] And then one, two, I'm sure there are more of you in the back
  • 52:27 - 52:29
    [Organizer] so raise your hands high.
  • 52:29 - 52:32
    [Organizer] We'll take maybe your first please.
  • 52:32 - 52:36
    [Questioner 2] Thank you very much, I just wanted to ask a short question.
  • 52:36 - 52:42
    [Questioner 2] Can Facebook, can iPhone, and free media coexist on a long range?
  • 52:42 - 52:44
    Probably not.
  • 52:44 - 52:47
    But we don't have to worry too much.
  • 52:47 - 52:53
    iPhone is just a product and Facebook is just a commercial version of a service.
  • 52:53 - 52:56
    I said recently to a newspaper in New York that I thought Facebook
  • 52:56 - 53:00
    would continue to exist for somewhere between 12 and 120 months.
  • 53:00 - 53:02
    I still think that is correct.
  • 53:02 - 53:06
    Federated social networking will become available.
  • 53:06 - 53:11
    Federated social networking in a form which allows you to leave Facebook
  • 53:11 - 53:15
    without leaving your friends, will become available.
  • 53:15 - 53:20
    Better forms of communication without a man in the middle will become available.
  • 53:20 - 53:22
    The question will be will people use them?
  • 53:22 - 53:27
    FreedomBox is an attempt to produce a stack of software
  • 53:27 - 53:31
    that will fit in a new generation of low power, low cost hardware servers
  • 53:31 - 53:35
    the size of mobile phone chargers.
  • 53:35 - 53:37
    And if we do that work right, we will be able to give
  • 53:37 - 53:40
    billions of web servers to the net.
  • 53:40 - 53:45
    Which will server the purpose of providing competting services
  • 53:45 - 53:50
    that don't invade privacy and are compatible with existing services.
  • 53:50 - 53:54
    But mobile phones get changed very frequently so iPhone goes away
  • 53:54 - 53:56
    it's no big deal.
  • 53:56 - 53:58
    and web services are much less unique
  • 53:58 - 54:00
    than they appear right now.
  • 54:00 - 54:04
    Facebook's a brand, it's not a thing that we need to worry about
  • 54:04 - 54:05
    in any great particular.
  • 54:05 - 54:08
    We just have to do it in as quickly as possible.
  • 54:08 - 54:10
    Co-existence?
  • 54:10 - 54:12
    Well all I have to say about that is that is
  • 54:12 - 54:16
    they're not going to co-exist with freedom
  • 54:16 - 54:18
    so I'm not sure why I should co-exist with them.
  • 54:18 - 54:26
    [Applause]
  • 54:26 - 54:28
    [Questioner 3]Hi, I'm Trey Gulalm(sp) from Bangladesh.
  • 54:28 - 54:32
    [Questioner 3]Thank you for that wonderfully lucid, sintilating,
  • 54:32 - 54:35
    [Questioner 3] and hugely informative presentation.
  • 54:35 - 54:39
    [Questioner 3]I was involved in introducing email to Bangladesh
  • 54:39 - 54:44
    [Questioner 3] in the early 90's and, at that time connectivity was very expensive
  • 54:44 - 54:47
    [Questioner 3]we were spending $0.30 US cents per kilobyte
  • 54:47 - 54:51
    [Questioner 3]so a megabyte of data would be $100, $300.
  • 54:51 - 54:55
    [Questioner 3]It's changes from them but it is still very tightly constrained
  • 54:55 - 54:59
    [Questioner 3]by the regulatory bodies. So we on the ground find it
  • 54:59 - 55:01
    [Questioner 3]very difficult because the powers that be,
  • 55:01 - 55:05
    [Questioner 3]the gate keepers have a vested interest in maintaining that.
  • 55:05 - 55:10
    [Questioner 3]But in that gatekeeper nexus there is also a nexus
  • 55:10 - 55:14
    [Questioner 3]between governments in my country and governments in yours,
  • 55:14 - 55:19
    [Questioner 3]and right now the largest biometric data in the world
  • 55:19 - 55:22
    [Questioner 3]is the census of Bangladesh
  • 55:22 - 55:25
    [Questioner 3]and the company that's providing it is a company
  • 55:25 - 55:28
    [Questioner 3]that's directly linked to the CIA.
  • 55:28 - 55:30
    [Questioner 3]So what do we as pracatitioners do
  • 55:30 - 55:33
    [Questioner 3]to overcome very very powerful entities?
  • 55:33 - 55:39
    This is why I began by speaking about the United States Government's recent behaviors.
  • 55:39 - 55:44
    My colleagues at the Software Freedom Law Center in India
  • 55:44 - 55:49
    have been spending a lot of time this past month trying to get a motion
  • 55:49 - 55:52
    through the upper house of the Indian Parliment
  • 55:52 - 55:57
    to nullify Department of IT regulations on the censorship of the Indian net.
  • 55:57 - 56:00
    And of course the good news is that the largest biometric database in the world
  • 56:00 - 56:04
    will soon be the retinal scans that the Indian government
  • 56:04 - 56:08
    are going to require if you want to have a propane gas cylinder,
  • 56:08 - 56:11
    or anything else like energy for your home.
  • 56:11 - 56:17
    And the difficulty that we've been having in talking to Indian government officials this month
  • 56:17 - 56:22
    is that they say "Well if the Americans can do it, why can't we?"
  • 56:22 - 56:25
    Which is, unfortunately true.
  • 56:25 - 56:29
    The United States government has this winter lowered the bar around the world
  • 56:29 - 56:35
    on internet freedom in the sense of datamining your society to the Chinese level.
  • 56:35 - 56:37
    They've fundamentally agreed.
  • 56:37 - 56:41
    They're going to datamine the hell out of their populations
  • 56:41 - 56:45
    and they're going to encourage ever other state on earth to do the same.
  • 56:45 - 56:49
    So I'm entirely with you about the definition of the problem.
  • 56:49 - 56:55
    We are not now any longer living in a place, in a stage in our history
  • 56:55 - 56:58
    where we can think in terms of a country at a time.
  • 56:58 - 57:01
    Globalization has reached the point at which these questions
  • 57:01 - 57:05
    of surveillance of society are now global questions
  • 57:05 - 57:10
    and we have to work on them under the assumption that no government
  • 57:10 - 57:15
    will decide to be more virtuous than the super powers.
  • 57:15 - 57:18
    I don't know how we're going to deal with the Chinese Communist Party.
  • 57:18 - 57:20
    I do not know.
  • 57:20 - 57:22
    I know how we're going to deal with the American government.
  • 57:22 - 57:26
    We're going to insist on our rights.
  • 57:26 - 57:30
    We're going to do what it makes sense to do in the United States.
  • 57:30 - 57:32
    We're going to litigate about it.
  • 57:32 - 57:33
    We're going to push.
  • 57:33 - 57:35
    We're going to shove.
  • 57:35 - 57:39
    We're going to be everywhere, including in the street about it.
  • 57:39 - 57:43
    And I suspect that's what's going to happen here too.
  • 57:43 - 57:46
    Unless we move the biggest of the societies on earth,
  • 57:46 - 57:50
    we will have no hope of convincing smaller governments
  • 57:50 - 57:53
    that they have to let go of their controls.
  • 57:53 - 57:55
    So far as bandwidth is concerned, of course,
  • 57:55 - 57:58
    we're going to have to use unregulated bandwidth.
  • 57:58 - 58:01
    That is, we're going to have to build around 802.11 and wifi
  • 58:01 - 58:07
    and any other thing that the rules don't prevent us from using.
  • 58:07 - 58:11
    And how is that going to reach the poorest of the poor?
  • 58:11 - 58:14
    When the mobile phone system can be shaped to reach
  • 58:14 - 58:17
    the poorest of the poor? I don't know.
  • 58:17 - 58:21
    But I've got a little project with street children in Bangalore trying to figure it out.
  • 58:21 - 58:22
    We have to.
  • 58:22 - 58:24
    We have to work everywhere.
  • 58:24 - 58:30
    If we don't, we're going to screw it up for humanity and we can't afford the risk.
  • 58:30 - 58:32
    [Organizer]Thank you. The gentleman over here please.
  • 58:32 - 58:35
    [Questioner 4]Yes, Professor Moglen,I also want to thank you.
  • 58:35 - 58:40
    [Questioner 4]I can tell you that I from transformingfreedom.org in Vienna
  • 58:40 - 58:45
    [Questioner 4]and some years ago I saw you talking on a web video
  • 58:45 - 58:49
    [Questioner 4]at FOSDEM, and there I saw you pointing out
  • 58:49 - 58:54
    [Questioner 4]the role of Zimmerman, Fredrick, and we tried to help him as well.
  • 58:54 - 58:59
    [Questioner 4]But listening to you today, I see that this is just too slow,
  • 58:59 - 59:04
    [Questioner 4]too little, and I'm a bit amazed at two things.
  • 59:04 - 59:09
    [Questioner 4]The first is the academic system, let's say the European one
  • 59:09 - 59:13
    [Questioner 4]was founded by Plato and was closed down by force
  • 59:13 - 59:16
    [Questioner 4]about a thousand years after.
  • 59:16 - 59:20
    [Questioner 4]The second start of the European University was
  • 59:20 - 59:23
    [Questioner 4]around the last few century and let's see if we get there,
  • 59:23 - 59:28
    [Questioner 4]to have it running as long as a thousand years.
  • 59:28 - 59:32
    [Questioner 4]So my question is why is it not deeply in the cell structure
  • 59:32 - 59:37
    [Questioner 4]of academia to help the cause that you have talked about today
  • 59:37 - 59:42
    [Questioner 4]and why don't we have philanthropists helping
  • 59:42 - 59:46
    [Questioner 4]our little projects, running for three or five
  • 59:46 - 59:49
    [Questioner 4]thousand euros here and there, much more let's say
  • 59:49 - 59:53
    [Questioner 4] efficiently, like maybe you would agree that Mr. Soros
  • 59:53 - 59:56
    [Questioner 4]tries to do?
  • 59:56 - 60:02
    Some years ago, at Columbia, we tried to interest Faculty
  • 60:02 - 60:07
    in the state of preservation of the libraries and I saw
  • 60:07 - 60:13
    more distinguished scholars at my own university than at any other time
  • 60:13 - 60:17
    in my 25 years there, engaged politically.
  • 60:17 - 60:22
    Their primary concern was the aging of the paper
  • 60:22 - 60:26
    on which were printed the 19th century German Docterate
  • 60:26 - 60:32
    that conserved more philological research than any other literature on earth
  • 60:32 - 60:33
    right?
  • 60:33 - 60:37
    But it was 19th century books that they needed to preserve.
  • 60:37 - 60:41
    The problem with academis life is that it is inherently conservative
  • 60:41 - 60:44
    because it preserves the wisdom of the old.
  • 60:44 - 60:46
    And that is a good thing to do
  • 60:46 - 60:48
    but the wisdom of the old is old,
  • 60:48 - 60:53
    and it doesn't necessarily embrace the issues of the moment perfectly.
  • 60:53 - 60:56
    I mentioned the walk because I think it is so important
  • 60:56 - 61:02
    to support the university as it maneuvers itself towards the net
  • 61:02 - 61:06
    and away from the forms of learning that have characterized
  • 61:06 - 61:10
    the matriculatry university of the past.
  • 61:10 - 61:16
    For the last thousand years mostly we moved scholars to books,
  • 61:16 - 61:19
    and the university grew up around that principle.
  • 61:19 - 61:22
    It grew up around the principle that books are hard to move
  • 61:22 - 61:23
    and people are easy.
  • 61:23 - 61:26
    So you bring everybody to it.
  • 61:26 - 61:29
    Now we live in a world in which it is much simplier
  • 61:29 - 61:31
    to move knowledge to people.
  • 61:31 - 61:37
    But the continuance of ignorance is the desire of businesses that sell knowledge.
  • 61:37 - 61:40
    What we really need is to being ourselves to help
  • 61:40 - 61:43
    to turn the university system into something else.
  • 61:43 - 61:46
    The something which allows everybody to learn
  • 61:46 - 61:50
    and which demands unsurveiled learning.
  • 61:50 - 61:52
    The commissioner for Information Society will be here;
  • 61:52 - 61:57
    she should speak to that.
  • 61:57 - 62:01
    That should be the great question of the European Commission.
  • 62:01 - 62:04
    They know, they printed a report 18 months ago,
  • 62:04 - 62:06
    that said for the cost of 100km of road you
  • 62:06 - 62:10
    can scan 1/6 of all the books in European libraries.
  • 62:10 - 62:15
    That means for the cost of 600km of road we could get 'em all.
  • 62:15 - 62:17
    We built a lot of roads in a lot of places,
  • 62:17 - 62:19
    including Greece, in the last 10 year,
  • 62:19 - 62:22
    and we could've scanned all the books in Europe at the same time,
  • 62:22 - 62:28
    and made them available to all humanity on an un-surveiled basis.
  • 62:28 - 62:32
    If Ms. Cruise wants to build a monument to herself,
  • 62:32 - 62:36
    it isn't going to be as a feif a day politician.
  • 62:36 - 62:39
    She's going to do it this way, and you're going to ask her.
  • 62:39 - 62:41
    I'm going to be on a plane on my way back across the Atlantic,
  • 62:41 - 62:44
    or I promise you I'd ask her myself.
  • 62:44 - 62:46
    Ask her for me.
  • 62:46 - 62:51
    Say "It's not our fault, Eben want's to know. If you want to hurt somebody, hurt him. You should be changing
  • 62:51 - 62:56
    the European University. You should be breaking it up into un-surveiled reading. You should be putting
  • 62:56 - 63:04
    Google books and Amazon out of business. That's some North American, Anglo-Saxon, elbow capitalism thing.
  • 63:04 - 63:07
    Why aren't we making knowledge free in Europe
  • 63:07 - 63:10
    and assuring that it's un-surveiled?"
  • 63:10 - 63:16
    That would be the biggest step possible and it is within their power.
  • 63:16 - 63:28
    [Organizer] Thank you so much. [Applause] Brilliant. thank you.
Title:
Why Freedom of Thought Requires Free Media and Why Free Media Require Free Technology
Description:

Media that spy on and data-mine the public are capable of destroying humanity's most precious freedom: freedom of thought. Ensuring that media remain structured to support rather than suppress individual freedom and civic virtue requires us to achieve specific free technology and free culture goals. Our existing achievements in these directions are under assault from companies trying to bottleneck human communications or own our common culture, and states eager to control their subjects' minds. In this talk--one of a series beginning with "The dotCommunist Manifesto" and "Die Gedanken Sind Frei"--I offer some suggestions about how the Free World should meet the challenges of the next decade.

more » « less
Video Language:
English
Duration:
01:03:42

English subtitles

Revisions