Return to Video

Presentation of the FreedomBox from James Vasile

  • 0:08 - 0:11
    I'm very proud to have as a guest here from the United States
  • 0:11 - 0:15
    coming to Elevate is James Vasile of the Freedom Box Foundation
  • 0:15 - 0:21
    James Vasile is working on a multitude of projects
  • 0:21 - 0:24
    like Apache, I think, Joomla and many others. He is also a lawyer,
  • 0:24 - 0:31
    and he's working also with the Freedom Box Foundation and the Free Software Foundation.
  • 0:31 - 0:38
    He's going to present one of the, in my opinion, most revolutionary projects I've seen in recent years
  • 0:38 - 0:43
    as we can see here, a little small box, the Freedom Box.
  • 0:43 - 0:48
    Yeah, erm, James is going to do a presentation and then we're going to
  • 0:48 - 0:50
    be open for questions and then sit down for a talk
  • 0:50 - 0:54
    so James, I give the floor to you.
  • 0:54 - 0:57
    Thank you, Daniel.
  • 0:57 - 1:03
    I've been here at the Elevate festival for a few days now
  • 1:03 - 1:10
    I've been attending the talks and the films and the music
  • 1:10 - 1:16
    and this has been an amazing place to see all these different ideas coming togethers
  • 1:16 - 1:21
    I want to say thank you to Daniel for organizing so much
  • 1:21 - 1:24
    of this. To Joseph as well.
  • 1:24 - 1:30
    To Daniel especially for making a big effort to get me out here,
  • 1:30 - 1:33
    making it possible for me to come out here and being such a gracious host.
  • 1:33 - 1:36
    Thank you Dan, I really appreciate it.
  • 1:36 - 1:43
    APPLAUSE
  • 1:43 - 1:53
    A long time ago, in the beginning of the internet
  • 1:53 - 1:57
    When we first started using the internet as a way to talk to each other
  • 1:57 - 2:01
    We mostly talked directly to each other, right?
  • 2:01 - 2:05
    Think about how email works, on a technical level
  • 2:05 - 2:10
    You take a message, you hand it off to your mail transport agent
  • 2:10 - 2:15
    It sends it through a network, directly to the recipient.
  • 2:15 - 2:17
    It hops through some other computers, but funadmentally
  • 2:17 - 2:21
    you use the network to talk directly to your other computer
  • 2:21 - 2:26
    the other computer where the recipient gets his or her mail
  • 2:26 - 2:30
    It was a direct communication medium.
  • 2:30 - 2:33
    If you're old enough to remember a program called 'talk'
  • 2:33 - 2:37
    Talk was the first, sort of, interactive you type, they see it, they type, you see it
  • 2:37 - 2:40
    instant message application.
  • 2:40 - 2:43
    This again, was direct.
  • 2:43 - 2:48
    You would put your, put their name, into your program, and address
  • 2:48 - 2:51
    they would put theirs into yours, and you would just talk directly to each other
  • 2:51 - 2:57
    You didn't send this message through servers. That centralised technology.
  • 2:57 - 3:02
    From there, from those beginnings of talking directly to each other
  • 3:02 - 3:08
    we started to build communities, emailing directly to people.
  • 3:08 - 3:11
    But that was relatively inefficient.
  • 3:11 - 3:17
    Talking directly to people, one-to-one, works very good for one-to-one converstions.
  • 3:17 - 3:20
    But as soon as you want a group conversation
  • 3:20 - 3:22
    as soon as you want to find people reliably who you haven't
  • 3:22 - 3:27
    already set up contacts for, exchanged email addresses and such
  • 3:27 - 3:29
    you run into friction, you run into problems
  • 3:29 - 3:34
    So the solution to that, was to create more centralised structures
  • 3:34 - 3:38
    and we did this with IRC
  • 3:38 - 3:41
    IRC is a place where instead of talking directly to the people we're trying to reach
  • 3:41 - 3:45
    we take a message, and we send it to an IRC server
  • 3:45 - 3:47
    a third party
  • 3:47 - 3:48
    and the IRC server then copies that message
  • 3:48 - 3:51
    to all the people who we might want to talk to.
  • 3:51 - 3:54
    We developed mailing lists, listservs
  • 3:54 - 3:58
    And again, this was a way where we would take our message
  • 3:58 - 3:59
    and hand it to a third party
  • 3:59 - 4:03
    A mail server, that is not us and not the person we're trying to talk to
  • 4:03 - 4:06
    and that mail server would then echo our communication to
  • 4:06 - 4:08
    all the people we want to talk to
  • 4:08 - 4:10
    and this was great, because you didn't have to know the
  • 4:10 - 4:13
    addresses of all the people you wanted to talk to
  • 4:13 - 4:15
    You could just all 'meet' in a common place
  • 4:15 - 4:20
    We all meet in an IRC chatroom, we all meet on a listserv
  • 4:20 - 4:24
    And there were a lot of IRC channels, and a lot of IRC servers
  • 4:24 - 4:25
    and a lot of mail servers
  • 4:25 - 4:27
    all across the internet
  • 4:27 - 4:29
    A lot of places to do this communication.
  • 4:29 - 4:32
    And if you didn't like the policies or the structures or the technology
  • 4:32 - 4:34
    of any one of these service providers
  • 4:34 - 4:37
    these IRC servers, or these list servers
  • 4:37 - 4:38
    you could just switch, you could choose to run your own.
  • 4:38 - 4:40
    It was very simple.
  • 4:40 - 4:47
    This infrastructure is not hard to create, it's not hard to run, it's not hard to install.
  • 4:47 - 4:50
    And so a lot of people did run, create and install it.
  • 4:50 - 4:53
    There were a bunch of IRC servers, there were a bunch of different listserv packages
  • 4:53 - 4:58
    But as we've moved forward in time,
  • 4:58 - 5:01
    we've started to centralise even more.
  • 5:01 - 5:05
    And, you can fast-forward to today
  • 5:05 - 5:07
    where we're channeling our communication
  • 5:07 - 5:11
    through fewer and fewer places.
  • 5:11 - 5:14
    And we are making structures that are more and more central
  • 5:14 - 5:16
    and more and more over-arching
  • 5:16 - 5:21
    So, from the, the IRC way of talking to each other
  • 5:21 - 5:25
    we moved to instant messaging applications.
  • 5:25 - 5:28
    AOL Instant Messenger, ICQ,
  • 5:28 - 5:31
    those were the early ways to do it
  • 5:31 - 5:33
    and there were only a few of them
  • 5:33 - 5:37
    MSN had its messaging system, Yahoo had its messaging system
  • 5:37 - 5:39
    and when people wanted to talk to each other now,
  • 5:39 - 5:41
    they were using third-parties again.
  • 5:41 - 5:43
    But they were only using a few third parties.
  • 5:43 - 5:47
    And if you wanted to switch providers,
  • 5:47 - 5:49
    you would leave almost everyone you knew behind,
  • 5:49 - 5:51
    your entire community behind.
  • 5:51 - 5:53
    And so it becomes harder to switch.
  • 5:53 - 5:55
    There are fewer options
  • 5:55 - 5:58
    and the cost of switching leaves more and more people behind
  • 5:58 - 6:01
    So you started to have lock-in.
  • 6:01 - 6:06
    You started to have people who were chained to their methods of communication
  • 6:06 - 6:08
    because the cost of losing your community is too high.
  • 6:08 - 6:10
    And so if you don't like the technology, or you don't like the policy
  • 6:10 - 6:12
    or you don't like the politics
  • 6:12 - 6:13
    or if they're trying to filter you
  • 6:13 - 6:15
    or censor you
  • 6:15 - 6:16
    you don't have a lot of options.
  • 6:16 - 6:19
    The cost of leaving is so high that you might stay.
  • 6:19 - 6:21
    People do stay. And they accept it.
  • 6:21 - 6:25
    And we went from that small basket of providers of this kind
  • 6:25 - 6:27
    of communication technology
  • 6:27 - 6:29
    to an even more centralised structure
  • 6:29 - 6:34
    where there is effectively only one way to reach all our friends,
  • 6:34 - 6:36
    in each mode of communication,
  • 6:36 - 6:38
    Facebook.
  • 6:38 - 6:39
    And Twitter.
  • 6:39 - 6:41
    These two services rule everything.
  • 6:41 - 6:43
    And I'm not going to stand here and say Facebook is evil
  • 6:43 - 6:45
    and that Twitter is evil
  • 6:45 - 6:49
    What I want to say is that having one place
  • 6:49 - 6:51
    where we do all our communication
  • 6:51 - 6:53
    leaves us at the mercy of the policies of the people
  • 6:53 - 6:56
    that control the infrastructure that we are chained to,
  • 6:56 - 6:58
    that we are stuck using, that we are locked into.
  • 6:58 - 7:02
    You can't leave Facebook without leaving everybody you know
  • 7:02 - 7:06
    because everybody you know is on Facebook.
  • 7:06 - 7:10
    I was not a Facebook user.
  • 7:10 - 7:11
    I was against Facebook.
  • 7:11 - 7:14
    I thought it was bad to centralise all our communication in one place.
  • 7:14 - 7:16
    I didn't like the privacy implications,
  • 7:16 - 7:18
    I didn't like Facebook's censorship
  • 7:18 - 7:22
    of things like pictures of nursing mothers.
  • 7:22 - 7:23
    I don't think that kind of thing is obscene,
  • 7:23 - 7:25
    and I don't think Facebook should have the ability to tell us
  • 7:25 - 7:28
    what we can share with our friends.
  • 7:28 - 7:29
    So I thought those were bad policies,
  • 7:29 - 7:32
    and I reacted to that by not joining Facebook. For years.
  • 7:32 - 7:36
    All my friends were on Facebook.
  • 7:36 - 7:42
    I joined Facebook late last year. November.
  • 7:42 - 7:48
    Because in November, a friend of mine passed away.
  • 7:48 - 7:50
    His name was Chuck. He was a brilliant man.
  • 7:50 - 7:55
    And he lived a lot of his life online.
  • 7:55 - 7:58
    He was on Facebook, and he shared things with friends on Facebook.
  • 7:58 - 8:01
    When he passed away I realised I hadn't communicated with him in a while,
  • 8:01 - 8:03
    I hadn't really talked to him in a while.
  • 8:03 - 8:06
    And the reason I hadn't was because I wasn't
  • 8:06 - 8:08
    communicating with him in the place he communicates.
  • 8:08 - 8:10
    I wasn't meeting him where he was, I wasn't on Facebook.
  • 8:10 - 8:12
    I was missing out on something huge.
  • 8:12 - 8:16
    That's the cost of not being there.
  • 8:16 - 8:17
    And so I joined.
  • 8:17 - 8:19
    Because I decided that as strong as my beliefs were,
  • 8:19 - 8:21
    it was more important to me to be there with my friends and
  • 8:21 - 8:23
    to talk to my friends.
  • 8:23 - 8:25
    That's the power of lock-in.
  • 8:25 - 8:27
    Me, a person who cares, as much as I do,
  • 8:27 - 8:31
    who cares enough about these issues that I do something like this
  • 8:31 - 8:33
    I got locked into Facebook. I'm there now.
  • 8:33 - 8:35
    That's how I talk to a lot of my friends, whether I like it or not
  • 8:35 - 8:39
    I am locked into Facebook.
  • 8:39 - 8:43
    You know, I'm also on Diaspora. But my friends aren't on Diaspora.
  • 8:43 - 8:47
    This sort of lock-in creates a sort of situation where
  • 8:47 - 8:51
    we have one arbiter of what is acceptable speech,
  • 8:51 - 8:53
    whether we like it or not.
  • 8:53 - 8:55
    If they're free, we're free to the extent,
  • 8:55 - 8:56
    only to the extent,
  • 8:56 - 8:57
    that they give us freedom.
  • 8:57 - 8:59
    And that to me isn't freedom.
  • 8:59 - 9:01
    That to me is accepting what you're given.
  • 9:01 - 9:04
    It's the exact opposite of making your own choices.
  • 9:04 - 9:09
    The exact opposite of self-determination.
  • 9:09 - 9:14
    All of our problems in communication can be traced
  • 9:14 - 9:17
    to centralized communications infrastructure.
  • 9:17 - 9:23
    Now, I've sort of told this story at the social level,
  • 9:23 - 9:26
    in the way that we're talking about how to talk to your peers
  • 9:26 - 9:29
    and your friends on the internet.
  • 9:29 - 9:34
    But this story also exists when we think about relying on the pipes,
  • 9:34 - 9:38
    relying on the hardware, the technical infrastructure behind the software.
  • 9:38 - 9:43
    We rely on internet backbones,
  • 9:43 - 9:46
    we rely on centralized cellphone networks,
  • 9:46 - 9:48
    we rely on centralized telephone networks.
  • 9:48 - 9:52
    The people that control these networks have the ability
  • 9:52 - 9:55
    to tell us what we're allowed to say,
  • 9:55 - 9:57
    when we're allowed to say it.
  • 9:57 - 10:00
    They have the ability to filter us, to censor us, to influence us.
  • 10:00 - 10:03
    Sometimes they use that ability, and sometimes they don't,
  • 10:03 - 10:05
    and sometimes by law they're not allowed to.
  • 10:05 - 10:06
    But at the end of the day
  • 10:06 - 10:09
    the power doesn't rest in our hands.
  • 10:09 - 10:12
    The power, from a technological perspective,
  • 10:12 - 10:14
    rests in the hands of the people that operate the
  • 10:14 - 10:16
    networks.
  • 10:16 - 10:20
    Centralization doesn't just allow this sort of filtering and censorship.
  • 10:20 - 10:24
    There's another big problem with centralization.
  • 10:24 - 10:26
    The other big problem with centralization is that by
  • 10:26 - 10:30
    gathering all of our data in one place
  • 10:30 - 10:34
    it becomes easy
  • 10:34 - 10:37
    to spy on us.
  • 10:37 - 10:39
    So every time you go to a website
  • 10:39 - 10:41
    pretty much
  • 10:41 - 10:45
    the website includes, at the bottom of the page
  • 10:45 - 10:50
    a little graphic or invisible Javascript thing
  • 10:50 - 10:53
    that tells Google that you came to visit the page.
  • 10:53 - 10:56
    Eva goes to a website, and the website says
  • 10:56 - 10:59
    "Hey Google! Eva just came to my website!"
  • 10:59 - 11:01
    Every time she goes to a website, that happens.
  • 11:01 - 11:05
    And so Google effectively sits next to her and watches,
  • 11:05 - 11:07
    while she uses the internet.
  • 11:07 - 11:08
    Watches everything she does,
  • 11:08 - 11:09
    and everything she enters,
  • 11:09 - 11:12
    everything she looks at and knows.
  • 11:12 - 11:15
    It's not just her search data, it's not just her Gmail.
  • 11:15 - 11:19
    It's the entire picture of her digital life.
  • 11:19 - 11:22
    In one place.
  • 11:22 - 11:24
    That's a pretty complete profile.
  • 11:24 - 11:25
    If you were able...
  • 11:25 - 11:28
    ...imagine if somebody could sit next to you and watch
  • 11:28 - 11:29
    everything you did online,
  • 11:29 - 11:31
    imagine how much they would know about you.
  • 11:31 - 11:33
    That's how much Google knows about you.
  • 11:33 - 11:36
    Google knows more about you than you know about yourself,
  • 11:36 - 11:40
    because Google never forgets.
  • 11:40 - 11:43
    Google knows more about you than your parents,
  • 11:43 - 11:44
    than your partner,
  • 11:44 - 11:47
    Google knows your secrets, your worst secrets,
  • 11:47 - 11:49
    Google knows if you're cheating on your spouse
  • 11:49 - 11:50
    because they saw you do the Google search for the
  • 11:50 - 11:55
    sexually-transmitted disease.
  • 11:55 - 11:57
    Google knows your hopes and your dreams.
  • 11:57 - 11:58
    Because the things we hope and dream about,
  • 11:58 - 11:59
    we look for more information about.
  • 11:59 - 12:01
    We're natural information seekers.
  • 12:01 - 12:02
    We think about something, it fascinates us,
  • 12:02 - 12:05
    we go and look it up online. We search around.
  • 12:05 - 12:07
    We look around the internet, and we think about it.
  • 12:07 - 12:11
    And Google is right there. Following our thought process,
  • 12:11 - 12:15
    the thought process in our click trail.
  • 12:15 - 12:19
    That is an intimate relationship.
  • 12:19 - 12:21
    Right? Do you want an intimate relationship with Google?
  • 12:21 - 12:22
    Maybe you do.
  • 12:22 - 12:26
    I personally, don't.
  • 12:26 - 12:29
    But that's it, Google sits next to us and watches us use
  • 12:29 - 12:30
    our computers.
  • 12:30 - 12:35
    And if anyone actually did... if you had a friend who wanted
  • 12:35 - 12:37
    to sit next to you, or a stranger said I want to sit next to you
  • 12:37 - 12:39
    and just watch you use your computer all day,
  • 12:39 - 12:41
    you would use that computer very differently to the way you do now.
  • 12:41 - 12:44
    But because Google doesn't physically sit next to you,
  • 12:44 - 12:49
    Google sits invisibly in the box, you don't know Google is there.
  • 12:49 - 12:51
    But you do know, right?
  • 12:51 - 12:53
    We're all aware of this. I'm not saying any of you don't know,
  • 12:53 - 12:56
    especially in a room like this.
  • 12:56 - 12:57
    But we don't think about it.
  • 12:57 - 12:59
    We try not to think about it.
  • 12:59 - 13:02
    We are locked in, to the internet.
  • 13:02 - 13:04
    We can't stop using it.
  • 13:04 - 13:05
    And the structures that exist,
  • 13:05 - 13:07
    the infrastructure that exists,
  • 13:07 - 13:09
    that has been slowly turned from
  • 13:09 - 13:13
    a means to allow us to communicate with each other
  • 13:13 - 13:16
    to a means of allowing us to access web services
  • 13:16 - 13:20
    in return for all our personal information so we can be bought and sold
  • 13:20 - 13:22
    like products.
  • 13:22 - 13:25
    That is the problem. That is the problem of centralization, of having one structure.
  • 13:25 - 13:27
    As soon as we put all that information in one place
  • 13:27 - 13:32
    we get complete profiles of us, you get complete pictures of you.
  • 13:32 - 13:33
    And that is a lot of information.
  • 13:33 - 13:35
    It's valuable information.
  • 13:35 - 13:39
    It's information that is used, right now, mostly to sell you things.
  • 13:39 - 13:42
    And that, you might find objectionable.
  • 13:42 - 13:43
    Maybe you don't.
  • 13:43 - 13:47
    Maybe you don't believe the studies that say you can't ignore advertising.
  • 13:47 - 13:52
    Maybe you think that you are smart and special, and advertising doesn't affect you.
  • 13:52 - 13:53
    You're wrong.
  • 13:53 - 13:56
    But maybe you believe that.
  • 13:56 - 14:02
    But that information, that same infrastructure, that same technology that allows them
  • 14:02 - 14:06
    to know you well enough to sell you soap
  • 14:06 - 14:12
    allows them to know you well enough to decide how much of a credit risk you are,
  • 14:12 - 14:14
    how much of a health risk you are,
  • 14:14 - 14:17
    and what your insurance premiums should look like.
  • 14:17 - 14:19
    In America we have a big problem right now.
  • 14:19 - 14:23
    Insurance costs are out of control. Health insurance. We're having a lot of difficulty paying for it.
  • 14:23 - 14:29
    Insurance companies would like to respond to this problem
  • 14:29 - 14:32
    by knowing better who's a good risk and who's a bad risk
  • 14:32 - 14:36
    so they can lower prices for the good risk and raise prices for the bad risk.
  • 14:36 - 14:41
    Essentially they want to make people who are going to get sick, uninsurable.
  • 14:41 - 14:45
    And if you could know enough about a person to know what their risk factors are based on
  • 14:45 - 14:49
    what they're digital life is, if you can get just a little bit of information about them,
  • 14:49 - 14:53
    maybe you can figure out who their parents are and what hereditary diseases they might be subject to,
  • 14:53 - 14:56
    you can start to understand these things.
  • 14:56 - 14:59
    You can start to figure out who's a good risk and who's a bad risk.
  • 14:59 - 15:04
    You can use this information for ends that seem reasonable if you're a health insurance
  • 15:04 - 15:07
    company, but probably don't seem reasonable if you're
  • 15:07 - 15:10
    the kind of person sitting in this room, the kind of person that I talk to.
  • 15:10 - 15:17
    And that's the problem. The innocuous use. The use that seems kind of icky, but not truly evil,
  • 15:17 - 15:20
    which is advertising.
  • 15:20 - 15:25
    It's the same mechanism, the same data, that then gets used for other purposes.
  • 15:25 - 15:33
    It's the same data that then gets turned over to a government who wants to oppress you
  • 15:33 - 15:37
    because you are supporting wikileaks.
  • 15:37 - 15:40
    And that's not a fantasy, that's what happened.
  • 15:40 - 15:49
    It's the same information that anybody who wants to know something about you for an evil end would use.
  • 15:49 - 15:57
    We have a saying in the world of information, that if the data exists, you can't decide what it gets
  • 15:57 - 15:58
    used for.
  • 15:58 - 16:03
    Once data exists, especially data in the hands of the government, of officials,
  • 16:03 - 16:06
    once that data exists, it's a resource.
  • 16:06 - 16:10
    And the use of that resource it its own energy, its own logic.
  • 16:10 - 16:15
    Once a resource is there begging to be used, it's very hard to stop it from being used.
  • 16:15 - 16:23
    Because it's so attractive, it's so efficient, it would solve so many problems to use the data.
  • 16:23 - 16:29
    And so once you collect the data, once the data exists in one centralized place,
  • 16:29 - 16:35
    for anybody to come and get it with a warrant, or maybe no warrant, or maybe some money...
  • 16:35 - 16:41
    somebody is going to come with a warrant, or no warrant, and they are going to get that data.
  • 16:41 - 16:43
    And they will use it for whatever they want to use it.
  • 16:43 - 16:47
    Once it's out of the hands of the first person who collected it, who maybe you trust,
  • 16:47 - 16:53
    who maybe has good privacy policies, who maybe has no intention to do anything with your data
  • 16:53 - 16:59
    other than use it for diagnostic purposes, once it's out of that person's hands it's gone.
  • 16:59 - 17:01
    You never know where it goes after that.
  • 17:01 - 17:03
    It is completely uncontrolled and unchecked
  • 17:03 - 17:06
    and there is no ability to restrain what happens to that data.
  • 17:06 - 17:14
    So all of this is my attempt to convince you that privacy is a real value in our society,
  • 17:14 - 17:18
    and that the danger of losing privacy is a real problem.
  • 17:18 - 17:21
    It's not just the censorship, it's not just the filtering,
  • 17:21 - 17:27
    it's not just the propaganda, the influencing of opinion, that's one aspect of it,
  • 17:27 - 17:35
    it's not just the free speech. It's also the privacy, because privacy goes to the heart of our autonomy.
  • 17:35 - 17:43
    About a year and a half ago to two years ago at the Software Freedom Law Center
  • 17:43 - 17:48
    a man named Ian Sullivan who's a co-worker of mine,
  • 17:48 - 17:50
    he bought a bunch of plug servers,
  • 17:50 - 17:54
    because he was really excited at the thought of using them as print servers, and media servers,
  • 17:54 - 17:59
    and he started tinkering with them in our office.
  • 17:59 - 18:03
    My boss Eben Moglen who is a long-time activist in the Free Software movement,
  • 18:03 - 18:15
    fought very hard for Phil Zimmerman and PGP when that was a big issue,
  • 18:15 - 18:24
    he looked at this technology and he immediately realised that several streams had come together in one
  • 18:24 - 18:25
    place.
  • 18:25 - 18:28
    There's a lot of really good technology to protect your privacy right now.
  • 18:28 - 18:31
    In fact that's the stuff we're putting on the Freedom Box.
  • 18:31 - 18:33
    We're not writing new software.
  • 18:33 - 18:37
    We are gathering stuff, and putting it in one place.
  • 18:37 - 18:41
    Stuff that other people did because there are people who are better at writing software, and security,
  • 18:41 - 18:43
    than we are. We're software integrators.
  • 18:43 - 18:47
    And he realised there was all this software out there, and suddenly there was a box to put it on.
  • 18:47 - 18:53
    You could put all that software in one place, make it easy, and give it to people in one neat package.
  • 18:53 - 18:57
    Pre-installed, pre-configured, or as close to it as we can get.
  • 18:57 - 19:03
    And that, was the vision for the FreedomBox.
  • 19:03 - 19:08
    The FreedomBox is a tiny computer. Look at this.
  • 19:08 - 19:11
    That's small, it's unobtrusive.
  • 19:11 - 19:12
    So it's a small computer.
  • 19:12 - 19:16
    And we don't just mean small in size... it doesn't take a lot of energy.
  • 19:16 - 19:23
    I could be running this box on a couple of AA batteries for the life of this presentation.
  • 19:23 - 19:25
    You could run it on a solar panel.
  • 19:25 - 19:28
    It's very lightweight infrastructure.
  • 19:28 - 19:33
    You plug it into your home network, and when I say home network,
  • 19:33 - 19:35
    (I'm going to pass this around)
  • 19:35 - 19:38
    When I say home network, I mean home network.
  • 19:38 - 19:43
    This is technology we are designing for individuals to use to talk to their friends.
  • 19:43 - 19:48
    Our use-case, the thing we're trying to protect is you guys, as individuals in your communities.
  • 19:48 - 19:52
    This isn't a small-business appliance, it's not a large corporate applicance, this is a thing
  • 19:52 - 19:59
    that we are truly aiming at the home market, and people who care about privacy on an individual level.
  • 19:59 - 20:06
    You plug it into your home network to protect your privacy, your freedom, your anonymity and your security.
  • 20:06 - 20:10
    That is our mission statement, I guess. Unofficially.
  • 20:10 - 20:17
    That is what we believe we are trying to do with this device.
  • 20:17 - 20:22
    So, what privacy means in this context, the way we're going to go about trying to protect your privacy
  • 20:22 - 20:28
    is to connect you directly with other people and take everything you do and try to encrypt it
  • 20:28 - 20:31
    so that only you and the person you are talking to can see it. This is not a new idea.
  • 20:31 - 20:36
    We can do encrypted messaging, and we can do encrypted browsing.
  • 20:36 - 20:44
    Now there are problems with encrypted browsing. Right now if you want to have secure browsing you generally
  • 20:44 - 20:46
    use something called SSL.
  • 20:46 - 20:58
    SSL is a system of certificates that allow a web server to say to you "we can talk privately".
  • 20:58 - 21:02
    That's the first guarantee, a secure cryptographic connection (A).
  • 21:02 - 21:06
    and (B) I can authenticate to you that I am who I say I am.
  • 21:06 - 21:11
    So not only can nobody listen, but you know who you're talking to.
  • 21:11 - 21:18
    You're not secretly talking to the government, when really you're talking to me.
  • 21:18 - 21:24
    The problem with SSL, the big problem with SSL, is that the system for signing certificates relies
  • 21:24 - 21:28
    on a trust hierachy that goes back to a cartel of companies who have the server certificates,
  • 21:28 - 21:36
    who have the ability to do this "guarantee". So when the website says to you "I guarantee I am who I
  • 21:36 - 21:43
    am", you say "I don't know you, I don't trust you". And they say "Oh, but this other company, I paid
  • 21:43 - 21:47
    them money, and so they'll guarantee that I am me."
  • 21:47 - 21:53
    Which is a really interesting idea - because I also don't know this company, why would I trust that company?
  • 21:53 - 21:57
    I mean, the company is just old enough and influential enough that they could actually get their
  • 21:57 - 22:04
    authority into my browser. So really my browser is willing to accept at face-value that this website
  • 22:04 - 22:07
    is who it says it is, but I don't necessarily accept that.
  • 22:07 - 22:13
    And then, we have the problem of self-signed certificate. Where if they say, none of those authorities
  • 22:13 - 22:18
    in your browser trust me, I trust myself and look, I've signed a piece of paper -
  • 22:18 - 22:21
    I swear I am who I say I am.
  • 22:21 - 22:24
    And that, is not trustworthy at all, right?
  • 22:24 - 22:28
    That's just him saying again "No, really! I'm me!".
  • 22:28 - 22:34
    So this is a problem, because the FreedomBoxes are not going to trust the SSL cartel,
  • 22:34 - 22:37
    and they are not going to trust each other, so they can't just sort of swear to each other that
  • 22:37 - 22:40
    they are who they are.
  • 22:40 - 22:45
    So we think we've solved this. I'm not going to say we've solved it, because we're just starting to tell
  • 22:45 - 22:52
    people about this idea, and I'm sure people will have reasons why the idea can be improved.
  • 22:52 - 22:58
    But there is a technology called MonkeySphere, that allows you to take an SSH key and wrap it around a
  • 22:58 - 23:03
    PGP key, and use a PGP key to authenticate SSH connections.
  • 23:03 - 23:10
    It's really neat technology that allows you to replace SSH trust with PGP trust.
  • 23:10 - 23:14
    And we looked at that, and we thought, why can't we do that with SSL?
  • 23:14 - 23:21
    So one thing we're going do with browsing is take an SSL certificate, an X.509 certificate,
  • 23:21 - 23:25
    and wrap it around a PGP key and send it through the normal SSL layer mechanisms
  • 23:25 - 23:32
    but when it gets to the other end, smart servers and smart browsers will open it up and use PGP mechanisms
  • 23:32 - 23:40
    to figure out how to trust people, to verify the connections, to sign the authentication of the identity
  • 23:40 - 23:43
    of the browser, of the server.
  • 23:43 - 23:48
    This allows us to replace the SSL cartel with the web of trust, the keyservers.
  • 23:48 - 23:57
    We're replacing a tiny group of companies that control everything with keyservers, community infrastructure.
  • 23:57 - 24:01
    Anyone can set up a keyserver, and you can decide which one you want to trust.
  • 24:01 - 24:03
    They share information.
  • 24:03 - 24:06
    The web of trust is built on people, telling each other that they trust each other.
  • 24:06 - 24:10
    Again, you can decide who to trust and how much you want to trust them.
  • 24:10 - 24:16
    This is emblematic of our approach. We've identified structures that are unreliable because
  • 24:16 - 24:20
    they are centralized, because they are controlled by interests that are not the same interests
  • 24:20 - 24:23
    as our interests.
  • 24:23 - 24:30
    And we've decided to replace them wherever we can with structures that rely on people,
  • 24:30 - 24:38
    that rely on human relationships, that rely less on the notion that you can buy trust, and more on the
  • 24:38 - 24:42
    notion that you earn trust, by being trustworthy, by having people vouch for you over time.
  • 24:42 - 24:50
    So that's our approach to encrypted browsing. It's also our approach to encrypted messaging.
  • 24:50 - 24:58
    We're doing Jabber for a lot of message passing, XMPP, and we're securing that again with PGP.
  • 24:58 - 25:02
    Everywhere we can we're going to try to use the PGP network, because it already exists...
  • 25:02 - 25:04
    as I said, we're not trying to invent anything new.
  • 25:04 - 25:11
    PGP already exists and it does a really good job. So we're taking the PGP trust system and we're
  • 25:11 - 25:17
    going to apply it to things like XMPP and make sure that we can do message passing in a way
  • 25:17 - 25:19
    that we can trust.
  • 25:19 - 25:26
    Once we have XMPP we have a way to send text, a way to send audio, sure...
  • 25:26 - 25:29
    but also you can send structured data.
  • 25:29 - 25:33
    Through that same channel. And you can send that data to buddy lists.
  • 25:33 - 25:39
    So the system starts to look like a way to pass data in a social way. And we think this is the
  • 25:39 - 25:42
    beginning of the social layer of the box.
  • 25:42 - 25:47
    At the bottom of the box we have a belief that the technology should be social
  • 25:47 - 25:48
    from the ground up.
  • 25:48 - 25:51
    And so we're building structures that allow it to be social,
  • 25:51 - 25:56
    that assume you want to connect with friends in a network of freedom,
  • 25:56 - 26:01
    perhaps FreedomBoxes, perhaps other kinds of software, other kinds of technology.
  • 26:01 - 26:04
    And we're designing with that in mind.
  • 26:04 - 26:09
    With that in mind, we think we get certain benefits technologically which I'll get into later.
  • 26:09 - 26:13
    We think we can simply things like key management, through methods like this.
  • 26:13 - 26:19
    By privacy I also mean that we can install a proxy server, privoxy,
  • 26:19 - 26:21
    we think the answer is privoxy here,
  • 26:21 - 26:27
    privoxy on the box, so you can point your browser at the box, surf the web on the box,
  • 26:27 - 26:34
    and strip ads, strip cookies, stop Google from tracking you from website to website to website,
  • 26:34 - 26:43
    to remove, the constant person sitting at your side, spying, recording, listening to everything you do.
  • 26:43 - 26:47
    In that vein, we don't just want to block ads and reject cookies,
  • 26:47 - 26:50
    we want to do something new, relatively new.
  • 26:50 - 27:03
    We think we want to munge your browser fingerprint, that unique pattern of data that is captured by your
  • 27:03 - 27:04
    user-agent string and what plugins you have, and all that stuff
  • 27:04 - 27:08
    that forms a unique profile of you that allows people to track your browser, companies to track your
  • 27:08 - 27:10
    browser as you hop along the web, even if they don't know anything about you.
  • 27:10 - 27:13
    It can sort of tie you to the browser, make profiles about your browser.
  • 27:13 - 27:16
    And that turns out to be a very effective way of figuring out who you are.
  • 27:16 - 27:24
    So even without a cookie, even without serving you with an ad, once they're talking to you they can
  • 27:24 - 27:26
    uniquely identify you, or relatively uniquely.
  • 27:26 - 27:33
    But it's relatively early in the browser fingerprint arms race.
  • 27:33 - 27:38
    We think that with a very little bit of changing, we can foil the recording.
  • 27:38 - 27:41
    and win this round at least.
  • 27:41 - 27:47
    And instead of having one profile where they gather all of your data, you will present to services
  • 27:47 - 27:51
    as a different person every time you use the service. So they cannot build profiles of you over time.
  • 27:52 - 27:53
    That's what privacy looks like in our context. We're looking for cheap ways to foil the tracking.
  • 27:55 - 28:02
    We're looking for easy things we can do, because we believe there's a lot of low-hanging fruit.
  • 28:02 - 28:06
    And we'll talk about that more in a minute.
  • 28:06 - 28:10
    Freedom is our value, freedom is the thing we are aiming for,
  • 28:10 - 28:13
    freedom from centralized structures like the pipes.
  • 28:13 - 28:19
    Now mesh networking, I have mesh networking in my slides. That is a lie.
  • 28:19 - 28:21
    We are not doing mesh networking.
  • 28:21 - 28:27
    The reason we are not doing mesh networking is because I do not know anything about mesh networking
  • 28:27 - 28:32
    and one of the reaons I came here was to meet people who know a lot about mesh networking
  • 28:32 - 28:34
    and I see people in this audience who know a lot about mesh networking.
  • 28:34 - 28:41
    If you want to turn that lie into the truth, the way you do that
  • 28:41 - 28:44
    is by continuing on your projects, making mesh networking awesome,
  • 28:44 - 28:46
    to the point where I can say yes, we're going to put that in this box.
  • 28:46 - 28:49
    Then eventually, by the time this box is ready to do real
  • 28:49 - 28:53
    things for real people, we're really hoping that the mesh story
  • 28:53 - 28:57
    coheres, where we've identified the protocol and the technology and the people who are going to help
  • 28:57 - 29:00
    us. If you think you might be one of those people, we want to talk to you.
  • 29:00 - 29:03
    So yes, we are going to do mesh networking,
  • 29:03 - 29:06
    and that might be a lie
  • 29:06 - 29:08
    but I hope not.
  • 29:08 - 29:11
    We want you to have the freedom to own your data
  • 29:11 - 29:17
    that means data portability, that means that your data sits on your box and never goes to a third party.
  • 29:17 - 29:19
    It only goes to the people you want it to go to.
  • 29:19 - 29:24
    Fine-grained access control. Your data, your structures, you decide where it goes.
  • 29:24 - 29:25
    That's a user-interface problem,
  • 29:25 - 29:27
    that's a user permission problem,
  • 29:27 - 29:29
    an access control problem.
  • 29:29 - 29:33
    Access control is a solved problem.
  • 29:33 - 29:38
    Doing it through a convenient user-interface, that's not solved... so that's work to be done.
  • 29:38 - 29:42
    That's a big chunk of our todo list.
  • 29:42 - 29:44
    We want you to own your social network
  • 29:44 - 29:50
    Before Facebook there was a thing called MySpace, which was... I'm not even sure it exists anymore.
  • 29:50 - 29:54
    Before MySpace there was Tribe.
  • 29:54 - 29:57
    Before Tribe there was Friendster.
  • 29:57 - 30:00
    Friendster is now like a... "gaming network".
  • 30:00 - 30:03
    I don't know what it is but they still send me email
  • 30:03 - 30:06
    Which is the only reason I know they're still alive.
  • 30:06 - 30:11
    Before Friendster was the original social network.
  • 30:11 - 30:16
    We called this social network "the internet".
  • 30:16 - 30:17
    We talked directly to each other,
  • 30:17 - 30:21
    we used email, an instant messenger and IRC.
  • 30:21 - 30:24
    We talked to people using the structures that were out there.
  • 30:24 - 30:28
    It wasn't centralized in one service, we had a lot of ways of meeting each other
  • 30:28 - 30:29
    and passing messages.
  • 30:29 - 30:32
    What we lacked was a centralized interface.
  • 30:32 - 30:36
    So when we say "own your social network" we mean use the services of the internet,
  • 30:36 - 30:38
    own the pieces that talk to each other.
  • 30:38 - 30:41
    Hopefully we'll provide you with a convenient interface to do that.
  • 30:41 - 30:44
    But the actual structures, the places where your data live,
  • 30:44 - 30:48
    that is just the same pieces that we know how to use already.
  • 30:48 - 30:51
    We are not going to try to reinvent how you talk to people,
  • 30:51 - 30:56
    we're just going to make it so that the pipes are secure.
  • 30:56 - 30:59
    A big part of freedom, a big part of privacy,
  • 30:59 - 31:02
    is anonymity.
  • 31:02 - 31:06
    Tor can provide anonymity.
  • 31:06 - 31:09
    But we don't have to go all the way to Tor.
  • 31:09 - 31:12
    Tor is expensive, in terms of latency.
  • 31:12 - 31:17
    Tor is difficult to manage...
  • 31:17 - 31:21
    I don't know how many people have tried to use Tor, to run all their traffic through Tor.
  • 31:21 - 31:24
    It's hard. For two reasons.
  • 31:24 - 31:27
    For one, the latency... it takes a very long time to load a web page.
  • 31:27 - 31:32
    And two, you look like a criminal. To every website that you go to.
  • 31:32 - 31:39
    My bank shut down my account when I used Tor.
  • 31:39 - 31:45
    Because suddenly, I was coming from an IP address in Germany that they had detected in the past
  • 31:45 - 31:49
    efforts to hack them on.
  • 31:49 - 31:52
    So they closed my account, well I had to talk to them about it,
  • 31:52 - 31:54
    it did all get solved in the end.
  • 31:54 - 31:58
    PayPal as well closed my account down.
  • 31:58 - 31:59
    So that was the end of my ability to use Tor.
  • 31:59 - 32:01
    So we can't just run all our traffic through Tor.
  • 32:01 - 32:07
    It's too slow, and the network has weird properties in terms of how you present to websites,
  • 32:07 - 32:09
    that frankly, are scary.
  • 32:09 - 32:17
    Because if I look like a criminal to the bank, I don't want to imagine what I look like to my own government.
  • 32:17 - 32:19
    But we can do privacy in other ways.
  • 32:19 - 32:25
    If you are a web user, in China, and you want to surf the internet,
  • 32:25 - 32:31
    with full access to every website you might go to, and with privacy from your government,
  • 32:31 - 32:35
    so that you don't get a knock on your door from visiting those websites,
  • 32:35 - 32:37
    we can do that without Tor.
  • 32:37 - 32:39
    We don't need Tor to do that. We can do that cheaply.
  • 32:39 - 32:46
    Because all you need to do in that situation is get your connection out of China.
  • 32:46 - 32:54
    Send your request for a web page through an encrypted connection to a FreedomBox in...
  • 32:54 - 32:58
    Austria, America, who knows?
  • 32:58 - 33:06
    Just get the request away from the people who physically have the power to control you.
  • 33:06 - 33:09
    And we can do that cheaply, that's just SSH port forwarding.
  • 33:09 - 33:14
    That's just a little bit of tunneling, that's just a little bit of VPN.
  • 33:14 - 33:16
    There's a lot of ways to do that sort of thing,
  • 33:16 - 33:21
    to give you anonymity and privacy in your specific context
  • 33:21 - 33:23
    without going all the way into something like Tor.
  • 33:23 - 33:26
    Now there are people who are going to need Tor.
  • 33:26 - 33:28
    They will need it for their use case.
  • 33:28 - 33:33
    But not every use case requires that level of attack.
  • 33:33 - 33:38
    And so one of the things we're trying to do is figure out how much privacy and anonymity you need,
  • 33:38 - 33:40
    and from whom you need it.
  • 33:40 - 33:43
    If we can do that effectively we can give people solutions
  • 33:43 - 33:46
    that actually work for them. Because if we just tell people
  • 33:46 - 33:50
    to use Tor, we're going to have a problem.
  • 33:50 - 33:53
    They're not going to use it, and they won't get any privacy at all.
  • 33:53 - 33:55
    And that's bad.
  • 33:55 - 33:57
    So we want to allow people to do anonymous publishing,
  • 33:57 - 34:00
    and file-sharing, and web-browsing and email.
  • 34:00 - 34:02
    All the communications you want to do.
  • 34:02 - 34:04
    The technology to do that already exists,
  • 34:04 - 34:06
    we could do all of that with Tor.
  • 34:06 - 34:09
    The next piece of our challenge is to figure out how to do it without Tor.
  • 34:09 - 34:12
    To figure out what pieces we need Tor for, and to figure out
  • 34:12 - 34:18
    what pieces we can do a little bit more cheaply.
  • 34:18 - 34:20
    Security.
  • 34:20 - 34:24
    Without security, you don't have freedom and privacy and anonymity.
  • 34:24 - 34:26
    If the box isn't secure,
  • 34:26 - 34:28
    you lose.
  • 34:28 - 34:32
    We're going to encrypt everything.
  • 34:32 - 34:36
    We're going to do something that's called social key management, which I'm going to talk about.
  • 34:36 - 34:39
    I do want to talk about the Debian-based bit.
  • 34:39 - 34:43
    We are based on a distribution of Linux called Debian,
  • 34:43 - 34:46
    because it is a community-based distribution.
  • 34:46 - 34:48
    It is made by people who care a lot about your
  • 34:48 - 34:52
    freedom, your privacy, and your ability to speak anonymously.
  • 34:52 - 34:56
    And we really believe that the best way to distribute this
  • 34:56 - 34:58
    software is to hand it to the Debian mirror network and let
  • 34:58 - 35:00
    them distribute it. Because they have mechanisms
  • 35:00 - 35:02
    to make sure that nobody changes it.
  • 35:02 - 35:05
    If we were to distribute the software to you directly, we
  • 35:05 - 35:09
    would become a target. People would want to change the
  • 35:09 - 35:12
    software as we distribute it on our website.
  • 35:12 - 35:13
    They would want to crack our website and distribute their
  • 35:13 - 35:16
    version of the package.
  • 35:16 - 35:18
    We don't want to be a target, so we're not going to give you software.
  • 35:18 - 35:22
    We're going to give it to Debian, and let them give you the software.
  • 35:22 - 35:26
    And at the same time you get all of the Debian guarantees about freedom.
  • 35:26 - 35:29
    The Debian Free Software Guidelines.
  • 35:29 - 35:32
    They're not going to give you software unless it comes
  • 35:32 - 35:37
    with all of the social guarantees that are required to participate in the Debian community.
  • 35:37 - 35:40
    So we're very proud to be using Debian in this manner,
  • 35:40 - 35:42
    and working with Debian in this manner.
  • 35:42 - 35:45
    And we think that's the most effective way we can guarantee that we're going to live up to
  • 35:45 - 35:52
    our promises to you, because it provides a mechanism whereby if we fail to live up to our promises,
  • 35:52 - 35:56
    we cannot give you something that is broken. Because Debian won't let us,
  • 35:56 - 36:00
    they just won't distribute it.
  • 36:00 - 36:02
    There are problems with security.
  • 36:02 - 36:04
    There are things we can't solve.
  • 36:04 - 36:05
    One...
  • 36:05 - 36:09
    Physical security of the box.
  • 36:09 - 36:14
    We haven't really talked much internally about whether we can encrypt the filesystem on this box.
  • 36:14 - 36:17
    I don't quite see a way to do it.
  • 36:17 - 36:20
    It doesn't have an interface for you to enter a password effectively.
  • 36:20 - 36:23
    By the time you've brought an interface up you'd be running untrusted code.
  • 36:23 - 36:25
    I don't know a way to do it.
  • 36:25 - 36:30
    If anyone can think of a way that we can effectively encrypt the filesystem, I'd love to hear it.
  • 36:30 - 36:35
    But, on top of that, if we do encrypt the filesystem,
  • 36:35 - 36:39
    then the thing cannot be rebooted remotely, which is a downside.
  • 36:39 - 36:41
    So there are trade-offs at every step of the way.
  • 36:41 - 36:45
    If we can figure out some of these security issues, then we can be ahead of the game.
  • 36:45 - 36:50
    But I think the encrypting the filesystem is the only way to guarantee the box is secure, even if it's
  • 36:50 - 36:52
    not physically secure.
  • 36:52 - 36:54
    So I think that's a big one.
  • 36:54 - 36:58
    If you have ideas about that, please come and talk to me after the talk.
  • 36:58 - 37:01
    I promised I would talk about social key management, and here it is.
  • 37:01 - 37:06
    So we're building the idea of knowing who your friends are
  • 37:06 - 37:08
    into the box at a somewhat low level.
  • 37:08 - 37:13
    To the point where things that are on the box can assume it is there,
  • 37:13 - 37:18
    or ask you if it's there, or rely on it as a matter of course in some cases.
  • 37:18 - 37:22
    So we can do things with keys that make your keys unlosable.
  • 37:22 - 37:25
    Right now a PGP key is a hard thing to manage.
  • 37:25 - 37:27
    Key management is terrible.
  • 37:27 - 37:30
    Do you guys like PGP? PGP is good.
  • 37:30 - 37:35
    Does anyone here like key management?
  • 37:35 - 37:36
    We have one guy who likes key management.
  • 37:36 - 37:39
    LAUGHTER
  • 37:39 - 37:41
    He's going to do it for all of you!
  • 37:41 - 37:44
    So, none of us like key management.
  • 37:44 - 37:46
    Key management doesn't work, especially if your use-case is home users, naive end-users.
  • 37:46 - 37:48
    Nobody wants to do key management.
  • 37:48 - 37:52
    Writing their key down and putting it in a safety deposit box is ludicrous.
  • 37:52 - 37:54
    It's a very difficult thing to actually convince people to do.
  • 37:54 - 38:00
    Sticking it on a USB key, putting it in a zip-lock back and burying it in your backyard is paranoid.
  • 38:00 - 38:03
    I can't believe I just told you what I do with my key.
  • 38:03 - 38:05
    LAUGHTER
  • 38:05 - 38:07
    No, you can't ask people to do that.
  • 38:07 - 38:08
    They won't do it.
  • 38:08 - 38:10
    You can't protect keys in this manner.
  • 38:10 - 38:13
    You have to have a system that allows them to sort of, not ever know they have a key.
  • 38:13 - 38:16
    To not think about their key unless they really want to.
  • 38:16 - 38:19
    We think we've come up with something that might work.
  • 38:19 - 38:21
    You take the key,
  • 38:21 - 38:22
    or a subkey,
  • 38:22 - 38:25
    you chop it into little bits
  • 38:25 - 38:25
    and you give that key...
  • 38:25 - 38:31
    and we're talking about a key of a very long length, so there's a giant attack space
  • 38:31 - 38:36
    and you can chop it into bits and hand it to people without reducing the search space for a key.
  • 38:36 - 38:39
    You chop it into bits and hand all the bits to your friends.
  • 38:39 - 38:42
    Now all your friends have your key, as a group.
  • 38:42 - 38:44
    Individually, none of them can attack you.
  • 38:44 - 38:48
    Indicidually, none of them has the power to come root your box,
  • 38:48 - 38:50
    to access your services and pretend to be you.
  • 38:50 - 38:54
    As a group, they can do this.
  • 38:54 - 39:04
    We trust our friends, as a group, more than we trust them as individuals.
  • 39:04 - 39:09
    Any single one of your friends, if you gave them the key to your financial data and your private online
  • 39:09 - 39:11
    life that would make you very nervous.
  • 39:11 - 39:14
    You would worry that they would succumb to temptation to peek,
  • 39:14 - 39:17
    fall on hard times and want to attack you in some way,
  • 39:17 - 39:20
    fall out with you, get mad at you.
  • 39:20 - 39:23
    As an individual, people are sort of fallible in this sense.
  • 39:23 - 39:26
    But as a group of friends who would have to get together
  • 39:26 - 39:30
    and affirmatively make a decision to attack you,
  • 39:30 - 39:33
    we think that's extremely unlikely.
  • 39:33 - 39:38
    It's so unlikely that there are only a few scenarios where we think it might happen.
  • 39:38 - 39:40
    One...
  • 39:40 - 39:43
    if you are ill, and unable to access your box
  • 39:43 - 39:44
    or you're in jail
  • 39:44 - 39:46
    or you've passed away
  • 39:46 - 39:49
    or you've disappeared.
  • 39:49 - 39:52
    Or... you've gone crazy.
  • 39:52 - 39:58
    We call this type of event, where all your friends get together and help you,
  • 39:58 - 40:00
    even if you don't ask them for help,
  • 40:00 - 40:03
    we call that an intervention.
  • 40:03 - 40:06
    When your friends sit you down and say,
  • 40:06 - 40:09
    "you need our help, you can't ask us for it because you're not in a position to ask us for it",
  • 40:09 - 40:11
    that's an intervention.
  • 40:11 - 40:17
    If you have a moment in your life, a crisis in your life that is an intervention level event,
  • 40:17 - 40:19
    that's when you can go to your friends.
  • 40:19 - 40:22
    If your house burns down, you lose your key and all your data
  • 40:22 - 40:26
    You go to your friends, and you say "can I have part of my key back?"
  • 40:26 - 40:30
    "Oh, and give me that data that you have in a cryptographically-sealed box that you can't read."
  • 40:30 - 40:31
    To all your friends...
  • 40:31 - 40:32
    "My data please, my key please, ..."
  • 40:32 - 40:33
    "My data please, my key please, ..."
  • 40:33 - 40:34
    "My data please, my key please, ..."
  • 40:34 - 40:40
    You take all those pieces, you get a new box,
  • 40:40 - 40:42
    you load it all onto your box.
  • 40:42 - 40:47
    You have the key, you have your entire key, and now you can read your data.
  • 40:47 - 40:49
    And you haven't lost your digital life.
  • 40:49 - 40:54
    You have a key that is now unlosable.
  • 40:54 - 40:59
    Even if you never wrote it down, even if you never buried it in the backyard.
  • 40:59 - 41:01
    This is a hard problem in key management.
  • 41:01 - 41:04
    People lose their keys and their passwords to services all the time.
  • 41:04 - 41:09
    The only way we can think of to make that impossible, is this mechanism.
  • 41:09 - 41:10
    And of course it's optional.
  • 41:10 - 41:14
    If you're a person who doesn't trust your friends, even as a group,
  • 41:14 - 41:17
    or if you're a person who just doesn't have a lot of friends
  • 41:17 - 41:21
    (let me finish!)
  • 41:21 - 41:25
    ...who doesn't have a lot of friends with FreedomBoxes who can be the backend for this,
  • 41:25 - 41:27
    you don't have to trust this mechanism.
  • 41:27 - 41:30
    You can do something else to make your key unforgettable.
  • 41:30 - 41:32
    But for a lot of naive end-users,
  • 41:32 - 41:35
    this is the mechanism.
  • 41:35 - 41:37
    This is the way they are going to never
  • 41:37 - 41:38
    lose their keys
  • 41:38 - 41:42
    Because the first time a user gets irretrievably locked out of his FreedomBox,
  • 41:42 - 41:44
    we lose that user forever.
  • 41:44 - 41:46
    And we lose all his friends forever.
  • 41:46 - 41:52
    Because it would scare you to lose such an important group of information.
  • 41:52 - 41:54
    Social key management.
  • 41:54 - 41:59
    This is the benefit of building social, of building knowledge
  • 41:59 - 42:04
    of who your friends are, into the box, at a deep level.
  • 42:04 - 42:06
    We have never done that before, with a technology
  • 42:06 - 42:08
    as a community project.
  • 42:08 - 42:11
    And it opens up new possibilities. This is just one.
  • 42:11 - 42:13
    There are others.
  • 42:13 - 42:15
    But it's a field we haven't really thought a lot about.
  • 42:15 - 42:20
    I think once we get out there and we start doing this kind of
  • 42:20 - 42:25
    construction, a lot of new uses are going to be found for this architecture.
  • 42:25 - 42:29
    I encourage you all to think about what changes,
  • 42:29 - 42:35
    when you can assume that the box has people you can trust, just a little bit,
  • 42:35 - 42:38
    because right now we live in a world where we are asked
  • 42:38 - 42:43
    to trust third party services like Facebook with all our photos,
  • 42:43 - 42:46
    or Flickr with all our photos, or Gmail with all our email.
  • 42:46 - 42:48
    We are asked to trust them.
  • 42:48 - 42:50
    We have no reason to trust them.
  • 42:50 - 42:55
    I mean, we expect that they'll act all right, because they have no reason to destroy us.
  • 42:55 - 42:57
    But we don't know what's going to happen.
  • 42:57 - 43:02
    We're effectively giving all our information to people we don't trust at all right now.
  • 43:02 - 43:05
    How does a network of people we trust, just a little bit,
  • 43:05 - 43:07
    change the landscape?
  • 43:07 - 43:09
    I think that's a really interesting question.
  • 43:09 - 43:10
    This box explores that question,
  • 43:10 - 43:16
    this box creates new solutions to old problems that previously seemed intractable.
  • 43:16 - 43:20
    So, I encourage everybody to think about how that might
  • 43:20 - 43:27
    change the solution to a problem they have with a technological architecture as it exists today.
  • 43:27 - 43:32
    Here's another problem...
  • 43:32 - 43:35
    Boxes that know who you are, and know who your friends are,
  • 43:35 - 43:38
    and know how your friends normally act,
  • 43:38 - 43:42
    can also know when your friends are acting weird.
  • 43:42 - 43:50
    If you have a friend who sends you one email a year, who suddenly sends you ten emails in a day,
  • 43:50 - 43:52
    that look like spam,
  • 43:52 - 43:53
    you know that box is rooted.
  • 43:53 - 43:55
    You know that box is weird.
  • 43:55 - 43:59
    Or if you are using the FreedomBox as your gateway to the internet,
  • 43:59 - 44:05
    and a box it is serving downstream, starts sending a bunch of spam through it, it knows.
  • 44:05 - 44:09
    It can say "Oh no! You're acting like a zombie."
  • 44:09 - 44:10
    "You should get a check-up."
  • 44:10 - 44:16
    It can shut off mail service to that box, and not let the messages out.
  • 44:16 - 44:22
    It can make that decision to protect the wider internet to make you a better citizen in the world.
  • 44:22 - 44:28
    If suddenly your computer starts saying "Hey, I'm in Scotland and I need $5000"...
  • 44:28 - 44:30
    but we know you're not in Scotland
  • 44:30 - 44:33
    Maybe this box, because it has contact information,
  • 44:33 - 44:36
    maybe this box sends you an SMS.
  • 44:36 - 44:41
    And says "Dude, you've been hacked, go do something about your box."
  • 44:41 - 44:44
    So the types of things we can do once we assume we have
  • 44:44 - 44:49
    close relations as opposed to arms-length relations,
  • 44:49 - 44:51
    the types of things we can do when we trust each other a little bit
  • 44:51 - 44:54
    and we trust our boxes a little bit, goes way up.
  • 44:54 - 44:56
    Way up.
  • 44:56 - 44:59
    And by bringing that infrastructure closer to us,
  • 44:59 - 45:03
    I mean Gmail is too far away to play that role from a network perspective.
  • 45:03 - 45:09
    But if the box is in our land, we can do that.
  • 45:09 - 45:12
    These boxes will only work if they are convenient.
  • 45:12 - 45:15
    There's an old punk-rock slogan, from the Dead Kennedys,
  • 45:15 - 45:19
    "Give me convenience, or give me death."
  • 45:19 - 45:25
    We laugh at that, but that's a belief users have,
  • 45:25 - 45:27
    and I deduce that based on their behaviour,
  • 45:27 - 45:30
    because every time there is a convenient web service,
  • 45:30 - 45:31
    people use it.
  • 45:31 - 45:35
    Even if it's not very good with privacy, a lot of people are going to use it.
  • 45:35 - 45:41
    And conversely, whenever we have web services that are very good at privacy, but aren't very convenient,
  • 45:41 - 45:44
    comparatively fewer people use them.
  • 45:44 - 45:48
    We don't think this box works without convenience.
  • 45:48 - 45:51
    If we don't get the user-interface right then this project
  • 45:51 - 45:53
    will probably fall over.
  • 45:53 - 45:56
    It will never gain any sort of critical mass.
  • 45:56 - 45:58
    So we need a simple interface,
  • 45:58 - 46:01
    we need a way for users to interact with this box in a minimal way.
  • 46:01 - 46:03
    They should think about it as little as possible.
  • 46:03 - 46:06
    That's the hardest problem we face.
  • 46:06 - 46:07
    Quite frankly.
  • 46:07 - 46:10
    The technology to do private communication, that exists.
  • 46:10 - 46:14
    A lot of the people in this room helped to build that infrastructure and technology.
  • 46:14 - 46:17
    We can put it on the box.
  • 46:17 - 46:21
    Making it easy and accessible for users, that's hard.
  • 46:21 - 46:23
    And right now we're trying to figure out what that looks like,
  • 46:23 - 46:25
    who the designers are going to be.
  • 46:25 - 46:31
    If you have user interface or user experience design that you want to bring to a project like this,
  • 46:31 - 46:34
    please, please, come find me.
  • 46:34 - 46:39
    In order to have convenience, we need to have the thing provide services that are not just
  • 46:39 - 46:45
    freedom-oriented, we need to use its position in your network as a trusted device
  • 46:45 - 46:48
    to do things for you that aren't just about privacy.
  • 46:48 - 46:51
    It needs to do backups.
  • 46:51 - 46:52
    This is important.
  • 46:52 - 46:57
    Right now the way people back up their photos is by giving them to Flickr.
  • 46:57 - 47:00
    The way they back up their email is by giving it to Gmail.
  • 47:00 - 47:06
    If we don't provide backups, we can never be an effective replacement
  • 47:06 - 47:09
    for the services that store your data somewhere else.
  • 47:09 - 47:15
    Even though they're storing it out there in the cloud for their purposes, you get a benefit from it.
  • 47:15 - 47:17
    We have to replicate that benefit.
  • 47:17 - 47:20
    So things that we don't think of as privacy features have to
  • 47:20 - 47:22
    be in the box.
  • 47:22 - 47:26
    The backups, the passwords, and the keys, you can't forget them.
  • 47:26 - 47:29
    We would like it to be a music, a video, a photo server,
  • 47:29 - 47:34
    all the kinds of things you might expect from a convenient box on your network.
  • 47:34 - 47:38
    All the things that you want to share with other people, this box has to do those things.
  • 47:38 - 47:45
    And these aren't privacy features, but without them we won't be able to give people privacy.
  • 47:45 - 47:49
    Our first feature, the thing we are working towards
  • 47:49 - 47:50
    is Jabber.
  • 47:50 - 47:53
    It's secure encrypted chat, point-to-point.
  • 47:53 - 47:58
    That will be the thing we are working on right now.
  • 47:58 - 48:02
    But in order to do that we need to solve this monkey-spherish SSL problem that I described.
  • 48:02 - 48:07
    We have code, it needs to get packaged and all that.
  • 48:07 - 48:10
    Our development strategy, the way we are going to do all the things we said,
  • 48:10 - 48:15
    because the list of things I have said we're going to do...
  • 48:15 - 48:19
    I can't believe you're not throwing things at me.
  • 48:19 - 48:22
    Because it's ludicrous to believe that we can actually do all these things by ourselves.
  • 48:22 - 48:24
    And we're not.
  • 48:24 - 48:26
    We're going to let other people make the software.
  • 48:26 - 48:28
    As much as possible we're going to encourage other people
  • 48:28 - 48:32
    to build stuff. We're going to use stuff that already exists.
  • 48:32 - 48:35
    We're going to use Privoxy, we're going to use Prosody, we're going to use Apache.
  • 48:35 - 48:39
    We're not going to reinvent the web server, we're not going to reinvent protocols.
  • 48:39 - 48:46
    I really hope that by the time this project is mature, we haven't invented any new protocols.
  • 48:46 - 48:49
    Maybe we'll use new protocols, but I don't want to be
  • 48:49 - 48:53
    generating new things that haven't been tested, and then putting them in FreedomBox.
  • 48:53 - 48:58
    I want to see things in the real world, tested, gain credibility and take them.
  • 48:58 - 49:02
    The less we invent, the better.
  • 49:02 - 49:08
    As far as timelines go, by the time we have it ready, you'll know why you need it.
  • 49:08 - 49:11
    People right now are figuring out that privacy is important.
  • 49:11 - 49:13
    They're seeing it over and over again.
  • 49:13 - 49:18
    In Egypt, the at the start of the Arab spring, one of the things the government did to try to
  • 49:18 - 49:23
    tamp down the organisation was to convince companies to shut off cell networks,
  • 49:23 - 49:25
    to prevent people from talking to each other.
  • 49:25 - 49:28
    In America they did the same thing in San Francisco I hear.
  • 49:28 - 49:36
    Turned off the cell towers to prevent people from organising to meet for a protest.
  • 49:36 - 49:42
    With Occupy Wall Street, you're starting to see infiltration,
  • 49:42 - 49:46
    you're starting to see people going and getting information
  • 49:46 - 49:49
    that Occupy Wall Street is talking about and turning it over
  • 49:49 - 49:52
    to the authorities, the police, the FBI.
  • 49:52 - 49:59
    So the need for privacy as we enter a new age of increased activism, we hope,
  • 49:59 - 50:02
    of increased activity, of social activity,
  • 50:02 - 50:06
    I think the need for a lot of this privacy stuff is going to become clear.
  • 50:06 - 50:11
    As the technology for invading your privacy improves,
  • 50:11 - 50:18
    the need for technology to protect your privacy will become stark and clear.
  • 50:18 - 50:23
    Our two big challenges as I said are user experience,
  • 50:23 - 50:28
    and the one I didn't say was paying for developers, paying for designers.
  • 50:28 - 50:32
    Those are the hard parts that we're working on.
  • 50:32 - 50:36
    And if we fail, we think that's where we fail.
  • 50:36 - 50:40
    Software isn't on that list, as I said software is already out there.
  • 50:40 - 50:42
    So you can have a FreedomBox.
  • 50:42 - 50:47
    If you like that box that we've been passing around the audience, you can buy one from Globalscale.
  • 50:47 - 50:51
    If you don't want the box, it's just Debian, it's just Linux, it's just packages.
  • 50:51 - 50:56
    Throw Debian on a box, we will have packages available through the normal Debian mechanisms.
  • 50:56 - 50:58
    You don't even have to use our repository.
  • 50:58 - 51:02
    In fact, I don't think we're going to have a repository.
  • 51:02 - 51:06
    You're just going to download it and install it the same way you normally do it if you're technologically
  • 51:06 - 51:09
    capable of doing that.
  • 51:09 - 51:10
    I grabbed a bunch of photos from Flickr,
  • 51:10 - 51:14
    my colleague Ian Sullivan took that awesome picture of the FreedomBox.
  • 51:14 - 51:17
    And that's how you reach me.
  • 51:19 - 51:31
    APPLAUSE
  • 51:39 - 51:45
    Thanks James, please sit down.
  • 51:45 - 51:49
    We are up for questions from the audience for James.
  • 51:49 - 52:04
    Please raise your hand if you have any questions about the FreedomBox.
  • 52:04 - 52:06
    Hello, thanks that was a very interesting presentation.
  • 52:06 - 52:07
    Thank you.
  • 52:07 - 52:10
    Your boss Eben Moglen, he has given a speech at a committee of the US congress
  • 52:10 - 52:13
    I believe, which has received a lot of attention
  • 52:13 - 52:19
    and in Iran during the green movement the US state department
  • 52:19 - 52:24
    I believe has told Twitter to reschedule maintainence so that
  • 52:24 - 52:29
    the opposition could keep using Twitter during the attempted revolution
  • 52:29 - 52:33
    and Hilary Clinton has given a very popular speech about
  • 52:33 - 52:37
    how America would support the promotion of internet freedom
  • 52:37 - 52:41
    and I think things such as the New America Foundation are
  • 52:41 - 52:46
    funding and supporting projects such as the Commotion mesh networking project
  • 52:46 - 52:49
    that we've already heard about before.
  • 52:49 - 52:53
    So in other words there's a link between politics and technology sometimes,
  • 52:53 - 52:58
    and in the past I believe certain influential Americans such
  • 52:58 - 53:04
    Rupert Murdoch or George W. Bush have viewed modern communication technologies as a way to
  • 53:04 - 53:09
    promote U.S. foreign policy and to spread democracy and freedom in the world.
  • 53:09 - 53:14
    So my question is, what is your relationship with your government?
  • 53:14 - 53:16
    That's a really good question.
  • 53:16 - 53:21
    So one of the things that we sort of figured out from the beginning was that
  • 53:21 - 53:26
    if we had close relationships with the U.S. government,
  • 53:26 - 53:30
    people outside of the U.S. might have difficulty trusting us,
  • 53:30 - 53:35
    because nobody wants to tell all their secrets to the American government.
  • 53:35 - 53:43
    So we were thinking about what that really looks like in the context of a box that could be used globally.
  • 53:43 - 53:49
    We are working very hard to engineer a device that does not require you to trust us.
  • 53:49 - 53:51
    I'm not asking for your trust.
  • 53:51 - 53:55
    I'm not asking for your trust, I'm asking for your help.
  • 53:55 - 53:59
    All the code we write you'll be able to see it, you'll be able to
  • 53:59 - 54:02
    audit it, you'll be able to make your own decisions about what it does,
  • 54:02 - 54:05
    you'll be able to test it if it trustworthy or not,
  • 54:05 - 54:11
    and if you decide that it is not, you can tell everyone,
  • 54:11 - 54:12
    and they won't use it.
  • 54:12 - 54:17
    So from a trust perspective, it doesn't matter what our relationship is with anybody.
  • 54:17 - 54:18
    So that's the first thing.
  • 54:18 - 54:24
    The second thing is that right now we don't have much of a relationship with the U.S. government.
  • 54:24 - 54:33
    Jacob Applebaum is somewhat famous for his work with Julian Assange on Wikileaks,
  • 54:33 - 54:37
    and his work on Tor, and security in general,
  • 54:37 - 54:40
    his efforts to provide you with freedom and privacy.
  • 54:40 - 54:46
    He is a guy who was recently revealed in the Wall Street Journal that the U.S. government has been spying
  • 54:46 - 54:52
    on. And he is on our team, he's on our technical advisory committee.
  • 54:52 - 54:56
    He's one of the people we go to for help when we need to understand security on the box.
  • 54:56 - 55:03
    So right now our position with the American government is that we're not really related except in
  • 55:03 - 55:06
    so much that we are a bunch of people who really care about these issues,
  • 55:06 - 55:13
    which maybe occasionally makes us targets. Which gives us a reason to use a box like this.
  • 55:13 - 55:21
    Coupled with that, there is a program in America - you were talking about Hilary Clinton saying
  • 55:21 - 55:26
    she was going to encourage technologies that will spread democracy.
  • 55:26 - 55:30
    So the way America encourages things is by spending money on it.
  • 55:30 - 55:35
    That's our typical way to support programs. We fund different things.
  • 55:35 - 55:41
    We don't generally have feel-good campaigns, we just pay people to make good work, or try to.
  • 55:41 - 55:47
    So the U.S. state department has a program to provide funding for projects like the FreedomBox.
  • 55:47 - 55:49
    We have not applied for that funding.
  • 55:49 - 55:50
    I don't know if we will.
  • 55:50 - 55:56
    However I do know that they have given funding to some very good and genuine projects that are
  • 55:56 - 56:00
    run by people I trust, so I try not to be cynical about that.
  • 56:00 - 56:07
    I imagine at some point that through a direct grant or a sub-grant or something,
  • 56:07 - 56:11
    some state department money might support some aspect of work that is related to us.
  • 56:11 - 56:15
    I mean, we might take work from a project that is state department funded,
  • 56:15 - 56:18
    just because it's quick work.
  • 56:18 - 56:21
    Have I answered your question?
  • 56:21 - 56:22
    Yes, thanks.
  • 56:32 - 56:38
    Hi, well you always have tension if you talk about privacy
  • 56:38 - 56:41
    since 9/11 you know, I heard this in America very often,
  • 56:41 - 56:44
    "we have to be careful", every body is suspicious and stuff.
  • 56:44 - 56:48
    So how do you react when people like the government say well,
  • 56:48 - 56:55
    you are creating a way to support terrorism, whatever.
  • 56:55 - 57:00
    That's a good question, and it's a common question.
  • 57:00 - 57:05
    Frankly every time I do this talk, it's one of the first questions that come up.
  • 57:05 - 57:07
    The answer is really simple.
  • 57:07 - 57:12
    The fact is, this box doesn't create any new privacy technology.
  • 57:12 - 57:15
    It just makes it easier to use and easier to access.
  • 57:15 - 57:21
    People who are committed to terrorism or criminal activity, they have sufficient motivation that they
  • 57:21 - 57:24
    can use the technology that exists. Terrorists are already using PGP.
  • 57:24 - 57:27
    They're already using Tor.
  • 57:27 - 57:30
    They're already using stuff to hide their data.
  • 57:30 - 57:33
    At best we are helping stupid terrorists.
  • 57:33 - 57:36
    LAUGHTER
  • 57:36 - 57:43
    Granted, I'm not excited about that, but I don't that's a sufficient reason to deny common people
  • 57:43 - 57:45
    access to these technologies.
  • 57:45 - 57:49
    And more importantly than the fact that terrorists and criminals have access to this technology,
  • 57:49 - 57:52
    governments have access to this technology.
  • 57:52 - 57:55
    The largest corporations have access to this technology.
  • 57:55 - 58:01
    Every bank, the same encryption methods that we are using is the stuff that protects trillions of dollars
  • 58:01 - 58:05
    in value that banks trade every day.
  • 58:05 - 58:13
    This is technology that is currently being used by everyone except us.
  • 58:13 - 58:15
    All we're doing is levelling the playing field.
  • 58:15 - 58:22
    The same technology that hides data from us, that causes a complete lack of transparency in a downward
  • 58:22 - 58:28
    direction, we can have to level the playing field a little bit.
  • 58:28 - 58:40
    More questions?
  • 58:40 - 58:44
    Thank you for your presentation.
  • 58:44 - 58:51
    Could we add to challenges, maybe we could produce it in a non-communist dictatorship?
  • 58:51 - 58:54
    Because I saw the label "Made in China", so I think it is just
  • 58:54 - 59:01
    paradox to produce something like the FreedomBox in this country, and I would also like to be independent
  • 59:01 - 59:07
    from producing in China. So that's just something for a challenge I think.
  • 59:07 - 59:11
    That's a really good question and important point.
  • 59:11 - 59:16
    So, we're not a hardware project. Hardware is really really hard to do right and do well.
  • 59:16 - 59:19
    We have some hardware hackers on our project.
  • 59:19 - 59:25
    Our tech lead Bdale Garbee does amazing work with satellites and model rockets and altimeters,
  • 59:25 - 59:29
    and he's brilliant. But this is not a hardware project.
  • 59:29 - 59:32
    All we can do is use hardware that already exists.
  • 59:32 - 59:38
    When the world makes hardware in places other than China, we will use that hardware.
  • 59:38 - 59:41
    Right now, we don't have a lot of options.
  • 59:41 - 59:47
    And we're not going to deny everybody privacy because we don't have a lot of hardware options.
  • 59:47 - 59:48
    When we have those options we'll take them.
  • 59:48 - 59:52
    In the meantime, if you are a person who really cares about this issue,
  • 59:52 - 59:56
    don't buy a FreedomBox.
  • 59:56 - 59:59
    Take the software, go find a computer that isn't made in China,
  • 59:59 - 60:02
    LAUGHTER
  • 60:02 - 60:05
    and go put the software on that box.
  • 60:05 - 60:12
    If you want a solution that is run on computers that don't exist, I can't help you with that.
  • 60:12 - 60:16
    If you want a solution that runs, I might be able to help you with that.
  • 60:16 - 60:20
    But yes, I agree that that is a real issue, and we are thinking about that.
  • 60:20 - 60:25
    We believe that there is an open hardware project story here.
  • 60:25 - 60:29
    And one thing we've been doing is working with the manufacturer of the box,
  • 60:29 - 60:33
    to get the code free, to make sure we know what's in it,
  • 60:33 - 60:35
    so that there are no binary blobs in the box,
  • 60:35 - 60:38
    so we have some assurances that we actually do have freedom.
  • 60:38 - 60:46
    At some point though, we do believe that somebody will solve the open hardware problem for us.
  • 60:46 - 60:51
    We're not going to be the hardware project, but there are people trying to do this in an open way.
  • 60:51 - 60:54
    RaspberryPi for example. They're not quite right for our use-case, but those kinds of projects
  • 60:54 - 60:59
    are starting to exist, and they're starting to be really good.
  • 60:59 - 61:01
    In a few years, maybe that will be the thing we move onto.
  • 61:01 - 61:10
    Now, I'm guessing that even an open hardware project like RaspberryPi does their manufacturing in
  • 61:10 - 61:15
    a place like China. And that's a big problem.
  • 61:15 - 61:19
    When the world is ready with a solution to that, we will be ready to accept that solution and adopt it
  • 61:19 - 61:23
    of course.
  • 61:23 - 61:31
    Any more questions for James? or statements?
  • 61:33 - 61:37
    This is more of a statement than a question I guess,
  • 61:37 - 61:43
    but should the FreedomBox start being made in China there will be a lot more of them coming out of
  • 61:43 - 61:46
    the back door and enabling privacy for people that don't get
  • 61:46 - 61:52
    it, but also as soon as it starts getting manufactured I'd imagine you may,
  • 61:52 - 61:55
    because you're not in it for the money as you told me last night,
  • 61:55 - 62:00
    you may be looking forward to how easy it will be to copy,
  • 62:00 - 62:06
    and with things like MakerBot, making a case, making a bot is easy,
  • 62:06 - 62:09
    you can do it in your bedroom now with 3D printers.
  • 62:09 - 62:16
    So there will be a bag of components, a board, made by some online place that is really into this,
  • 62:16 - 62:18
    and you can assemble these at home.
  • 62:18 - 62:23
    So you've just got to get it out there first I think, and lead the way.
  • 62:23 - 62:30
    Yeah, I think that's quite right in that we are not the only place to get a box like this.
  • 62:30 - 62:35
    I mean, we're putting it on a specific box to make it easy, but there will be lots of places that make
  • 62:35 - 62:41
    boxes, and hopefully there will be places where working conditions are acceptable to everybody.
  • 62:41 - 62:44
    And at that point you can make your own boxes,
  • 62:44 - 62:44
    you can put them on any box you can find.
  • 62:44 - 62:46
    The point of Free Software is not to lock you into a service,
  • 62:46 - 62:53
    a technology, a software, a structure or a box.
  • 62:53 - 62:54
    We're not going to lock you into anything, that's one thing we're extremely clear about.
  • 62:54 - 63:01
    If you manage to make a box like this at home, I would really love to hear about it.
  • 63:01 - 63:06
    If you can spin up a MakerBot to make a case,
  • 63:06 - 63:09
    and you have a friend who can etch boards,
  • 63:09 - 63:11
    and you make a box like this at home,
  • 63:11 - 63:14
    that would be big news and a lot of people would want to know about it.
  • 63:14 - 63:23
    More statements or questions? Yes...
  • 63:23 - 63:31
    So, if you lose your box and get a new one, how is it going to reauthenticate to the boxes of your friends?
  • 63:31 - 63:34
    I think I didn't get that one.
  • 63:34 - 63:39
    Yeah, so, the good thing about friends is that they don't actually know you by your PGP key.
  • 63:39 - 63:48
    Sorry, I didn't specify it, if you want a grand security and you want distribution to more than 12 friends,
  • 63:48 - 63:54
    so let's say a hundred, and they're like, all over the world.
  • 63:54 - 64:00
    You are probably going to reach them through the internet to get your key parts back,
  • 64:00 - 64:05
    and you are probably not going to be able to use the FreedomBox to get a new one because
  • 64:05 - 64:06
    it has to be authenticated.
  • 64:06 - 64:09
    So how do you do?
  • 64:09 - 64:11
    Well, you at that point...
  • 64:11 - 64:15
    if you don't have a FreedomBox, the FreedomBox can't provide you with a solution to that problem.
  • 64:15 - 64:17
    What you're going to have to do,
  • 64:17 - 64:19
    is perhaps call your friends.
  • 64:19 - 64:21
    Have a conversation with them,
  • 64:21 - 64:23
    convince them that you are the person you say you are.
  • 64:23 - 64:27
    Reference your shared experiences, maybe they know your voice,
  • 64:27 - 64:34
    maybe they just know who you are by the way that you act and the way that you talk.
  • 64:34 - 64:37
    There's not going to be any one way that we get our keys back.
  • 64:37 - 64:41
    If you lose your key, yeah, we're not saying that's never going to be a problem.
  • 64:41 - 64:44
    And I wouldn't recommend splitting your key up among a hundred people,
  • 64:44 - 64:49
    because that's a lot of people to ask for your key back.
  • 64:49 - 64:54
    The mechanism I have in mind is not that you get a little bit of your key from
  • 64:54 - 64:56
    everyone you know, it's that you spread out the key among
  • 64:56 - 65:00
    a lot of people, and you need a certain number of those people.
  • 65:00 - 65:03
    So maybe it's five of seven of your friends.
  • 65:03 - 65:07
    So you give seven people the key, but any five of them could give you a whole key.
  • 65:07 - 65:10
    So in case you can't reach somebody you can still manage to do it.
  • 65:10 - 65:13
    And we can make that access control as fine-grained as we want,
  • 65:13 - 65:16
    but a hundred would be overwhelming.
  • 65:16 - 65:21
    We wouldn't do that. Sure, you could do it if you wanted,
  • 65:21 - 65:23
    but I don't think you'll have a hundred friends you could trust that much.
  • 65:23 - 65:27
    Maybe you do, I don't.
  • 65:27 - 65:34
    More questions, statements?
  • 65:34 - 65:39
    Yes?
  • 65:39 - 65:47
    Erm, it's just a wish... but have you thought about the idea of using the FreedomBox to create
  • 65:47 - 65:52
    a community where you can exchange not only data but like
  • 65:52 - 65:59
    products or services, so that would maybe like, change the system?
  • 65:59 - 66:05
    One of the things we want to do with the FreedomBox is
  • 66:05 - 66:10
    create a thing that looks a lot like your current social networking,
  • 66:10 - 66:13
    minus the advertising and the spying.
  • 66:13 - 66:16
    A way to talk to all your friends at once.
  • 66:16 - 66:20
    Once you have a place, a platform, where you can communicate
  • 66:20 - 66:23
    with your friends, you can build on that platform
  • 66:23 - 66:25
    and you can create structures like that.
  • 66:25 - 66:29
    If we make a thing that has programmable interfaces, so
  • 66:29 - 66:33
    you can make apps for it, you can make an app like that,
  • 66:33 - 66:34
    if that's important to you.
  • 66:34 - 66:38
    What people do with the communication once they have it,
  • 66:38 - 66:40
    we don't have any opinions about.
  • 66:40 - 66:43
    We want them to do everything that's important to them.
  • 66:43 - 66:46
    And I think something like that could be important,
  • 66:46 - 67:03
    and yeah, that would be amazing if that were to emerge.
  • 67:03 - 67:08
    Some things I believe are easier to do in a centralized architecture than a decentralized one,
  • 67:08 - 67:13
    for example search, or services that require a lot of bandwidth.
  • 67:13 - 67:16
    I don't see how you can run something like YouTube on the FreedomBox.
  • 67:16 - 67:18
    So is your utopian vision one where everything is decentralized,
  • 67:18 - 67:24
    or is it ok to have some centralized pieces in a future network?
  • 67:24 - 67:29
    Look, if you're going to grant me my utopia then of course everything is decentralized.
  • 67:29 - 67:32
    But we don't live in a utopia, I don't have magic.
  • 67:32 - 67:39
    We actually have in our flowchart a box labeled "magic routing",
  • 67:39 - 67:41
    because routing is hard to do in a decentralized way...
  • 67:41 - 67:44
    You need someone to tell you where the IPs are.
  • 67:44 - 67:47
    And that's hard to do in a decentralized way.
  • 67:47 - 67:52
    We haven't solved it, and we don't think we're going to fully solve it.
  • 67:52 - 67:55
    We hope someone else solves it first of all.
  • 67:55 - 67:57
    But second of all, we don't know where the compromises are.
  • 67:57 - 67:59
    Some things are not possible to decentralize.
  • 67:59 - 68:02
    We're going to decentralize as much as we can,
  • 68:02 - 68:04
    but we're not committing to doing anything impossible.
  • 68:04 - 68:06
    If you can't run YouTube off this box,
  • 68:06 - 68:08
    which I disagree with by the way,
  • 68:08 - 68:10
    then you won't, because it's impossible.
  • 68:10 - 68:12
    If you want to run YouTube on this box you turn all your
  • 68:12 - 68:14
    friends into your content delivery network,
  • 68:14 - 68:17
    and all your friends parallelize the distribution of the box,
  • 68:17 - 68:18
    you share the bandwidth.
  • 68:18 - 68:21
    It's ad-hoc, BitTorrent-like functionality.
  • 68:21 - 68:24
    Yes, that technology doesn't exist yet, I just made all that up,
  • 68:24 - 68:27
    but we can do it.
  • 68:27 - 68:33
    The parts that are hard though, the things like the routing,
  • 68:33 - 68:35
    there will be real compromises.
  • 68:35 - 68:36
    There will be real trade-offs.
  • 68:36 - 68:40
    There will be places where we'll say, you know what, we have
  • 68:40 - 68:42
    to rely on the DNS system.
  • 68:42 - 68:45
    Everybody in this room knows that the DNS system has some
  • 68:45 - 68:48
    security problems, some architectural problems that make it
  • 68:48 - 68:52
    a thing we would ideally not have to rely on.
  • 68:52 - 68:56
    But you know what? This project is not going to be able to replace DNS.
  • 68:56 - 68:59
    There are plenty of alternate DNS proposals out there, but we are not going to
  • 68:59 - 69:03
    just chuck the old DNS system, because we want people
  • 69:03 - 69:06
    to be able to get to the box, even if they don't have a box.
  • 69:06 - 69:09
    We want you to be able to serve services to the public.
  • 69:09 - 69:14
    We are going to use a lot of structures that are less than ideal.
  • 69:14 - 69:16
    We're assuming that TCP/IP is there...
  • 69:16 - 69:19
    in the normal use case you're using the internet backbone
  • 69:19 - 69:23
    to do your communication.
  • 69:23 - 69:26
    The mesh routing story we talked about is not how you do
  • 69:26 - 69:30
    your normal use. That's an emergency mode if there's a crisis, a political instability, a tsunami,
  • 69:30 - 69:35
    if you can't get to your regular internet because it has failed you in some way because
  • 69:35 - 69:38
    it has become oppressive or inaccessible.
  • 69:38 - 69:41
    Then you would use something like the mesh network.
  • 69:41 - 69:44
    But in the normal course of business, you are using
  • 69:44 - 69:47
    a thing that is less than ideal, and that's a trade-off.
  • 69:47 - 69:50
    We can't as a project protect you from everything.
  • 69:50 - 69:51
    We are going to look for the places where we can make
  • 69:51 - 69:54
    effective protection. We are going to try and make it clear
  • 69:54 - 69:58
    the limits of that protection. And we're going to give you
  • 69:58 - 69:59
    everything we can.
  • 69:59 - 70:05
    And then, as we move forward, when opportunities to solve new problems present themselves,
  • 70:05 - 70:09
    we'll take them.
  • 70:09 - 70:16
    Well I have to add before when we had the talk, unfortunately German you couldn't
  • 70:16 - 70:19
    understand a lot.
  • 70:19 - 70:23
    I didn't understand it but I could tell that it was occurring at a very high level of technical competence
  • 70:23 - 70:26
    and that there was a lot of good information there.
  • 70:26 - 70:29
    And I'm really hoping that you'll take the video of it and put it up on universalsubtitles.org, or some
  • 70:29 - 70:33
    other service where people can subtitle it. And hopefully there'll be an English version and I'll get
  • 70:33 - 70:36
    to see it. I think there was a lot of really good information in there.
  • 70:36 - 70:38
    What's universalsubtitles.org?
  • 70:38 - 70:46
    Universalsubtitles.org is a great website. It's kind of like, you put a video up, and anyone can
  • 70:46 - 70:49
    add subtitles to as much or as little as they want.
  • 70:49 - 70:54
    And then other people can change the subtitles, and you can do it in as many languages as you want.
  • 70:54 - 70:59
    So you don't have to ask someone for a favour, "hey, will you subtitle my video?"
  • 70:59 - 71:03
    that's 20 minutes long or an hour long. You tell a community of people "we need help subtitling",
  • 71:03 - 71:09
    and everyone goes and subtitles 3 minutes in their favourite languages.
  • 71:09 - 71:15
    It's a very effective way to crowdsouce subtitling, and it's a very effective way to just share information.
  • 71:15 - 71:21
    We have a lot of videos with good information that are locked into languages that not everyone speaks.
  • 71:21 - 71:23
    So this is a way to get around that.
  • 71:23 - 71:25
    As FreedomBox, we use that project.
  • 71:25 - 71:28
    And I believe, if I'm not mistaken, I haven't looked in a while,
  • 71:28 - 71:33
    that it's all Free software that they are using. So you can download it and start your own if you want.
  • 71:33 - 71:42
    So back to my previous question - in the talk in the afternoon we heard about mesh networking
  • 71:42 - 71:45
    we talked about that, and it's actually not just being used in
  • 71:45 - 71:47
    emergency situations but people are really using it.
  • 71:47 - 71:53
    And especially, the philosophy that everyone becomes part of the net as not just a consumer
  • 71:53 - 71:59
    but providing part of the net, it certainly is like that that they
  • 71:59 - 72:01
    can share data among each other, they don't necessarily need
  • 72:01 - 72:03
    to go into the internet.
  • 72:03 - 72:07
    So, I would imagine the FreedomBox, with mesh networking,
  • 72:07 - 72:11
    we could essentially create a large network of many many
  • 72:11 - 72:12
    people using it.
  • 72:12 - 72:17
    We also talked about the mesh networking like FunkFeuer in Graz or Vienna
  • 72:17 - 72:21
    but it would be interesting to get them on mobile devices,
  • 72:21 - 72:23
    so that you could walk through the street,
  • 72:23 - 72:30
    theoretically people have these devices, and you could walk
  • 72:30 - 72:32
    through and it would automatically mesh and connect you.
  • 72:32 - 72:38
    So FreedomBox if applied to that, you told me this interesting example, you could screw them to
  • 72:38 - 72:42
    light posts on the street, so maybe elaborate on that,
  • 72:42 - 72:44
    maybe it could have an effect and give a lot of coverage.
  • 72:44 - 72:49
    The reason why we currently envision mesh,
  • 72:49 - 72:51
    and no decisions have been made, right,
  • 72:51 - 72:54
    but just in the way we think about it when we talk to each other,
  • 72:54 - 72:58
    and the reason why we think mesh networking is not your daily
  • 72:58 - 73:03
    mode of use is that the performance degradation is not acceptable to most end-users.
  • 73:03 - 73:06
    If mesh networking reaches the point where it is acceptable
  • 73:06 - 73:10
    if you're in a place where there's enough nodes, and you
  • 73:10 - 73:13
    have a density that you can move around then sure, that
  • 73:13 - 73:16
    can make a lot of sense. But for a lot of people who
  • 73:16 - 73:19
    exist as a person not near a lot of FreedomBoxes, they're
  • 73:19 - 73:22
    going to need the regular internet.
  • 73:22 - 73:26
    So yeah, we think mesh will be great where you have that
  • 73:26 - 73:29
    density, when the mesh technology is mature.
  • 73:29 - 73:34
    When that happens, we could have the most easy access
  • 73:34 - 73:38
    to municipal wifi by using the power in all the street
  • 73:38 - 73:43
    lights. Put a FreedomBox up in the top of every street lamp.
  • 73:43 - 73:48
    Unscrew the light bulb, screw in the FreedomBox, and screw the light bulb back on top.
  • 73:48 - 73:51
    So you still get light, we're not going to plunge you into darkness.
  • 73:51 - 73:56
    You still get light, but then you have a mesh node. Right there.
  • 73:56 - 74:01
    And you could do every 3rd or 4th street light down town, and you could cover
  • 74:01 - 74:03
    an area rather effectively.
  • 74:03 - 74:07
    It is a way to get simple municipal wifi without running
  • 74:07 - 74:10
    any fibre. And every time you have fibre you can link to it.
  • 74:10 - 74:14
    Like any time you're near fibre you can link to it and you'll
  • 74:14 - 74:19
    get your information out of that little mesh and into the regular network.
  • 74:19 - 74:24
    We could have municipal wifi with much lower infrastructure costs than most people currently think of
  • 74:24 - 74:29
    when they think of municipal wifi. And we can do it through mesh nodes.
  • 74:29 - 74:34
    And if we did it through mesh nodes we would be providing that service not only to people who have
  • 74:34 - 74:39
    FreedomBoxes, that just looks like wifi, it just looks like a regular connection.
  • 74:39 - 74:46
    You might need to do some fancy hopping, but it's not...
  • 74:46 - 74:51
    the mesh boxes themselves will do the fancy hopping, your phone itself won't have to do it.
  • 74:51 - 74:54
    While we are talking about phones,
  • 74:54 - 74:59
    I want to say that I'm not sure how phones fit into the FreedomBox.
  • 74:59 - 75:02
    I'm pretty sure there is a way that phones fit into FreedomBoxes,
  • 75:02 - 75:06
    but you can't trust your phone.
  • 75:06 - 75:09
    With the so-called smartphones it's not a phone actually but a little computer, no?
  • 75:09 - 75:12
    Yes, your phone, a smartphone is a little computer but
  • 75:12 - 75:16
    it's not a computer that you can trust, because
  • 75:16 - 75:21
    even if you replace the software on your phone,
  • 75:21 - 75:27
    with Free software, it's almost impossible to actually replace all the binary drivers,
  • 75:27 - 75:30
    it's almost impossible to go all the way down to the metal.
  • 75:30 - 75:32
    It's very hard to get a phone that is completely trustworthy
  • 75:32 - 75:35
    all the way down to the bottom of the stack.
  • 75:35 - 75:37
    So that's a problem we haven't quite figured out how to solve.
  • 75:37 - 75:42
    And pretty soon it's going to be impossible to put Free software on phones.
  • 75:42 - 75:48
    The days of jailbreaking your iPhone and rooting your Android phone might
  • 75:48 - 75:55
    very well come to an end. There is a proposal right now called UEFI.
  • 75:55 - 76:01
    It's a standard. We currently use EFI, this would be UEFI.
  • 76:01 - 76:04
    I don't know what it stands for, it's a new thing.
  • 76:04 - 76:08
    And what this proposal is, is that before your computer,
  • 76:08 - 76:14
    before the BIOS will load a bootloader on your computer
  • 76:14 - 76:18
    that BIOS has to authenticate, sorry, that bootloader has
  • 76:18 - 76:20
    to authenticate to the BIOS. It has to be signed by someone
  • 76:20 - 76:23
    the BIOS trusts, someone the BIOS manufacturer trusts.
  • 76:23 - 76:26
    And the person who puts the BIOS in your phone can decide who it trusts,
  • 76:26 - 76:29
    and they can decide they don't trust anyone except themselves.
  • 76:29 - 76:37
    If Apple sells you an iPhone with a BIOS that requires a
  • 76:37 - 76:40
    signed operating system, it might be very hard for you to
  • 76:40 - 76:43
    get another version of the operating system on there.
  • 76:43 - 76:50
    The proposals for this stuff are really in the realm of laptops and computers, that's where it's starting,
  • 76:50 - 76:53
    but believe me, technology spreads.
  • 76:53 - 76:59
    And if you want to be able to put Linux on a computer that you buy, on a laptop you buy,
  • 76:59 - 77:03
    very soon you might have a very difficult time doing that.
  • 77:03 - 77:05
    The standard is there, the companies paying attention to it
  • 77:05 - 77:08
    are not paying attention to it for our purposes.
  • 77:08 - 77:13
    They want to make sure that they can control what is on your computer.
  • 77:13 - 77:18
    So this is, you know, another political fight that we're going to engage in,
  • 77:18 - 77:20
    not the FreedomBox, but the community.
  • 77:20 - 77:26
    We're going to have to have this fight. UEFI. Look it up.
  • 77:26 - 77:33
    Start thinking about it. This is going to be a big piece of the puzzle for freedom in computing over
  • 77:33 - 77:34
    the next few years.
  • 77:34 - 77:39
    We're going to have some problems and we're going to have to find some solutions.
  • 77:39 - 77:45
    But wouldn't such an initiative, wouldn't that create a good market for companies who actually
  • 77:45 - 77:50
    would supply Linux on such devices, on the phone and on the laptop market.
  • 77:50 - 77:53
    I'm sure there are companies supplying that.
  • 77:53 - 77:55
    Absolutely.
  • 77:55 - 77:58
    And if the market in freedom were good enough to support
  • 77:58 - 78:03
    large-scale manufacturing and all that other stuff then we might get that.
  • 78:03 - 78:05
    And we might get that anyway.
  • 78:05 - 78:07
    I mean, the standard will include as many keys as you want,
  • 78:07 - 78:09
    so we might get the freedom.
  • 78:09 - 78:13
    But the manufacturers will have a really convenient way to turn the freedom off.
  • 78:13 - 78:17
    I think there will be a lot of boxes where you will have freedom.
  • 78:17 - 78:22
    But there will also be a lot where right now we think we can get Free software onto it,
  • 78:22 - 78:24
    where we won't be able to anymore.
  • 78:24 - 78:26
    It's going to be a narrowing of the market.
  • 78:26 - 78:29
    I don't think our freedom is going to completely disappear from devices.
  • 78:29 - 78:33
    But a lot of devices, if you buy the device without thinking about freedom, assuming you can have it,
  • 78:33 - 78:38
    you might get it home and discover that you can't.
  • 78:38 - 78:45
    Ok, we want to give the floor again to the audience for more questions or statements.
  • 78:45 - 78:52
    Ok, there in the back, one more.
  • 78:52 - 78:55
    Yeah, one more time, so...
  • 78:55 - 79:01
    Nowadays, where you can hardly really save your PC, laptop, whatever, against malware...
  • 79:01 - 79:16
    Isn't it really, a red carpet for hackers to, if you have social networks and circles of friends,
  • 79:16 - 79:22
    one gets some malware on his PC, mobile device, whatever,
  • 79:22 - 79:27
    has a FreedomBox, authenticates to his friends, the state is secure
  • 79:27 - 79:32
    wouldn't that open doors?
  • 79:32 - 79:37
    Sure, well, the human error is not one we can control for.
  • 79:37 - 79:45
    But someone who has a key that you trust is not necessarily someone who you let run arbitrary code
  • 79:45 - 79:48
    on your FreedomBox.
  • 79:48 - 79:53
    You might trust them to the point of having message passing with them, and trusting who they are
  • 79:53 - 79:56
    and what they say, but you don't necessarily trust the technology that they have and the
  • 79:56 - 79:59
    code that they have to be free of malware.
  • 79:59 - 80:01
    You'll still have to do all the things you currently do.
  • 80:01 - 80:04
    Right now if somebody sends you a file, it could have malware in it.
  • 80:04 - 80:08
    We're not making that easier, or better, or more likely to happen.
  • 80:08 - 80:15
    I think what we are doing is completely orthogonal to that problem.
  • 80:15 - 80:19
    At the same time, if we were to have email services on the box,
  • 80:19 - 80:23
    and you know we're not quite sure what the email story of a box like this looks like,
  • 80:23 - 80:27
    we probably would want to include some sort of virus scanning or spam catching,
  • 80:27 - 80:32
    all the usual filtering tools to give you whatever measure of protection might currently exist.
  • 80:32 - 80:35
    But the fact someone has a key and you know who they are
  • 80:35 - 80:39
    I don't think that will ever be the security hole.
  • 80:39 - 80:42
    Or at least we really hope we can make it so it's not.
  • 80:42 - 80:49
    If we fail in that then we've missed a trick.
  • 80:49 - 80:54
    Ok, any more statements or questions?
  • 80:54 - 80:57
    Ok, so, James, my last question would be...
  • 80:57 - 80:59
    You can actually buy the box right now?
  • 80:59 - 81:00
    Yes.
  • 81:00 - 81:02
    From a company?
  • 81:02 - 81:03
    Yes.
  • 81:03 - 81:06
    Maybe you can supply that information. But the software is being developed?
  • 81:06 - 81:07
    Yes.
  • 81:07 - 81:12
    Can you give an estimation about the timeline of your project, or the next milestones?
  • 81:12 - 81:13
    Sure.
  • 81:13 - 81:17
    So, the boxes are manufactures by a company called Globalscale,
  • 81:17 - 81:19
    they're about $140 US dollars.
  • 81:19 - 81:24
    There is a slightly older model called the SheevaPlug that is about $90.
  • 81:24 - 81:28
    It does just pretty much everything the Dreamplug does.
  • 81:28 - 81:32
    It has some heat sinking issues, but it's a pretty good box as well,
  • 81:32 - 81:39
    so if the price point matters to you you can get last year's model and it'll serve you just fine.
  • 81:39 - 81:43
    The software, right now we have a bare Linux distribution.
  • 81:43 - 81:46
    We spent a lot of time getting the binary blobs out of the kernel
  • 81:46 - 81:50
    and making it installable onto this hardware target.
  • 81:50 - 81:55
    We have a Jabber server, Prosody, that we are modifying to suit our needs.
  • 81:55 - 82:01
    And that should be ready, time-frame, weeks.
  • 82:01 - 82:04
    Some short number of weeks.
  • 82:04 - 82:10
    The Privoxy server, the SSH forwarding, some short number of months.
  • 82:10 - 82:17
    But those are our roadmap for the short-term future, is Jabber, SSH forwarding, browser proxying.
  • 82:17 - 82:23
    We also are working on the interface, so we're going to have an interface that you can actually
  • 82:23 - 82:25
    control some of these services with.
  • 82:25 - 82:28
    And the first thing we're doing with that interface is probably allowing you to
  • 82:28 - 82:31
    configure this box as a wireless router.
  • 82:31 - 82:36
    So it can become your wireless access point if you want it to be.
  • 82:36 - 82:38
    And your gateway of course.
  • 82:38 - 82:40
    So user interface in one vertical,
  • 82:40 - 82:44
    SSH forwarding, browser proxying a little bit out there,
  • 82:44 - 82:48
    a little bit closer: Jabber, XMPP secure chat.
  • 82:48 - 82:53
    And once we have that stack, we believe that we're going to build upwards from XMPP towards
  • 82:53 - 82:56
    perhaps something like BuddyCloud.
  • 82:56 - 82:59
    We're seriously looking at BuddyCloud and seeing what problems it solves for us
  • 82:59 - 83:06
    in terms of actually letting users group themselves in ways that they can then do access control
  • 83:06 - 83:09
    and channels and things of that nature.
  • 83:09 - 83:14
    And are you actually in contact with the hardware company producing the servers?
  • 83:14 - 83:19
    Yeah, we've had a number of conversations with them.
  • 83:19 - 83:22
    They've agreed that when our code is ready this is something
  • 83:22 - 83:25
    they are very interested in distributing.
  • 83:25 - 83:27
    More importantly we've had a lot of conversations with
  • 83:27 - 83:29
    them about freedom.
  • 83:29 - 83:31
    About why we do what we do, they way we do.
  • 83:31 - 83:35
    And how they need to act if they want to distribute code for
  • 83:35 - 83:37
    us and work with our community.
  • 83:37 - 83:39
    And what that means is we're teaching them how to comply
  • 83:39 - 83:42
    with the GPL, and we're teaching them how to remove the binary drivers,
  • 83:42 - 83:46
    and in fact we're doing some of that for them.
  • 83:46 - 83:47
    But they're Chinese, right?
  • 83:47 - 83:49
    No. No, Globalscale is not a Chinese company.
  • 83:49 - 83:54
    Their manufacturing is in China, but they're not a Chinese company.
  • 83:54 - 83:58
    And we're also talking to Marvel. Marvel makes the system-on-a-chip that goes onto the boards
  • 83:58 - 84:01
    that Globalscale is integrating into their boxes.
  • 84:01 - 84:06
    But we're also talking to Marvel about what they can do to better serve the needs of our community.
  • 84:06 - 84:13
    So a large part of our efforts is to try to convince manufacturers to make
  • 84:13 - 84:15
    hardware that suits our needs.
  • 84:15 - 84:17
    This box is a thing that they developed, they invented,
  • 84:17 - 84:19
    before they ever met us, before they ever heard of us.
  • 84:19 - 84:24
    And if we can get them enough business,
  • 84:24 - 84:27
    if by making FreedomBoxes and by putting our software on the box,
  • 84:27 - 84:31
    that enables them to sell more boxes they will be very happy
  • 84:31 - 84:34
    and when they design the next generation,
  • 84:34 - 84:39
    not the next generation of the DreamPlug, but the next generation after whatever they're designing now,
  • 84:39 - 84:42
    so we're talking a couple of years from now.
  • 84:42 - 84:45
    We can say to them, look, you're selling a lot of boxes
  • 84:45 - 84:49
    because you're making a thing that serves the free world very well.
  • 84:49 - 84:52
    Remove the 8 inch audio jack because our people don't need it.
  • 84:52 - 84:56
    Add a second wifi radio. Put antenna ports on it.
  • 84:56 - 85:00
    This box can go from something that looks really good for our purpose to
  • 85:00 - 85:02
    being something that looks amazingly good for our purpose.
  • 85:02 - 85:05
    And that will require scale.
  • 85:05 - 85:07
    And what that means is that the FreedomBox becomes a wedge for
  • 85:07 - 85:13
    making better hardware for everyone.
  • 85:13 - 85:16
    But it's not just the FreedomBox. The Tor router project is
  • 85:16 - 85:21
    also focused on the DreamPlug. They've also decided this is a good box for their purpose.
  • 85:21 - 85:26
    If you are making a box that is kind of like a FreedomBox but isn't the FreedomBox because
  • 85:26 - 85:31
    it's more specialised to what you want it for, think about
  • 85:31 - 85:36
    the DreamPlug as a hardware target. And let us know,
  • 85:36 - 85:39
    so that when we go to the company, we can say look,
  • 85:39 - 85:42
    look at all the business you are getting by being people that serve the Free world.
  • 85:42 - 85:52
    And then, hopefully, we can convince them to make boxes that better serve the Free world.
  • 85:52 - 85:55
    And that's not a fantasy. We are having those conversations with them,
  • 85:55 - 85:58
    and they are very receptive.
  • 85:58 - 86:00
    So I am pretty happy about that aspect we do.
  • 86:00 - 86:03
    And my last question would be...
  • 86:03 - 86:05
    since we are now, everything is turning mobile,
  • 86:05 - 86:07
    it's like we have these computers with an extra phone...
  • 86:07 - 86:09
    the phone is a small application on these devices.
  • 86:09 - 86:13
    Is there any plan or any idea or any project to say like, have
  • 86:13 - 86:18
    a FreedomPhone or Free mobile device?
  • 86:18 - 86:23
    So the way you connect to this box is kind of how you connect to your router,
  • 86:23 - 86:25
    port 80, browser.
  • 86:25 - 86:29
    But another way you could do it would be an app on your cellphone that bluetooths to the box.
  • 86:29 - 86:34
    I don't actually think the box has bluetooth, but you know,
  • 86:34 - 86:36
    an app on your cellphone that talks to the box over the network, say.
  • 86:36 - 86:38
    That's possible, we're thinking about that.
  • 86:38 - 86:41
    We're thinking about what that looks like for the large population
  • 86:41 - 86:44
    that exists out there that doesn't have computers.
  • 86:44 - 86:47
    There's an awful lot of people that only have cellphones, they don't have computers.
  • 86:47 - 86:49
    And we want them to have freedom too.
  • 86:49 - 86:51
    So figuring out how we can use a cellphone to talk to the box is a future problem.
  • 86:51 - 86:52
    We're not working on it right now, but we're certainly talking
  • 86:52 - 86:57
    about where it fits into the roadmap.
  • 86:57 - 87:01
    And that's why we are concerned about whether or not you
  • 87:01 - 87:05
    can trust your phone.
  • 87:05 - 87:07
    Because if you can trust your FreedomBox, but not the
  • 87:07 - 87:10
    thing you use to access it then you don't really have the privacy you think you have.
  • 87:10 - 87:13
    So, figuring out, can you trust your cellphone? Is a big part of the puzzle.
  • 87:13 - 87:18
    It's a big thing that we don't know how to do yet.
  • 87:18 - 87:21
    So let me make a little advertisement for another interesting project,
  • 87:21 - 87:25
    there is a Spanish development, I think it is also produced in China,
  • 87:25 - 87:27
    but it's called The Geek's Phone.
  • 87:27 - 87:31
    And they have a compatible Android installation by default,
  • 87:31 - 87:34
    and they are probably having a similar philosophy to keep the hardware open.
  • 87:34 - 87:37
    So maybe there is a new cooperation on the horizon.
  • 87:37 - 87:41
    Oh yeah, we love projects like that.
  • 87:41 - 87:41
    I don't know a lot about their project, but I have heard of it
  • 87:41 - 87:44
    and it is on my list of things to look into.
  • 87:44 - 87:48
    I would love to see that succeed, that would be excellent.
  • 87:48 - 87:50
    Well James, thank you for your presentation.
  • 87:50 - 87:55
    I think it was really interesting. And thank you for coming.
  • 87:55 - 87:58
    James will be back on this stage at 7pm when we have our final discussion on the 20 years of
  • 87:58 - 88:03
    the world wide web.
  • 88:03 - 88:05
    Thank you James for coming.
  • 88:05 - 88:13
    APPLAUSE
Title:
Presentation of the FreedomBox from James Vasile
Description:

Freedom, Out of the Box! -- A Presentation of the FreedomBox" -- Elevate 2011 -- Forum Stadtpark ,24.10.2011 14:30 mit James Vasile (FreedomBox Foundation / US)

more » « less
Video Language:
English

English subtitles

Revisions