I'm very proud to have as a guest here from the United States coming to Elevate is James Vasile of the Freedom Box Foundation James Vasile is working on a multitude of projects like Apache, I think, Joomla and many others. He is also a lawyer, and he's working also with the Freedom Box Foundation and the Free Software Foundation. He's going to present one of the, in my opinion, most revolutionary projects I've seen in recent years as we can see here, a little small box, the Freedom Box. Yeah, erm, James is going to do a presentation and then we're going to be open for questions and then sit down for a talk so James, I give the floor to you. Thank you, Daniel. I've been here at the Elevate festival for a few days now I've been attending the talks and the films and the music and this has been an amazing place to see all these different ideas coming togethers I want to say thank you to Daniel for organizing so much of this. To Joseph as well. To Daniel especially for making a big effort to get me out here, making it possible for me to come out here and being such a gracious host. Thank you Dan, I really appreciate it. APPLAUSE A long time ago, in the beginning of the internet When we first started using the internet as a way to talk to each other We mostly talked directly to each other, right? Think about how email works, on a technical level You take a message, you hand it off to your mail transport agent It sends it through a network, directly to the recipient. It hops through some other computers, but funadmentally you use the network to talk directly to your other computer the other computer where the recipient gets his or her mail It was a direct communication medium. If you're old enough to remember a program called 'talk' Talk was the first, sort of, interactive you type, they see it, they type, you see it instant message application. This again, was direct. You would put your, put their name, into your program, and address they would put theirs into yours, and you would just talk directly to each other You didn't send this message through servers. That centralised technology. From there, from those beginnings of talking directly to each other we started to build communities, emailing directly to people. But that was relatively inefficient. Talking directly to people, one-to-one, works very good for one-to-one converstions. But as soon as you want a group conversation as soon as you want to find people reliably who you haven't already set up contacts for, exchanged email addresses and such you run into friction, you run into problems So the solution to that, was to create more centralised structures and we did this with IRC IRC is a place where instead of talking directly to the people we're trying to reach we take a message, and we send it to an IRC server a third party and the IRC server then copies that message to all the people who we might want to talk to. We developed mailing lists, listservs And again, this was a way where we would take our message and hand it to a third party A mail server, that is not us and not the person we're trying to talk to and that mail server would then echo our communication to all the people we want to talk to and this was great, because you didn't have to know the addresses of all the people you wanted to talk to You could just all 'meet' in a common place We all meet in an IRC chatroom, we all meet on a listserv And there were a lot of IRC channels, and a lot of IRC servers and a lot of mail servers all across the internet A lot of places to do this communication. And if you didn't like the policies or the structures or the technology of any one of these service providers these IRC servers, or these list servers you could just switch, you could choose to run your own. It was very simple. This infrastructure is not hard to create, it's not hard to run, it's not hard to install. And so a lot of people did run, create and install it. There were a bunch of IRC servers, there were a bunch of different listserv packages But as we've moved forward in time, we've started to centralise even more. And, you can fast-forward to today where we're channeling our communication through fewer and fewer places. And we are making structures that are more and more central and more and more over-arching So, from the, the IRC way of talking to each other we moved to instant messaging applications. AOL Instant Messenger, ICQ, those were the early ways to do it and there were only a few of them MSN had its messaging system, Yahoo had its messaging system and when people wanted to talk to each other now, they were using third-parties again. But they were only using a few third parties. And if you wanted to switch providers, you would leave almost everyone you knew behind, your entire community behind. And so it becomes harder to switch. There are fewer options and the cost of switching leaves more and more people behind So you started to have lock-in. You started to have people who were chained to their methods of communication because the cost of losing your community is too high. And so if you don't like the technology, or you don't like the policy or you don't like the politics or if they're trying to filter you or censor you you don't have a lot of options. The cost of leaving is so high that you might stay. People do stay. And they accept it. And we went from that small basket of providers of this kind of communication technology to an even more centralised structure where there is effectively only one way to reach all our friends, in each mode of communication, Facebook. And Twitter. These two services rule everything. And I'm not going to stand here and say Facebook is evil and that Twitter is evil What I want to say is that having one place where we do all our communication leaves us at the mercy of the policies of the people that control the infrastructure that we are chained to, that we are stuck using, that we are locked into. You can't leave Facebook without leaving everybody you know because everybody you know is on Facebook. I was not a Facebook user. I was against Facebook. I thought it was bad to centralise all our communication in one place. I didn't like the privacy implications, I didn't like Facebook's censorship of things like pictures of nursing mothers. I don't think that kind of thing is obscene, and I don't think Facebook should have the ability to tell us what we can share with our friends. So I thought those were bad policies, and I reacted to that by not joining Facebook. For years. All my friends were on Facebook. I joined Facebook late last year. November. Because in November, a friend of mine passed away. His name was Chuck. He was a brilliant man. And he lived a lot of his life online. He was on Facebook, and he shared things with friends on Facebook. When he passed away I realised I hadn't communicated with him in a while, I hadn't really talked to him in a while. And the reason I hadn't was because I wasn't communicating with him in the place he communicates. I wasn't meeting him where he was, I wasn't on Facebook. I was missing out on something huge. That's the cost of not being there. And so I joined. Because I decided that as strong as my beliefs were, it was more important to me to be there with my friends and to talk to my friends. That's the power of lock-in. Me, a person who cares, as much as I do, who cares enough about these issues that I do something like this I got locked into Facebook. I'm there now. That's how I talk to a lot of my friends, whether I like it or not I am locked into Facebook. You know, I'm also on Diaspora. But my friends aren't on Diaspora. This sort of lock-in creates a sort of situation where we have one arbiter of what is acceptable speech, whether we like it or not. If they're free, we're free to the extent, only to the extent, that they give us freedom. And that to me isn't freedom. That to me is accepting what you're given. It's the exact opposite of making your own choices. The exact opposite of self-determination. All of our problems in communication can be traced to centralized communications infrastructure. Now, I've sort of told this story at the social level, in the way that we're talking about how to talk to your peers and your friends on the internet. But this story also exists when we think about relying on the pipes, relying on the hardware, the technical infrastructure behind the software. We rely on internet backbones, we rely on centralized cellphone networks, we rely on centralized telephone networks. The people that control these networks have the ability to tell us what we're allowed to say, when we're allowed to say it. They have the ability to filter us, to censor us, to influence us. Sometimes they use that ability, and sometimes they don't, and sometimes by law they're not allowed to. But at the end of the day the power doesn't rest in our hands. The power, from a technological perspective, rests in the hands of the people that operate the networks. Centralization doesn't just allow this sort of filtering and censorship. There's another big problem with centralization. The other big problem with centralization is that by gathering all of our data in one place it becomes easy to spy on us. So every time you go to a website pretty much the website includes, at the bottom of the page a little graphic or invisible Javascript thing that tells Google that you came to visit the page. Eva goes to a website, and the website says "Hey Google! Eva just came to my website!" Every time she goes to a website, that happens. And so Google effectively sits next to her and watches, while she uses the internet. Watches everything she does, and everything she enters, everything she looks at and knows. It's not just her search data, it's not just her Gmail. It's the entire picture of her digital life. In one place. That's a pretty complete profile. If you were able... ...imagine if somebody could sit next to you and watch everything you did online, imagine how much they would know about you. That's how much Google knows about you. Google knows more about you than you know about yourself, because Google never forgets. Google knows more about you than your parents, than your partner, Google knows your secrets, your worst secrets, Google knows if you're cheating on your spouse because they saw you do the Google search for the sexually-transmitted disease. Google knows your hopes and your dreams. Because the things we hope and dream about, we look for more information about. We're natural information seekers. We think about something, it fascinates us, we go and look it up online. We search around. We look around the internet, and we think about it. And Google is right there. Following our thought process, the thought process in our click trail. That is an intimate relationship. Right? Do you want an intimate relationship with Google? Maybe you do. I personally, don't. But that's it, Google sits next to us and watches us use our computers. And if anyone actually did... if you had a friend who wanted to sit next to you, or a stranger said I want to sit next to you and just watch you use your computer all day, you would use that computer very differently to the way you do now. But because Google doesn't physically sit next to you, Google sits invisibly in the box, you don't know Google is there. But you do know, right? We're all aware of this. I'm not saying any of you don't know, especially in a room like this. But we don't think about it. We try not to think about it. We are locked in, to the internet. We can't stop using it. And the structures that exist, the infrastructure that exists, that has been slowly turned from a means to allow us to communicate with each other to a means of allowing us to access web services in return for all our personal information so we can be bought and sold like products. That is the problem. That is the problem of centralization, of having one structure. As soon as we put all that information in one place we get complete profiles of us, you get complete pictures of you. And that is a lot of information. It's valuable information. It's information that is used, right now, mostly to sell you things. And that, you might find objectionable. Maybe you don't. Maybe you don't believe the studies that say you can't ignore advertising. Maybe you think that you are smart and special, and advertising doesn't affect you. You're wrong. But maybe you believe that. But that information, that same infrastructure, that same technology that allows them to know you well enough to sell you soap allows them to know you well enough to decide how much of a credit risk you are, how much of a health risk you are, and what your insurance premiums should look like. In America we have a big problem right now. Insurance costs are out of control. Health insurance. We're having a lot of difficulty paying for it. Insurance companies would like to respond to this problem by knowing better who's a good risk and who's a bad risk so they can lower prices for the good risk and raise prices for the bad risk. Essentially they want to make people who are going to get sick, uninsurable. And if you could know enough about a person to know what their risk factors are based on what they're digital life is, if you can get just a little bit of information about them, maybe you can figure out who their parents are and what hereditary diseases they might be subject to, you can start to understand these things. You can start to figure out who's a good risk and who's a bad risk. You can use this information for ends that seem reasonable if you're a health insurance company, but probably don't seem reasonable if you're the kind of person sitting in this room, the kind of person that I talk to. And that's the problem. The innocuous use. The use that seems kind of icky, but not truly evil, which is advertising. It's the same mechanism, the same data, that then gets used for other purposes. It's the same data that then gets turned over to a government who wants to oppress you because you are supporting wikileaks. And that's not a fantasy, that's what happened. It's the same information that anybody who wants to know something about you for an evil end would use. We have a saying in the world of information, that if the data exists, you can't decide what it gets used for. Once data exists, especially data in the hands of the government, of officials, once that data exists, it's a resource. And the use of that resource it its own energy, its own logic. Once a resource is there begging to be used, it's very hard to stop it from being used. Because it's so attractive, it's so efficient, it would solve so many problems to use the data. And so once you collect the data, once the data exists in one centralized place, for anybody to come and get it with a warrant, or maybe no warrant, or maybe some money... somebody is going to come with a warrant, or no warrant, and they are going to get that data. And they will use it for whatever they want to use it. Once it's out of the hands of the first person who collected it, who maybe you trust, who maybe has good privacy policies, who maybe has no intention to do anything with your data other than use it for diagnostic purposes, once it's out of that person's hands it's gone. You never know where it goes after that. It is completely uncontrolled and unchecked and there is no ability to restrain what happens to that data. So all of this is my attempt to convince you that privacy is a real value in our society, and that the danger of losing privacy is a real problem. It's not just the censorship, it's not just the filtering, it's not just the propaganda, the influencing of opinion, that's one aspect of it, it's not just the free speech. It's also the privacy, because privacy goes to the heart of our autonomy. About a year and a half ago to two years ago at the Software Freedom Law Center a man named Ian Sullivan who's a co-worker of mine, he bought a bunch of plug servers, because he was really excited at the thought of using them as print servers, and media servers, and he started tinkering with them in our office. My boss Eben Moglen who is a long-time activist in the Free Software movement, fought very hard for Phil Zimmerman and PGP when that was a big issue, he looked at this technology and he immediately realised that several streams had come together in one place. There's a lot of really good technology to protect your privacy right now. In fact that's the stuff we're putting on the Freedom Box. We're not writing new software. We are gathering stuff, and putting it in one place. Stuff that other people did because there are people who are better at writing software, and security, than we are. We're software integrators. And he realised there was all this software out there, and suddenly there was a box to put it on. You could put all that software in one place, make it easy, and give it to people in one neat package. Pre-installed, pre-configured, or as close to it as we can get. And that, was the vision for the FreedomBox. The FreedomBox is a tiny computer. Look at this. That's small, it's unobtrusive. So it's a small computer. And we don't just mean small in size... it doesn't take a lot of energy. I could be running this box on a couple of AA batteries for the life of this presentation. You could run it on a solar panel. It's very lightweight infrastructure. You plug it into your home network, and when I say home network, (I'm going to pass this around) When I say home network, I mean home network. This is technology we are designing for individuals to use to talk to their friends. Our use-case, the thing we're trying to protect is you guys, as individuals in your communities. This isn't a small-business appliance, it's not a large corporate applicance, this is a thing that we are truly aiming at the home market, and people who care about privacy on an individual level. You plug it into your home network to protect your privacy, your freedom, your anonymity and your security. That is our mission statement, I guess. Unofficially. That is what we believe we are trying to do with this device. So, what privacy means in this context, the way we're going to go about trying to protect your privacy is to connect you directly with other people and take everything you do and try to encrypt it so that only you and the person you are talking to can see it. This is not a new idea. We can do encrypted messaging, and we can do encrypted browsing. Now there are problems with encrypted browsing. Right now if you want to have secure browsing you generally use something called SSL. SSL is a system of certificates that allow a web server to say to you "we can talk privately". That's the first guarantee, a secure cryptographic connection (A). and (B) I can authenticate to you that I am who I say I am. So not only can nobody listen, but you know who you're talking to. You're not secretly talking to the government, when really you're talking to me. The problem with SSL, the big problem with SSL, is that the system for signing certificates relies on a trust hierachy that goes back to a cartel of companies who have the server certificates, who have the ability to do this "guarantee". So when the website says to you "I guarantee I am who I am", you say "I don't know you, I don't trust you". And they say "Oh, but this other company, I paid them money, and so they'll guarantee that I am me." Which is a really interesting idea - because I also don't know this company, why would I trust that company? I mean, the company is just old enough and influential enough that they could actually get their authority into my browser. So really my browser is willing to accept at face-value that this website is who it says it is, but I don't necessarily accept that. And then, we have the problem of self-signed certificate. Where if they say, none of those authorities in your browser trust me, I trust myself and look, I've signed a piece of paper - I swear I am who I say I am. And that, is not trustworthy at all, right? That's just him saying again "No, really! I'm me!". So this is a problem, because the FreedomBoxes are not going to trust the SSL cartel, and they are not going to trust each other, so they can't just sort of swear to each other that they are who they are. So we think we've solved this. I'm not going to say we've solved it, because we're just starting to tell people about this idea, and I'm sure people will have reasons why the idea can be improved. But there is a technology called MonkeySphere, that allows you to take an SSH key and wrap it around a PGP key, and use a PGP key to authenticate SSH connections. It's really neat technology that allows you to replace SSH trust with PGP trust. And we looked at that, and we thought, why can't we do that with SSL? So one thing we're going do with browsing is take an SSL certificate, an X.509 certificate, and wrap it around a PGP key and send it through the normal SSL layer mechanisms but when it gets to the other end, smart servers and smart browsers will open it up and use PGP mechanisms to figure out how to trust people, to verify the connections, to sign the authentication of the identity of the browser, of the server. This allows us to replace the SSL cartel with the web of trust, the keyservers. We're replacing a tiny group of companies that control everything with keyservers, community infrastructure. Anyone can set up a keyserver, and you can decide which one you want to trust. They share information. The web of trust is built on people, telling each other that they trust each other. Again, you can decide who to trust and how much you want to trust them. This is emblematic of our approach. We've identified structures that are unreliable because they are centralized, because they are controlled by interests that are not the same interests as our interests. And we've decided to replace them wherever we can with structures that rely on people, that rely on human relationships, that rely less on the notion that you can buy trust, and more on the notion that you earn trust, by being trustworthy, by having people vouch for you over time. So that's our approach to encrypted browsing. It's also our approach to encrypted messaging. We're doing Jabber for a lot of message passing, XMPP, and we're securing that again with PGP. Everywhere we can we're going to try to use the PGP network, because it already exists... as I said, we're not trying to invent anything new. PGP already exists and it does a really good job. So we're taking the PGP trust system and we're going to apply it to things like XMPP and make sure that we can do message passing in a way that we can trust. Once we have XMPP we have a way to send text, a way to send audio, sure... but also you can send structured data. Through that same channel. And you can send that data to buddy lists. So the system starts to look like a way to pass data in a social way. And we think this is the beginning of the social layer of the box. At the bottom of the box we have a belief that the technology should be social from the ground up. And so we're building structures that allow it to be social, that assume you want to connect with friends in a network of freedom, perhaps FreedomBoxes, perhaps other kinds of software, other kinds of technology. And we're designing with that in mind. With that in mind, we think we get certain benefits technologically which I'll get into later. We think we can simply things like key management, through methods like this. By privacy I also mean that we can install a proxy server, privoxy, we think the answer is privoxy here, privoxy on the box, so you can point your browser at the box, surf the web on the box, and strip ads, strip cookies, stop Google from tracking you from website to website to website, to remove, the constant person sitting at your side, spying, recording, listening to everything you do. In that vein, we don't just want to block ads and reject cookies, we want to do something new, relatively new. We think we want to munge your browser fingerprint, that unique pattern of data that is captured by your user-agent string and what plugins you have, and all that stuff that forms a unique profile of you that allows people to track your browser, companies to track your browser as you hop along the web, even if they don't know anything about you. It can sort of tie you to the browser, make profiles about your browser. And that turns out to be a very effective way of figuring out who you are. So even without a cookie, even without serving you with an ad, once they're talking to you they can uniquely identify you, or relatively uniquely. But it's relatively early in the browser fingerprint arms race. We think that with a very little bit of changing, we can foil the recording. and win this round at least. And instead of having one profile where they gather all of your data, you will present to services as a different person every time you use the service. So they cannot build profiles of you over time. That's what privacy looks like in our context. We're looking for cheap ways to foil the tracking. We're looking for easy things we can do, because we believe there's a lot of low-hanging fruit. And we'll talk about that more in a minute. Freedom is our value, freedom is the thing we are aiming for, freedom from centralized structures like the pipes. Now mesh networking, I have mesh networking in my slides. That is a lie. We are not doing mesh networking. The reason we are not doing mesh networking is because I do not know anything about mesh networking and one of the reaons I came here was to meet people who know a lot about mesh networking and I see people in this audience who know a lot about mesh networking. If you want to turn that lie into the truth, the way you do that is by continuing on your projects, making mesh networking awesome, to the point where I can say yes, we're going to put that in this box. Then eventually, by the time this box is ready to do real things for real people, we're really hoping that the mesh story coheres, where we've identified the protocol and the technology and the people who are going to help us. If you think you might be one of those people, we want to talk to you. So yes, we are going to do mesh networking, and that might be a lie but I hope not. We want you to have the freedom to own your data that means data portability, that means that your data sits on your box and never goes to a third party. It only goes to the people you want it to go to. Fine-grained access control. Your data, your structures, you decide where it goes. That's a user-interface problem, that's a user permission problem, an access control problem. Access control is a solved problem. Doing it through a convenient user-interface, that's not solved... so that's work to be done. That's a big chunk of our todo list. We want you to own your social network Before Facebook there was a thing called MySpace, which was... I'm not even sure it exists anymore. Before MySpace there was Tribe. Before Tribe there was Friendster. Friendster is now like a... "gaming network". I don't know what it is but they still send me email Which is the only reason I know they're still alive. Before Friendster was the original social network. We called this social network "the internet". We talked directly to each other, we used email, an instant messenger and IRC. We talked to people using the structures that were out there. It wasn't centralized in one service, we had a lot of ways of meeting each other and passing messages. What we lacked was a centralized interface. So when we say "own your social network" we mean use the services of the internet, own the pieces that talk to each other. Hopefully we'll provide you with a convenient interface to do that. But the actual structures, the places where your data live, that is just the same pieces that we know how to use already. We are not going to try to reinvent how you talk to people, we're just going to make it so that the pipes are secure. A big part of freedom, a big part of privacy, is anonymity. Tor can provide anonymity. But we don't have to go all the way to Tor. Tor is expensive, in terms of latency. Tor is difficult to manage... I don't know how many people have tried to use Tor, to run all their traffic through Tor. It's hard. For two reasons. For one, the latency... it takes a very long time to load a web page. And two, you look like a criminal. To every website that you go to. My bank shut down my account when I used Tor. Because suddenly, I was coming from an IP address in Germany that they had detected in the past efforts to hack them on. So they closed my account, well I had to talk to them about it, it did all get solved in the end. PayPal as well closed my account down. So that was the end of my ability to use Tor. So we can't just run all our traffic through Tor. It's too slow, and the network has weird properties in terms of how you present to websites, that frankly, are scary. Because if I look like a criminal to the bank, I don't want to imagine what I look like to my own government. But we can do privacy in other ways. If you are a web user, in China, and you want to surf the internet, with full access to every website you might go to, and with privacy from your government, so that you don't get a knock on your door from visiting those websites, we can do that without Tor. We don't need Tor to do that. We can do that cheaply. Because all you need to do in that situation is get your connection out of China. Send your request for a web page through an encrypted connection to a FreedomBox in... Austria, America, who knows? Just get the request away from the people who physically have the power to control you. And we can do that cheaply, that's just SSH port forwarding. That's just a little bit of tunneling, that's just a little bit of VPN. There's a lot of ways to do that sort of thing, to give you anonymity and privacy in your specific context without going all the way into something like Tor. Now there are people who are going to need Tor. They will need it for their use case. But not every use case requires that level of attack. And so one of the things we're trying to do is figure out how much privacy and anonymity you need, and from whom you need it. If we can do that effectively we can give people solutions that actually work for them. Because if we just tell people to use Tor, we're going to have a problem. They're not going to use it, and they won't get any privacy at all. And that's bad. So we want to allow people to do anonymous publishing, and file-sharing, and web-browsing and email. All the communications you want to do. The technology to do that already exists, we could do all of that with Tor. The next piece of our challenge is to figure out how to do it without Tor. To figure out what pieces we need Tor for, and to figure out what pieces we can do a little bit more cheaply. Security. Without security, you don't have freedom and privacy and anonymity. If the box isn't secure, you lose. We're going to encrypt everything. We're going to do something that's called social key management, which I'm going to talk about. I do want to talk about the Debian-based bit. We are based on a distribution of Linux called Debian, because it is a community-based distribution. It is made by people who care a lot about your freedom, your privacy, and your ability to speak anonymously. And we really believe that the best way to distribute this software is to hand it to the Debian mirror network and let them distribute it. Because they have mechanisms to make sure that nobody changes it. If we were to distribute the software to you directly, we would become a target. People would want to change the software as we distribute it on our website. They would want to crack our website and distribute their version of the package. We don't want to be a target, so we're not going to give you software. We're going to give it to Debian, and let them give you the software. And at the same time you get all of the Debian guarantees about freedom. The Debian Free Software Guidelines. They're not going to give you software unless it comes with all of the social guarantees that are required to participate in the Debian community. So we're very proud to be using Debian in this manner, and working with Debian in this manner. And we think that's the most effective way we can guarantee that we're going to live up to our promises to you, because it provides a mechanism whereby if we fail to live up to our promises, we cannot give you something that is broken. Because Debian won't let us, they just won't distribute it. There are problems with security. There are things we can't solve. One... Physical security of the box. We haven't really talked much internally about whether we can encrypt the filesystem on this box. I don't quite see a way to do it. It doesn't have an interface for you to enter a password effectively. By the time you've brought an interface up you'd be running untrusted code. I don't know a way to do it. If anyone can think of a way that we can effectively encrypt the filesystem, I'd love to hear it. But, on top of that, if we do encrypt the filesystem, then the thing cannot be rebooted remotely, which is a downside. So there are trade-offs at every step of the way. If we can figure out some of these security issues, then we can be ahead of the game. But I think the encrypting the filesystem is the only way to guarantee the box is secure, even if it's not physically secure. So I think that's a big one. If you have ideas about that, please come and talk to me after the talk. I promised I would talk about social key management, and here it is. So we're building the idea of knowing who your friends are into the box at a somewhat low level. To the point where things that are on the box can assume it is there, or ask you if it's there, or rely on it as a matter of course in some cases. So we can do things with keys that make your keys unlosable. Right now a PGP key is a hard thing to manage. Key management is terrible. Do you guys like PGP? PGP is good. Does anyone here like key management? We have one guy who likes key management. LAUGHTER He's going to do it for all of you! So, none of us like key management. Key management doesn't work, especially if your use-case is home users, naive end-users. Nobody wants to do key management. Writing their key down and putting it in a safety deposit box is ludicrous. It's a very difficult thing to actually convince people to do. Sticking it on a USB key, putting it in a zip-lock back and burying it in your backyard is paranoid. I can't believe I just told you what I do with my key. LAUGHTER No, you can't ask people to do that. They won't do it. You can't protect keys in this manner. You have to have a system that allows them to sort of, not ever know they have a key. To not think about their key unless they really want to. We think we've come up with something that might work. You take the key, or a subkey, you chop it into little bits and you give that key... and we're talking about a key of a very long length, so there's a giant attack space and you can chop it into bits and hand it to people without reducing the search space for a key. You chop it into bits and hand all the bits to your friends. Now all your friends have your key, as a group. Individually, none of them can attack you. Indicidually, none of them has the power to come root your box, to access your services and pretend to be you. As a group, they can do this. We trust our friends, as a group, more than we trust them as individuals. Any single one of your friends, if you gave them the key to your financial data and your private online life that would make you very nervous. You would worry that they would succumb to temptation to peek, fall on hard times and want to attack you in some way, fall out with you, get mad at you. As an individual, people are sort of fallible in this sense. But as a group of friends who would have to get together and affirmatively make a decision to attack you, we think that's extremely unlikely. It's so unlikely that there are only a few scenarios where we think it might happen. One... if you are ill, and unable to access your box or you're in jail or you've passed away or you've disappeared. Or... you've gone crazy. We call this type of event, where all your friends get together and help you, even if you don't ask them for help, we call that an intervention. When your friends sit you down and say, "you need our help, you can't ask us for it because you're not in a position to ask us for it", that's an intervention. If you have a moment in your life, a crisis in your life that is an intervention level event, that's when you can go to your friends. If your house burns down, you lose your key and all your data You go to your friends, and you say "can I have part of my key back?" "Oh, and give me that data that you have in a cryptographically-sealed box that you can't read." To all your friends... "My data please, my key please, ..." "My data please, my key please, ..." "My data please, my key please, ..." You take all those pieces, you get a new box, you load it all onto your box. You have the key, you have your entire key, and now you can read your data. And you haven't lost your digital life. You have a key that is now unlosable. Even if you never wrote it down, even if you never buried it in the backyard. This is a hard problem in key management. People lose their keys and their passwords to services all the time. The only way we can think of to make that impossible, is this mechanism. And of course it's optional. If you're a person who doesn't trust your friends, even as a group, or if you're a person who just doesn't have a lot of friends (let me finish!) ...who doesn't have a lot of friends with FreedomBoxes who can be the backend for this, you don't have to trust this mechanism. You can do something else to make your key unforgettable. But for a lot of naive end-users, this is the mechanism. This is the way they are going to never lose their keys Because the first time a user gets irretrievably locked out of his FreedomBox, we lose that user forever. And we lose all his friends forever. Because it would scare you to lose such an important group of information. Social key management. This is the benefit of building social, of building knowledge of who your friends are, into the box, at a deep level. We have never done that before, with a technology as a community project. And it opens up new possibilities. This is just one. There are others. But it's a field we haven't really thought a lot about. I think once we get out there and we start doing this kind of construction, a lot of new uses are going to be found for this architecture. I encourage you all to think about what changes, when you can assume that the box has people you can trust, just a little bit, because right now we live in a world where we are asked to trust third party services like Facebook with all our photos, or Flickr with all our photos, or Gmail with all our email. We are asked to trust them. We have no reason to trust them. I mean, we expect that they'll act all right, because they have no reason to destroy us. But we don't know what's going to happen. We're effectively giving all our information to people we don't trust at all right now. How does a network of people we trust, just a little bit, change the landscape? I think that's a really interesting question. This box explores that question, this box creates new solutions to old problems that previously seemed intractable. So, I encourage everybody to think about how that might change the solution to a problem they have with a technological architecture as it exists today. Here's another problem... Boxes that know who you are, and know who your friends are, and know how your friends normally act, can also know when your friends are acting weird. If you have a friend who sends you one email a year, who suddenly sends you ten emails in a day, that look like spam, you know that box is rooted. You know that box is weird. Or if you are using the FreedomBox as your gateway to the internet, and a box it is serving downstream, starts sending a bunch of spam through it, it knows. It can say "Oh no! You're acting like a zombie." "You should get a check-up." It can shut off mail service to that box, and not let the messages out. It can make that decision to protect the wider internet to make you a better citizen in the world. If suddenly your computer starts saying "Hey, I'm in Scotland and I need $5000"... but we know you're not in Scotland Maybe this box, because it has contact information, maybe this box sends you an SMS. And says "Dude, you've been hacked, go do something about your box." So the types of things we can do once we assume we have close relations as opposed to arms-length relations, the types of things we can do when we trust each other a little bit and we trust our boxes a little bit, goes way up. Way up. And by bringing that infrastructure closer to us, I mean Gmail is too far away to play that role from a network perspective. But if the box is in our land, we can do that. These boxes will only work if they are convenient. There's an old punk-rock slogan, from the Dead Kennedys, "Give me convenience, or give me death." We laugh at that, but that's a belief users have, and I deduce that based on their behaviour, because every time there is a convenient web service, people use it. Even if it's not very good with privacy, a lot of people are going to use it. And conversely, whenever we have web services that are very good at privacy, but aren't very convenient, comparatively fewer people use them. We don't think this box works without convenience. If we don't get the user-interface right then this project will probably fall over. It will never gain any sort of critical mass. So we need a simple interface, we need a way for users to interact with this box in a minimal way. They should think about it as little as possible. That's the hardest problem we face. Quite frankly. The technology to do private communication, that exists. A lot of the people in this room helped to build that infrastructure and technology. We can put it on the box. Making it easy and accessible for users, that's hard. And right now we're trying to figure out what that looks like, who the designers are going to be. If you have user interface or user experience design that you want to bring to a project like this, please, please, come find me. In order to have convenience, we need to have the thing provide services that are not just freedom-oriented, we need to use its position in your network as a trusted device to do things for you that aren't just about privacy. It needs to do backups. This is important. Right now the way people back up their photos is by giving them to Flickr. The way they back up their email is by giving it to Gmail. If we don't provide backups, we can never be an effective replacement for the services that store your data somewhere else. Even though they're storing it out there in the cloud for their purposes, you get a benefit from it. We have to replicate that benefit. So things that we don't think of as privacy features have to be in the box. The backups, the passwords, and the keys, you can't forget them. We would like it to be a music, a video, a photo server, all the kinds of things you might expect from a convenient box on your network. All the things that you want to share with other people, this box has to do those things. And these aren't privacy features, but without them we won't be able to give people privacy. Our first feature, the thing we are working towards is Jabber. It's secure encrypted chat, point-to-point. That will be the thing we are working on right now. But in order to do that we need to solve this monkey-spherish SSL problem that I described. We have code, it needs to get packaged and all that. Our development strategy, the way we are going to do all the things we said, because the list of things I have said we're going to do... I can't believe you're not throwing things at me. Because it's ludicrous to believe that we can actually do all these things by ourselves. And we're not. We're going to let other people make the software. As much as possible we're going to encourage other people to build stuff. We're going to use stuff that already exists. We're going to use Privoxy, we're going to use Prosody, we're going to use Apache. We're not going to reinvent the web server, we're not going to reinvent protocols. I really hope that by the time this project is mature, we haven't invented any new protocols. Maybe we'll use new protocols, but I don't want to be generating new things that haven't been tested, and then putting them in FreedomBox. I want to see things in the real world, tested, gain credibility and take them. The less we invent, the better. As far as timelines go, by the time we have it ready, you'll know why you need it. People right now are figuring out that privacy is important. They're seeing it over and over again. In Egypt, the at the start of the Arab spring, one of the things the government did to try to tamp down the organisation was to convince companies to shut off cell networks, to prevent people from talking to each other. In America they did the same thing in San Francisco I hear. Turned off the cell towers to prevent people from organising to meet for a protest. With Occupy Wall Street, you're starting to see infiltration, you're starting to see people going and getting information that Occupy Wall Street is talking about and turning it over to the authorities, the police, the FBI. So the need for privacy as we enter a new age of increased activism, we hope, of increased activity, of social activity, I think the need for a lot of this privacy stuff is going to become clear. As the technology for invading your privacy improves, the need for technology to protect your privacy will become stark and clear. Our two big challenges as I said are user experience, and the one I didn't say was paying for developers, paying for designers. Those are the hard parts that we're working on. And if we fail, we think that's where we fail. Software isn't on that list, as I said software is already out there. So you can have a FreedomBox. If you like that box that we've been passing around the audience, you can buy one from Globalscale. If you don't want the box, it's just Debian, it's just Linux, it's just packages. Throw Debian on a box, we will have packages available through the normal Debian mechanisms. You don't even have to use our repository. In fact, I don't think we're going to have a repository. You're just going to download it and install it the same way you normally do it if you're technologically capable of doing that. I grabbed a bunch of photos from Flickr, my colleague Ian Sullivan took that awesome picture of the FreedomBox. And that's how you reach me. APPLAUSE Thanks James, please sit down. We are up for questions from the audience for James. Please raise your hand if you have any questions about the FreedomBox. Hello, thanks that was a very interesting presentation. Thank you. Your boss Eben Moglen, he has given a speech at a committee of the US congress I believe, which has received a lot of attention and in Iran during the green movement the US state department I believe has told Twitter to reschedule maintainence so that the opposition could keep using Twitter during the attempted revolution and Hilary Clinton has given a very popular speech about how America would support the promotion of internet freedom and I think things such as the New America Foundation are funding and supporting projects such as the Commotion mesh networking project that we've already heard about before. So in other words there's a link between politics and technology sometimes, and in the past I believe certain influential Americans such Rupert Murdoch or George W. Bush have viewed modern communication technologies as a way to promote U.S. foreign policy and to spread democracy and freedom in the world. So my question is, what is your relationship with your government? That's a really good question. So one of the things that we sort of figured out from the beginning was that if we had close relationships with the U.S. government, people outside of the U.S. might have difficulty trusting us, because nobody wants to tell all their secrets to the American government. So we were thinking about what that really looks like in the context of a box that could be used globally. We are working very hard to engineer a device that does not require you to trust us. I'm not asking for your trust. I'm not asking for your trust, I'm asking for your help. All the code we write you'll be able to see it, you'll be able to audit it, you'll be able to make your own decisions about what it does, you'll be able to test it if it trustworthy or not, and if you decide that it is not, you can tell everyone, and they won't use it. So from a trust perspective, it doesn't matter what our relationship is with anybody. So that's the first thing. The second thing is that right now we don't have much of a relationship with the U.S. government. Jacob Applebaum is somewhat famous for his work with Julian Assange on Wikileaks, and his work on Tor, and security in general, his efforts to provide you with freedom and privacy. He is a guy who was recently revealed in the Wall Street Journal that the U.S. government has been spying on. And he is on our team, he's on our technical advisory committee. He's one of the people we go to for help when we need to understand security on the box. So right now our position with the American government is that we're not really related except in so much that we are a bunch of people who really care about these issues, which maybe occasionally makes us targets. Which gives us a reason to use a box like this. Coupled with that, there is a program in America - you were talking about Hilary Clinton saying she was going to encourage technologies that will spread democracy. So the way America encourages things is by spending money on it. That's our typical way to support programs. We fund different things. We don't generally have feel-good campaigns, we just pay people to make good work, or try to. So the U.S. state department has a program to provide funding for projects like the FreedomBox. We have not applied for that funding. I don't know if we will. However I do know that they have given funding to some very good and genuine projects that are run by people I trust, so I try not to be cynical about that. I imagine at some point that through a direct grant or a sub-grant or something, some state department money might support some aspect of work that is related to us. I mean, we might take work from a project that is state department funded, just because it's quick work. Have I answered your question? Yes, thanks. Hi, well you always have tension if you talk about privacy since 9/11 you know, I heard this in America very often, "we have to be careful", every body is suspicious and stuff. So how do you react when people like the government say well, you are creating a way to support terrorism, whatever. That's a good question, and it's a common question. Frankly every time I do this talk, it's one of the first questions that come up. The answer is really simple. The fact is, this box doesn't create any new privacy technology. It just makes it easier to use and easier to access. People who are committed to terrorism or criminal activity, they have sufficient motivation that they can use the technology that exists. Terrorists are already using PGP. They're already using Tor. They're already using stuff to hide their data. At best we are helping stupid terrorists. LAUGHTER Granted, I'm not excited about that, but I don't that's a sufficient reason to deny common people access to these technologies. And more importantly than the fact that terrorists and criminals have access to this technology, governments have access to this technology. The largest corporations have access to this technology. Every bank, the same encryption methods that we are using is the stuff that protects trillions of dollars in value that banks trade every day. This is technology that is currently being used by everyone except us. All we're doing is levelling the playing field. The same technology that hides data from us, that causes a complete lack of transparency in a downward direction, we can have to level the playing field a little bit. More questions? Thank you for your presentation. Could we add to challenges, maybe we could produce it in a non-communist dictatorship? Because I saw the label "Made in China", so I think it is just paradox to produce something like the FreedomBox in this country, and I would also like to be independent from producing in China. So that's just something for a challenge I think. That's a really good question and important point. So, we're not a hardware project. Hardware is really really hard to do right and do well. We have some hardware hackers on our project. Our tech lead Bdale Garbee does amazing work with satellites and model rockets and altimeters, and he's brilliant. But this is not a hardware project. All we can do is use hardware that already exists. When the world makes hardware in places other than China, we will use that hardware. Right now, we don't have a lot of options. And we're not going to deny everybody privacy because we don't have a lot of hardware options. When we have those options we'll take them. In the meantime, if you are a person who really cares about this issue, don't buy a FreedomBox. Take the software, go find a computer that isn't made in China, LAUGHTER and go put the software on that box. If you want a solution that is run on computers that don't exist, I can't help you with that. If you want a solution that runs, I might be able to help you with that. But yes, I agree that that is a real issue, and we are thinking about that. We believe that there is an open hardware project story here. And one thing we've been doing is working with the manufacturer of the box, to get the code free, to make sure we know what's in it, so that there are no binary blobs in the box, so we have some assurances that we actually do have freedom. At some point though, we do believe that somebody will solve the open hardware problem for us. We're not going to be the hardware project, but there are people trying to do this in an open way. RaspberryPi for example. They're not quite right for our use-case, but those kinds of projects are starting to exist, and they're starting to be really good. In a few years, maybe that will be the thing we move onto. Now, I'm guessing that even an open hardware project like RaspberryPi does their manufacturing in a place like China. And that's a big problem. When the world is ready with a solution to that, we will be ready to accept that solution and adopt it of course. Any more questions for James? or statements? This is more of a statement than a question I guess, but should the FreedomBox start being made in China there will be a lot more of them coming out of the back door and enabling privacy for people that don't get it, but also as soon as it starts getting manufactured I'd imagine you may, because you're not in it for the money as you told me last night, you may be looking forward to how easy it will be to copy, and with things like MakerBot, making a case, making a bot is easy, you can do it in your bedroom now with 3D printers. So there will be a bag of components, a board, made by some online place that is really into this, and you can assemble these at home. So you've just got to get it out there first I think, and lead the way. Yeah, I think that's quite right in that we are not the only place to get a box like this. I mean, we're putting it on a specific box to make it easy, but there will be lots of places that make boxes, and hopefully there will be places where working conditions are acceptable to everybody. And at that point you can make your own boxes, you can put them on any box you can find. The point of Free Software is not to lock you into a service, a technology, a software, a structure or a box. We're not going to lock you into anything, that's one thing we're extremely clear about. If you manage to make a box like this at home, I would really love to hear about it. If you can spin up a MakerBot to make a case, and you have a friend who can etch boards, and you make a box like this at home, that would be big news and a lot of people would want to know about it. More statements or questions? Yes... So, if you lose your box and get a new one, how is it going to reauthenticate to the boxes of your friends? I think I didn't get that one. Yeah, so, the good thing about friends is that they don't actually know you by your PGP key. Sorry, I didn't specify it, if you want a grand security and you want distribution to more than 12 friends, so let's say a hundred, and they're like, all over the world. You are probably going to reach them through the internet to get your key parts back, and you are probably not going to be able to use the FreedomBox to get a new one because it has to be authenticated. So how do you do? Well, you at that point... if you don't have a FreedomBox, the FreedomBox can't provide you with a solution to that problem. What you're going to have to do, is perhaps call your friends. Have a conversation with them, convince them that you are the person you say you are. Reference your shared experiences, maybe they know your voice, maybe they just know who you are by the way that you act and the way that you talk. There's not going to be any one way that we get our keys back. If you lose your key, yeah, we're not saying that's never going to be a problem. And I wouldn't recommend splitting your key up among a hundred people, because that's a lot of people to ask for your key back. The mechanism I have in mind is not that you get a little bit of your key from everyone you know, it's that you spread out the key among a lot of people, and you need a certain number of those people. So maybe it's five of seven of your friends. So you give seven people the key, but any five of them could give you a whole key. So in case you can't reach somebody you can still manage to do it. And we can make that access control as fine-grained as we want, but a hundred would be overwhelming. We wouldn't do that. Sure, you could do it if you wanted, but I don't think you'll have a hundred friends you could trust that much. Maybe you do, I don't. More questions, statements? Yes? Erm, it's just a wish... but have you thought about the idea of using the FreedomBox to create a community where you can exchange not only data but like products or services, so that would maybe like, change the system? One of the things we want to do with the FreedomBox is create a thing that looks a lot like your current social networking, minus the advertising and the spying. A way to talk to all your friends at once. Once you have a place, a platform, where you can communicate with your friends, you can build on that platform and you can create structures like that. If we make a thing that has programmable interfaces, so you can make apps for it, you can make an app like that, if that's important to you. What people do with the communication once they have it, we don't have any opinions about. We want them to do everything that's important to them. And I think something like that could be important, and yeah, that would be amazing if that were to emerge. Some things I believe are easier to do in a centralized architecture than a decentralized one, for example search, or services that require a lot of bandwidth. I don't see how you can run something like YouTube on the FreedomBox. So is your utopian vision one where everything is decentralized, or is it ok to have some centralized pieces in a future network? Look, if you're going to grant me my utopia then of course everything is decentralized. But we don't live in a utopia, I don't have magic. We actually have in our flowchart a box labeled "magic routing", because routing is hard to do in a decentralized way... You need someone to tell you where the IPs are. And that's hard to do in a decentralized way. We haven't solved it, and we don't think we're going to fully solve it. We hope someone else solves it first of all. But second of all, we don't know where the compromises are. Some things are not possible to decentralize. We're going to decentralize as much as we can, but we're not committing to doing anything impossible. If you can't run YouTube off this box, which I disagree with by the way, then you won't, because it's impossible. If you want to run YouTube on this box you turn all your friends into your content delivery network, and all your friends parallelize the distribution of the box, you share the bandwidth. It's ad-hoc, BitTorrent-like functionality. Yes, that technology doesn't exist yet, I just made all that up, but we can do it. The parts that are hard though, the things like the routing, there will be real compromises. There will be real trade-offs. There will be places where we'll say, you know what, we have to rely on the DNS system. Everybody in this room knows that the DNS system has some security problems, some architectural problems that make it a thing we would ideally not have to rely on. But you know what? This project is not going to be able to replace DNS. There are plenty of alternate DNS proposals out there, but we are not going to just chuck the old DNS system, because we want people to be able to get to the box, even if they don't have a box. We want you to be able to serve services to the public. We are going to use a lot of structures that are less than ideal. We're assuming that TCP/IP is there... in the normal use case you're using the internet backbone to do your communication. The mesh routing story we talked about is not how you do your normal use. That's an emergency mode if there's a crisis, a political instability, a tsunami, if you can't get to your regular internet because it has failed you in some way because it has become oppressive or inaccessible. Then you would use something like the mesh network. But in the normal course of business, you are using a thing that is less than ideal, and that's a trade-off. We can't as a project protect you from everything. We are going to look for the places where we can make effective protection. We are going to try and make it clear the limits of that protection. And we're going to give you everything we can. And then, as we move forward, when opportunities to solve new problems present themselves, we'll take them. Well I have to add before when we had the talk, unfortunately German you couldn't understand a lot. I didn't understand it but I could tell that it was occurring at a very high level of technical competence and that there was a lot of good information there. And I'm really hoping that you'll take the video of it and put it up on universalsubtitles.org, or some other service where people can subtitle it. And hopefully there'll be an English version and I'll get to see it. I think there was a lot of really good information in there. What's universalsubtitles.org? Universalsubtitles.org is a great website. It's kind of like, you put a video up, and anyone can add subtitles to as much or as little as they want. And then other people can change the subtitles, and you can do it in as many languages as you want. So you don't have to ask someone for a favour, "hey, will you subtitle my video?" that's 20 minutes long or an hour long. You tell a community of people "we need help subtitling", and everyone goes and subtitles 3 minutes in their favourite languages. It's a very effective way to crowdsouce subtitling, and it's a very effective way to just share information. We have a lot of videos with good information that are locked into languages that not everyone speaks. So this is a way to get around that. As FreedomBox, we use that project. And I believe, if I'm not mistaken, I haven't looked in a while, that it's all Free software that they are using. So you can download it and start your own if you want. So back to my previous question - in the talk in the afternoon we heard about mesh networking we talked about that, and it's actually not just being used in emergency situations but people are really using it. And especially, the philosophy that everyone becomes part of the net as not just a consumer but providing part of the net, it certainly is like that that they can share data among each other, they don't necessarily need to go into the internet. So, I would imagine the FreedomBox, with mesh networking, we could essentially create a large network of many many people using it. We also talked about the mesh networking like FunkFeuer in Graz or Vienna but it would be interesting to get them on mobile devices, so that you could walk through the street, theoretically people have these devices, and you could walk through and it would automatically mesh and connect you. So FreedomBox if applied to that, you told me this interesting example, you could screw them to light posts on the street, so maybe elaborate on that, maybe it could have an effect and give a lot of coverage. The reason why we currently envision mesh, and no decisions have been made, right, but just in the way we think about it when we talk to each other, and the reason why we think mesh networking is not your daily mode of use is that the performance degradation is not acceptable to most end-users. If mesh networking reaches the point where it is acceptable if you're in a place where there's enough nodes, and you have a density that you can move around then sure, that can make a lot of sense. But for a lot of people who exist as a person not near a lot of FreedomBoxes, they're going to need the regular internet. So yeah, we think mesh will be great where you have that density, when the mesh technology is mature. When that happens, we could have the most easy access to municipal wifi by using the power in all the street lights. Put a FreedomBox up in the top of every street lamp. Unscrew the light bulb, screw in the FreedomBox, and screw the light bulb back on top. So you still get light, we're not going to plunge you into darkness. You still get light, but then you have a mesh node. Right there. And you could do every 3rd or 4th street light down town, and you could cover an area rather effectively. It is a way to get simple municipal wifi without running any fibre. And every time you have fibre you can link to it. Like any time you're near fibre you can link to it and you'll get your information out of that little mesh and into the regular network. We could have municipal wifi with much lower infrastructure costs than most people currently think of when they think of municipal wifi. And we can do it through mesh nodes. And if we did it through mesh nodes we would be providing that service not only to people who have FreedomBoxes, that just looks like wifi, it just looks like a regular connection. You might need to do some fancy hopping, but it's not... the mesh boxes themselves will do the fancy hopping, your phone itself won't have to do it. While we are talking about phones, I want to say that I'm not sure how phones fit into the FreedomBox. I'm pretty sure there is a way that phones fit into FreedomBoxes, but you can't trust your phone. With the so-called smartphones it's not a phone actually but a little computer, no? Yes, your phone, a smartphone is a little computer but it's not a computer that you can trust, because even if you replace the software on your phone, with Free software, it's almost impossible to actually replace all the binary drivers, it's almost impossible to go all the way down to the metal. It's very hard to get a phone that is completely trustworthy all the way down to the bottom of the stack. So that's a problem we haven't quite figured out how to solve. And pretty soon it's going to be impossible to put Free software on phones. The days of jailbreaking your iPhone and rooting your Android phone might very well come to an end. There is a proposal right now called UEFI. It's a standard. We currently use EFI, this would be UEFI. I don't know what it stands for, it's a new thing. And what this proposal is, is that before your computer, before the BIOS will load a bootloader on your computer that BIOS has to authenticate, sorry, that bootloader has to authenticate to the BIOS. It has to be signed by someone the BIOS trusts, someone the BIOS manufacturer trusts. And the person who puts the BIOS in your phone can decide who it trusts, and they can decide they don't trust anyone except themselves. If Apple sells you an iPhone with a BIOS that requires a signed operating system, it might be very hard for you to get another version of the operating system on there. The proposals for this stuff are really in the realm of laptops and computers, that's where it's starting, but believe me, technology spreads. And if you want to be able to put Linux on a computer that you buy, on a laptop you buy, very soon you might have a very difficult time doing that. The standard is there, the companies paying attention to it are not paying attention to it for our purposes. They want to make sure that they can control what is on your computer. So this is, you know, another political fight that we're going to engage in, not the FreedomBox, but the community. We're going to have to have this fight. UEFI. Look it up. Start thinking about it. This is going to be a big piece of the puzzle for freedom in computing over the next few years. We're going to have some problems and we're going to have to find some solutions. But wouldn't such an initiative, wouldn't that create a good market for companies who actually would supply Linux on such devices, on the phone and on the laptop market. I'm sure there are companies supplying that. Absolutely. And if the market in freedom were good enough to support large-scale manufacturing and all that other stuff then we might get that. And we might get that anyway. I mean, the standard will include as many keys as you want, so we might get the freedom. But the manufacturers will have a really convenient way to turn the freedom off. I think there will be a lot of boxes where you will have freedom. But there will also be a lot where right now we think we can get Free software onto it, where we won't be able to anymore. It's going to be a narrowing of the market. I don't think our freedom is going to completely disappear from devices. But a lot of devices, if you buy the device without thinking about freedom, assuming you can have it, you might get it home and discover that you can't. Ok, we want to give the floor again to the audience for more questions or statements. Ok, there in the back, one more. Yeah, one more time, so... Nowadays, where you can hardly really save your PC, laptop, whatever, against malware... Isn't it really, a red carpet for hackers to, if you have social networks and circles of friends, one gets some malware on his PC, mobile device, whatever, has a FreedomBox, authenticates to his friends, the state is secure wouldn't that open doors? Sure, well, the human error is not one we can control for. But someone who has a key that you trust is not necessarily someone who you let run arbitrary code on your FreedomBox. You might trust them to the point of having message passing with them, and trusting who they are and what they say, but you don't necessarily trust the technology that they have and the code that they have to be free of malware. You'll still have to do all the things you currently do. Right now if somebody sends you a file, it could have malware in it. We're not making that easier, or better, or more likely to happen. I think what we are doing is completely orthogonal to that problem. At the same time, if we were to have email services on the box, and you know we're not quite sure what the email story of a box like this looks like, we probably would want to include some sort of virus scanning or spam catching, all the usual filtering tools to give you whatever measure of protection might currently exist. But the fact someone has a key and you know who they are I don't think that will ever be the security hole. Or at least we really hope we can make it so it's not. If we fail in that then we've missed a trick. Ok, any more statements or questions? Ok, so, James, my last question would be... You can actually buy the box right now? Yes. From a company? Yes. Maybe you can supply that information. But the software is being developed? Yes. Can you give an estimation about the timeline of your project, or the next milestones? Sure. So, the boxes are manufactures by a company called Globalscale, they're about $140 US dollars. There is a slightly older model called the SheevaPlug that is about $90. It does just pretty much everything the Dreamplug does. It has some heat sinking issues, but it's a pretty good box as well, so if the price point matters to you you can get last year's model and it'll serve you just fine. The software, right now we have a bare Linux distribution. We spent a lot of time getting the binary blobs out of the kernel and making it installable onto this hardware target. We have a Jabber server, Prosody, that we are modifying to suit our needs. And that should be ready, time-frame, weeks. Some short number of weeks. The Privoxy server, the SSH forwarding, some short number of months. But those are our roadmap for the short-term future, is Jabber, SSH forwarding, browser proxying. We also are working on the interface, so we're going to have an interface that you can actually control some of these services with. And the first thing we're doing with that interface is probably allowing you to configure this box as a wireless router. So it can become your wireless access point if you want it to be. And your gateway of course. So user interface in one vertical, SSH forwarding, browser proxying a little bit out there, a little bit closer: Jabber, XMPP secure chat. And once we have that stack, we believe that we're going to build upwards from XMPP towards perhaps something like BuddyCloud. We're seriously looking at BuddyCloud and seeing what problems it solves for us in terms of actually letting users group themselves in ways that they can then do access control and channels and things of that nature. And are you actually in contact with the hardware company producing the servers? Yeah, we've had a number of conversations with them. They've agreed that when our code is ready this is something they are very interested in distributing. More importantly we've had a lot of conversations with them about freedom. About why we do what we do, they way we do. And how they need to act if they want to distribute code for us and work with our community. And what that means is we're teaching them how to comply with the GPL, and we're teaching them how to remove the binary drivers, and in fact we're doing some of that for them. But they're Chinese, right? No. No, Globalscale is not a Chinese company. Their manufacturing is in China, but they're not a Chinese company. And we're also talking to Marvel. Marvel makes the system-on-a-chip that goes onto the boards that Globalscale is integrating into their boxes. But we're also talking to Marvel about what they can do to better serve the needs of our community. So a large part of our efforts is to try to convince manufacturers to make hardware that suits our needs. This box is a thing that they developed, they invented, before they ever met us, before they ever heard of us. And if we can get them enough business, if by making FreedomBoxes and by putting our software on the box, that enables them to sell more boxes they will be very happy and when they design the next generation, not the next generation of the DreamPlug, but the next generation after whatever they're designing now, so we're talking a couple of years from now. We can say to them, look, you're selling a lot of boxes because you're making a thing that serves the free world very well. Remove the 8 inch audio jack because our people don't need it. Add a second wifi radio. Put antenna ports on it. This box can go from something that looks really good for our purpose to being something that looks amazingly good for our purpose. And that will require scale. And what that means is that the FreedomBox becomes a wedge for making better hardware for everyone. But it's not just the FreedomBox. The Tor router project is also focused on the DreamPlug. They've also decided this is a good box for their purpose. If you are making a box that is kind of like a FreedomBox but isn't the FreedomBox because it's more specialised to what you want it for, think about the DreamPlug as a hardware target. And let us know, so that when we go to the company, we can say look, look at all the business you are getting by being people that serve the Free world. And then, hopefully, we can convince them to make boxes that better serve the Free world. And that's not a fantasy. We are having those conversations with them, and they are very receptive. So I am pretty happy about that aspect we do. And my last question would be... since we are now, everything is turning mobile, it's like we have these computers with an extra phone... the phone is a small application on these devices. Is there any plan or any idea or any project to say like, have a FreedomPhone or Free mobile device? So the way you connect to this box is kind of how you connect to your router, port 80, browser. But another way you could do it would be an app on your cellphone that bluetooths to the box. I don't actually think the box has bluetooth, but you know, an app on your cellphone that talks to the box over the network, say. That's possible, we're thinking about that. We're thinking about what that looks like for the large population that exists out there that doesn't have computers. There's an awful lot of people that only have cellphones, they don't have computers. And we want them to have freedom too. So figuring out how we can use a cellphone to talk to the box is a future problem. We're not working on it right now, but we're certainly talking about where it fits into the roadmap. And that's why we are concerned about whether or not you can trust your phone. Because if you can trust your FreedomBox, but not the thing you use to access it then you don't really have the privacy you think you have. So, figuring out, can you trust your cellphone? Is a big part of the puzzle. It's a big thing that we don't know how to do yet. So let me make a little advertisement for another interesting project, there is a Spanish development, I think it is also produced in China, but it's called The Geek's Phone. And they have a compatible Android installation by default, and they are probably having a similar philosophy to keep the hardware open. So maybe there is a new cooperation on the horizon. Oh yeah, we love projects like that. I don't know a lot about their project, but I have heard of it and it is on my list of things to look into. I would love to see that succeed, that would be excellent. Well James, thank you for your presentation. I think it was really interesting. And thank you for coming. James will be back on this stage at 7pm when we have our final discussion on the 20 years of the world wide web. Thank you James for coming. APPLAUSE