0:00:08.220,0:00:11.122
I'm very proud to have as a guest here from the United States
0:00:11.122,0:00:14.861
coming to Elevate is James Vasile of the Freedom Box Foundation
0:00:14.861,0:00:20.619
James Vasile is working on a multitude of projects
0:00:20.619,0:00:23.568
like Apache, I think, Joomla and many others. He is also a lawyer,
0:00:23.568,0:00:31.347
and he's working also with the Freedom Box Foundation and the Free Software Foundation.
0:00:31.347,0:00:37.895
He's going to present one of the, in my opinion, most revolutionary projects I've seen in recent years
0:00:37.895,0:00:43.236
as we can see here, a little small box, the Freedom Box.
0:00:43.236,0:00:48.042
Yeah, erm, James is going to do a presentation and then we're going to
0:00:48.042,0:00:50.294
be open for questions and then sit down for a talk
0:00:50.294,0:00:53.731
so James, I give the floor to you.
0:00:53.731,0:00:56.564
Thank you, Daniel.
0:00:56.564,0:01:03.135
I've been here at the Elevate festival for a few days now
0:01:03.135,0:01:10.101
I've been attending the talks and the films and the music
0:01:10.101,0:01:15.743
and this has been an amazing place to see all these different ideas coming togethers
0:01:15.743,0:01:21.223
I want to say thank you to Daniel for organizing so much
0:01:21.223,0:01:23.615
of this. To Joseph as well.
0:01:23.615,0:01:30.349
To Daniel especially for making a big effort to get me out here,
0:01:30.349,0:01:33.484
making it possible for me to come out here and being such a gracious host.
0:01:33.484,0:01:36.316
Thank you Dan, I really appreciate it.
0:01:36.316,0:01:42.841
APPLAUSE
0:01:42.841,0:01:52.524
A long time ago, in the beginning of the internet
0:01:52.524,0:01:56.657
When we first started using the internet as a way to talk to each other
0:01:56.657,0:02:00.651
We mostly talked directly to each other, right?
0:02:00.651,0:02:05.086
Think about how email works, on a technical level
0:02:05.086,0:02:10.009
You take a message, you hand it off to your mail transport agent
0:02:10.009,0:02:14.653
It sends it through a network, directly to the recipient.
0:02:14.653,0:02:16.905
It hops through some other computers, but funadmentally
0:02:16.905,0:02:21.084
you use the network to talk directly to your other computer
0:02:21.084,0:02:26.309
the other computer where the recipient gets his or her mail
0:02:26.309,0:02:30.489
It was a direct communication medium.
0:02:30.489,0:02:33.484
If you're old enough to remember a program called 'talk'
0:02:33.484,0:02:37.176
Talk was the first, sort of, interactive you type, they see it, they type, you see it
0:02:37.176,0:02:40.403
instant message application.
0:02:40.403,0:02:43.074
This again, was direct.
0:02:43.074,0:02:48.205
You would put your, put their name, into your program, and address
0:02:48.205,0:02:51.363
they would put theirs into yours, and you would just talk directly to each other
0:02:51.363,0:02:57.308
You didn't send this message through servers. That centralised technology.
0:02:57.308,0:03:02.091
From there, from those beginnings of talking directly to each other
0:03:02.091,0:03:07.733
we started to build communities, emailing directly to people.
0:03:07.733,0:03:10.705
But that was relatively inefficient.
0:03:10.705,0:03:17.254
Talking directly to people, one-to-one, works very good for one-to-one converstions.
0:03:17.254,0:03:19.506
But as soon as you want a group conversation
0:03:19.506,0:03:21.735
as soon as you want to find people reliably who you haven't
0:03:21.735,0:03:26.774
already set up contacts for, exchanged email addresses and such
0:03:26.774,0:03:28.724
you run into friction, you run into problems
0:03:28.724,0:03:34.018
So the solution to that, was to create more centralised structures
0:03:34.018,0:03:37.896
and we did this with IRC
0:03:37.896,0:03:41.472
IRC is a place where instead of talking directly to the people we're trying to reach
0:03:41.472,0:03:45.210
we take a message, and we send it to an IRC server
0:03:45.210,0:03:46.696
a third party
0:03:46.696,0:03:48.484
and the IRC server then copies that message
0:03:48.484,0:03:51.201
to all the people who we might want to talk to.
0:03:51.201,0:03:54.336
We developed mailing lists, listservs
0:03:54.336,0:03:58.214
And again, this was a way where we would take our message
0:03:58.214,0:03:59.375
and hand it to a third party
0:03:59.375,0:04:03.392
A mail server, that is not us and not the person we're trying to talk to
0:04:03.392,0:04:05.923
and that mail server would then echo our communication to
0:04:05.923,0:04:07.571
all the people we want to talk to
0:04:07.571,0:04:10.381
and this was great, because you didn't have to know the
0:04:10.381,0:04:12.563
addresses of all the people you wanted to talk to
0:04:12.563,0:04:15.373
You could just all 'meet' in a common place
0:04:15.373,0:04:19.529
We all meet in an IRC chatroom, we all meet on a listserv
0:04:19.529,0:04:23.523
And there were a lot of IRC channels, and a lot of IRC servers
0:04:23.523,0:04:25.311
and a lot of mail servers
0:04:25.311,0:04:27.285
all across the internet
0:04:27.285,0:04:28.887
A lot of places to do this communication.
0:04:28.887,0:04:32.463
And if you didn't like the policies or the structures or the technology
0:04:32.463,0:04:34.274
of any one of these service providers
0:04:34.274,0:04:36.503
these IRC servers, or these list servers
0:04:36.503,0:04:38.454
you could just switch, you could choose to run your own.
0:04:38.454,0:04:40.102
It was very simple.
0:04:40.102,0:04:46.975
This infrastructure is not hard to create, it's not hard to run, it's not hard to install.
0:04:46.975,0:04:49.669
And so a lot of people did run, create and install it.
0:04:49.669,0:04:53.082
There were a bunch of IRC servers, there were a bunch of different listserv packages
0:04:53.082,0:04:57.842
But as we've moved forward in time,
0:04:57.842,0:05:01.395
we've started to centralise even more.
0:05:01.395,0:05:05.366
And, you can fast-forward to today
0:05:05.366,0:05:07.455
where we're channeling our communication
0:05:07.455,0:05:10.567
through fewer and fewer places.
0:05:10.567,0:05:13.702
And we are making structures that are more and more central
0:05:13.702,0:05:15.629
and more and more over-arching
0:05:15.629,0:05:20.830
So, from the, the IRC way of talking to each other
0:05:20.830,0:05:25.451
we moved to instant messaging applications.
0:05:25.451,0:05:28.144
AOL Instant Messenger, ICQ,
0:05:28.144,0:05:31.372
those were the early ways to do it
0:05:31.372,0:05:33.299
and there were only a few of them
0:05:33.299,0:05:36.852
MSN had its messaging system, Yahoo had its messaging system
0:05:36.852,0:05:39.383
and when people wanted to talk to each other now,
0:05:39.383,0:05:41.333
they were using third-parties again.
0:05:41.333,0:05:43.144
But they were only using a few third parties.
0:05:43.144,0:05:46.883
And if you wanted to switch providers,
0:05:46.883,0:05:49.414
you would leave almost everyone you knew behind,
0:05:49.414,0:05:51.364
your entire community behind.
0:05:51.364,0:05:53.013
And so it becomes harder to switch.
0:05:53.013,0:05:54.662
There are fewer options
0:05:54.662,0:05:58.098
and the cost of switching leaves more and more people behind
0:05:58.098,0:06:00.768
So you started to have lock-in.
0:06:00.768,0:06:05.529
You started to have people who were chained to their methods of communication
0:06:05.529,0:06:07.874
because the cost of losing your community is too high.
0:06:07.874,0:06:10.126
And so if you don't like the technology, or you don't like the policy
0:06:10.126,0:06:12.077
or you don't like the politics
0:06:12.077,0:06:13.261
or if they're trying to filter you
0:06:13.261,0:06:14.863
or censor you
0:06:14.863,0:06:16.070
you don't have a lot of options.
0:06:16.070,0:06:18.601
The cost of leaving is so high that you might stay.
0:06:18.601,0:06:21.411
People do stay. And they accept it.
0:06:21.411,0:06:25.265
And we went from that small basket of providers of this kind
0:06:25.265,0:06:27.053
of communication technology
0:06:27.053,0:06:29.143
to an even more centralised structure
0:06:29.143,0:06:33.625
where there is effectively only one way to reach all our friends,
0:06:33.625,0:06:36.040
in each mode of communication,
0:06:36.040,0:06:37.502
Facebook.
0:06:37.502,0:06:38.687
And Twitter.
0:06:38.687,0:06:41.403
These two services rule everything.
0:06:41.403,0:06:43.493
And I'm not going to stand here and say Facebook is evil
0:06:43.493,0:06:45.142
and that Twitter is evil
0:06:45.142,0:06:49.043
What I want to say is that having one place
0:06:49.043,0:06:50.645
where we do all our communication
0:06:50.645,0:06:53.176
leaves us at the mercy of the policies of the people
0:06:53.176,0:06:55.544
that control the infrastructure that we are chained to,
0:06:55.544,0:06:57.750
that we are stuck using, that we are locked into.
0:06:57.750,0:07:02.232
You can't leave Facebook without leaving everybody you know
0:07:02.232,0:07:05.645
because everybody you know is on Facebook.
0:07:05.645,0:07:09.523
I was not a Facebook user.
0:07:09.523,0:07:11.171
I was against Facebook.
0:07:11.171,0:07:14.469
I thought it was bad to centralise all our communication in one place.
0:07:14.469,0:07:15.653
I didn't like the privacy implications,
0:07:15.653,0:07:18.207
I didn't like Facebook's censorship
0:07:18.207,0:07:21.783
of things like pictures of nursing mothers.
0:07:21.783,0:07:22.967
I don't think that kind of thing is obscene,
0:07:22.967,0:07:25.498
and I don't think Facebook should have the ability to tell us
0:07:25.498,0:07:27.565
what we can share with our friends.
0:07:27.565,0:07:29.074
So I thought those were bad policies,
0:07:29.074,0:07:32.464
and I reacted to that by not joining Facebook. For years.
0:07:32.464,0:07:35.576
All my friends were on Facebook.
0:07:35.576,0:07:41.682
I joined Facebook late last year. November.
0:07:41.682,0:07:48.207
Because in November, a friend of mine passed away.
0:07:48.207,0:07:50.018
His name was Chuck. He was a brilliant man.
0:07:50.018,0:07:55.243
And he lived a lot of his life online.
0:07:55.243,0:07:58.215
He was on Facebook, and he shared things with friends on Facebook.
0:07:58.215,0:08:01.071
When he passed away I realised I hadn't communicated with him in a while,
0:08:01.071,0:08:02.720
I hadn't really talked to him in a while.
0:08:02.720,0:08:05.552
And the reason I hadn't was because I wasn't
0:08:05.552,0:08:08.083
communicating with him in the place he communicates.
0:08:08.083,0:08:10.034
I wasn't meeting him where he was, I wasn't on Facebook.
0:08:10.034,0:08:12.402
I was missing out on something huge.
0:08:12.402,0:08:15.653
That's the cost of not being there.
0:08:15.653,0:08:17.441
And so I joined.
0:08:17.441,0:08:19.368
Because I decided that as strong as my beliefs were,
0:08:19.368,0:08:21.296
it was more important to me to be there with my friends and
0:08:21.296,0:08:23.084
to talk to my friends.
0:08:23.084,0:08:24.570
That's the power of lock-in.
0:08:24.570,0:08:27.240
Me, a person who cares, as much as I do,
0:08:27.240,0:08:31.048
who cares enough about these issues that I do something like this
0:08:31.048,0:08:32.975
I got locked into Facebook. I'm there now.
0:08:32.975,0:08:35.344
That's how I talk to a lot of my friends, whether I like it or not
0:08:35.344,0:08:38.734
I am locked into Facebook.
0:08:38.734,0:08:42.774
You know, I'm also on Diaspora. But my friends aren't on Diaspora.
0:08:42.774,0:08:46.814
This sort of lock-in creates a sort of situation where
0:08:46.814,0:08:51.133
we have one arbiter of what is acceptable speech,
0:08:51.133,0:08:53.223
whether we like it or not.
0:08:53.223,0:08:55.034
If they're free, we're free to the extent,
0:08:55.034,0:08:56.218
only to the extent,
0:08:56.218,0:08:57.263
that they give us freedom.
0:08:57.263,0:08:59.051
And that to me isn't freedom.
0:08:59.051,0:09:01.443
That to me is accepting what you're given.
0:09:01.443,0:09:04.136
It's the exact opposite of making your own choices.
0:09:04.136,0:09:08.641
The exact opposite of self-determination.
0:09:08.641,0:09:13.564
All of our problems in communication can be traced
0:09:13.564,0:09:16.977
to centralized communications infrastructure.
0:09:16.977,0:09:22.620
Now, I've sort of told this story at the social level,
0:09:22.620,0:09:25.870
in the way that we're talking about how to talk to your peers
0:09:25.870,0:09:28.703
and your friends on the internet.
0:09:28.703,0:09:33.765
But this story also exists when we think about relying on the pipes,
0:09:33.765,0:09:38.247
relying on the hardware, the technical infrastructure behind the software.
0:09:38.247,0:09:43.471
We rely on internet backbones,
0:09:43.471,0:09:45.700
we rely on centralized cellphone networks,
0:09:45.700,0:09:47.952
we rely on centralized telephone networks.
0:09:47.952,0:09:52.434
The people that control these networks have the ability
0:09:52.434,0:09:54.802
to tell us what we're allowed to say,
0:09:54.802,0:09:56.614
when we're allowed to say it.
0:09:56.614,0:09:59.748
They have the ability to filter us, to censor us, to influence us.
0:09:59.748,0:10:02.581
Sometimes they use that ability, and sometimes they don't,
0:10:02.581,0:10:04.671
and sometimes by law they're not allowed to.
0:10:04.671,0:10:06.482
But at the end of the day
0:10:06.482,0:10:09.268
the power doesn't rest in our hands.
0:10:09.268,0:10:11.521
The power, from a technological perspective,
0:10:11.521,0:10:13.587
rests in the hands of the people that operate the
0:10:13.587,0:10:15.654
networks.
0:10:15.654,0:10:20.414
Centralization doesn't just allow this sort of filtering and censorship.
0:10:20.414,0:10:23.525
There's another big problem with centralization.
0:10:23.525,0:10:26.056
The other big problem with centralization is that by
0:10:26.056,0:10:30.050
gathering all of our data in one place
0:10:30.050,0:10:33.510
it becomes easy
0:10:33.510,0:10:36.645
to spy on us.
0:10:36.645,0:10:39.338
So every time you go to a website
0:10:39.338,0:10:41.428
pretty much
0:10:41.428,0:10:45.445
the website includes, at the bottom of the page
0:10:45.445,0:10:49.927
a little graphic or invisible Javascript thing
0:10:49.927,0:10:53.061
that tells Google that you came to visit the page.
0:10:53.061,0:10:56.173
Eva goes to a website, and the website says
0:10:56.173,0:10:59.284
"Hey Google! Eva just came to my website!"
0:10:59.284,0:11:01.490
Every time she goes to a website, that happens.
0:11:01.490,0:11:04.764
And so Google effectively sits next to her and watches,
0:11:04.764,0:11:06.552
while she uses the internet.
0:11:06.552,0:11:07.899
Watches everything she does,
0:11:07.899,0:11:09.083
and everything she enters,
0:11:09.083,0:11:11.637
everything she looks at and knows.
0:11:11.637,0:11:15.236
It's not just her search data, it's not just her Gmail.
0:11:15.236,0:11:19.253
It's the entire picture of her digital life.
0:11:19.253,0:11:22.086
In one place.
0:11:22.086,0:11:23.735
That's a pretty complete profile.
0:11:23.735,0:11:24.780
If you were able...
0:11:24.780,0:11:27.613
...imagine if somebody could sit next to you and watch
0:11:27.613,0:11:29.261
everything you did online,
0:11:29.261,0:11:31.351
imagine how much they would know about you.
0:11:31.351,0:11:33.278
That's how much Google knows about you.
0:11:33.278,0:11:36.250
Google knows more about you than you know about yourself,
0:11:36.250,0:11:39.942
because Google never forgets.
0:11:39.942,0:11:42.914
Google knows more about you than your parents,
0:11:42.914,0:11:43.959
than your partner,
0:11:43.959,0:11:46.885
Google knows your secrets, your worst secrets,
0:11:46.885,0:11:48.673
Google knows if you're cheating on your spouse
0:11:48.673,0:11:49.857
because they saw you do the Google search for the
0:11:49.857,0:11:54.641
sexually-transmitted disease.
0:11:54.641,0:11:56.707
Google knows your hopes and your dreams.
0:11:56.707,0:11:58.170
Because the things we hope and dream about,
0:11:58.170,0:11:59.354
we look for more information about.
0:11:59.354,0:12:00.701
We're natural information seekers.
0:12:00.701,0:12:02.489
We think about something, it fascinates us,
0:12:02.489,0:12:05.182
we go and look it up online. We search around.
0:12:05.182,0:12:06.970
We look around the internet, and we think about it.
0:12:06.970,0:12:11.011
And Google is right there. Following our thought process,
0:12:11.011,0:12:15.028
the thought process in our click trail.
0:12:15.028,0:12:19.347
That is an intimate relationship.
0:12:19.347,0:12:21.297
Right? Do you want an intimate relationship with Google?
0:12:21.297,0:12:21.901
Maybe you do.
0:12:21.901,0:12:25.500
I personally, don't.
0:12:25.500,0:12:28.774
But that's it, Google sits next to us and watches us use
0:12:28.774,0:12:30.121
our computers.
0:12:30.121,0:12:34.741
And if anyone actually did... if you had a friend who wanted
0:12:34.741,0:12:37.272
to sit next to you, or a stranger said I want to sit next to you
0:12:37.272,0:12:39.060
and just watch you use your computer all day,
0:12:39.060,0:12:41.406
you would use that computer very differently to the way you do now.
0:12:41.406,0:12:44.378
But because Google doesn't physically sit next to you,
0:12:44.378,0:12:49.068
Google sits invisibly in the box, you don't know Google is there.
0:12:49.068,0:12:51.158
But you do know, right?
0:12:51.158,0:12:52.644
We're all aware of this. I'm not saying any of you don't know,
0:12:52.644,0:12:55.755
especially in a room like this.
0:12:55.755,0:12:57.102
But we don't think about it.
0:12:57.102,0:12:58.751
We try not to think about it.
0:12:58.751,0:13:01.584
We are locked in, to the internet.
0:13:01.584,0:13:03.650
We can't stop using it.
0:13:03.650,0:13:05.299
And the structures that exist,
0:13:05.299,0:13:06.506
the infrastructure that exists,
0:13:06.506,0:13:09.014
that has been slowly turned from
0:13:09.014,0:13:12.729
a means to allow us to communicate with each other
0:13:12.729,0:13:16.119
to a means of allowing us to access web services
0:13:16.119,0:13:19.811
in return for all our personal information so we can be bought and sold
0:13:19.811,0:13:21.599
like products.
0:13:21.599,0:13:24.966
That is the problem. That is the problem of centralization, of having one structure.
0:13:24.966,0:13:27.381
As soon as we put all that information in one place
0:13:27.381,0:13:32.025
we get complete profiles of us, you get complete pictures of you.
0:13:32.025,0:13:33.488
And that is a lot of information.
0:13:33.488,0:13:34.556
It's valuable information.
0:13:34.556,0:13:39.455
It's information that is used, right now, mostly to sell you things.
0:13:39.455,0:13:42.288
And that, you might find objectionable.
0:13:42.288,0:13:43.171
Maybe you don't.
0:13:43.171,0:13:46.909
Maybe you don't believe the studies that say you can't ignore advertising.
0:13:46.909,0:13:51.669
Maybe you think that you are smart and special, and advertising doesn't affect you.
0:13:51.669,0:13:53.457
You're wrong.
0:13:53.457,0:13:56.267
But maybe you believe that.
0:13:56.267,0:14:02.025
But that information, that same infrastructure, that same technology that allows them
0:14:02.025,0:14:05.973
to know you well enough to sell you soap
0:14:05.973,0:14:12.219
allows them to know you well enough to decide how much of a credit risk you are,
0:14:12.219,0:14:14.146
how much of a health risk you are,
0:14:14.146,0:14:16.956
and what your insurance premiums should look like.
0:14:16.956,0:14:18.906
In America we have a big problem right now.
0:14:18.906,0:14:23.225
Insurance costs are out of control. Health insurance. We're having a lot of difficulty paying for it.
0:14:23.225,0:14:28.728
Insurance companies would like to respond to this problem
0:14:28.728,0:14:31.747
by knowing better who's a good risk and who's a bad risk
0:14:31.747,0:14:35.624
so they can lower prices for the good risk and raise prices for the bad risk.
0:14:35.624,0:14:41.290
Essentially they want to make people who are going to get sick, uninsurable.
0:14:41.290,0:14:45.330
And if you could know enough about a person to know what their risk factors are based on
0:14:45.330,0:14:49.347
what they're digital life is, if you can get just a little bit of information about them,
0:14:49.347,0:14:53.365
maybe you can figure out who their parents are and what hereditary diseases they might be subject to,
0:14:53.365,0:14:55.872
you can start to understand these things.
0:14:55.872,0:14:58.844
You can start to figure out who's a good risk and who's a bad risk.
0:14:58.844,0:15:04.487
You can use this information for ends that seem reasonable if you're a health insurance
0:15:04.487,0:15:07.041
company, but probably don't seem reasonable if you're
0:15:07.041,0:15:10.315
the kind of person sitting in this room, the kind of person that I talk to.
0:15:10.315,0:15:17.467
And that's the problem. The innocuous use. The use that seems kind of icky, but not truly evil,
0:15:17.467,0:15:19.696
which is advertising.
0:15:19.696,0:15:25.246
It's the same mechanism, the same data, that then gets used for other purposes.
0:15:25.246,0:15:32.838
It's the same data that then gets turned over to a government who wants to oppress you
0:15:32.838,0:15:36.577
because you are supporting wikileaks.
0:15:36.577,0:15:39.828
And that's not a fantasy, that's what happened.
0:15:39.828,0:15:49.325
It's the same information that anybody who wants to know something about you for an evil end would use.
0:15:49.325,0:15:56.616
We have a saying in the world of information, that if the data exists, you can't decide what it gets
0:15:56.616,0:15:58.148
used for.
0:15:58.148,0:16:03.048
Once data exists, especially data in the hands of the government, of officials,
0:16:03.048,0:16:05.811
once that data exists, it's a resource.
0:16:05.811,0:16:10.153
And the use of that resource it its own energy, its own logic.
0:16:10.153,0:16:15.401
Once a resource is there begging to be used, it's very hard to stop it from being used.
0:16:15.401,0:16:22.645
Because it's so attractive, it's so efficient, it would solve so many problems to use the data.
0:16:22.645,0:16:28.590
And so once you collect the data, once the data exists in one centralized place,
0:16:28.590,0:16:35.439
for anybody to come and get it with a warrant, or maybe no warrant, or maybe some money...
0:16:35.439,0:16:41.059
somebody is going to come with a warrant, or no warrant, and they are going to get that data.
0:16:41.059,0:16:42.847
And they will use it for whatever they want to use it.
0:16:42.847,0:16:47.189
Once it's out of the hands of the first person who collected it, who maybe you trust,
0:16:47.189,0:16:52.692
who maybe has good privacy policies, who maybe has no intention to do anything with your data
0:16:52.692,0:16:58.613
other than use it for diagnostic purposes, once it's out of that person's hands it's gone.
0:16:58.613,0:17:00.981
You never know where it goes after that.
0:17:00.981,0:17:02.909
It is completely uncontrolled and unchecked
0:17:02.909,0:17:05.904
and there is no ability to restrain what happens to that data.
0:17:05.904,0:17:14.379
So all of this is my attempt to convince you that privacy is a real value in our society,
0:17:14.379,0:17:18.095
and that the danger of losing privacy is a real problem.
0:17:18.095,0:17:20.788
It's not just the censorship, it's not just the filtering,
0:17:20.788,0:17:26.918
it's not just the propaganda, the influencing of opinion, that's one aspect of it,
0:17:26.918,0:17:35.417
it's not just the free speech. It's also the privacy, because privacy goes to the heart of our autonomy.
0:17:35.417,0:17:43.451
About a year and a half ago to two years ago at the Software Freedom Law Center
0:17:43.451,0:17:47.607
a man named Ian Sullivan who's a co-worker of mine,
0:17:47.607,0:17:49.697
he bought a bunch of plug servers,
0:17:49.697,0:17:54.480
because he was really excited at the thought of using them as print servers, and media servers,
0:17:54.480,0:17:59.240
and he started tinkering with them in our office.
0:17:59.240,0:18:02.932
My boss Eben Moglen who is a long-time activist in the Free Software movement,
0:18:02.932,0:18:15.030
fought very hard for Phil Zimmerman and PGP when that was a big issue,
0:18:15.030,0:18:23.552
he looked at this technology and he immediately realised that several streams had come together in one
0:18:23.552,0:18:24.596
place.
0:18:24.596,0:18:27.987
There's a lot of really good technology to protect your privacy right now.
0:18:27.987,0:18:31.144
In fact that's the stuff we're putting on the Freedom Box.
0:18:31.144,0:18:33.095
We're not writing new software.
0:18:33.095,0:18:36.740
We are gathering stuff, and putting it in one place.
0:18:36.740,0:18:40.920
Stuff that other people did because there are people who are better at writing software, and security,
0:18:40.920,0:18:43.265
than we are. We're software integrators.
0:18:43.265,0:18:46.679
And he realised there was all this software out there, and suddenly there was a box to put it on.
0:18:46.679,0:18:53.111
You could put all that software in one place, make it easy, and give it to people in one neat package.
0:18:53.111,0:18:56.710
Pre-installed, pre-configured, or as close to it as we can get.
0:18:56.710,0:19:02.654
And that, was the vision for the FreedomBox.
0:19:02.654,0:19:08.180
The FreedomBox is a tiny computer. Look at this.
0:19:08.180,0:19:10.874
That's small, it's unobtrusive.
0:19:10.874,0:19:11.779
So it's a small computer.
0:19:11.779,0:19:16.238
And we don't just mean small in size... it doesn't take a lot of energy.
0:19:16.238,0:19:22.670
I could be running this box on a couple of AA batteries for the life of this presentation.
0:19:22.670,0:19:24.620
You could run it on a solar panel.
0:19:24.620,0:19:27.778
It's very lightweight infrastructure.
0:19:27.778,0:19:33.304
You plug it into your home network, and when I say home network,
0:19:33.304,0:19:35.092
(I'm going to pass this around)
0:19:35.092,0:19:38.343
When I say home network, I mean home network.
0:19:38.343,0:19:42.824
This is technology we are designing for individuals to use to talk to their friends.
0:19:42.824,0:19:47.910
Our use-case, the thing we're trying to protect is you guys, as individuals in your communities.
0:19:47.910,0:19:51.927
This isn't a small-business appliance, it's not a large corporate applicance, this is a thing
0:19:51.927,0:19:58.939
that we are truly aiming at the home market, and people who care about privacy on an individual level.
0:19:58.939,0:20:05.975
You plug it into your home network to protect your privacy, your freedom, your anonymity and your security.
0:20:05.975,0:20:09.690
That is our mission statement, I guess. Unofficially.
0:20:09.690,0:20:17.004
That is what we believe we are trying to do with this device.
0:20:17.004,0:20:22.089
So, what privacy means in this context, the way we're going to go about trying to protect your privacy
0:20:22.089,0:20:27.616
is to connect you directly with other people and take everything you do and try to encrypt it
0:20:27.616,0:20:31.331
so that only you and the person you are talking to can see it. This is not a new idea.
0:20:31.331,0:20:35.696
We can do encrypted messaging, and we can do encrypted browsing.
0:20:35.696,0:20:43.986
Now there are problems with encrypted browsing. Right now if you want to have secure browsing you generally
0:20:43.986,0:20:45.890
use something called SSL.
0:20:45.890,0:20:57.523
SSL is a system of certificates that allow a web server to say to you "we can talk privately".
0:20:57.523,0:21:01.981
That's the first guarantee, a secure cryptographic connection (A).
0:21:01.981,0:21:05.673
and (B) I can authenticate to you that I am who I say I am.
0:21:05.673,0:21:11.362
So not only can nobody listen, but you know who you're talking to.
0:21:11.362,0:21:18.328
You're not secretly talking to the government, when really you're talking to me.
0:21:18.328,0:21:23.878
The problem with SSL, the big problem with SSL, is that the system for signing certificates relies
0:21:23.878,0:21:28.266
on a trust hierachy that goes back to a cartel of companies who have the server certificates,
0:21:28.266,0:21:35.581
who have the ability to do this "guarantee". So when the website says to you "I guarantee I am who I
0:21:35.581,0:21:42.639
am", you say "I don't know you, I don't trust you". And they say "Oh, but this other company, I paid
0:21:42.639,0:21:47.098
them money, and so they'll guarantee that I am me."
0:21:47.098,0:21:52.624
Which is a really interesting idea - because I also don't know this company, why would I trust that company?
0:21:52.624,0:21:57.059
I mean, the company is just old enough and influential enough that they could actually get their
0:21:57.059,0:22:03.630
authority into my browser. So really my browser is willing to accept at face-value that this website
0:22:03.630,0:22:07.345
is who it says it is, but I don't necessarily accept that.
0:22:07.345,0:22:13.150
And then, we have the problem of self-signed certificate. Where if they say, none of those authorities
0:22:13.150,0:22:17.771
in your browser trust me, I trust myself and look, I've signed a piece of paper -
0:22:17.771,0:22:20.581
I swear I am who I say I am.
0:22:20.581,0:22:24.017
And that, is not trustworthy at all, right?
0:22:24.017,0:22:27.895
That's just him saying again "No, really! I'm me!".
0:22:27.895,0:22:33.584
So this is a problem, because the FreedomBoxes are not going to trust the SSL cartel,
0:22:33.584,0:22:36.696
and they are not going to trust each other, so they can't just sort of swear to each other that
0:22:36.696,0:22:39.528
they are who they are.
0:22:39.528,0:22:45.124
So we think we've solved this. I'm not going to say we've solved it, because we're just starting to tell
0:22:45.124,0:22:52.137
people about this idea, and I'm sure people will have reasons why the idea can be improved.
0:22:52.137,0:22:58.406
But there is a technology called MonkeySphere, that allows you to take an SSH key and wrap it around a
0:22:58.406,0:23:03.329
PGP key, and use a PGP key to authenticate SSH connections.
0:23:03.329,0:23:10.341
It's really neat technology that allows you to replace SSH trust with PGP trust.
0:23:10.341,0:23:14.498
And we looked at that, and we thought, why can't we do that with SSL?
0:23:14.498,0:23:21.371
So one thing we're going do with browsing is take an SSL certificate, an X.509 certificate,
0:23:21.371,0:23:25.248
and wrap it around a PGP key and send it through the normal SSL layer mechanisms
0:23:25.248,0:23:32.284
but when it gets to the other end, smart servers and smart browsers will open it up and use PGP mechanisms
0:23:32.284,0:23:39.575
to figure out how to trust people, to verify the connections, to sign the authentication of the identity
0:23:39.575,0:23:42.687
of the browser, of the server.
0:23:42.687,0:23:48.492
This allows us to replace the SSL cartel with the web of trust, the keyservers.
0:23:48.492,0:23:57.292
We're replacing a tiny group of companies that control everything with keyservers, community infrastructure.
0:23:57.292,0:24:01.170
Anyone can set up a keyserver, and you can decide which one you want to trust.
0:24:01.170,0:24:02.772
They share information.
0:24:02.772,0:24:06.232
The web of trust is built on people, telling each other that they trust each other.
0:24:06.232,0:24:09.947
Again, you can decide who to trust and how much you want to trust them.
0:24:09.947,0:24:16.193
This is emblematic of our approach. We've identified structures that are unreliable because
0:24:16.193,0:24:20.373
they are centralized, because they are controlled by interests that are not the same interests
0:24:20.373,0:24:22.625
as our interests.
0:24:22.625,0:24:29.777
And we've decided to replace them wherever we can with structures that rely on people,
0:24:29.777,0:24:37.532
that rely on human relationships, that rely less on the notion that you can buy trust, and more on the
0:24:37.532,0:24:42.292
notion that you earn trust, by being trustworthy, by having people vouch for you over time.
0:24:42.292,0:24:50.303
So that's our approach to encrypted browsing. It's also our approach to encrypted messaging.
0:24:50.303,0:24:58.221
We're doing Jabber for a lot of message passing, XMPP, and we're securing that again with PGP.
0:24:58.221,0:25:02.076
Everywhere we can we're going to try to use the PGP network, because it already exists...
0:25:02.076,0:25:04.351
as I said, we're not trying to invent anything new.
0:25:04.351,0:25:10.621
PGP already exists and it does a really good job. So we're taking the PGP trust system and we're
0:25:10.621,0:25:16.611
going to apply it to things like XMPP and make sure that we can do message passing in a way
0:25:16.611,0:25:18.539
that we can trust.
0:25:18.539,0:25:26.015
Once we have XMPP we have a way to send text, a way to send audio, sure...
0:25:26.015,0:25:28.709
but also you can send structured data.
0:25:28.709,0:25:33.144
Through that same channel. And you can send that data to buddy lists.
0:25:33.144,0:25:39.344
So the system starts to look like a way to pass data in a social way. And we think this is the
0:25:39.344,0:25:42.432
beginning of the social layer of the box.
0:25:42.432,0:25:46.890
At the bottom of the box we have a belief that the technology should be social
0:25:46.890,0:25:48.376
from the ground up.
0:25:48.376,0:25:50.629
And so we're building structures that allow it to be social,
0:25:50.629,0:25:55.505
that assume you want to connect with friends in a network of freedom,
0:25:55.505,0:26:01.310
perhaps FreedomBoxes, perhaps other kinds of software, other kinds of technology.
0:26:01.310,0:26:04.259
And we're designing with that in mind.
0:26:04.259,0:26:08.740
With that in mind, we think we get certain benefits technologically which I'll get into later.
0:26:08.740,0:26:13.384
We think we can simply things like key management, through methods like this.
0:26:13.384,0:26:19.189
By privacy I also mean that we can install a proxy server, privoxy,
0:26:19.189,0:26:21.209
we think the answer is privoxy here,
0:26:21.209,0:26:26.852
privoxy on the box, so you can point your browser at the box, surf the web on the box,
0:26:26.852,0:26:33.632
and strip ads, strip cookies, stop Google from tracking you from website to website to website,
0:26:33.632,0:26:43.338
to remove, the constant person sitting at your side, spying, recording, listening to everything you do.
0:26:43.338,0:26:46.914
In that vein, we don't just want to block ads and reject cookies,
0:26:46.914,0:26:50.327
we want to do something new, relatively new.
0:26:50.327,0:27:02.750
We think we want to munge your browser fingerprint, that unique pattern of data that is captured by your
0:27:02.750,0:27:03.632
user-agent string and what plugins you have, and all that stuff
0:27:03.632,0:27:07.812
that forms a unique profile of you that allows people to track your browser, companies to track your
0:27:07.812,0:27:09.878
browser as you hop along the web, even if they don't know anything about you.
0:27:09.878,0:27:13.338
It can sort of tie you to the browser, make profiles about your browser.
0:27:13.338,0:27:16.473
And that turns out to be a very effective way of figuring out who you are.
0:27:16.473,0:27:23.578
So even without a cookie, even without serving you with an ad, once they're talking to you they can
0:27:23.578,0:27:26.388
uniquely identify you, or relatively uniquely.
0:27:26.388,0:27:32.750
But it's relatively early in the browser fingerprint arms race.
0:27:32.750,0:27:37.649
We think that with a very little bit of changing, we can foil the recording.
0:27:37.649,0:27:40.505
and win this round at least.
0:27:40.505,0:27:46.937
And instead of having one profile where they gather all of your data, you will present to services
0:27:46.937,0:27:51.279
as a different person every time you use the service. So they cannot build profiles of you over time.
0:27:51.579,0:27:53.157
That's what privacy looks like in our context. We're looking for cheap ways to foil the tracking.
0:27:55.057,0:28:02.054
We're looking for easy things we can do, because we believe there's a lot of low-hanging fruit.
0:28:02.054,0:28:05.931
And we'll talk about that more in a minute.
0:28:05.931,0:28:09.832
Freedom is our value, freedom is the thing we are aiming for,
0:28:09.832,0:28:13.431
freedom from centralized structures like the pipes.
0:28:13.431,0:28:19.213
Now mesh networking, I have mesh networking in my slides. That is a lie.
0:28:19.213,0:28:21.465
We are not doing mesh networking.
0:28:21.465,0:28:26.992
The reason we are not doing mesh networking is because I do not know anything about mesh networking
0:28:26.992,0:28:31.705
and one of the reaons I came here was to meet people who know a lot about mesh networking
0:28:31.705,0:28:34.492
and I see people in this audience who know a lot about mesh networking.
0:28:34.492,0:28:41.295
If you want to turn that lie into the truth, the way you do that
0:28:41.295,0:28:43.548
is by continuing on your projects, making mesh networking awesome,
0:28:43.548,0:28:46.195
to the point where I can say yes, we're going to put that in this box.
0:28:46.195,0:28:49.190
Then eventually, by the time this box is ready to do real
0:28:49.190,0:28:52.766
things for real people, we're really hoping that the mesh story
0:28:52.766,0:28:56.504
coheres, where we've identified the protocol and the technology and the people who are going to help
0:28:56.504,0:29:00.243
us. If you think you might be one of those people, we want to talk to you.
0:29:00.243,0:29:02.774
So yes, we are going to do mesh networking,
0:29:02.774,0:29:05.746
and that might be a lie
0:29:05.746,0:29:08.277
but I hope not.
0:29:08.277,0:29:10.668
We want you to have the freedom to own your data
0:29:10.668,0:29:16.775
that means data portability, that means that your data sits on your box and never goes to a third party.
0:29:16.775,0:29:18.586
It only goes to the people you want it to go to.
0:29:18.586,0:29:23.625
Fine-grained access control. Your data, your structures, you decide where it goes.
0:29:23.625,0:29:25.390
That's a user-interface problem,
0:29:25.390,0:29:27.155
that's a user permission problem,
0:29:27.155,0:29:29.105
an access control problem.
0:29:29.105,0:29:33.261
Access control is a solved problem.
0:29:33.261,0:29:37.882
Doing it through a convenient user-interface, that's not solved... so that's work to be done.
0:29:37.882,0:29:42.039
That's a big chunk of our todo list.
0:29:42.039,0:29:43.710
We want you to own your social network
0:29:43.710,0:29:50.119
Before Facebook there was a thing called MySpace, which was... I'm not even sure it exists anymore.
0:29:50.119,0:29:54.136
Before MySpace there was Tribe.
0:29:54.136,0:29:56.551
Before Tribe there was Friendster.
0:29:56.551,0:29:59.825
Friendster is now like a... "gaming network".
0:29:59.825,0:30:02.820
I don't know what it is but they still send me email
0:30:02.820,0:30:06.234
Which is the only reason I know they're still alive.
0:30:06.234,0:30:11.017
Before Friendster was the original social network.
0:30:11.017,0:30:15.522
We called this social network "the internet".
0:30:15.522,0:30:17.008
We talked directly to each other,
0:30:17.008,0:30:21.420
we used email, an instant messenger and IRC.
0:30:21.420,0:30:23.951
We talked to people using the structures that were out there.
0:30:23.951,0:30:27.828
It wasn't centralized in one service, we had a lot of ways of meeting each other
0:30:27.828,0:30:29.152
and passing messages.
0:30:29.152,0:30:31.706
What we lacked was a centralized interface.
0:30:31.706,0:30:35.584
So when we say "own your social network" we mean use the services of the internet,
0:30:35.584,0:30:37.650
own the pieces that talk to each other.
0:30:37.650,0:30:41.110
Hopefully we'll provide you with a convenient interface to do that.
0:30:41.110,0:30:44.106
But the actual structures, the places where your data live,
0:30:44.106,0:30:48.401
that is just the same pieces that we know how to use already.
0:30:48.401,0:30:51.234
We are not going to try to reinvent how you talk to people,
0:30:51.234,0:30:56.459
we're just going to make it so that the pipes are secure.
0:30:56.459,0:30:59.454
A big part of freedom, a big part of privacy,
0:30:59.454,0:31:02.426
is anonymity.
0:31:02.426,0:31:06.443
Tor can provide anonymity.
0:31:06.443,0:31:08.812
But we don't have to go all the way to Tor.
0:31:08.812,0:31:12.248
Tor is expensive, in terms of latency.
0:31:12.248,0:31:16.822
Tor is difficult to manage...
0:31:16.822,0:31:21.397
I don't know how many people have tried to use Tor, to run all their traffic through Tor.
0:31:21.397,0:31:23.649
It's hard. For two reasons.
0:31:23.649,0:31:26.575
For one, the latency... it takes a very long time to load a web page.
0:31:26.575,0:31:32.380
And two, you look like a criminal. To every website that you go to.
0:31:32.380,0:31:38.649
My bank shut down my account when I used Tor.
0:31:38.649,0:31:44.942
Because suddenly, I was coming from an IP address in Germany that they had detected in the past
0:31:44.942,0:31:48.518
efforts to hack them on.
0:31:48.518,0:31:52.256
So they closed my account, well I had to talk to them about it,
0:31:52.256,0:31:53.905
it did all get solved in the end.
0:31:53.905,0:31:57.782
PayPal as well closed my account down.
0:31:57.782,0:31:59.408
So that was the end of my ability to use Tor.
0:31:59.408,0:32:01.057
So we can't just run all our traffic through Tor.
0:32:01.057,0:32:07.117
It's too slow, and the network has weird properties in terms of how you present to websites,
0:32:07.117,0:32:08.951
that frankly, are scary.
0:32:08.951,0:32:16.916
Because if I look like a criminal to the bank, I don't want to imagine what I look like to my own government.
0:32:16.916,0:32:19.006
But we can do privacy in other ways.
0:32:19.006,0:32:25.252
If you are a web user, in China, and you want to surf the internet,
0:32:25.252,0:32:30.941
with full access to every website you might go to, and with privacy from your government,
0:32:30.941,0:32:34.981
so that you don't get a knock on your door from visiting those websites,
0:32:34.981,0:32:36.769
we can do that without Tor.
0:32:36.769,0:32:39.021
We don't need Tor to do that. We can do that cheaply.
0:32:39.021,0:32:45.592
Because all you need to do in that situation is get your connection out of China.
0:32:45.592,0:32:54.393
Send your request for a web page through an encrypted connection to a FreedomBox in...
0:32:54.393,0:32:58.410
Austria, America, who knows?
0:32:58.410,0:33:05.933
Just get the request away from the people who physically have the power to control you.
0:33:05.933,0:33:08.905
And we can do that cheaply, that's just SSH port forwarding.
0:33:08.905,0:33:14.130
That's just a little bit of tunneling, that's just a little bit of VPN.
0:33:14.130,0:33:16.057
There's a lot of ways to do that sort of thing,
0:33:16.057,0:33:20.840
to give you anonymity and privacy in your specific context
0:33:20.840,0:33:22.791
without going all the way into something like Tor.
0:33:22.791,0:33:25.902
Now there are people who are going to need Tor.
0:33:25.902,0:33:27.969
They will need it for their use case.
0:33:27.969,0:33:32.891
But not every use case requires that level of attack.
0:33:32.891,0:33:37.930
And so one of the things we're trying to do is figure out how much privacy and anonymity you need,
0:33:37.930,0:33:40.206
and from whom you need it.
0:33:40.206,0:33:43.457
If we can do that effectively we can give people solutions
0:33:43.457,0:33:45.546
that actually work for them. Because if we just tell people
0:33:45.546,0:33:49.540
to use Tor, we're going to have a problem.
0:33:49.540,0:33:52.652
They're not going to use it, and they won't get any privacy at all.
0:33:52.652,0:33:55.183
And that's bad.
0:33:55.183,0:33:57.249
So we want to allow people to do anonymous publishing,
0:33:57.249,0:33:59.710
and file-sharing, and web-browsing and email.
0:33:59.710,0:34:01.615
All the communications you want to do.
0:34:01.615,0:34:03.867
The technology to do that already exists,
0:34:03.867,0:34:05.771
we could do all of that with Tor.
0:34:05.771,0:34:09.045
The next piece of our challenge is to figure out how to do it without Tor.
0:34:09.045,0:34:12.017
To figure out what pieces we need Tor for, and to figure out
0:34:12.017,0:34:17.845
what pieces we can do a little bit more cheaply.
0:34:17.845,0:34:19.633
Security.
0:34:19.633,0:34:23.975
Without security, you don't have freedom and privacy and anonymity.
0:34:23.975,0:34:25.624
If the box isn't secure,
0:34:25.624,0:34:27.853
you lose.
0:34:27.853,0:34:32.033
We're going to encrypt everything.
0:34:32.033,0:34:36.189
We're going to do something that's called social key management, which I'm going to talk about.
0:34:36.189,0:34:39.138
I do want to talk about the Debian-based bit.
0:34:39.138,0:34:42.853
We are based on a distribution of Linux called Debian,
0:34:42.853,0:34:46.290
because it is a community-based distribution.
0:34:46.290,0:34:48.380
It is made by people who care a lot about your
0:34:48.380,0:34:51.654
freedom, your privacy, and your ability to speak anonymously.
0:34:51.654,0:34:55.531
And we really believe that the best way to distribute this
0:34:55.531,0:34:58.341
software is to hand it to the Debian mirror network and let
0:34:58.341,0:35:00.129
them distribute it. Because they have mechanisms
0:35:00.129,0:35:02.219
to make sure that nobody changes it.
0:35:02.219,0:35:05.214
If we were to distribute the software to you directly, we
0:35:05.214,0:35:09.092
would become a target. People would want to change the
0:35:09.092,0:35:11.808
software as we distribute it on our website.
0:35:11.808,0:35:13.271
They would want to crack our website and distribute their
0:35:13.271,0:35:15.965
version of the package.
0:35:15.965,0:35:18.496
We don't want to be a target, so we're not going to give you software.
0:35:18.496,0:35:21.630
We're going to give it to Debian, and let them give you the software.
0:35:21.630,0:35:26.414
And at the same time you get all of the Debian guarantees about freedom.
0:35:26.414,0:35:28.666
The Debian Free Software Guidelines.
0:35:28.666,0:35:32.103
They're not going to give you software unless it comes
0:35:32.103,0:35:37.025
with all of the social guarantees that are required to participate in the Debian community.
0:35:37.025,0:35:39.556
So we're very proud to be using Debian in this manner,
0:35:39.556,0:35:41.948
and working with Debian in this manner.
0:35:41.948,0:35:44.781
And we think that's the most effective way we can guarantee that we're going to live up to
0:35:44.781,0:35:51.747
our promises to you, because it provides a mechanism whereby if we fail to live up to our promises,
0:35:51.747,0:35:56.344
we cannot give you something that is broken. Because Debian won't let us,
0:35:56.344,0:35:59.618
they just won't distribute it.
0:35:59.618,0:36:02.010
There are problems with security.
0:36:02.010,0:36:04.100
There are things we can't solve.
0:36:04.100,0:36:05.377
One...
0:36:05.377,0:36:08.744
Physical security of the box.
0:36:08.744,0:36:13.643
We haven't really talked much internally about whether we can encrypt the filesystem on this box.
0:36:13.643,0:36:16.615
I don't quite see a way to do it.
0:36:16.615,0:36:20.029
It doesn't have an interface for you to enter a password effectively.
0:36:20.029,0:36:23.303
By the time you've brought an interface up you'd be running untrusted code.
0:36:23.303,0:36:25.230
I don't know a way to do it.
0:36:25.230,0:36:29.549
If anyone can think of a way that we can effectively encrypt the filesystem, I'd love to hear it.
0:36:29.549,0:36:35.029
But, on top of that, if we do encrypt the filesystem,
0:36:35.029,0:36:38.605
then the thing cannot be rebooted remotely, which is a downside.
0:36:38.605,0:36:40.694
So there are trade-offs at every step of the way.
0:36:40.694,0:36:45.013
If we can figure out some of these security issues, then we can be ahead of the game.
0:36:45.013,0:36:50.261
But I think the encrypting the filesystem is the only way to guarantee the box is secure, even if it's
0:36:50.261,0:36:52.351
not physically secure.
0:36:52.351,0:36:53.698
So I think that's a big one.
0:36:53.698,0:36:58.040
If you have ideas about that, please come and talk to me after the talk.
0:36:58.040,0:37:01.291
I promised I would talk about social key management, and here it is.
0:37:01.291,0:37:06.376
So we're building the idea of knowing who your friends are
0:37:06.376,0:37:08.024
into the box at a somewhat low level.
0:37:08.024,0:37:12.947
To the point where things that are on the box can assume it is there,
0:37:12.947,0:37:17.544
or ask you if it's there, or rely on it as a matter of course in some cases.
0:37:17.544,0:37:21.887
So we can do things with keys that make your keys unlosable.
0:37:21.887,0:37:25.207
Right now a PGP key is a hard thing to manage.
0:37:25.207,0:37:26.670
Key management is terrible.
0:37:26.670,0:37:30.432
Do you guys like PGP? PGP is good.
0:37:30.432,0:37:34.727
Does anyone here like key management?
0:37:34.727,0:37:36.213
We have one guy who likes key management.
0:37:36.213,0:37:39.487
LAUGHTER
0:37:39.487,0:37:41.252
He's going to do it for all of you!
0:37:41.252,0:37:43.504
So, none of us like key management.
0:37:43.504,0:37:46.151
Key management doesn't work, especially if your use-case is home users, naive end-users.
0:37:46.151,0:37:48.102
Nobody wants to do key management.
0:37:48.102,0:37:51.701
Writing their key down and putting it in a safety deposit box is ludicrous.
0:37:51.701,0:37:54.371
It's a very difficult thing to actually convince people to do.
0:37:54.371,0:38:00.316
Sticking it on a USB key, putting it in a zip-lock back and burying it in your backyard is paranoid.
0:38:00.316,0:38:03.311
I can't believe I just told you what I do with my key.
0:38:03.311,0:38:04.820
LAUGHTER
0:38:04.820,0:38:06.748
No, you can't ask people to do that.
0:38:06.748,0:38:08.071
They won't do it.
0:38:08.071,0:38:09.882
You can't protect keys in this manner.
0:38:09.882,0:38:13.342
You have to have a system that allows them to sort of, not ever know they have a key.
0:38:13.342,0:38:16.012
To not think about their key unless they really want to.
0:38:16.012,0:38:19.008
We think we've come up with something that might work.
0:38:19.008,0:38:20.772
You take the key,
0:38:20.772,0:38:22.282
or a subkey,
0:38:22.282,0:38:24.511
you chop it into little bits
0:38:24.511,0:38:25.416
and you give that key...
0:38:25.416,0:38:31.245
and we're talking about a key of a very long length, so there's a giant attack space
0:38:31.245,0:38:36.307
and you can chop it into bits and hand it to people without reducing the search space for a key.
0:38:36.307,0:38:39.000
You chop it into bits and hand all the bits to your friends.
0:38:39.000,0:38:42.437
Now all your friends have your key, as a group.
0:38:42.437,0:38:44.271
Individually, none of them can attack you.
0:38:44.271,0:38:47.708
Indicidually, none of them has the power to come root your box,
0:38:47.708,0:38:50.378
to access your services and pretend to be you.
0:38:50.378,0:38:53.791
As a group, they can do this.
0:38:53.791,0:39:04.217
We trust our friends, as a group, more than we trust them as individuals.
0:39:04.217,0:39:08.698
Any single one of your friends, if you gave them the key to your financial data and your private online
0:39:08.698,0:39:10.811
life that would make you very nervous.
0:39:10.811,0:39:14.387
You would worry that they would succumb to temptation to peek,
0:39:14.387,0:39:17.220
fall on hard times and want to attack you in some way,
0:39:17.220,0:39:19.612
fall out with you, get mad at you.
0:39:19.612,0:39:23.350
As an individual, people are sort of fallible in this sense.
0:39:23.350,0:39:25.579
But as a group of friends who would have to get together
0:39:25.579,0:39:30.038
and affirmatively make a decision to attack you,
0:39:30.038,0:39:32.592
we think that's extremely unlikely.
0:39:32.592,0:39:38.072
It's so unlikely that there are only a few scenarios where we think it might happen.
0:39:38.072,0:39:39.535
One...
0:39:39.535,0:39:42.669
if you are ill, and unable to access your box
0:39:42.669,0:39:44.202
or you're in jail
0:39:44.202,0:39:45.548
or you've passed away
0:39:45.548,0:39:49.008
or you've disappeared.
0:39:49.008,0:39:52.305
Or... you've gone crazy.
0:39:52.305,0:39:57.646
We call this type of event, where all your friends get together and help you,
0:39:57.646,0:39:59.898
even if you don't ask them for help,
0:39:59.898,0:40:02.871
we call that an intervention.
0:40:02.871,0:40:05.564
When your friends sit you down and say,
0:40:05.564,0:40:09.302
"you need our help, you can't ask us for it because you're not in a position to ask us for it",
0:40:09.302,0:40:10.951
that's an intervention.
0:40:10.951,0:40:16.733
If you have a moment in your life, a crisis in your life that is an intervention level event,
0:40:16.733,0:40:18.544
that's when you can go to your friends.
0:40:18.544,0:40:22.120
If your house burns down, you lose your key and all your data
0:40:22.120,0:40:25.533
You go to your friends, and you say "can I have part of my key back?"
0:40:25.533,0:40:29.829
"Oh, and give me that data that you have in a cryptographically-sealed box that you can't read."
0:40:29.829,0:40:31.013
To all your friends...
0:40:31.013,0:40:32.035
"My data please, my key please, ..."
0:40:32.035,0:40:32.778
"My data please, my key please, ..."
0:40:32.778,0:40:34.148
"My data please, my key please, ..."
0:40:34.148,0:40:39.697
You take all those pieces, you get a new box,
0:40:39.697,0:40:42.089
you load it all onto your box.
0:40:42.089,0:40:47.151
You have the key, you have your entire key, and now you can read your data.
0:40:47.151,0:40:49.241
And you haven't lost your digital life.
0:40:49.241,0:40:54.001
You have a key that is now unlosable.
0:40:54.001,0:40:58.761
Even if you never wrote it down, even if you never buried it in the backyard.
0:40:58.761,0:41:00.502
This is a hard problem in key management.
0:41:00.502,0:41:04.241
People lose their keys and their passwords to services all the time.
0:41:04.241,0:41:09.024
The only way we can think of to make that impossible, is this mechanism.
0:41:09.024,0:41:10.371
And of course it's optional.
0:41:10.371,0:41:13.808
If you're a person who doesn't trust your friends, even as a group,
0:41:13.808,0:41:17.244
or if you're a person who just doesn't have a lot of friends
0:41:17.244,0:41:20.518
(let me finish!)
0:41:20.518,0:41:25.116
...who doesn't have a lot of friends with FreedomBoxes who can be the backend for this,
0:41:25.116,0:41:27.229
you don't have to trust this mechanism.
0:41:27.229,0:41:30.015
You can do something else to make your key unforgettable.
0:41:30.015,0:41:32.430
But for a lot of naive end-users,
0:41:32.430,0:41:34.520
this is the mechanism.
0:41:34.520,0:41:36.749
This is the way they are going to never
0:41:36.749,0:41:37.956
lose their keys
0:41:37.956,0:41:41.695
Because the first time a user gets irretrievably locked out of his FreedomBox,
0:41:41.695,0:41:43.784
we lose that user forever.
0:41:43.784,0:41:45.572
And we lose all his friends forever.
0:41:45.572,0:41:52.306
Because it would scare you to lose such an important group of information.
0:41:52.306,0:41:53.932
Social key management.
0:41:53.932,0:41:58.692
This is the benefit of building social, of building knowledge
0:41:58.692,0:42:03.614
of who your friends are, into the box, at a deep level.
0:42:03.614,0:42:05.820
We have never done that before, with a technology
0:42:05.820,0:42:08.026
as a community project.
0:42:08.026,0:42:11.021
And it opens up new possibilities. This is just one.
0:42:11.021,0:42:13.088
There are others.
0:42:13.088,0:42:15.317
But it's a field we haven't really thought a lot about.
0:42:15.317,0:42:19.636
I think once we get out there and we start doing this kind of
0:42:19.636,0:42:25.441
construction, a lot of new uses are going to be found for this architecture.
0:42:25.441,0:42:28.576
I encourage you all to think about what changes,
0:42:28.576,0:42:34.938
when you can assume that the box has people you can trust, just a little bit,
0:42:34.938,0:42:38.212
because right now we live in a world where we are asked
0:42:38.212,0:42:42.694
to trust third party services like Facebook with all our photos,
0:42:42.694,0:42:46.409
or Flickr with all our photos, or Gmail with all our email.
0:42:46.409,0:42:47.755
We are asked to trust them.
0:42:47.755,0:42:50.101
We have no reason to trust them.
0:42:50.101,0:42:54.861
I mean, we expect that they'll act all right, because they have no reason to destroy us.
0:42:54.861,0:42:56.927
But we don't know what's going to happen.
0:42:56.927,0:43:01.664
We're effectively giving all our information to people we don't trust at all right now.
0:43:01.664,0:43:04.613
How does a network of people we trust, just a little bit,
0:43:04.613,0:43:06.982
change the landscape?
0:43:06.982,0:43:09.071
I think that's a really interesting question.
0:43:09.071,0:43:10.418
This box explores that question,
0:43:10.418,0:43:16.061
this box creates new solutions to old problems that previously seemed intractable.
0:43:16.061,0:43:19.660
So, I encourage everybody to think about how that might
0:43:19.660,0:43:27.137
change the solution to a problem they have with a technological architecture as it exists today.
0:43:27.137,0:43:31.595
Here's another problem...
0:43:31.595,0:43:34.567
Boxes that know who you are, and know who your friends are,
0:43:34.567,0:43:37.562
and know how your friends normally act,
0:43:37.562,0:43:41.881
can also know when your friends are acting weird.
0:43:41.881,0:43:49.613
If you have a friend who sends you one email a year, who suddenly sends you ten emails in a day,
0:43:49.613,0:43:51.680
that look like spam,
0:43:51.680,0:43:53.445
you know that box is rooted.
0:43:53.445,0:43:55.372
You know that box is weird.
0:43:55.372,0:43:59.412
Or if you are using the FreedomBox as your gateway to the internet,
0:43:59.412,0:44:05.357
and a box it is serving downstream, starts sending a bunch of spam through it, it knows.
0:44:05.357,0:44:08.793
It can say "Oh no! You're acting like a zombie."
0:44:08.793,0:44:10.442
"You should get a check-up."
0:44:10.442,0:44:15.527
It can shut off mail service to that box, and not let the messages out.
0:44:15.527,0:44:21.611
It can make that decision to protect the wider internet to make you a better citizen in the world.
0:44:21.611,0:44:27.996
If suddenly your computer starts saying "Hey, I'm in Scotland and I need $5000"...
0:44:27.996,0:44:30.179
but we know you're not in Scotland
0:44:30.179,0:44:33.035
Maybe this box, because it has contact information,
0:44:33.035,0:44:35.705
maybe this box sends you an SMS.
0:44:35.705,0:44:40.930
And says "Dude, you've been hacked, go do something about your box."
0:44:40.930,0:44:43.762
So the types of things we can do once we assume we have
0:44:43.762,0:44:49.010
close relations as opposed to arms-length relations,
0:44:49.010,0:44:51.100
the types of things we can do when we trust each other a little bit
0:44:51.100,0:44:54.374
and we trust our boxes a little bit, goes way up.
0:44:54.374,0:44:55.860
Way up.
0:44:55.860,0:44:58.786
And by bringing that infrastructure closer to us,
0:44:58.786,0:45:03.360
I mean Gmail is too far away to play that role from a network perspective.
0:45:03.360,0:45:08.840
But if the box is in our land, we can do that.
0:45:08.840,0:45:11.812
These boxes will only work if they are convenient.
0:45:11.812,0:45:14.784
There's an old punk-rock slogan, from the Dead Kennedys,
0:45:14.784,0:45:18.523
"Give me convenience, or give me death."
0:45:18.523,0:45:24.676
We laugh at that, but that's a belief users have,
0:45:24.676,0:45:26.580
and I deduce that based on their behaviour,
0:45:26.580,0:45:29.738
because every time there is a convenient web service,
0:45:29.738,0:45:31.201
people use it.
0:45:31.201,0:45:34.777
Even if it's not very good with privacy, a lot of people are going to use it.
0:45:34.777,0:45:41.325
And conversely, whenever we have web services that are very good at privacy, but aren't very convenient,
0:45:41.325,0:45:44.018
comparatively fewer people use them.
0:45:44.018,0:45:47.733
We don't think this box works without convenience.
0:45:47.733,0:45:51.286
If we don't get the user-interface right then this project
0:45:51.286,0:45:53.376
will probably fall over.
0:45:53.376,0:45:56.023
It will never gain any sort of critical mass.
0:45:56.023,0:45:57.811
So we need a simple interface,
0:45:57.811,0:46:00.945
we need a way for users to interact with this box in a minimal way.
0:46:00.945,0:46:03.476
They should think about it as little as possible.
0:46:03.476,0:46:06.007
That's the hardest problem we face.
0:46:06.007,0:46:07.494
Quite frankly.
0:46:07.494,0:46:10.489
The technology to do private communication, that exists.
0:46:10.489,0:46:14.367
A lot of the people in this room helped to build that infrastructure and technology.
0:46:14.367,0:46:16.619
We can put it on the box.
0:46:16.619,0:46:21.100
Making it easy and accessible for users, that's hard.
0:46:21.100,0:46:23.353
And right now we're trying to figure out what that looks like,
0:46:23.353,0:46:25.141
who the designers are going to be.
0:46:25.141,0:46:30.783
If you have user interface or user experience design that you want to bring to a project like this,
0:46:30.783,0:46:33.918
please, please, come find me.
0:46:33.918,0:46:38.980
In order to have convenience, we need to have the thing provide services that are not just
0:46:38.980,0:46:44.924
freedom-oriented, we need to use its position in your network as a trusted device
0:46:44.924,0:46:48.500
to do things for you that aren't just about privacy.
0:46:48.500,0:46:50.543
It needs to do backups.
0:46:50.543,0:46:52.006
This is important.
0:46:52.006,0:46:56.627
Right now the way people back up their photos is by giving them to Flickr.
0:46:56.627,0:47:00.180
The way they back up their email is by giving it to Gmail.
0:47:00.180,0:47:06.031
If we don't provide backups, we can never be an effective replacement
0:47:06.031,0:47:09.142
for the services that store your data somewhere else.
0:47:09.142,0:47:14.831
Even though they're storing it out there in the cloud for their purposes, you get a benefit from it.
0:47:14.831,0:47:16.619
We have to replicate that benefit.
0:47:16.619,0:47:19.893
So things that we don't think of as privacy features have to
0:47:19.893,0:47:21.658
be in the box.
0:47:21.658,0:47:25.513
The backups, the passwords, and the keys, you can't forget them.
0:47:25.513,0:47:29.112
We would like it to be a music, a video, a photo server,
0:47:29.112,0:47:33.709
all the kinds of things you might expect from a convenient box on your network.
0:47:33.709,0:47:37.703
All the things that you want to share with other people, this box has to do those things.
0:47:37.703,0:47:44.994
And these aren't privacy features, but without them we won't be able to give people privacy.
0:47:44.994,0:47:49.150
Our first feature, the thing we are working towards
0:47:49.150,0:47:50.474
is Jabber.
0:47:50.474,0:47:53.144
It's secure encrypted chat, point-to-point.
0:47:53.144,0:47:57.719
That will be the thing we are working on right now.
0:47:57.719,0:48:02.223
But in order to do that we need to solve this monkey-spherish SSL problem that I described.
0:48:02.223,0:48:06.705
We have code, it needs to get packaged and all that.
0:48:06.705,0:48:10.234
Our development strategy, the way we are going to do all the things we said,
0:48:10.234,0:48:15.180
because the list of things I have said we're going to do...
0:48:15.180,0:48:19.360
I can't believe you're not throwing things at me.
0:48:19.360,0:48:21.566
Because it's ludicrous to believe that we can actually do all these things by ourselves.
0:48:21.566,0:48:23.516
And we're not.
0:48:23.516,0:48:25.908
We're going to let other people make the software.
0:48:25.908,0:48:28.160
As much as possible we're going to encourage other people
0:48:28.160,0:48:31.713
to build stuff. We're going to use stuff that already exists.
0:48:31.713,0:48:35.010
We're going to use Privoxy, we're going to use Prosody, we're going to use Apache.
0:48:35.010,0:48:38.563
We're not going to reinvent the web server, we're not going to reinvent protocols.
0:48:38.563,0:48:45.621
I really hope that by the time this project is mature, we haven't invented any new protocols.
0:48:45.621,0:48:48.617
Maybe we'll use new protocols, but I don't want to be
0:48:48.617,0:48:53.238
generating new things that haven't been tested, and then putting them in FreedomBox.
0:48:53.238,0:48:58.462
I want to see things in the real world, tested, gain credibility and take them.
0:48:58.462,0:49:01.736
The less we invent, the better.
0:49:01.736,0:49:07.541
As far as timelines go, by the time we have it ready, you'll know why you need it.
0:49:07.541,0:49:10.676
People right now are figuring out that privacy is important.
0:49:10.676,0:49:12.975
They're seeing it over and over again.
0:49:12.975,0:49:18.106
In Egypt, the at the start of the Arab spring, one of the things the government did to try to
0:49:18.106,0:49:22.982
tamp down the organisation was to convince companies to shut off cell networks,
0:49:22.982,0:49:25.165
to prevent people from talking to each other.
0:49:25.165,0:49:28.300
In America they did the same thing in San Francisco I hear.
0:49:28.300,0:49:36.334
Turned off the cell towers to prevent people from organising to meet for a protest.
0:49:36.334,0:49:42.255
With Occupy Wall Street, you're starting to see infiltration,
0:49:42.255,0:49:45.970
you're starting to see people going and getting information
0:49:45.970,0:49:48.501
that Occupy Wall Street is talking about and turning it over
0:49:48.501,0:49:51.938
to the authorities, the police, the FBI.
0:49:51.938,0:49:59.089
So the need for privacy as we enter a new age of increased activism, we hope,
0:49:59.089,0:50:01.783
of increased activity, of social activity,
0:50:01.783,0:50:06.241
I think the need for a lot of this privacy stuff is going to become clear.
0:50:06.241,0:50:11.001
As the technology for invading your privacy improves,
0:50:11.001,0:50:18.083
the need for technology to protect your privacy will become stark and clear.
0:50:18.083,0:50:22.541
Our two big challenges as I said are user experience,
0:50:22.541,0:50:27.557
and the one I didn't say was paying for developers, paying for designers.
0:50:27.557,0:50:31.713
Those are the hard parts that we're working on.
0:50:31.713,0:50:35.870
And if we fail, we think that's where we fail.
0:50:35.870,0:50:40.212
Software isn't on that list, as I said software is already out there.
0:50:40.212,0:50:42.441
So you can have a FreedomBox.
0:50:42.441,0:50:46.760
If you like that box that we've been passing around the audience, you can buy one from Globalscale.
0:50:46.760,0:50:51.241
If you don't want the box, it's just Debian, it's just Linux, it's just packages.
0:50:51.241,0:50:56.466
Throw Debian on a box, we will have packages available through the normal Debian mechanisms.
0:50:56.466,0:50:58.277
You don't even have to use our repository.
0:50:58.277,0:51:01.551
In fact, I don't think we're going to have a repository.
0:51:01.551,0:51:06.149
You're just going to download it and install it the same way you normally do it if you're technologically
0:51:06.149,0:51:08.517
capable of doing that.
0:51:08.517,0:51:10.259
I grabbed a bunch of photos from Flickr,
0:51:10.259,0:51:14.415
my colleague Ian Sullivan took that awesome picture of the FreedomBox.
0:51:14.415,0:51:17.238
And that's how you reach me.
0:51:18.992,0:51:31.307
APPLAUSE
0:51:39.030,0:51:44.787
Thanks James, please sit down.
0:51:44.787,0:51:49.105
We are up for questions from the audience for James.
0:51:49.105,0:52:03.525
Please raise your hand if you have any questions about the FreedomBox.
0:52:03.525,0:52:05.754
Hello, thanks that was a very interesting presentation.
0:52:05.754,0:52:06.660
Thank you.
0:52:06.660,0:52:10.491
Your boss Eben Moglen, he has given a speech at a committee of the US congress
0:52:10.491,0:52:13.486
I believe, which has received a lot of attention
0:52:13.486,0:52:18.572
and in Iran during the green movement the US state department
0:52:18.572,0:52:24.075
I believe has told Twitter to reschedule maintainence so that
0:52:24.075,0:52:29.160
the opposition could keep using Twitter during the attempted revolution
0:52:29.160,0:52:33.038
and Hilary Clinton has given a very popular speech about
0:52:33.038,0:52:36.915
how America would support the promotion of internet freedom
0:52:36.915,0:52:40.793
and I think things such as the New America Foundation are
0:52:40.793,0:52:46.412
funding and supporting projects such as the Commotion mesh networking project
0:52:46.412,0:52:49.222
that we've already heard about before.
0:52:49.222,0:52:52.635
So in other words there's a link between politics and technology sometimes,
0:52:52.635,0:52:57.860
and in the past I believe certain influential Americans such
0:52:57.860,0:53:03.967
Rupert Murdoch or George W. Bush have viewed modern communication technologies as a way to
0:53:03.967,0:53:09.052
promote U.S. foreign policy and to spread democracy and freedom in the world.
0:53:09.052,0:53:14.137
So my question is, what is your relationship with your government?
0:53:14.137,0:53:16.087
That's a really good question.
0:53:16.087,0:53:21.335
So one of the things that we sort of figured out from the beginning was that
0:53:21.335,0:53:25.770
if we had close relationships with the U.S. government,
0:53:25.770,0:53:29.787
people outside of the U.S. might have difficulty trusting us,
0:53:29.787,0:53:34.547
because nobody wants to tell all their secrets to the American government.
0:53:34.547,0:53:42.674
So we were thinking about what that really looks like in the context of a box that could be used globally.
0:53:42.674,0:53:48.642
We are working very hard to engineer a device that does not require you to trust us.
0:53:48.642,0:53:50.569
I'm not asking for your trust.
0:53:50.569,0:53:55.051
I'm not asking for your trust, I'm asking for your help.
0:53:55.051,0:53:59.091
All the code we write you'll be able to see it, you'll be able to
0:53:59.091,0:54:02.086
audit it, you'll be able to make your own decisions about what it does,
0:54:02.086,0:54:05.383
you'll be able to test it if it trustworthy or not,
0:54:05.383,0:54:10.887
and if you decide that it is not, you can tell everyone,
0:54:10.887,0:54:11.931
and they won't use it.
0:54:11.931,0:54:16.808
So from a trust perspective, it doesn't matter what our relationship is with anybody.
0:54:16.808,0:54:18.433
So that's the first thing.
0:54:18.433,0:54:23.797
The second thing is that right now we don't have much of a relationship with the U.S. government.
0:54:23.797,0:54:33.456
Jacob Applebaum is somewhat famous for his work with Julian Assange on Wikileaks,
0:54:33.456,0:54:36.568
and his work on Tor, and security in general,
0:54:36.568,0:54:39.726
his efforts to provide you with freedom and privacy.
0:54:39.726,0:54:45.856
He is a guy who was recently revealed in the Wall Street Journal that the U.S. government has been spying
0:54:45.856,0:54:51.545
on. And he is on our team, he's on our technical advisory committee.
0:54:51.545,0:54:56.026
He's one of the people we go to for help when we need to understand security on the box.
0:54:56.026,0:55:02.690
So right now our position with the American government is that we're not really related except in
0:55:02.690,0:55:05.662
so much that we are a bunch of people who really care about these issues,
0:55:05.662,0:55:12.768
which maybe occasionally makes us targets. Which gives us a reason to use a box like this.
0:55:12.768,0:55:21.266
Coupled with that, there is a program in America - you were talking about Hilary Clinton saying
0:55:21.266,0:55:26.026
she was going to encourage technologies that will spread democracy.
0:55:26.026,0:55:30.206
So the way America encourages things is by spending money on it.
0:55:30.206,0:55:34.687
That's our typical way to support programs. We fund different things.
0:55:34.687,0:55:40.678
We don't generally have feel-good campaigns, we just pay people to make good work, or try to.
0:55:40.678,0:55:46.924
So the U.S. state department has a program to provide funding for projects like the FreedomBox.
0:55:46.924,0:55:48.526
We have not applied for that funding.
0:55:48.526,0:55:50.198
I don't know if we will.
0:55:50.198,0:55:56.143
However I do know that they have given funding to some very good and genuine projects that are
0:55:56.143,0:56:00.276
run by people I trust, so I try not to be cynical about that.
0:56:00.276,0:56:06.522
I imagine at some point that through a direct grant or a sub-grant or something,
0:56:06.522,0:56:11.143
some state department money might support some aspect of work that is related to us.
0:56:11.143,0:56:15.020
I mean, we might take work from a project that is state department funded,
0:56:15.020,0:56:17.853
just because it's quick work.
0:56:17.853,0:56:20.849
Have I answered your question?
0:56:20.849,0:56:21.708
Yes, thanks.
0:56:32.200,0:56:37.637
Hi, well you always have tension if you talk about privacy
0:56:37.637,0:56:41.073
since 9/11 you know, I heard this in America very often,
0:56:41.073,0:56:44.185
"we have to be careful", every body is suspicious and stuff.
0:56:44.185,0:56:48.155
So how do you react when people like the government say well,
0:56:48.155,0:56:55.446
you are creating a way to support terrorism, whatever.
0:56:55.446,0:57:00.230
That's a good question, and it's a common question.
0:57:00.230,0:57:04.711
Frankly every time I do this talk, it's one of the first questions that come up.
0:57:04.711,0:57:06.940
The answer is really simple.
0:57:06.940,0:57:11.747
The fact is, this box doesn't create any new privacy technology.
0:57:11.747,0:57:15.137
It just makes it easier to use and easier to access.
0:57:15.137,0:57:21.429
People who are committed to terrorism or criminal activity, they have sufficient motivation that they
0:57:21.429,0:57:23.612
can use the technology that exists. Terrorists are already using PGP.
0:57:23.612,0:57:27.165
They're already using Tor.
0:57:27.165,0:57:30.253
They're already using stuff to hide their data.
0:57:30.253,0:57:33.341
At best we are helping stupid terrorists.
0:57:33.341,0:57:35.710
LAUGHTER
0:57:35.710,0:57:42.861
Granted, I'm not excited about that, but I don't that's a sufficient reason to deny common people
0:57:42.861,0:57:44.510
access to these technologies.
0:57:44.510,0:57:49.131
And more importantly than the fact that terrorists and criminals have access to this technology,
0:57:49.131,0:57:52.405
governments have access to this technology.
0:57:52.405,0:57:54.657
The largest corporations have access to this technology.
0:57:54.657,0:58:00.787
Every bank, the same encryption methods that we are using is the stuff that protects trillions of dollars
0:58:00.787,0:58:05.106
in value that banks trade every day.
0:58:05.106,0:58:12.583
This is technology that is currently being used by everyone except us.
0:58:12.583,0:58:15.114
All we're doing is levelling the playing field.
0:58:15.114,0:58:22.243
The same technology that hides data from us, that causes a complete lack of transparency in a downward
0:58:22.243,0:58:27.908
direction, we can have to level the playing field a little bit.
0:58:27.908,0:58:39.727
More questions?
0:58:39.727,0:58:43.884
Thank you for your presentation.
0:58:43.884,0:58:51.337
Could we add to challenges, maybe we could produce it in a non-communist dictatorship?
0:58:51.337,0:58:54.333
Because I saw the label "Made in China", so I think it is just
0:58:54.333,0:59:00.927
paradox to produce something like the FreedomBox in this country, and I would also like to be independent
0:59:00.927,0:59:07.173
from producing in China. So that's just something for a challenge I think.
0:59:07.173,0:59:10.610
That's a really good question and important point.
0:59:10.610,0:59:16.229
So, we're not a hardware project. Hardware is really really hard to do right and do well.
0:59:16.229,0:59:19.340
We have some hardware hackers on our project.
0:59:19.340,0:59:25.261
Our tech lead Bdale Garbee does amazing work with satellites and model rockets and altimeters,
0:59:25.261,0:59:28.837
and he's brilliant. But this is not a hardware project.
0:59:28.837,0:59:31.972
All we can do is use hardware that already exists.
0:59:31.972,0:59:37.638
When the world makes hardware in places other than China, we will use that hardware.
0:59:37.638,0:59:41.098
Right now, we don't have a lot of options.
0:59:41.098,0:59:46.624
And we're not going to deny everybody privacy because we don't have a lot of hardware options.
0:59:46.624,0:59:48.110
When we have those options we'll take them.
0:59:48.110,0:59:51.941
In the meantime, if you are a person who really cares about this issue,
0:59:51.941,0:59:55.656
don't buy a FreedomBox.
0:59:55.656,0:59:58.954
Take the software, go find a computer that isn't made in China,
0:59:58.954,1:00:02.228
LAUGHTER
1:00:02.228,1:00:05.014
and go put the software on that box.
1:00:05.014,1:00:11.748
If you want a solution that is run on computers that don't exist, I can't help you with that.
1:00:11.748,1:00:15.951
If you want a solution that runs, I might be able to help you with that.
1:00:15.951,1:00:20.270
But yes, I agree that that is a real issue, and we are thinking about that.
1:00:20.270,1:00:25.471
We believe that there is an open hardware project story here.
1:00:25.471,1:00:28.884
And one thing we've been doing is working with the manufacturer of the box,
1:00:28.884,1:00:32.948
to get the code free, to make sure we know what's in it,
1:00:32.948,1:00:35.316
so that there are no binary blobs in the box,
1:00:35.316,1:00:38.149
so we have some assurances that we actually do have freedom.
1:00:38.149,1:00:45.672
At some point though, we do believe that somebody will solve the open hardware problem for us.
1:00:45.672,1:00:50.548
We're not going to be the hardware project, but there are people trying to do this in an open way.
1:00:50.548,1:00:54.426
RaspberryPi for example. They're not quite right for our use-case, but those kinds of projects
1:00:54.426,1:00:58.582
are starting to exist, and they're starting to be really good.
1:00:58.582,1:01:01.415
In a few years, maybe that will be the thing we move onto.
1:01:01.415,1:01:09.937
Now, I'm guessing that even an open hardware project like RaspberryPi does their manufacturing in
1:01:09.937,1:01:14.860
a place like China. And that's a big problem.
1:01:14.860,1:01:19.480
When the world is ready with a solution to that, we will be ready to accept that solution and adopt it
1:01:19.480,1:01:22.615
of course.
1:01:22.615,1:01:30.533
Any more questions for James? or statements?
1:01:33.056,1:01:37.012
This is more of a statement than a question I guess,
1:01:37.012,1:01:42.979
but should the FreedomBox start being made in China there will be a lot more of them coming out of
1:01:42.979,1:01:46.253
the back door and enabling privacy for people that don't get
1:01:46.253,1:01:51.919
it, but also as soon as it starts getting manufactured I'd imagine you may,
1:01:51.919,1:01:54.914
because you're not in it for the money as you told me last night,
1:01:54.914,1:01:59.558
you may be looking forward to how easy it will be to copy,
1:01:59.558,1:02:05.990
and with things like MakerBot, making a case, making a bot is easy,
1:02:05.990,1:02:08.823
you can do it in your bedroom now with 3D printers.
1:02:08.823,1:02:15.998
So there will be a bag of components, a board, made by some online place that is really into this,
1:02:15.998,1:02:18.227
and you can assemble these at home.
1:02:18.227,1:02:22.987
So you've just got to get it out there first I think, and lead the way.
1:02:22.987,1:02:29.628
Yeah, I think that's quite right in that we are not the only place to get a box like this.
1:02:29.628,1:02:34.551
I mean, we're putting it on a specific box to make it easy, but there will be lots of places that make
1:02:34.551,1:02:40.657
boxes, and hopefully there will be places where working conditions are acceptable to everybody.
1:02:40.657,1:02:43.931
And at that point you can make your own boxes,
1:02:43.931,1:02:44.431
you can put them on any box you can find.
1:02:44.431,1:02:46.137
The point of Free Software is not to lock you into a service,
1:02:46.137,1:02:53.196
a technology, a software, a structure or a box.
1:02:53.196,1:02:53.696
We're not going to lock you into anything, that's one thing we're extremely clear about.
1:02:53.696,1:03:00.928
If you manage to make a box like this at home, I would really love to hear about it.
1:03:00.928,1:03:06.455
If you can spin up a MakerBot to make a case,
1:03:06.455,1:03:08.939
and you have a friend who can etch boards,
1:03:08.939,1:03:10.565
and you make a box like this at home,
1:03:10.565,1:03:14.141
that would be big news and a lot of people would want to know about it.
1:03:14.141,1:03:22.662
More statements or questions? Yes...
1:03:22.662,1:03:31.463
So, if you lose your box and get a new one, how is it going to reauthenticate to the boxes of your friends?
1:03:31.463,1:03:34.296
I think I didn't get that one.
1:03:34.296,1:03:39.381
Yeah, so, the good thing about friends is that they don't actually know you by your PGP key.
1:03:39.381,1:03:48.251
Sorry, I didn't specify it, if you want a grand security and you want distribution to more than 12 friends,
1:03:48.251,1:03:54.009
so let's say a hundred, and they're like, all over the world.
1:03:54.009,1:03:59.536
You are probably going to reach them through the internet to get your key parts back,
1:03:59.536,1:04:05.178
and you are probably not going to be able to use the FreedomBox to get a new one because
1:04:05.178,1:04:06.478
it has to be authenticated.
1:04:06.478,1:04:09.311
So how do you do?
1:04:09.311,1:04:10.960
Well, you at that point...
1:04:10.960,1:04:14.536
if you don't have a FreedomBox, the FreedomBox can't provide you with a solution to that problem.
1:04:14.536,1:04:16.811
What you're going to have to do,
1:04:16.811,1:04:19.017
is perhaps call your friends.
1:04:19.017,1:04:20.991
Have a conversation with them,
1:04:20.991,1:04:23.499
convince them that you are the person you say you are.
1:04:23.499,1:04:27.400
Reference your shared experiences, maybe they know your voice,
1:04:27.400,1:04:33.506
maybe they just know who you are by the way that you act and the way that you talk.
1:04:33.506,1:04:37.059
There's not going to be any one way that we get our keys back.
1:04:37.059,1:04:41.076
If you lose your key, yeah, we're not saying that's never going to be a problem.
1:04:41.076,1:04:43.909
And I wouldn't recommend splitting your key up among a hundred people,
1:04:43.909,1:04:48.530
because that's a lot of people to ask for your key back.
1:04:48.530,1:04:53.568
The mechanism I have in mind is not that you get a little bit of your key from
1:04:53.568,1:04:56.424
everyone you know, it's that you spread out the key among
1:04:56.424,1:05:00.000
a lot of people, and you need a certain number of those people.
1:05:00.000,1:05:02.694
So maybe it's five of seven of your friends.
1:05:02.694,1:05:06.734
So you give seven people the key, but any five of them could give you a whole key.
1:05:06.734,1:05:09.730
So in case you can't reach somebody you can still manage to do it.
1:05:09.730,1:05:12.887
And we can make that access control as fine-grained as we want,
1:05:12.887,1:05:15.860
but a hundred would be overwhelming.
1:05:15.860,1:05:20.504
We wouldn't do that. Sure, you could do it if you wanted,
1:05:20.504,1:05:23.476
but I don't think you'll have a hundred friends you could trust that much.
1:05:23.476,1:05:26.750
Maybe you do, I don't.
1:05:26.750,1:05:33.878
More questions, statements?
1:05:33.878,1:05:39.498
Yes?
1:05:39.498,1:05:47.253
Erm, it's just a wish... but have you thought about the idea of using the FreedomBox to create
1:05:47.253,1:05:51.897
a community where you can exchange not only data but like
1:05:51.897,1:05:58.770
products or services, so that would maybe like, change the system?
1:05:58.770,1:06:04.738
One of the things we want to do with the FreedomBox is
1:06:04.738,1:06:10.380
create a thing that looks a lot like your current social networking,
1:06:10.380,1:06:12.911
minus the advertising and the spying.
1:06:12.911,1:06:16.417
A way to talk to all your friends at once.
1:06:16.417,1:06:20.295
Once you have a place, a platform, where you can communicate
1:06:20.295,1:06:23.128
with your friends, you can build on that platform
1:06:23.128,1:06:25.055
and you can create structures like that.
1:06:25.055,1:06:29.072
If we make a thing that has programmable interfaces, so
1:06:29.072,1:06:32.671
you can make apps for it, you can make an app like that,
1:06:32.671,1:06:34.436
if that's important to you.
1:06:34.436,1:06:38.174
What people do with the communication once they have it,
1:06:38.174,1:06:40.403
we don't have any opinions about.
1:06:40.403,1:06:43.236
We want them to do everything that's important to them.
1:06:43.236,1:06:45.930
And I think something like that could be important,
1:06:45.930,1:07:03.414
and yeah, that would be amazing if that were to emerge.
1:07:03.414,1:07:08.337
Some things I believe are easier to do in a centralized architecture than a decentralized one,
1:07:08.337,1:07:12.819
for example search, or services that require a lot of bandwidth.
1:07:12.819,1:07:16.093
I don't see how you can run something like YouTube on the FreedomBox.
1:07:16.093,1:07:18.461
So is your utopian vision one where everything is decentralized,
1:07:18.461,1:07:23.918
or is it ok to have some centralized pieces in a future network?
1:07:23.918,1:07:28.840
Look, if you're going to grant me my utopia then of course everything is decentralized.
1:07:28.840,1:07:31.812
But we don't live in a utopia, I don't have magic.
1:07:31.812,1:07:38.546
We actually have in our flowchart a box labeled "magic routing",
1:07:38.546,1:07:41.217
because routing is hard to do in a decentralized way...
1:07:41.217,1:07:44.049
You need someone to tell you where the IPs are.
1:07:44.049,1:07:47.347
And that's hard to do in a decentralized way.
1:07:47.347,1:07:52.107
We haven't solved it, and we don't think we're going to fully solve it.
1:07:52.107,1:07:54.731
We hope someone else solves it first of all.
1:07:54.731,1:07:56.844
But second of all, we don't know where the compromises are.
1:07:56.844,1:07:59.212
Some things are not possible to decentralize.
1:07:59.212,1:08:01.859
We're going to decentralize as much as we can,
1:08:01.859,1:08:04.227
but we're not committing to doing anything impossible.
1:08:04.227,1:08:06.155
If you can't run YouTube off this box,
1:08:06.155,1:08:08.407
which I disagree with by the way,
1:08:08.407,1:08:10.009
then you won't, because it's impossible.
1:08:10.009,1:08:12.262
If you want to run YouTube on this box you turn all your
1:08:12.262,1:08:14.491
friends into your content delivery network,
1:08:14.491,1:08:16.743
and all your friends parallelize the distribution of the box,
1:08:16.743,1:08:18.368
you share the bandwidth.
1:08:18.368,1:08:20.621
It's ad-hoc, BitTorrent-like functionality.
1:08:20.621,1:08:24.220
Yes, that technology doesn't exist yet, I just made all that up,
1:08:24.220,1:08:27.192
but we can do it.
1:08:27.192,1:08:32.556
The parts that are hard though, the things like the routing,
1:08:32.556,1:08:35.064
there will be real compromises.
1:08:35.064,1:08:36.410
There will be real trade-offs.
1:08:36.410,1:08:39.986
There will be places where we'll say, you know what, we have
1:08:39.986,1:08:41.612
to rely on the DNS system.
1:08:41.612,1:08:44.955
Everybody in this room knows that the DNS system has some
1:08:44.955,1:08:48.090
security problems, some architectural problems that make it
1:08:48.090,1:08:51.689
a thing we would ideally not have to rely on.
1:08:51.689,1:08:55.869
But you know what? This project is not going to be able to replace DNS.
1:08:55.869,1:08:59.305
There are plenty of alternate DNS proposals out there, but we are not going to
1:08:59.305,1:09:02.579
just chuck the old DNS system, because we want people
1:09:02.579,1:09:05.551
to be able to get to the box, even if they don't have a box.
1:09:05.551,1:09:09.290
We want you to be able to serve services to the public.
1:09:09.290,1:09:13.911
We are going to use a lot of structures that are less than ideal.
1:09:13.911,1:09:16.302
We're assuming that TCP/IP is there...
1:09:16.302,1:09:19.414
in the normal use case you're using the internet backbone
1:09:19.414,1:09:22.664
to do your communication.
1:09:22.664,1:09:25.637
The mesh routing story we talked about is not how you do
1:09:25.637,1:09:30.490
your normal use. That's an emergency mode if there's a crisis, a political instability, a tsunami,
1:09:30.490,1:09:35.110
if you can't get to your regular internet because it has failed you in some way because
1:09:35.110,1:09:38.222
it has become oppressive or inaccessible.
1:09:38.222,1:09:40.614
Then you would use something like the mesh network.
1:09:40.614,1:09:44.050
But in the normal course of business, you are using
1:09:44.050,1:09:47.324
a thing that is less than ideal, and that's a trade-off.
1:09:47.324,1:09:49.530
We can't as a project protect you from everything.
1:09:49.530,1:09:51.318
We are going to look for the places where we can make
1:09:51.318,1:09:54.476
effective protection. We are going to try and make it clear
1:09:54.476,1:09:57.750
the limits of that protection. And we're going to give you
1:09:57.750,1:09:59.097
everything we can.
1:09:59.097,1:10:05.389
And then, as we move forward, when opportunities to solve new problems present themselves,
1:10:05.389,1:10:08.501
we'll take them.
1:10:08.501,1:10:16.303
Well I have to add before when we had the talk, unfortunately German you couldn't
1:10:16.303,1:10:19.275
understand a lot.
1:10:19.275,1:10:22.572
I didn't understand it but I could tell that it was occurring at a very high level of technical competence
1:10:22.572,1:10:25.730
and that there was a lot of good information there.
1:10:25.730,1:10:28.702
And I'm really hoping that you'll take the video of it and put it up on universalsubtitles.org, or some
1:10:28.702,1:10:33.183
other service where people can subtitle it. And hopefully there'll be an English version and I'll get
1:10:33.183,1:10:35.877
to see it. I think there was a lot of really good information in there.
1:10:35.877,1:10:38.269
What's universalsubtitles.org?
1:10:38.269,1:10:46.349
Universalsubtitles.org is a great website. It's kind of like, you put a video up, and anyone can
1:10:46.349,1:10:49.020
add subtitles to as much or as little as they want.
1:10:49.020,1:10:53.780
And then other people can change the subtitles, and you can do it in as many languages as you want.
1:10:53.780,1:10:59.213
So you don't have to ask someone for a favour, "hey, will you subtitle my video?"
1:10:59.213,1:11:03.068
that's 20 minutes long or an hour long. You tell a community of people "we need help subtitling",
1:11:03.068,1:11:08.547
and everyone goes and subtitles 3 minutes in their favourite languages.
1:11:08.547,1:11:15.421
It's a very effective way to crowdsouce subtitling, and it's a very effective way to just share information.
1:11:15.421,1:11:20.947
We have a lot of videos with good information that are locked into languages that not everyone speaks.
1:11:20.947,1:11:22.712
So this is a way to get around that.
1:11:22.712,1:11:25.428
As FreedomBox, we use that project.
1:11:25.428,1:11:28.099
And I believe, if I'm not mistaken, I haven't looked in a while,
1:11:28.099,1:11:33.021
that it's all Free software that they are using. So you can download it and start your own if you want.
1:11:33.021,1:11:41.752
So back to my previous question - in the talk in the afternoon we heard about mesh networking
1:11:41.752,1:11:44.863
we talked about that, and it's actually not just being used in
1:11:44.863,1:11:46.814
emergency situations but people are really using it.
1:11:46.814,1:11:52.851
And especially, the philosophy that everyone becomes part of the net as not just a consumer
1:11:52.851,1:11:58.633
but providing part of the net, it certainly is like that that they
1:11:58.633,1:12:01.187
can share data among each other, they don't necessarily need
1:12:01.187,1:12:03.416
to go into the internet.
1:12:03.416,1:12:07.155
So, I would imagine the FreedomBox, with mesh networking,
1:12:07.155,1:12:10.591
we could essentially create a large network of many many
1:12:10.591,1:12:12.379
people using it.
1:12:12.379,1:12:17.464
We also talked about the mesh networking like FunkFeuer in Graz or Vienna
1:12:17.464,1:12:21.156
but it would be interesting to get them on mobile devices,
1:12:21.156,1:12:23.269
so that you could walk through the street,
1:12:23.269,1:12:30.375
theoretically people have these devices, and you could walk
1:12:30.375,1:12:32.023
through and it would automatically mesh and connect you.
1:12:32.023,1:12:37.828
So FreedomBox if applied to that, you told me this interesting example, you could screw them to
1:12:37.828,1:12:41.660
light posts on the street, so maybe elaborate on that,
1:12:41.660,1:12:44.492
maybe it could have an effect and give a lot of coverage.
1:12:44.492,1:12:48.974
The reason why we currently envision mesh,
1:12:48.974,1:12:50.622
and no decisions have been made, right,
1:12:50.622,1:12:54.198
but just in the way we think about it when we talk to each other,
1:12:54.198,1:12:58.215
and the reason why we think mesh networking is not your daily
1:12:58.215,1:13:03.300
mode of use is that the performance degradation is not acceptable to most end-users.
1:13:03.300,1:13:06.296
If mesh networking reaches the point where it is acceptable
1:13:06.296,1:13:09.732
if you're in a place where there's enough nodes, and you
1:13:09.732,1:13:13.030
have a density that you can move around then sure, that
1:13:13.030,1:13:15.839
can make a lot of sense. But for a lot of people who
1:13:15.839,1:13:19.253
exist as a person not near a lot of FreedomBoxes, they're
1:13:19.253,1:13:21.667
going to need the regular internet.
1:13:21.667,1:13:26.102
So yeah, we think mesh will be great where you have that
1:13:26.102,1:13:29.098
density, when the mesh technology is mature.
1:13:29.098,1:13:33.835
When that happens, we could have the most easy access
1:13:33.835,1:13:38.456
to municipal wifi by using the power in all the street
1:13:38.456,1:13:43.378
lights. Put a FreedomBox up in the top of every street lamp.
1:13:43.378,1:13:47.860
Unscrew the light bulb, screw in the FreedomBox, and screw the light bulb back on top.
1:13:47.860,1:13:51.134
So you still get light, we're not going to plunge you into darkness.
1:13:51.134,1:13:56.358
You still get light, but then you have a mesh node. Right there.
1:13:56.358,1:14:00.700
And you could do every 3rd or 4th street light down town, and you could cover
1:14:00.700,1:14:02.790
an area rather effectively.
1:14:02.790,1:14:07.109
It is a way to get simple municipal wifi without running
1:14:07.109,1:14:10.220
any fibre. And every time you have fibre you can link to it.
1:14:10.220,1:14:13.796
Like any time you're near fibre you can link to it and you'll
1:14:13.796,1:14:18.858
get your information out of that little mesh and into the regular network.
1:14:18.858,1:14:23.943
We could have municipal wifi with much lower infrastructure costs than most people currently think of
1:14:23.943,1:14:28.866
when they think of municipal wifi. And we can do it through mesh nodes.
1:14:28.866,1:14:33.951
And if we did it through mesh nodes we would be providing that service not only to people who have
1:14:33.951,1:14:38.572
FreedomBoxes, that just looks like wifi, it just looks like a regular connection.
1:14:38.572,1:14:45.584
You might need to do some fancy hopping, but it's not...
1:14:45.584,1:14:51.111
the mesh boxes themselves will do the fancy hopping, your phone itself won't have to do it.
1:14:51.111,1:14:54.083
While we are talking about phones,
1:14:54.083,1:14:59.006
I want to say that I'm not sure how phones fit into the FreedomBox.
1:14:59.006,1:15:02.419
I'm pretty sure there is a way that phones fit into FreedomBoxes,
1:15:02.419,1:15:05.855
but you can't trust your phone.
1:15:05.855,1:15:09.455
With the so-called smartphones it's not a phone actually but a little computer, no?
1:15:09.455,1:15:12.450
Yes, your phone, a smartphone is a little computer but
1:15:12.450,1:15:16.467
it's not a computer that you can trust, because
1:15:16.467,1:15:20.623
even if you replace the software on your phone,
1:15:20.623,1:15:26.893
with Free software, it's almost impossible to actually replace all the binary drivers,
1:15:26.893,1:15:29.726
it's almost impossible to go all the way down to the metal.
1:15:29.726,1:15:31.815
It's very hard to get a phone that is completely trustworthy
1:15:31.815,1:15:35.089
all the way down to the bottom of the stack.
1:15:35.089,1:15:37.202
So that's a problem we haven't quite figured out how to solve.
1:15:37.202,1:15:42.380
And pretty soon it's going to be impossible to put Free software on phones.
1:15:42.380,1:15:47.698
The days of jailbreaking your iPhone and rooting your Android phone might
1:15:47.698,1:15:55.012
very well come to an end. There is a proposal right now called UEFI.
1:15:55.012,1:16:01.026
It's a standard. We currently use EFI, this would be UEFI.
1:16:01.026,1:16:03.534
I don't know what it stands for, it's a new thing.
1:16:03.534,1:16:08.247
And what this proposal is, is that before your computer,
1:16:08.247,1:16:14.308
before the BIOS will load a bootloader on your computer
1:16:14.308,1:16:17.860
that BIOS has to authenticate, sorry, that bootloader has
1:16:17.860,1:16:20.113
to authenticate to the BIOS. It has to be signed by someone
1:16:20.113,1:16:23.108
the BIOS trusts, someone the BIOS manufacturer trusts.
1:16:23.108,1:16:25.779
And the person who puts the BIOS in your phone can decide who it trusts,
1:16:25.779,1:16:29.494
and they can decide they don't trust anyone except themselves.
1:16:29.494,1:16:36.622
If Apple sells you an iPhone with a BIOS that requires a
1:16:36.622,1:16:39.734
signed operating system, it might be very hard for you to
1:16:39.734,1:16:43.170
get another version of the operating system on there.
1:16:43.170,1:16:49.997
The proposals for this stuff are really in the realm of laptops and computers, that's where it's starting,
1:16:49.997,1:16:53.155
but believe me, technology spreads.
1:16:53.155,1:16:58.983
And if you want to be able to put Linux on a computer that you buy, on a laptop you buy,
1:16:58.983,1:17:03.464
very soon you might have a very difficult time doing that.
1:17:03.464,1:17:05.252
The standard is there, the companies paying attention to it
1:17:05.252,1:17:08.387
are not paying attention to it for our purposes.
1:17:08.387,1:17:12.567
They want to make sure that they can control what is on your computer.
1:17:12.567,1:17:17.605
So this is, you know, another political fight that we're going to engage in,
1:17:17.605,1:17:20.136
not the FreedomBox, but the community.
1:17:20.136,1:17:25.523
We're going to have to have this fight. UEFI. Look it up.
1:17:25.523,1:17:32.536
Start thinking about it. This is going to be a big piece of the puzzle for freedom in computing over
1:17:32.536,1:17:34.184
the next few years.
1:17:34.184,1:17:38.945
We're going to have some problems and we're going to have to find some solutions.
1:17:38.945,1:17:44.750
But wouldn't such an initiative, wouldn't that create a good market for companies who actually
1:17:44.750,1:17:49.603
would supply Linux on such devices, on the phone and on the laptop market.
1:17:49.603,1:17:53.155
I'm sure there are companies supplying that.
1:17:53.155,1:17:54.664
Absolutely.
1:17:54.664,1:17:58.217
And if the market in freedom were good enough to support
1:17:58.217,1:18:02.699
large-scale manufacturing and all that other stuff then we might get that.
1:18:02.699,1:18:05.322
And we might get that anyway.
1:18:05.322,1:18:07.134
I mean, the standard will include as many keys as you want,
1:18:07.134,1:18:08.643
so we might get the freedom.
1:18:08.643,1:18:12.660
But the manufacturers will have a really convenient way to turn the freedom off.
1:18:12.660,1:18:16.700
I think there will be a lot of boxes where you will have freedom.
1:18:16.700,1:18:21.623
But there will also be a lot where right now we think we can get Free software onto it,
1:18:21.623,1:18:24.015
where we won't be able to anymore.
1:18:24.015,1:18:25.965
It's going to be a narrowing of the market.
1:18:25.965,1:18:28.937
I don't think our freedom is going to completely disappear from devices.
1:18:28.937,1:18:33.117
But a lot of devices, if you buy the device without thinking about freedom, assuming you can have it,
1:18:33.117,1:18:37.575
you might get it home and discover that you can't.
1:18:37.575,1:18:45.261
Ok, we want to give the floor again to the audience for more questions or statements.
1:18:45.261,1:18:52.087
Ok, there in the back, one more.
1:18:52.087,1:18:54.781
Yeah, one more time, so...
1:18:54.781,1:19:01.492
Nowadays, where you can hardly really save your PC, laptop, whatever, against malware...
1:19:01.492,1:19:16.283
Isn't it really, a red carpet for hackers to, if you have social networks and circles of friends,
1:19:16.283,1:19:21.925
one gets some malware on his PC, mobile device, whatever,
1:19:21.925,1:19:26.685
has a FreedomBox, authenticates to his friends, the state is secure
1:19:26.685,1:19:32.467
wouldn't that open doors?
1:19:32.467,1:19:37.204
Sure, well, the human error is not one we can control for.
1:19:37.204,1:19:45.122
But someone who has a key that you trust is not necessarily someone who you let run arbitrary code
1:19:45.122,1:19:48.071
on your FreedomBox.
1:19:48.071,1:19:52.715
You might trust them to the point of having message passing with them, and trusting who they are
1:19:52.715,1:19:56.244
and what they say, but you don't necessarily trust the technology that they have and the
1:19:56.244,1:19:58.961
code that they have to be free of malware.
1:19:58.961,1:20:00.865
You'll still have to do all the things you currently do.
1:20:00.865,1:20:04.139
Right now if somebody sends you a file, it could have malware in it.
1:20:04.139,1:20:08.017
We're not making that easier, or better, or more likely to happen.
1:20:08.017,1:20:15.006
I think what we are doing is completely orthogonal to that problem.
1:20:15.006,1:20:19.441
At the same time, if we were to have email services on the box,
1:20:19.441,1:20:23.156
and you know we're not quite sure what the email story of a box like this looks like,
1:20:23.156,1:20:26.732
we probably would want to include some sort of virus scanning or spam catching,
1:20:26.732,1:20:31.747
all the usual filtering tools to give you whatever measure of protection might currently exist.
1:20:31.747,1:20:35.045
But the fact someone has a key and you know who they are
1:20:35.045,1:20:39.085
I don't think that will ever be the security hole.
1:20:39.085,1:20:42.220
Or at least we really hope we can make it so it's not.
1:20:42.220,1:20:48.930
If we fail in that then we've missed a trick.
1:20:48.930,1:20:53.690
Ok, any more statements or questions?
1:20:53.690,1:20:56.964
Ok, so, James, my last question would be...
1:20:56.964,1:20:59.240
You can actually buy the box right now?
1:20:59.240,1:21:00.424
Yes.
1:21:00.424,1:21:01.608
From a company?
1:21:01.608,1:21:02.955
Yes.
1:21:02.955,1:21:05.950
Maybe you can supply that information. But the software is being developed?
1:21:05.950,1:21:07.297
Yes.
1:21:07.297,1:21:11.895
Can you give an estimation about the timeline of your project, or the next milestones?
1:21:11.895,1:21:13.102
Sure.
1:21:13.102,1:21:16.957
So, the boxes are manufactures by a company called Globalscale,
1:21:16.957,1:21:18.582
they're about $140 US dollars.
1:21:18.582,1:21:24.225
There is a slightly older model called the SheevaPlug that is about $90.
1:21:24.225,1:21:28.102
It does just pretty much everything the Dreamplug does.
1:21:28.102,1:21:31.818
It has some heat sinking issues, but it's a pretty good box as well,
1:21:31.818,1:21:38.969
so if the price point matters to you you can get last year's model and it'll serve you just fine.
1:21:38.969,1:21:43.010
The software, right now we have a bare Linux distribution.
1:21:43.010,1:21:45.842
We spent a lot of time getting the binary blobs out of the kernel
1:21:45.842,1:21:50.324
and making it installable onto this hardware target.
1:21:50.324,1:21:54.805
We have a Jabber server, Prosody, that we are modifying to suit our needs.
1:21:54.805,1:22:00.796
And that should be ready, time-frame, weeks.
1:22:00.796,1:22:03.745
Some short number of weeks.
1:22:03.745,1:22:09.643
The Privoxy server, the SSH forwarding, some short number of months.
1:22:09.643,1:22:16.864
But those are our roadmap for the short-term future, is Jabber, SSH forwarding, browser proxying.
1:22:16.864,1:22:22.785
We also are working on the interface, so we're going to have an interface that you can actually
1:22:22.785,1:22:24.736
control some of these services with.
1:22:24.736,1:22:28.172
And the first thing we're doing with that interface is probably allowing you to
1:22:28.172,1:22:30.843
configure this box as a wireless router.
1:22:30.843,1:22:35.626
So it can become your wireless access point if you want it to be.
1:22:35.626,1:22:38.180
And your gateway of course.
1:22:38.180,1:22:39.945
So user interface in one vertical,
1:22:39.945,1:22:44.148
SSH forwarding, browser proxying a little bit out there,
1:22:44.148,1:22:47.584
a little bit closer: Jabber, XMPP secure chat.
1:22:47.584,1:22:52.646
And once we have that stack, we believe that we're going to build upwards from XMPP towards
1:22:52.646,1:22:55.665
perhaps something like BuddyCloud.
1:22:55.665,1:22:58.776
We're seriously looking at BuddyCloud and seeing what problems it solves for us
1:22:58.776,1:23:05.580
in terms of actually letting users group themselves in ways that they can then do access control
1:23:05.580,1:23:08.691
and channels and things of that nature.
1:23:08.691,1:23:13.892
And are you actually in contact with the hardware company producing the servers?
1:23:13.892,1:23:19.419
Yeah, we've had a number of conversations with them.
1:23:19.419,1:23:22.089
They've agreed that when our code is ready this is something
1:23:22.089,1:23:24.504
they are very interested in distributing.
1:23:24.504,1:23:26.733
More importantly we've had a lot of conversations with
1:23:26.733,1:23:28.823
them about freedom.
1:23:28.823,1:23:31.215
About why we do what we do, they way we do.
1:23:31.215,1:23:35.417
And how they need to act if they want to distribute code for
1:23:35.417,1:23:37.484
us and work with our community.
1:23:37.484,1:23:39.156
And what that means is we're teaching them how to comply
1:23:39.156,1:23:41.826
with the GPL, and we're teaching them how to remove the binary drivers,
1:23:41.826,1:23:45.704
and in fact we're doing some of that for them.
1:23:45.704,1:23:47.492
But they're Chinese, right?
1:23:47.492,1:23:49.140
No. No, Globalscale is not a Chinese company.
1:23:49.140,1:23:53.622
Their manufacturing is in China, but they're not a Chinese company.
1:23:53.622,1:23:58.219
And we're also talking to Marvel. Marvel makes the system-on-a-chip that goes onto the boards
1:23:58.219,1:24:00.843
that Globalscale is integrating into their boxes.
1:24:00.843,1:24:05.905
But we're also talking to Marvel about what they can do to better serve the needs of our community.
1:24:05.905,1:24:13.010
So a large part of our efforts is to try to convince manufacturers to make
1:24:13.010,1:24:14.961
hardware that suits our needs.
1:24:14.961,1:24:16.888
This box is a thing that they developed, they invented,
1:24:16.888,1:24:18.537
before they ever met us, before they ever heard of us.
1:24:18.537,1:24:23.622
And if we can get them enough business,
1:24:23.622,1:24:27.360
if by making FreedomBoxes and by putting our software on the box,
1:24:27.360,1:24:30.774
that enables them to sell more boxes they will be very happy
1:24:30.774,1:24:34.489
and when they design the next generation,
1:24:34.489,1:24:39.412
not the next generation of the DreamPlug, but the next generation after whatever they're designing now,
1:24:39.412,1:24:41.617
so we're talking a couple of years from now.
1:24:41.617,1:24:44.706
We can say to them, look, you're selling a lot of boxes
1:24:44.706,1:24:48.723
because you're making a thing that serves the free world very well.
1:24:48.723,1:24:52.275
Remove the 8 inch audio jack because our people don't need it.
1:24:52.275,1:24:55.549
Add a second wifi radio. Put antenna ports on it.
1:24:55.549,1:25:00.286
This box can go from something that looks really good for our purpose to
1:25:00.286,1:25:02.376
being something that looks amazingly good for our purpose.
1:25:02.376,1:25:05.209
And that will require scale.
1:25:05.209,1:25:07.438
And what that means is that the FreedomBox becomes a wedge for
1:25:07.438,1:25:13.382
making better hardware for everyone.
1:25:13.382,1:25:16.331
But it's not just the FreedomBox. The Tor router project is
1:25:16.331,1:25:21.370
also focused on the DreamPlug. They've also decided this is a good box for their purpose.
1:25:21.370,1:25:26.246
If you are making a box that is kind of like a FreedomBox but isn't the FreedomBox because
1:25:26.246,1:25:30.704
it's more specialised to what you want it for, think about
1:25:30.704,1:25:35.906
the DreamPlug as a hardware target. And let us know,
1:25:35.906,1:25:38.599
so that when we go to the company, we can say look,
1:25:38.599,1:25:42.454
look at all the business you are getting by being people that serve the Free world.
1:25:42.454,1:25:52.136
And then, hopefully, we can convince them to make boxes that better serve the Free world.
1:25:52.136,1:25:55.434
And that's not a fantasy. We are having those conversations with them,
1:25:55.434,1:25:57.825
and they are very receptive.
1:25:57.825,1:26:00.171
So I am pretty happy about that aspect we do.
1:26:00.171,1:26:02.864
And my last question would be...
1:26:02.864,1:26:05.395
since we are now, everything is turning mobile,
1:26:05.395,1:26:07.183
it's like we have these computers with an extra phone...
1:26:07.183,1:26:08.646
the phone is a small application on these devices.
1:26:08.646,1:26:13.243
Is there any plan or any idea or any project to say like, have
1:26:13.243,1:26:18.259
a FreedomPhone or Free mobile device?
1:26:18.259,1:26:23.019
So the way you connect to this box is kind of how you connect to your router,
1:26:23.019,1:26:24.644
port 80, browser.
1:26:24.644,1:26:28.545
But another way you could do it would be an app on your cellphone that bluetooths to the box.
1:26:28.545,1:26:33.607
I don't actually think the box has bluetooth, but you know,
1:26:33.607,1:26:36.324
an app on your cellphone that talks to the box over the network, say.
1:26:36.324,1:26:38.228
That's possible, we're thinking about that.
1:26:38.228,1:26:41.223
We're thinking about what that looks like for the large population
1:26:41.223,1:26:43.569
that exists out there that doesn't have computers.
1:26:43.569,1:26:46.843
There's an awful lot of people that only have cellphones, they don't have computers.
1:26:46.843,1:26:49.095
And we want them to have freedom too.
1:26:49.095,1:26:50.883
So figuring out how we can use a cellphone to talk to the box is a future problem.
1:26:50.883,1:26:51.765
We're not working on it right now, but we're certainly talking
1:26:51.765,1:26:57.292
about where it fits into the roadmap.
1:26:57.292,1:27:01.262
And that's why we are concerned about whether or not you
1:27:01.262,1:27:05.233
can trust your phone.
1:27:05.233,1:27:07.299
Because if you can trust your FreedomBox, but not the
1:27:07.299,1:27:09.668
thing you use to access it then you don't really have the privacy you think you have.
1:27:09.668,1:27:12.663
So, figuring out, can you trust your cellphone? Is a big part of the puzzle.
1:27:12.663,1:27:17.725
It's a big thing that we don't know how to do yet.
1:27:17.725,1:27:21.464
So let me make a little advertisement for another interesting project,
1:27:21.464,1:27:24.738
there is a Spanish development, I think it is also produced in China,
1:27:24.738,1:27:26.827
but it's called The Geek's Phone.
1:27:26.827,1:27:30.705
And they have a compatible Android installation by default,
1:27:30.705,1:27:34.142
and they are probably having a similar philosophy to keep the hardware open.
1:27:34.142,1:27:36.673
So maybe there is a new cooperation on the horizon.
1:27:36.673,1:27:40.945
Oh yeah, we love projects like that.
1:27:40.945,1:27:41.445
I don't know a lot about their project, but I have heard of it
1:27:41.445,1:27:44.057
and it is on my list of things to look into.
1:27:44.057,1:27:47.609
I would love to see that succeed, that would be excellent.
1:27:47.609,1:27:50.303
Well James, thank you for your presentation.
1:27:50.303,1:27:54.761
I think it was really interesting. And thank you for coming.
1:27:54.761,1:27:57.849
James will be back on this stage at 7pm when we have our final discussion on the 20 years of
1:27:57.849,1:28:03.492
the world wide web.
1:28:03.492,1:28:05.001
Thank you James for coming.
1:28:05.001,1:28:12.838
APPLAUSE