1 00:00:08,744 --> 00:00:13,883 It's conventional to give an introduction at a moment like this 2 00:00:13,883 --> 00:00:19,039 and there are of course two things that make doing that difficult, if not pointless. 3 00:00:19,039 --> 00:00:25,821 One of which is, everyone already knows the speaker, and the other is, his family is here, which means of course 4 00:00:25,821 --> 00:00:31,407 introductions must be handled with particular care. 5 00:00:31,407 --> 00:00:39,702 I've been trying to talk about Snowden and the future at this law school this fall, and I was hampered by two things: 6 00:00:39,702 --> 00:00:47,122 I hadn't read the documents, and I wasn't a cryptographic expert. 7 00:00:47,122 --> 00:00:53,073 Both of those problems have been solved because I'm not going to be doing the talking this evening. 8 00:00:53,073 --> 00:01:01,489 Bruce Schneider is (I think it's fair to say), the world's most important cryptographer and public intellectual-- 9 00:01:01,489 --> 00:01:11,045 most wonderful cryptographers being more introverted and less linguistically capable in high rhetorical form. 10 00:01:11,045 --> 00:01:18,076 So that's why he doesn't need any introduction, but I should say that as another alumnus of 11 00:01:18,076 --> 00:01:20,826 the Hunter College Elementary School System, 12 00:01:20,826 --> 00:01:24,291 that Bruce is a graduate of the Hunter High School in New York 13 00:01:24,291 --> 00:01:31,993 and of the University of Rochester, and of American University, and holds honorary doctorates 14 00:01:31,993 --> 00:01:37,662 arising from having done good work for the human race, which most people here know all about. 15 00:01:37,662 --> 00:01:45,661 Applied cryptography is still a really good place to begin if you want to understand why you can trust the math. 16 00:01:45,661 --> 00:01:54,407 The array of articles, interviews, and books on security, trust and modern technology 17 00:01:54,407 --> 00:02:03,074 is, for people like me who try to follow along doing the law of this, is not just an inspiration but a godsend. 18 00:02:03,074 --> 00:02:12,173 It's my great pleasure to have Bruce here at Columbia today because he knows what the rest of all those documents say 19 00:02:12,173 --> 00:02:18,369 which means he knows a great deal about Snowden and the future is really going to turn out. 20 00:02:18,369 --> 00:02:26,816 I hope here in conversation this evening we can hit some of the geeky high spots of all of that. 21 00:02:26,816 --> 00:02:37,435 So, Bruce, welcome to Columbia Law School and thank you for being here. 22 00:02:37,435 --> 00:02:43,772 Maybe a good place to begin would be to say whatever you can say about how you came to be 23 00:02:43,772 --> 00:02:49,362 involved with Glen Greenwald and the project of publication of Mr. Snowden's disclosures. 24 00:02:49,362 --> 00:02:53,880 Well, the one sentence answer is: "I was asked." 25 00:02:53,880 --> 00:03:00,542 Greenwald had in his possession all these documents. They are very technical, very jargon-filled 26 00:03:00,542 --> 00:03:08,096 and he needed an expert in the material to help understand them. 27 00:03:08,096 --> 00:03:13,937 My name came up again and again until he called me. 28 00:03:13,937 --> 00:03:19,961 Stuff happens, and I go down to Rio, and it's kind of a surreal experience to be handed 29 00:03:19,961 --> 00:03:27,278 reams of Top Secret classified material and say "hey read this and tell me what you think" but 30 00:03:27,278 --> 00:03:30,354 that's what I did. 31 00:03:30,354 --> 00:03:35,603 We worked on several stories, and the one story we published before Greenwald severed his relationship 32 00:03:35,603 --> 00:03:39,609 with The Guardian was about Tor, the anonymity service. 33 00:03:39,609 --> 00:03:51,024 It's a good story, it talks about how it is secure, how the NSA does go after Tor users, what mechanisms they're using 34 00:03:51,024 --> 00:03:59,346 how they are attacking users on the Internet, both getting data, breaking anonymity, breaking into computers. 35 00:03:59,346 --> 00:04:05,778 The story published in early October, and I think it was like two weeks later that Greenwald broke with The Guardian. 36 00:04:05,778 --> 00:04:12,941 Presumably, when a new venture gets started up, I will be back doing stories. 37 00:04:12,941 --> 00:04:21,133 There is a lot more to tell. Until then, you are in the very capable hands of Bart Gellman and Ashkan Soltani 38 00:04:21,133 --> 00:04:26,297 who are writing for the Washington Post, doing a great job. 39 00:04:26,297 --> 00:04:35,205 What do you think we should do with the fact that you probably weren't terribly surprised by what you read? 40 00:04:35,205 --> 00:04:40,751 There's a movement around that world that says, "Well, nobody is really surprised because everybody 41 00:04:40,751 --> 00:04:46,962 knows it's going on, therefore there's nothing we need to do about it," and I find myself confronting that knowing that 42 00:04:46,962 --> 00:04:55,416 the first part of the syllogism is, in fact, correct. We're not terribly surprised. Why are we not terribly surprised? 43 00:04:55,416 --> 00:04:57,661 You know, I think we're both surprised and not surprised. 44 00:04:57,661 --> 00:05:04,811 It's really interesting, if you've ever watched any movie with an NSA villain, this is exactly the sorta thing they would do. 45 00:05:04,811 --> 00:05:12,538 There's nothing in the revelations-- I mean, sometimes they are a bit extreme, spying on gaming worlds-- 46 00:05:12,538 --> 00:05:19,123 but if you thought about it for half a minute, you would say "well of course you would, that's a place to communicate." 47 00:05:19,123 --> 00:05:24,621 If the goal is to eavesdrop on all communications, you're going to eavesdrop on that channel just like 48 00:05:24,621 --> 00:05:31,377 you'd eavesdrop on a little chat window in a Scrabble game, as well as Skype. 49 00:05:31,377 --> 00:05:40,792 So, there is no surprise. But the details, the extent, I think it really is a surprise. 50 00:05:40,792 --> 00:05:47,463 We kinda knew it, but we never actually fully thought about it, we never did the math. We never worked out the budgets. 51 00:05:47,463 --> 00:05:53,550 And we were starting to, because we were seeing the Utah facility come up, and people were looking at 52 00:05:53,550 --> 00:06:02,045 the square footage, the power, how many servers are there, what could be stored there, but still you're just guessing. 53 00:06:02,045 --> 00:06:09,499 Seeing it for real is just surprising because it's there. You might know it's there, but seeing it... 54 00:06:09,499 --> 00:06:16,749 The analogy I've been using, a crummy analogy but it's the best one I've got, is it's kinda like death. 55 00:06:16,749 --> 00:06:25,592 You all know death is coming, it's not a surprise, the story always ends this way, yet everytime it happens it is a surprise. 56 00:06:25,592 --> 00:06:32,171 Because you basically never really think about it. I think surveillance was like that; we never thought about it. 57 00:06:32,171 --> 00:06:39,694 There are professionals in the world of cyber security who have thought about it. They are more surprised I think than you and me 58 00:06:39,694 --> 00:06:46,536 because they trusted in what the listeners told them more. The gaps that have opened up in the documents you've read 59 00:06:46,536 --> 00:06:55,978 between what is actually going on and what people were assured was going on must seem fairly large. 60 00:06:55,978 --> 00:07:00,740 We know that they promised the financial community that they were going to break financial crypto. 61 00:07:00,740 --> 00:07:08,659 We know they made all sorts of promises about how minimization worked in the United States. 62 00:07:08,659 --> 00:07:19,511 In that sense, is it true that those of us at the "cypherpunk" edge of the world more surprised because they trusted The Listeners more? 63 00:07:19,511 --> 00:07:27,791 I don't know, my guess is if you're right you're surprised even more, because, my god, it is actually that bad. 64 00:07:27,791 --> 00:07:38,155 You know, around the edges... you sorta have a bell curve of beliefs of how bad this was. 65 00:07:38,155 --> 00:07:45,593 Now we're seeing that even the more extreme beliefs of how much surveillance is going on 66 00:07:45,593 --> 00:07:59,397 are true and actually conservative. We're seeing a surprising number of alliances, where it's very common for the NSA to spy on country A 67 00:07:59,397 --> 00:08:05,526 and then partner with country A to spy on country B, and then partner with country B to spy on somebody else. 68 00:08:05,526 --> 00:08:15,102 We're seeing so many webs of that. Germany, who is one of NSA's most trusted partners is being spied on. 69 00:08:15,102 --> 00:08:21,941 The only thing we haven't seen-- and I can't wait because it will be extraordinarily big-- is when we start seeing 70 00:08:21,941 --> 00:08:27,727 the UK and the US start spying on each other. I think the odds of that not being true are very small. 71 00:08:27,727 --> 00:08:35,479 Because why in the world, if you as the NSA are spying on your own country, why wouldn't you spy on the country of your closest ally? 72 00:08:35,479 --> 00:08:40,258 You're spying on everyone else. There's just no exclusion. 73 00:08:40,258 --> 00:08:47,369 What do you think is the biggest headline for you, as a technical thinker, that's come out of the documents you've seen? 74 00:08:47,369 --> 00:08:59,735 I think the most important headline is that crypto works. We expected more cryptanalysis, we expected more "the NSA can break this code and that code and that code." 75 00:08:59,735 --> 00:09:07,509 We know they're spending an enormous amount of money on this. But again, we learned from the documents that cryptography works. 76 00:09:07,509 --> 00:09:18,813 That's the lesson of Tor. The NSA can't break Tor and that pisses them off. That's the lesson of the NSA eavesdropping on your buddy list and address books 77 00:09:18,813 --> 00:09:27,400 from your connections between your browser, Gmail, and your ISP. You look at the data, and they get about 10 times the amount of data from 78 00:09:27,400 --> 00:09:36,735 Yahoo users as Google users even though Google is so many more times larger than Yahoo. It's because Google uses SSL client-side as default. 79 00:09:36,735 --> 00:09:40,903 So, SSL works. And there's another slide from a program called MUSCULAR 80 00:09:40,903 --> 00:09:48,982 which is an NSA program for getting data from Google's data centers and the connections between them, where they specifically point out 81 00:09:48,982 --> 00:10:03,489 "this is the place where SSL is removed." So, we see again and again that cryptography is not much of a barrier, but they aren't breaking it by breaking the math. 82 00:10:03,489 --> 00:10:12,311 They're breaking it by cheating, by going after the implementation, by stealing keys, by forging certificates 83 00:10:12,311 --> 00:10:22,418 by doing all the non-cryptography things that we all know are important, but you don't get that mathematical benefit. 84 00:10:22,418 --> 00:10:32,803 Just to take the paranoid side of this for a moment, how much of this do you think could turn out to be cognitive bias in our collecting? 85 00:10:32,803 --> 00:10:40,035 Did Mr. Snowden miss the whole other trove in which the documents about crypto-breaking are? 86 00:10:40,035 --> 00:10:49,622 That's certainly possible. We don't know a lot about how he collected and what he collected, but it's certainly possible. 87 00:10:49,622 --> 00:10:54,995 There are a documents about cryptanalysis that are completely separate, on separate networks and he didn't have access to them. 88 00:10:54,995 --> 00:11:07,246 But what we are seeing is operational stuff. You'll see in the documents on BULLRUN, which is their program to subvert cryptography, it'll say 89 00:11:07,246 --> 00:11:17,667 in several places, "don't speculate on how this works." They're talking to the analysts, the people doing the intelligence. 90 00:11:17,667 --> 00:11:30,530 "You want to break into these circuits, we will do that for you. Don't speculate on how we're doing it. Just accept this windfall of data and be happy." 91 00:11:30,530 --> 00:11:41,664 But we do see, again and again, crypto stymieing. Tor's story is really important. They [NSA] have seminars and workshops on "how do we break Tor?" 92 00:11:41,664 --> 00:11:52,923 Again and again you see that they are unable to, that the cryptography is working, and that when they have breaks they're getting in around the edges-- 93 00:11:52,923 --> 00:11:58,011 attacking the user, they're trying to go after correlations. 94 00:11:58,011 --> 00:12:05,923 There is something in the black budget, which we see the first pages of (and I think the Washington Post published those) 95 00:12:05,923 --> 00:12:16,554 there's a narrative by James Clapper, talking about what the NSA is doing. There is one sentence where he talks about crytopgraphy-- 96 00:12:16,554 --> 00:12:22,173 I'm going to read the sentence because I think the wording is interesting. 97 00:12:22,173 --> 00:12:34,645 He says, "We are investing in groundbreaking cryptanalytic capabilities to defeat adversarial cryptography and exploit Internet traffic." 98 00:12:34,645 --> 00:12:42,725 So, that sentence doesn't sound like, "We got a bunch of smart people in a room and hope they get lucky." 99 00:12:42,725 --> 00:12:49,099 That sounds like "We've got a piece of math but we need a bunch of engineering to make it work." 100 00:12:49,099 --> 00:13:02,978 "We need a really big computer, a big database, we need some something that requires time and budget. Not new math." 101 00:13:02,978 --> 00:13:10,461 So, there's speculation that there is at least one piece of cryptography that they have and we don't. Which is perfectly reasonable. 102 00:13:10,461 --> 00:13:19,607 They tend to hire the top 10% of mathematicians every year. They go into the agency, they never come out. They get everything we do, we get nothing they do. 103 00:13:19,607 --> 00:13:27,145 There is going to be this differential in knowledge that which will expand over the years. 104 00:13:27,145 --> 00:13:36,424 Before this we all would say, "we think they're 10 years ahead of the state of the art." And we pull that number out of the air. 105 00:13:36,424 --> 00:13:43,011 So there are speculations of what the NSA has. And these are just speculations; we know nothing. 106 00:13:43,011 --> 00:13:48,357 I'll give the three-- the first is something about elliptic curves. 107 00:13:48,357 --> 00:13:58,182 Elliptic curves is a very complex area of mathematics that is being used in public key cryptography, and it is perfectly reasonable to believe that 108 00:13:58,182 --> 00:14:06,488 the NSA has some ability to either break elliptic curves to a greater extent than we can, or to break certain classes of curves that we would 109 00:14:06,488 --> 00:14:15,561 not be able to recognize. We know that the agency has tried to affect curve selection, which implies that there are some classes of curves 110 00:14:15,561 --> 00:14:22,505 they have an advantage over, that we don't know. The other reasonable assumption is general factoring. 111 00:14:22,505 --> 00:14:31,269 If you look at the academic world, factoring gets better slowly every year-- a factor of 2 here, a factor of 10 there, a factor of 100 on a really good year. 112 00:14:31,269 --> 00:14:37,883 Every year it progresses. If you assume the NSA is 10 years ahead of the state of the arts, you can do some math and you can assume they are 113 00:14:37,883 --> 00:14:44,967 at some higher point than we are. And that'd be the sort of thing that would make sense from Clapper's quote. 114 00:14:44,967 --> 00:14:58,075 The third speculation is the RC4 algorithm, which is a symmetric cypher that has been teetering on the edge of "we can break it in academia" for quite awhile. 115 00:14:58,075 --> 00:15:06,993 It was invented by Ron Rivest, who is actually the master of the too-good-to-be-true cypher: you can't imagine it being secure yet you can't break it. 116 00:15:06,993 --> 00:15:10,827 And maybe it's holding up, but maybe they have something. 117 00:15:10,827 --> 00:15:20,128 Those are my guesses but there is going to be at least one piece of math, it will be extraordinarily hidden. 118 00:15:20,128 --> 00:15:33,240 Just like the names of the companies who are cooperating in BULLRUN, this stuff is ECI-- extremely compartmentalized information-- pretty much never written down. 119 00:15:33,240 --> 00:15:38,871 Now, sometimes you get lucky. And Snowden, because of his position, did get lucky. 120 00:15:38,871 --> 00:15:53,246 I remember some of the early weeks of this, some congressman said a quote like, "Snowden couldn't have this data. You have to be on a list to get this data. He wasn't on the list!" 121 00:15:53,246 --> 00:16:01,135 I'm listening to the person and I'm thinking, "no you don't understand, here's the man who typed the list into the computer. Don't you understand what root access means?" 122 00:16:01,135 --> 00:16:10,493 Who has the most access to the secrets in a company? It's the janitorial staff, because they get access to everything. 123 00:16:10,493 --> 00:16:20,231 The plumbers, the people who are doing the infrastructure, have enormous access, and that seems to be what happened. 124 00:16:20,231 --> 00:16:25,822 Let's talk a little bit about the efforts against math. Standards corruption, for example. 125 00:16:25,822 --> 00:16:35,734 We have one clearly documented case of NSA taking over a standards-making process, and choosing in the end, for NIST, 126 00:16:35,734 --> 00:16:41,971 a random number generator which was not a random number generator exactly... 127 00:16:41,971 --> 00:16:50,849 How should we think that process of not necessarily attacking the basic mathematics but attacking how that math is applied 128 00:16:50,849 --> 00:16:58,309 in the standards process to produce weaknesses, how far should we take that to an extent? 129 00:16:58,309 --> 00:17:04,117 That's certainly worrying, something we're looking at over and over again. That the standard in question is a random number generator standard, 130 00:17:04,117 --> 00:17:10,943 and if you are going to put a backdoor into a system, hacking a random number generator is a perfect place to do it. 131 00:17:10,943 --> 00:17:19,635 You can do it imperceptibly. It doesn't affect the output at all. It's a really good place to put a secret backdoor. 132 00:17:19,635 --> 00:17:26,853 This is a case, again, where we knew. It was '06 or '07 where I'm writing an essay where I'm saying 133 00:17:26,853 --> 00:17:31,661 "Don't trust this random number generator because there could conceivably be a backdoor and here's how you'd put it in." 134 00:17:31,661 --> 00:17:41,446 Again, there's no big surprise. It's one of four random number generators in the standard. The other three we think are good. 135 00:17:41,446 --> 00:17:49,309 But this entered the standard and then it started being requested by governments, by US government contracts, 136 00:17:49,309 --> 00:17:55,689 and it ended up as the default random number generator in some libraries, and no one is really sure how. 137 00:17:55,689 --> 00:18:05,549 So this is an example of how a hacked standard can infiltrate slowly the systems we're using. 138 00:18:05,549 --> 00:18:12,165 So now we're starting to look at everything else, and NIST, whose the government entity doing these standards, 139 00:18:12,165 --> 00:18:23,266 who pretty much has been trusted, is coming under a lot of scrutiny, and they, I think rightfully, are very angry at the NSA for ruining their credibility. 140 00:18:23,266 --> 00:18:32,876 I think a lot of the standards are still good. The AES is still strong. We have a new hash function standard that I'm happy with. 141 00:18:32,876 --> 00:18:39,550 These were semi-public processes. These were not NSA-produced algorithms that became standards. 142 00:18:39,550 --> 00:18:48,368 These were open calls, and we in the community would look at them, and there was rough consensus, and the NIST picked one. 143 00:18:48,368 --> 00:18:59,679 It's possible that they're hacked, but I think it's really unlikely. I would more look at implementations. 144 00:18:59,679 --> 00:19:07,119 We know that cellphone encryption-- and this is not just the NSA-- here you have an international cellphone standard 145 00:19:07,119 --> 00:19:17,999 and you've got three dozen countries that want to eavesdrop. So all together there's pressure not to have real security in your cell networks. 146 00:19:17,999 --> 00:19:25,841 So these are more overt, we know about them. I worry more about the private standards than the public standards. 147 00:19:25,841 --> 00:19:35,840 We have one example of a backdoor trying to be slipped into Linux, which we are almost positive is enemy action. We don't know who 148 00:19:35,840 --> 00:19:47,725 the enemy is in this case, it could be anybody, which we found because we, I think, got lucky. So, certainly it is possible to slip backdoors in commercial software. 149 00:19:47,725 --> 00:19:54,059 I worry less about the standards and more the private stuff we can't see. 150 00:19:54,059 --> 00:20:02,756 Do you think we're going to have to consider the possibility that all the standardized family of ECC curves are ones that we should abandon? 151 00:20:02,756 --> 00:20:08,644 I don't. I mean, they are curves that came from academia, they are curves that came from public processes. 152 00:20:08,644 --> 00:20:20,330 Those are ones we can trust more. I would love to up our key lengths, for extra conservativeness, just because I'm now more leery-- especially with elliptic curves. 153 00:20:20,330 --> 00:20:26,737 I think we just have to look at pedigree, and we have to mistrust things that can be tampered with. 154 00:20:26,737 --> 00:20:32,884 We know that the NSA has implemented curve selection, but we don't know how. 155 00:20:32,884 --> 00:20:42,862 We don't know if it's the NSA going to-- i'm making this up-- an engineer at some company and saying "here's some curves, why don't you suggest these?" 156 00:20:42,862 --> 00:20:52,800 We don't know if there's some vetting... we don't know what has happened. But I would like curves to be generated in open, public manners. 157 00:20:52,800 --> 00:20:59,828 I think for the world, if we're going to trust them in a global community, we have to do that. 158 00:20:59,828 --> 00:21:08,849 We end up talking about the NSA, but this is not about the NSA. This is what any large nation state would do. 159 00:21:08,849 --> 00:21:17,370 Snowden has given us some phenomenal insight into the NSA's activities in particular, but we know China uses a lot of these same techniques. 160 00:21:17,370 --> 00:21:28,989 We know other countries do. This is going to be what cybercrime is going to look like in 3-5 years because technology democratizes. 161 00:21:28,989 --> 00:21:34,779 We really need to get security for everyone against everything. 162 00:21:34,779 --> 00:21:44,632 So, let's follow that up a little bit. We have talked so far, and most of the Snowden documents that we have seen publicly released so far have 163 00:21:44,632 --> 00:21:54,276 talked primarily about passive listening activity of one sort or another-- hack, tap, steal, with respect to the backbones and the telecom networks. 164 00:21:54,276 --> 00:22:03,409 But we haven't had much information about listener activity directed at subverting the security of individuals or businesses. 165 00:22:03,409 --> 00:22:11,522 With the cookie-based material that started coming out 36 hours ago, things have begun to change a little bit about that. 166 00:22:11,522 --> 00:22:22,824 Have you seen things you can talk about that relate to how the American military listeners or others are directly subverting the security of individual's computers? 167 00:22:22,824 --> 00:22:30,243 So the first story on that was the Tor story from early October. And there, I wrote about two different programs 168 00:22:30,243 --> 00:22:40,353 that the NSA uses for active attack. And there's been a great article in Foreign Policy on TAO, tailored access operations. 169 00:22:40,353 --> 00:22:46,126 These are basically the NSA's black bag teams. So, we have seen these stories over the last several months. 170 00:22:46,126 --> 00:22:56,069 The two things I wrote about back in early October were QUANTUM, and QUANTUM is an add-on to their eavesdropping platforms. 171 00:22:56,069 --> 00:23:03,164 So, the NSA has large eavesdropping platforms on internet trunks, and these have names like TUMULT, TURBULENCE, and TURMOIL. 172 00:23:03,164 --> 00:23:12,491 I'm not quite sure how those three relate to each other. They all begin with 'TU' so they seem to be a family. One is a superset of another. 173 00:23:12,491 --> 00:23:21,660 TURMOIL seems to be the latest generation. TURMOIL is the device that gets the "firehose" and quickly decides 174 00:23:21,660 --> 00:23:29,400 what needs further analysis, because the firehose is coming at you and you need to make very quick decisions about what to eavesdrop on. 175 00:23:29,400 --> 00:23:42,771 So, sitting on TURMOIL is something called QUANTUM, and what QUANTUM does is it gives the NSA the ability to inject into the stream, to add bytes. 176 00:23:42,771 --> 00:23:50,820 And this is what the NSA uses for things like packet injection. Packet injection injects packets into your data stream. 177 00:23:50,820 --> 00:23:55,682 Again, nothing new, this is how the great firewall of China works-- this is a hacker tool. 178 00:23:55,682 --> 00:24:04,582 But if you're sitting on the backbone, if you're on the AT&T backbone, you can do some phenomenally interesting things with this. 179 00:24:04,582 --> 00:24:15,987 You can do DNS hacking-- this is actually what China does for censorship. You can do frame injection, where you redirect users 180 00:24:15,987 --> 00:24:24,571 surreptitiously to other servers-- I'll get back to that later. We knew from the Tor story that there's something called QUANTUM Cookie. 181 00:24:24,571 --> 00:24:35,578 We didn't really know how it worked until we just got the Cookie story from a few days ago, but the slide said "force users to divulge their cookies." 182 00:24:35,578 --> 00:24:46,299 This is a way that the NSA would-- and think of this in terms of a Tor user, this is a user that is being anonymous on the network because of Tor-- 183 00:24:46,299 --> 00:24:55,310 if you could force that user to divulge his cookies, if you've got a database of whose cookies belong to who, that de-anonymizes. 184 00:24:55,310 --> 00:25:00,160 So now we're seeing how these things link together. And there are other QUANTUM programs. 185 00:25:00,160 --> 00:25:12,305 Nicholas Weaver, whose at UC Berkely, knows nothing about the documents but has written some great essays on how this works, 186 00:25:12,305 --> 00:25:17,932 because this is how of course it would work now that you know about it, you start thinking about what you would do with this. 187 00:25:17,932 --> 00:25:26,044 So we do know quite a lot about QUANTUM, and I think that's important. We also know about a program called FOXACID-- 188 00:25:26,044 --> 00:25:33,858 and by the way, the NSA has the coolest codenames in the world. If you're at lunch, you'd want to sit at the FOXACID table. 189 00:25:33,858 --> 00:25:40,227 Worse codename: EGOTISTICALGIRAFFE. Never want to sit with them. Ever. 190 00:25:40,227 --> 00:25:53,547 So FOXACID is the NSA's multifaceted hacking tool. If you think about their problem, if you think about what they need to do, 191 00:25:53,547 --> 00:26:05,225 they need to turn, basically, people off the street, into cyberwarriors, and the way they do that is not going to be through years of training-- that's expensive. 192 00:26:05,225 --> 00:26:13,991 It's going to be through tools and procedural manuals, and automated/semi-automated ways to make hacking work. 193 00:26:13,991 --> 00:26:19,656 And FOXACID-- if you know hacking tools, you know Metasploit?-- FOXACID is Metasploit with a budget. 194 00:26:19,656 --> 00:26:33,326 So this is the server that, when you visit it, and you can be forced to visit it in many ways, one of them is through a QUANTUM-- we think it's called a 195 00:26:33,326 --> 00:26:45,245 QUANTUM tip, QUANTUM inserts-- I mean, there are some codenames we don't understand. First the user is "tipped", tipped into FOXACID. 196 00:26:45,245 --> 00:26:52,325 So you're visiting-- and I'm making this up-- Google, and the NSA sees you visit Google, and they do a frame injection, 197 00:26:52,325 --> 00:26:59,904 and then some invisible packets go to the FOXACID server, which recognizes who you are through whatever systems they have, 198 00:26:59,904 --> 00:27:07,843 and the server says "OK, it's this person." They're going to know if this person is high value vs. low value, 199 00:27:07,843 --> 00:27:17,493 if this person is a sophisticated user vs. a naive user. And based on all these criteria, FOXACID will decide what exploit to serve. 200 00:27:17,493 --> 00:27:24,656 ...and I forget the codename of the basic exploit, it's a cool codename too. Damn it... 201 00:27:24,656 --> 00:27:34,411 And then if that works-- called a "shot"-- if that works then there's a series of other exploits that are run to figure out 202 00:27:34,411 --> 00:27:44,224 "OK I'm on this computer, who is it? Where is it? What network is it on? What's it connected to? What's on it?" 203 00:27:44,224 --> 00:27:54,026 And then we know there's a lot of specialized attack tools. There's a document that the French press published, and I don't know if they meant to but 204 00:27:54,026 --> 00:28:03,076 at the bottom of this document is this glossary of codenames and attack codes. There was a special attack code for figuring out where the 205 00:28:03,076 --> 00:28:09,374 geographical location is. There's a special attack code for jumping air gaps-- that was interesting. There's a special attack code for doing 206 00:28:09,374 --> 00:28:20,024 various things you might want to do. So, we have quite a bit. We've seen nothing so far from US Cyber Command. 207 00:28:20,024 --> 00:28:29,862 And we don't know, it's not public yet, if the story is not yet out, or if the US Cyber Command documents were separate enough that 208 00:28:29,862 --> 00:28:38,431 they're not in the trove. So, everything we're seeing is NSA and GCHQ-- that's the British counterpart. We're not seeing US Cyber Command, 209 00:28:38,431 --> 00:28:46,571 which presumably does a lot more offensive operations. That's kind of their job. But the NSA does quite a lot, too. 210 00:28:46,571 --> 00:28:57,120 We know that TAO will go in and steal keys. If there's a circuit they want to eavesdrop on but they can't break it, they'll go in and steal the key. 211 00:28:57,120 --> 00:29:13,245 Let's just hang out a moment in this question of injection attacks from the backbone. That tip that put somebody's browsing activity 212 00:29:13,245 --> 00:29:23,779 into a platform where they could try various exploits and see what's going on, that depended on a frame injection in the example you gave, 213 00:29:23,779 --> 00:29:30,287 which a browser could be smart enough to turn down all together. If that browser were running NoScript, what would happen? 214 00:29:30,287 --> 00:29:45,704 It depends. NoScript is a really good way to deal with some of these, but in our normal browsing, there's often quite a lot redirects that don't all involve scripts. 215 00:29:45,704 --> 00:29:53,709 It's not clear to me whether scripts are required for this attack. I think there's going to be some attacks where they are not. 216 00:29:53,709 --> 00:30:03,799 You read the NSA documentation, they talk a lot about PSPs, personal security products, and these just piss them off ginormously. 217 00:30:03,799 --> 00:30:13,375 A lot of this action involves around a couple of things-- there's also the fact that the Internet is very insecure out of the box, 218 00:30:13,375 --> 00:30:27,739 and there is this sort of background radiation of script kiddies attacking things all the time, so when you are attacked it's the 30th time this millisecond, so what? 219 00:30:27,739 --> 00:30:39,033 That will give an agency like the NSA, or somebody else, an enormous amount of cover. Because attacks happen so often. 220 00:30:39,033 --> 00:30:48,564 There's certainly a lot of things we can do to make this much harder. Encrypting the backbone would do an enormous amount of good. 221 00:30:48,564 --> 00:30:57,325 You can't do frame injection in an SSL connection, because you can't see the frames. You can do an DNS redirect, or other things you can do, 222 00:30:57,325 --> 00:31:08,690 but there are things you can't do. Using the privacy tools we have, I think, give us an enormous benefit. The fact that Tor works-- that might be 223 00:31:08,690 --> 00:31:22,989 the biggest surprise we've seen so far-- that Tor does work. It's annoying to use, but it does work. That shows that a bunch of us can decide 224 00:31:22,989 --> 00:31:28,538 that we're going to build a privacy tool that will defeat major governments. That's kind of awesome. 225 00:31:28,538 --> 00:31:35,238 Kind of too awesome to be true. 226 00:31:35,238 --> 00:31:40,187 You know, everything I've read tells me the NSA cannot break Tor. I believe the NSA cannot break Tor. 227 00:31:40,187 --> 00:31:46,033 When all the dust settles, how much do you think they won't be able to break Tor? Is Tor going to be the exception, or are we going to be sitting there 228 00:31:46,033 --> 00:31:48,866 saying, "the new GPG is also safe"? 229 00:31:48,866 --> 00:32:00,662 I think most of the public domain privacy tools are going to be safe, yes. I think GPG is going to be safe. I think OTR is going to be safe. 230 00:32:00,662 --> 00:32:13,856 I think that Tails is going to be safe. I do think that these systems, because they were not-- you know, the NSA has a big lever when 231 00:32:13,856 --> 00:32:27,805 a tool is written closed-source by a for-profit corporation. There are levers they have that they don't have in the open source international, altruistic community. 232 00:32:27,805 --> 00:32:40,656 And these are generally written by crypto-paranoids, they're pretty well designed. We make mistakes, but we find them and we correct them, 233 00:32:40,656 --> 00:32:52,685 and we're getting good at that. I think that the NSA is going after these tools, they're going after implementations. Everyone got their Microsoft 234 00:32:52,685 --> 00:33:01,406 Update patches two days ago, you installed them, did that put a backdoor into your system? You have no idea. I mean, we hope not, we think not, 235 00:33:01,406 --> 00:33:10,318 but we actually don't know. That's going to be a much more fruitful avenue of attack, and yes, you can actually break all of those tools that way. 236 00:33:10,318 --> 00:33:21,325 Auto-update is great, but auto-update requires trust. But I think that the math and the protocols are fundamentally secure. 237 00:33:21,325 --> 00:33:31,158 So-- I will admit that I say this as a free software advocate-- I think that what you just said is without Freedom Zero there is not freedom: if you can't read it, you can't trust it. 238 00:33:31,158 --> 00:33:35,397 Is that where we're going to be when the dust settles? --I think it's where we always have been. 239 00:33:35,397 --> 00:33:45,314 --But people didn't believe it? --But we do believe it. We are all here trusting the building codes here at Columbia University. 240 00:33:45,314 --> 00:33:56,495 We don't think about "well, the roof could fall on our heads," but we are trusting. We're trusting the people around us, we're trusting all the tools we use, 241 00:33:56,495 --> 00:34:05,269 both tech and non-tech, but yes, we're trusting our hardware. I mean, a few days ago, was it OpenBSD?, announced that they no longer 242 00:34:05,269 --> 00:34:15,204 trust the random number generator on the Intel chip. Not because we know it's broken, no because we have evidence that it's broken, 243 00:34:15,204 --> 00:34:24,243 but because we know that Intel is susceptible, and if they were told "break your random number generator or we're not buying your stuff anymore," what are they going to do? 244 00:34:24,243 --> 00:34:35,631 And, a researcher's name I forget right now, showed a really clever way to put a backdoor in a random number generator on a silicon chip, that we would never in a million years find. 245 00:34:35,631 --> 00:34:45,698 So we have a proven concept that it's possible, we have a company that could be susceptible, and we have mathematical fixes to this. 246 00:34:45,698 --> 00:34:53,906 We can run the hardware outputs through an algorithm with some other input, and we know how to fix this. 247 00:34:53,906 --> 00:35:07,324 So, we either have to trust them, or we have to do things to ensure we're still secure if they're not trustworthy, but we're still trusting 248 00:35:07,324 --> 00:35:17,545 those tools that are now fixing this. So in the end, you have to trust everyone up the chain from the hardware, operating system software, user, everything-- 249 00:35:17,545 --> 00:35:29,908 -- to the room your sitting in which could have various listening devices-- and that's never going to change. In any technological society, you cannot examine everything. 250 00:35:29,908 --> 00:35:42,071 You fundamentally must trust. This is why transparency of process is so important. We don't trust because we verify, we trust because we know someone else verified, 251 00:35:42,071 --> 00:35:46,961 or a few people who mutually don't like each other have verified, and that sort of mechanism. 252 00:35:46,961 --> 00:35:52,608 Both republicans and democrats are counting the votes, therefore... that sort of thing. 253 00:35:52,608 --> 00:36:00,908 But in that, we would then say that the way that reduces out is use software over hardware where you can, 254 00:36:00,908 --> 00:36:05,189 and use software you can read over software that you can't. --Yes. 255 00:36:05,189 --> 00:36:12,911 And so, we are pushing ourselves towards openness or freedom, depending on which word we happen to be using. 256 00:36:12,911 --> 00:36:18,464 And we're basically saying that hardware's definition in the 20th century is, "hardware is what the NSA is inside." 257 00:36:18,464 --> 00:36:21,408 --Unless we have open source hardware, which we here talk about. 258 00:36:21,408 --> 00:36:27,778 --Well, at that point we're going to have to go very far towards the chips themselves, aren't we? --That's right. 259 00:36:27,778 --> 00:36:39,546 We're not talking about designs and layouts that are free to copy, modify, and reuse. We're talking about we have to go from the masks up, 260 00:36:39,546 --> 00:36:41,743 otherwise we wouldn't trust it. 261 00:36:41,743 --> 00:36:52,081 Right, and the goal here is to reduce your trust footprint. I mean, I could trust 30 companies, if I could trust 5 that's better. 262 00:36:52,081 --> 00:36:59,237 Or, if I could figure out ways where, I don't have to trust any one, but in order to break my security it has to be a collusion of two of them. 263 00:36:59,237 --> 00:37:03,013 These things make it harder for the attacker. 264 00:37:03,013 --> 00:37:09,878 OK, good. So I've got an apartment full of gear, like many of the people in this room, and there's a lot of boxes in there. 265 00:37:09,909 --> 00:37:19,652 I think what I have learned from the documents I have seen so far and what I think they tell me about the context of the listeners I've always known-- 266 00:37:19,652 --> 00:37:25,483 maybe you agree with me about this-- is if I'm going to start distrusting some box in my apartment, I should start with my router. 267 00:37:25,483 --> 00:37:39,684 I would. I believe-- and this story hasn't really been told, I think it will, I'm not sure where the details are-- that the routers, the network devices 268 00:37:39,684 --> 00:37:51,147 are a much more fruitful avenue of attack than the computers. And I think we're just starting to see that. There have been a couple of stories in the past 269 00:37:51,147 --> 00:38:01,401 couple of weeks about malware attacks against routers. The criminals are starting to notice this, but routers never get patched, basically. 270 00:38:01,401 --> 00:38:11,119 They're running a 4 year old version of Linux, they've got a bunch of binary blobs around them for various device drivers, and they never ever get patched. 271 00:38:11,119 --> 00:38:23,739 Even if the patch was issued, you would have no idea how to install it. The margins are very slim, the industry isn't really set up for security updates. 272 00:38:23,739 --> 00:38:35,989 They're always building the next thing. At a very small level, I think we are-- ignoring the NSA-- the next wave of cybercrime is going to come after these routers. 273 00:38:35,989 --> 00:38:45,610 We saw an attack on a point of sales system recently. There was a botnet that took over a gazillion routers in Brazil recently. 274 00:38:45,610 --> 00:38:58,132 I think this is very much a danger for all of us. For the NSA, I think they've had better luck with the router companies. 275 00:38:58,132 --> 00:39:07,404 I think this is very generational. You start to think about the history of the NSA and surveillance, and cooperating with US companies, telcos have 276 00:39:07,404 --> 00:39:15,986 started cooperating with the NSA since the NSA came into existence. My guess is this cooperation just carried through the Cold War, 277 00:39:15,986 --> 00:39:28,760 and after it's no big deal for Level3, or AT&T, or any telco company, or executive, or person, to, you know, "Oh yeah we give the NSA a copy, that's just what we do." 278 00:39:28,760 --> 00:39:37,635 And that is a very different mentality that you'll get out of Google, or Microsoft, or Apple, or companies coming out of the computer space that 279 00:39:37,635 --> 00:39:46,739 don't have this history of cooperation and collusion. The reactions you're getting from those companies are much more hostile. 280 00:39:46,739 --> 00:39:55,844 "What do you mean you're doing this to us?" Not, "oh yeah we kinda assumed that, and we'll give you a room if you just ask. Don't be a stranger." 281 00:39:55,844 --> 00:40:03,823 But isn't part of the outrage a result of the fact that they thought they had made deals as a result of which they weren't going to be troubled more? 282 00:40:03,823 --> 00:40:08,900 They really just feel that the guys they bought didn't stay bought, right? 283 00:40:08,900 --> 00:40:21,371 You know, I'm not sure it's deals. Yes, I think it's a bit rich for CEOs of Google to complain that the NSA is getting a copy of the data it stole from you fair and square. 284 00:40:21,371 --> 00:40:31,120 And certainly a lot of government surveillance piggy-backs on corporate surveillance. There's a whole story about cookies-- it's simply because these 285 00:40:31,120 --> 00:40:38,985 companies want to identify you on the internet, and the NSA is just getting itself a copy. --Right, the prefs cookie at Google, let's just 286 00:40:38,985 --> 00:40:46,939 fingerprint all the browsers just in case we need all the browser fingerprints on Earth, and then, by god, that makes it easier to steal all the browser fingerprints. 287 00:40:46,939 --> 00:41:01,901 But, these companies do have a huge PR problem. They did believe, I think, that the bulk of the NSA collection of their stuff, came through the front door, 288 00:41:01,901 --> 00:41:09,814 came through National Security Letters, came through subpoenas, came through warrants. I don't know it, but I assume Google has a room full of 289 00:41:09,814 --> 00:41:19,153 lawyers that deal with the 30 or 50 countries that serve it with subpoenas or whatever they're called in that country, whether they're legal or not. 290 00:41:19,153 --> 00:41:32,278 I believe these companies did think that that was primarily what the NSA was doing. I don't think they realized that was just a way to launder stuff they got surreptitiously previously. 291 00:41:32,278 --> 00:41:43,955 I think a really important moral is that the NSA surveillance is robust. It's robust legally, it's robust technically, it's robust politically. 292 00:41:43,955 --> 00:41:54,071 I can name three different ways the NSA has access to your GMail, under three different legal authorities. 293 00:41:54,071 --> 00:42:03,105 And I worry about pending legislation in the United States that tends to focus on a particular program, or a particular authority, 294 00:42:03,105 --> 00:42:15,657 not realizing that they have backups and backups to backups. So, I do think that Google was legitimately surprised at the extent that they were penetrating, 295 00:42:15,657 --> 00:42:26,739 given that they were cooperating where they thought they had to. "We're giving you what you're asking for, under the extraordinarily draconian laws, 296 00:42:26,739 --> 00:42:35,773 you mean you're getting it these other ways (plural) also? What, do you guys have money to burn?" "Yeah, we kinda do." 297 00:42:35,773 --> 00:42:46,361 And so, the private dataminers who also have money to burn are gonna have to burn some making themselves more secure or people aren't going to use them? 298 00:42:46,361 --> 00:42:54,738 We don't know. We're getting back to trust again. You are someone in some country somewhere and you've learned that the NSA is getting a copy of 299 00:42:54,738 --> 00:43:02,340 everything. And Google has a press release saying, "Oh, we fixed that." Do you believe it? I sure don't. I think the companies have 300 00:43:02,340 --> 00:43:13,120 a serious problem right now, that the trust that-- and this is an Internet problem-- the Internet used to be run on a basic, U.S. benign dictatorship. 301 00:43:13,120 --> 00:43:23,188 Under the assumption that the U.S. was generally behaving in the world's best interest. And I think that trust lost is a one-way function. 302 00:43:23,188 --> 00:43:35,978 We generally believe that Google, yeah, they were reading your GMail and serving you ads, but that was it. And now that the cats out of the bag, I'm not 303 00:43:35,978 --> 00:43:47,239 sure there's a way for these companies to convince the world that, "yes, we've contained the problem." That, "yes, we only give the NSA the data only when they ask us with secret requests." 304 00:43:47,239 --> 00:43:56,093 Which is the best they'll ever be able to say. I think this is why we're seeing these movements in Brazil and other countries that say, 305 00:43:56,093 --> 00:44:05,069 "Now wait a second. We want this data in our country. There no longer exists these assurances you can give us." 306 00:44:05,069 --> 00:44:15,603 Because maybe we've been deluding ourselves the past, you know, bunch of years-- and cloud computing is not going away for a whole bunch of other reasons-- 307 00:44:15,603 --> 00:44:24,726 I see coming some Internet balkanization, which I think is going to be very bad because a bunch of countries are going to be doing 308 00:44:24,726 --> 00:44:36,361 way worse than we are. And a lot of countries are using our actions to justify their own actions. So, if this is fixed, I don't think it's coming from the companies. 309 00:44:36,361 --> 00:44:44,122 I think it's coming from the tech community. It's coming from the IETF, it's coming from the open source movement, 310 00:44:44,214 --> 00:44:57,867 it's coming from all the non-commercial entities that are going to try to build security back in. We'll never be able to trust Google, or Microsoft, or Apple, 311 00:44:57,867 --> 00:45:01,367 or any of these companies ever again. I just don't think that's going to happen. 312 00:45:01,367 --> 00:45:13,255 But most of the free communities that have been building crypto and security software, groups of hackers who, as you say, are knowledgeable 313 00:45:13,255 --> 00:45:22,091 and extremely well motivated, they probably would have said 10 years ago, "Look, we are making software that we think creates security, but 314 00:45:22,091 --> 00:45:32,720 if you're up against national means of intelligence, all bets are off." And now, if you're right about what we're going to be called upon to do, we're going to have to 315 00:45:32,720 --> 00:45:40,089 raise our game substantially, because what you've really said is, "unless you're good against national means of intelligence, you're not good at all." 316 00:45:40,089 --> 00:45:51,180 Yeah, but it's actually better than that. One of the things we've learned about the NSA is they might have more employees doing surveillance 317 00:45:51,180 --> 00:45:56,615 than the rest of the planet combined, and bigger budget than the rest of the planet combined, but they are not made of magic. 318 00:45:56,615 --> 00:46:07,299 They are subject to the same laws of mathematics, and physics, and economics that everyone else is. And what we've done is not-- 319 00:46:07,299 --> 00:46:19,990 the problem is we've made surveillance too cheap. We've made bulk surveillance too cheap. Fundamentally, if the NSA, or China, 320 00:46:19,990 --> 00:46:27,886 or a dozen other countries I could name, or a bunch of really good hackers, want into your computer, they are in. Period. 321 00:46:27,886 --> 00:46:40,394 We do not have the expertise, anywhere on this planet, to build that level of security. Right now, in the world, on computers, attack is much easier than defense. 322 00:46:40,394 --> 00:46:51,448 But that's not what I'm trying to defend against. I'm trying to defend against bulk collection. And this is what we object to. 323 00:46:51,448 --> 00:47:01,004 If the Snowden documents revealed the NSA spied on the Taliban in North Korea, no one would care. If the NSA spied on Belgium-- 324 00:47:01,004 --> 00:47:13,841 or, I guess that UK spied on Belgium, which is like Connecticut spying on Nebraska-- that's the problem. It is easier to get everything than to target. 325 00:47:13,841 --> 00:47:22,860 The economics are all wrong. Fixing the economics is a much more tractable problem, and something we can do. 326 00:47:22,860 --> 00:47:34,206 So, it's not, "you need to be secure against the NSA." You need to be secure against NSA bulk collection, and that's an extremely important point. 327 00:47:34,206 --> 00:47:43,758 If you're the financial industry, however, you might actually need to be secure against the NSA. Part of what is happening, it seems to me at the moment, 328 00:47:43,758 --> 00:47:53,525 --and I'd be very interested to hear your view on this-- we're also living in a world after the end of money, where trust is all that sustains economic value. 329 00:47:53,525 --> 00:48:01,639 Bars of gold have been replaced by bit streams signed by trusted parties. Signing is a cryptographic activity, 330 00:48:01,639 --> 00:48:09,675 the consequence of which is that if we are to have the economics you are talking about, in a world where values are represented by digital 331 00:48:09,675 --> 00:48:20,671 entities, signed by trusted parties using algorithms we believe in, there is actually, at the end of the day, a requirement to provide a level of 332 00:48:20,671 --> 00:48:29,170 security in order to stave off chaotic risk in the world financial system, which it appears the American government has been deliberately undermining. 333 00:48:29,170 --> 00:48:38,472 Isn't there a really hard choice out there for us now, about whether we're going to have security in the way the military listeners think 334 00:48:38,472 --> 00:48:42,416 about it, or are we going to have trusts sufficient to run the world economic system? 335 00:48:42,416 --> 00:48:57,288 I think that this is the fundamental choice that this whole story brings to light. A lot of people talk about this as "should the NSA be allowed to spy or not?" 336 00:48:57,288 --> 00:49:06,543 That's actually the wrong way to think about it. The way to think about it is should be build an electronic infrastructure / Internet in the information age, 337 00:49:06,543 --> 00:49:14,305 where everyone is allowed to spy, or where nobody is. Do we choose surveillance or security, 338 00:49:14,305 --> 00:49:23,509 where security is defined as not that the NSA isn't listening, but that nobody is listening. Because the NSA doesn't get the only ear. 339 00:49:23,509 --> 00:49:35,296 It's the global financial industry, but it's everything else as well. And this is in the NSA's mission, the NSA has always had a dual mission: 340 00:49:35,296 --> 00:49:45,303 throughout the Cold War, it's to protect US communications and eavesdrop on Warsaw Pact communications. 341 00:49:45,303 --> 00:49:52,990 That dual mission made a lot of sense during the cold war. You eavesdrop on the Soviet stuff and you protect the American stuff. 342 00:49:52,990 --> 00:50:06,587 That fails when everyone starts using the same stuff. When the entire world uses TCP/IP, and Cisco routers, and Microsoft Windows, suddenly... 343 00:50:06,587 --> 00:50:11,799 --Well, not the entire world... --Well, enough of the world, to a first approximation. 344 00:50:11,799 --> 00:50:26,288 You now have a very real choice. You learn of a vulnerability against-- I'm making this up-- a Cisco router. You can use that vulnerability to 345 00:50:26,288 --> 00:50:33,484 eavesdrop on the people you don't like, knowing full well that other people might discover that vulnerability and eavesdrop on you, or you can 346 00:50:33,484 --> 00:50:41,796 close the vulnerability, reduce your ability to eavesdrop, and eliminate everyone else's ability to eavesdrop as well. 347 00:50:41,796 --> 00:50:51,260 And maybe the financial industry is the tipping point for this, but I think we need to collectively recognize that it is in our 348 00:50:51,260 --> 00:50:58,552 collective long-term interest to have actual security and not eavesdropping. 349 00:50:58,552 --> 00:51:02,836 Does actual security imply anonymity? --Yes. 350 00:51:02,836 --> 00:51:11,835 And the distinction of anonymity has been pretty much their goal all the way along. Attribution is what they look for. 351 00:51:11,835 --> 00:51:17,529 "Make it possible for us to attach an identity to every action." --Yes, and this is the metadata debate. 352 00:51:17,529 --> 00:51:28,171 When the first stories about Verizon and cellphone eavesdropping, one of the defenses was-- the President said this-- he said, 353 00:51:28,171 --> 00:51:38,449 "Don't worry, it's all metadata. No one is listening to your conversations." I think this is an extremely... I don't know what word I want to use... 354 00:51:38,449 --> 00:51:45,343 --Disingenuous. --Yeah, and it is. Because metadata equals surveillance. 355 00:51:45,343 --> 00:51:55,034 And it's easy to understand this. Imagine you hired a private detective to eavesdrop on somebody. That detective would put a bug in 356 00:51:55,034 --> 00:52:03,613 their home, and their car, and their office, and you would get a report of their conversations. That's what the data is. If you ask that same 357 00:52:03,613 --> 00:52:14,955 detective to surveil somebody, you'd get a different report: where he went, who he spoke to, what he purchased, what he read. That's all metadata. 358 00:52:14,955 --> 00:52:25,009 When the President says, "Don't worry, it's just metadata" I hear "Don't worry, you're all just under surveillance 24/7." 359 00:52:25,009 --> 00:52:34,834 Breaking anonymity is part of that, because it's one thing to know that this anonymous blog did these things. It's very different to 360 00:52:34,834 --> 00:52:43,342 attached a name to it, or if you can't do that, continuity with other anonymous blobs, you attach a persistent pseudonym. 361 00:52:43,342 --> 00:52:53,897 That's not just the goal of the NSA, that's the goal of Google. That's the goal of Facebook. When Google+ came up with a real names policy, 362 00:52:53,897 --> 00:53:05,874 it was basically "we need to market to you better. We don't want anonymity on our system." When they're trying to tie your cellphone usage to your internet usage to 363 00:53:05,874 --> 00:53:14,278 your real world usage, that's all about breaking anonymity. So it's for-profit and for-government. This is what's happening. 364 00:53:14,278 --> 00:53:24,257 So, what is the sum of the economic thinking that lies behind the idea that we change the economics? It is obviously expensive to follow people. 365 00:53:24,257 --> 00:53:33,897 You gotta have a guy out there tailing people and she's gotta know how to not get seen. But getting 5 billion cellphone location records a day... that's much simpler. 366 00:53:33,897 --> 00:53:44,276 Isn't it a permanent economic fact that the way we live in the digital universe, following individual people is expensive and following everybody is much cheaper? 367 00:53:44,276 --> 00:53:54,508 --Only if following everybody is cheap. And that's true because we have designed the cellphone system such that this location data 368 00:53:54,508 --> 00:54:01,139 is transmitted in the clear, and easy to eavesdrop on. We could design a cellphone system that doesn't have that property. 369 00:54:01,139 --> 00:54:12,420 We've designed an Internet economic architecture where surveillance is the fundamental business model. We could decide not to 370 00:54:12,420 --> 00:54:17,544 design it that way. --Yes, but if you and I and everybody in this room who totally believes this goes and says 371 00:54:17,544 --> 00:54:25,092 "we need to build an internet with anonymity built in from the beginning," it will be a complete political non-starter. 372 00:54:25,092 --> 00:54:34,127 Because every policeman, every taxman, every other form of legitimate government agency on earth has now decided they can do a much better job 373 00:54:34,127 --> 00:54:37,705 governing us without anonymity, and never going back. Isn't that right? 374 00:54:37,705 --> 00:54:47,115 --So, I tend to be long-term optimistic. I think that we as a species tend to solve these problems. 375 00:54:47,115 --> 00:54:57,858 It might take us a generation, or two. We might have some pretty horrible world wars while we're doing it, but you know, the quote that actually 376 00:54:57,858 --> 00:55:07,342 lets me sleep at night is Martin Luther King Jr. who says "the arc of history is long but bends towards justice." We do manage to have more 377 00:55:07,342 --> 00:55:19,450 freedom, and more liberty, and more rights, century by century. Not year by year. So I do think that long term, wherever that is, we will have licked this. 378 00:55:19,450 --> 00:55:27,648 --OK, but Martin Luther King can say that because his view of justice isn't path-dependent. His view of justice is it's absolute and it's always there. 379 00:55:27,648 --> 00:55:37,670 Technology, on the other hand, is path-dependent. When our friend Dan Geer at In-Q-Tel says that talk on tradeoffs 380 00:55:37,670 --> 00:55:43,917 in cybersecurity that you and I both so admire, this is the last generation in which the human race gets a choice. 381 00:55:43,917 --> 00:55:52,261 He's basically speaking to what you've just said. You said "if we have long enough we'll get this fixed" and he said "technology is path dependent 382 00:55:52,261 --> 00:56:01,170 and once this is fastened on the human race it may not be unfastenable again, and we evolve forward from where we are in a dependent path." 383 00:56:01,170 --> 00:56:10,454 So one of those lets me sleep and the other one keeps me awake, and between those two what you and I have to confront is our friends 384 00:56:10,454 --> 00:56:17,966 out in the world who say "it's hopeless, there's nothing we can do," and "I'm not doing anything wrong, so why should I care?" 385 00:56:17,966 --> 00:56:25,081 And those are the two arguments that we need to address. In the couple of minutes left to us before we open it up to all of these people, 386 00:56:25,081 --> 00:56:29,504 what do you say to the people who say, "it's hopeless, there's nothing we can do"? 387 00:56:29,504 --> 00:56:35,914 --I think there's a lot we can do. That's, I think, one of the most important morals from the Snowden documents, is that the NSA isn't 388 00:56:35,914 --> 00:56:45,354 made of magic, that there not breaking cryptography anywhere near the extent that we kinda thought they were, that there are things we can do to make 389 00:56:45,354 --> 00:56:53,041 ourselves much more secure. I mean, if you are the one person they want, they're going to get in. But again, that leverages the economics. 390 00:56:53,041 --> 00:56:59,207 Now we're getting into tailing everybody individually. You've only got so many agents, you can only tail so many people. 391 00:56:59,207 --> 00:57:08,860 If you eliminate the bulk, or make the bulk harder, or make us more able to hide in the noise, we are doing ourselves an enormous favor. 392 00:57:08,860 --> 00:57:14,631 And if we give the tools to the dissidents around the world who are hiding from much worse regimes than we have, 393 00:57:14,631 --> 00:57:21,325 to do this, we are doing an enormous amount of good for the world. There are things we can do. It is no where near hopeless, and I think 394 00:57:21,325 --> 00:57:33,907 we learned this again and again and again. And, the other half is "Why? I don't have anything to hide." The people who are speaking best to this 395 00:57:33,907 --> 00:57:44,996 are the psychologists, who look at what it is like to live under constant gaze, or under the threat of-- that if you believe that you could 396 00:57:44,996 --> 00:57:55,249 be watched at any moment, what does that do to you as a person? And what we learn is, it makes you different. It makes you more conformist. 397 00:57:55,249 --> 00:58:10,575 It makes you less willing to think new thoughts or try new ideas. It stagnates society. It makes us all worse. Society improves because people dare to 398 00:58:10,575 --> 00:58:20,521 think the unthinkable and then after 20 or 30 years everyone says, "well you know, that was kind of a good idea." It takes a while, but it has to start 399 00:58:20,521 --> 00:58:31,630 with doing something that you don't want anyone else to know. So, it hurts us big and small. It hurts us in the big because society stagnates, 400 00:58:31,630 --> 00:58:42,615 and it hurts us in the small because we are diminished as individuals, because we cannot fully be individuals. We have to be a member of the group. 401 00:58:42,615 --> 00:58:52,597 I mean, there's phenomenal writings, philosophical and psychological, that really look at how this works. It's a hard argument to make. 402 00:58:52,597 --> 00:59:05,934 The arguments on the other side are quite simple: "terrorists will kill your children." That's it. That argument pushes four very core buttons that will 403 00:59:05,934 --> 00:59:16,548 make you scared. So I could spend an hour saying, "well this doesn't protect you from terrorism." That argument is happening at a higher intellectual 404 00:59:16,548 --> 00:59:35,608 level than your fear. I'm going to lose that argument. So, the forces of surveillance are strong. This is an extremely difficult fight and I'm always amazed 405 00:59:35,608 --> 00:59:48,117 at the resilience of our species to overcome intractable problems, to overcome futility. It amazes me again and again, and I'm not willing to 406 00:59:48,117 --> 00:59:57,219 count us out. It is possible that we've reached some theoretical limits here, and I could actually draw out that argument, that's, you know, some Darwinian-level 407 00:59:57,219 --> 01:00:08,378 limit in our species, that technology just makes bad things happen and we have no choice here. My guess is not, but it's going to require a lot of 408 01:00:08,378 --> 01:00:18,367 changing. I mean, the war has to end-- that's a phrase you used when we were talking earlier. If terrorists-- if General Alexander could get 409 01:00:18,367 --> 01:00:28,289 in front of Congress and say, "if I had these powers I could have stopped 9/11," and no one looks to him and says, "you didn't stop Boston." And that was 410 01:00:28,289 --> 01:00:39,633 one guy on a terrorist watch list, and the other guy with a sloppy Facebook trail. What are you talking about? We need that level of response. 411 01:00:39,633 --> 01:00:43,202 But I'm still bullish on us. 412 01:00:43,202 --> 01:00:52,153 --So, if I don't ask you someone else is going to ask you, I might as well save the time: what do you trust these days? 413 01:00:52,153 --> 01:01:05,124 --So, I actually wrote an essay about that in The Guardian, and what do I trust? I trust OTR, I trust Tails, I trust GPG, 414 01:01:05,124 --> 01:01:19,170 I trust-- oh, what's the file encrypter-- Truecrypt, which I consider the best of three bad alternatives. I do a few other things, there's a file erasure program, 415 01:01:19,170 --> 01:01:29,338 that I think they're all pretty good. But basically, I have an airgap computer I use for things I don't want on the Internet. And again, all these things we can 416 01:01:29,338 --> 01:01:38,202 pick apart, but I'm just trying to make it harder. --We don't have to pick you apart, because other people will do that for us. 417 01:01:38,202 --> 01:01:47,500 --If the NSA wanted me, I think they're in. If the FBI-- could the FBI get a warrant against my computer? Probably. 418 01:01:47,500 --> 01:01:59,059 They haven't broken down my door yet. --[audience] That you know of... --That I know of, right. But what am I going to do? 419 01:01:59,059 --> 01:02:09,565 I am not a nation state, I cannot protect my computer. My house is not tamper shielded, and it will never be tamper shielded. I will never have my computers 420 01:02:09,565 --> 01:02:20,440 in a secret level [unintelligible] safe. I will never have guards, no one's home right now and I don't come home for a couple of hours. So it is quite easy 421 01:02:20,440 --> 01:02:29,867 to grab an image of my hard drive. It is trivial to put temporary receiver around, or grab my keystrokes when I type in my password. If you are 422 01:02:29,867 --> 01:02:42,961 targeted, there's pretty much nothing you can do with that level. So, at some point you have to just say, "that's the way the world works" and you can't do anything. 423 01:02:42,961 --> 01:02:52,025 But you can protect yourself against bulk surveillance, and that's largely what I'm trying to do. When I go in and out of the country right now, 424 01:02:52,025 --> 01:02:58,681 my securities don't happen on your laptop. So when my laptop gets-- and it's interesting, I spend a lot of time now when I'm flying into the US erasing 425 01:02:58,681 --> 01:03:07,465 all my free space, deleting data, encrypting archives, and all sort of things. You spend a few hours doing it, you go through the US border and nothing happens, 426 01:03:07,465 --> 01:03:17,131 and you know, you're pissed off. I went through all this trouble, and you can't seize my laptop? What the hell are you guys doing? And I just know after 427 01:03:17,131 --> 01:03:23,905 four or five times, "I don't have to do this, they're not going to take my stuff..." and then they're going to take it. And security is a lot like that. 428 01:03:23,905 --> 01:03:32,664 There's an essay I should write, on how hard it is to get opsec right. There's a nice story-- well there's a couple of stories-- 429 01:03:32,664 --> 01:03:44,315 there's a story of General Petraeus, and how his secret conversations were eavesdropped on. And also, the guy who was running Silk Road. And there's 430 01:03:44,315 --> 01:03:58,760 also a third one I'd use, from the Chinese hackers that Mandiant found. You have to be perfect, that if you make a mistake sometime in the past 10 years, 431 01:03:58,760 --> 01:04:10,312 your security has been compromised. And because there's never feedback, you never know. I can tell you that one time-- and I shouldn't say this-- 432 01:04:10,312 --> 01:04:19,972 I spent a lot of time encrypting this archive, I had it encrypted, I zipped it, I encrypted it, I decrypted it, and encrypted it again just make sure I had it right 433 01:04:19,972 --> 01:04:29,331 because if you get it wrong the key doesn't work and you're screwed. I do this, and I throw the zips in the trash, I delete the trash, I erase the trash, 434 01:04:29,331 --> 01:04:41,767 I go through the border, I come in, I open my computer, and I forgot to erase the originals... You never get any feedback as you do this. 435 01:04:41,767 --> 01:04:52,295 You never know if you did it right. And it's easy to make a mistake, because security is always-- you never want to do security, it's always in the way. 436 01:04:52,295 --> 01:05:03,840 "Oh, I have to remember when I use OTR to do the authentication step." It's not what I want to do, I want to talk to the guy. I have to remember-- 437 01:05:03,840 --> 01:05:12,275 and I'll do this: I'll close my laptop, boot it down, put in my USB stick, open up Tails, get it all ready and then "Oh, damn, the email address I wanted is on 438 01:05:12,275 --> 01:05:24,472 the memory." I gotta close it all down, open it up again. It's always in the way. It's very easy to make a mistake. And the way the balance goes, if you make 439 01:05:24,472 --> 01:05:33,043 a mistake, you're done. This makes it very hard. I'm not helping, am I? 440 01:05:33,043 --> 01:05:41,173 --I felt very good, I was thinking to myself I need to design an exploit platform that gives positive reinforcement back to you for doing the 441 01:05:41,173 --> 01:05:46,699 wrong thing, and pretty soon we'll have you trained to do the wrong thing-- --And if it gets a cool codename, you're in. 442 01:05:46,699 --> 01:05:56,251 --Absolutely. So, I think it's time we let some other people ask some questions. 443 01:05:56,251 --> 01:06:04,746 --[audience] Bruce, thanks very much. I wanted to ask one question that's about the opposite side of bulk analysis, and that is about natural language 444 01:06:04,746 --> 01:06:12,152 processing. What is your assessment of the state of it, and how much of a threat or problem do you think it really is? Because I've seen some of what's 445 01:06:12,152 --> 01:06:17,887 going into it, and I wasn't particularly impressed, frankly. --Well, we don't know. There are a large number of 446 01:06:17,887 --> 01:06:24,929 patents the NSA has in this area, so there's stuff in the public. My guess is they're extraordinarily good. This is something they've been working on 447 01:06:24,929 --> 01:06:33,598 since computers were invented, because this is not a new problem. Especially when you are dealing with radio, where you had to transcribe. It was 448 01:06:33,598 --> 01:06:43,747 the only thing you could do, that the recording just couldn't keep up. So my guess is they are very good at natural language processing, 449 01:06:43,747 --> 01:06:56,179 and natural language translation. I would expect that most everything gets very quickly turned into text, that there's easy ways to annotate little 450 01:06:56,179 --> 01:07:07,684 bits of voice, stuff you don't know that might need a person. And that voice printing is extremely advanced as well. Again, nothing is published in 451 01:07:07,684 --> 01:07:15,671 this, and I don't know if it will be. But I would expect this is an area they have devoted considerable resources on for a decade. 452 01:07:15,671 --> 01:07:19,180 -- [audience] Are they better than Facebook, do you think, or Google? -- They would have to be. They've been doing this 453 01:07:19,180 --> 01:07:26,848 for decades, and with way more budget. And again, it's going to be a one-way function. Anything that Google and Facebook can do is going to come out 454 01:07:26,848 --> 01:07:35,772 of the academic community, and they're going to know about it. It's like cryptography: you have this-- information only flows in one direction, from the 455 01:07:35,772 --> 01:07:44,974 academic community to the NSA. It never flows the other way. So the NSA can get the best of the world, plus what they have. They spend a lot of money 456 01:07:44,974 --> 01:07:50,836 on linguists. 457 01:07:50,836 --> 01:07:59,744 -- [audience] I want to get your opinion about these third party companies that are creating these commercial off-the-shelf products in order 458 01:07:59,744 --> 01:08:13,623 to spy and target people. I read this document "For Their Eyes Only: The Commercialization of Digital Spying" and at some point you posted something like 459 01:08:13,623 --> 01:08:18,326 that on your blog, so I want to get your opinion on these companies, not only NSA but now these other people. 460 01:08:18,326 --> 01:08:34,584 -- Right, I mean one problem-- I guess it is a problem-- a lot of government capabilities have corporate analogs. So we talk a lot 461 01:08:34,584 --> 01:08:41,079 about surveillance, there's government surveillance and then there's corporate surveillance, and all these tools are being built for corporate surveillance, 462 01:08:41,079 --> 01:08:50,972 some of it for legitimate reasons, some of it for reasons we may not like, and then this is also being used by governments. 463 01:08:50,972 --> 01:08:59,710 You know, propaganda tools-- we're seeing companies like Blue Coat and Sophos. These are commercial products being sold in 464 01:08:59,710 --> 01:09:08,821 corporations that are also being sold to Syria to identify and arrest dissidents. A lot of these technologies are dual use, and I don't 465 01:09:08,821 --> 01:09:19,451 think we can address one issue without also addressing the other. We cannot just say "governments can't do this and corporations can." 466 01:09:19,451 --> 01:09:31,043 The tools lend themselves to abuse. And there's talk about putting a lot of these tools back under export control. Over the past month I've been seeing 467 01:09:31,043 --> 01:09:40,195 more discussion about that. I'm not sure that it's possible anymore. It's a very different world than the 90s, when you actually could have export controls 468 01:09:40,195 --> 01:09:47,420 on cryptography, because everything was mailed around. It wasn't just downloaded. The connected international world is much harder. You end 469 01:09:47,420 --> 01:10:00,553 up putting national barriers like the Great Firewall of China, which works well but is also pretty porous. These are certainly important to talk about, 470 01:10:00,553 --> 01:10:09,008 the corporate analogs to these government tools. --I can't speak about the Snowden documents which Bruce has seen, and we're not ready at SFLC 471 01:10:09,008 --> 01:10:18,348 to make any publications yet, but I can tell you for sure that there are national governments that have outsourced the process of penetrating 472 01:10:18,348 --> 01:10:28,321 and listening to computer networks to commercial organizations whose contract work mixes government and commercial spying, but whose primary 473 01:10:28,321 --> 01:10:34,727 bread and butter in this and other countries around the world is the conduct of governmental spying. --I think it's dangerous for us because 474 01:10:34,727 --> 01:10:42,911 now you have an industry that is going to lobby. Just like you have a private prison industry lobbying for more draconian laws, you're going to have 475 01:10:42,911 --> 01:10:47,382 a private surveillance industry lobbying for more surveillance, because more surveillance means more sales. 476 01:10:47,382 --> 01:11:03,533 --Well, wealthy database-making companies can be counted on doing that anyway, I should think. 477 01:11:03,533 --> 01:11:12,847 -- [audience] Here's a real paranoid question for you. An important strategy in all espionage is the spread of disinformation. Is it possible with the spread 478 01:11:12,847 --> 01:11:22,415 of the Snowden documents, or other types of leaks, that there is some tiny bit of disinformation there to make the community represented here trust 479 01:11:22,415 --> 01:11:31,916 some type of technology that is actually vulnerable? --So that is actually the less paranoid version of that argument. The more paranoid one is 480 01:11:31,916 --> 01:11:42,354 that Snowden is a government plant and that is all disinformation. You do hear that. I believe that is not true. I believe that Snowden is a legitimate 481 01:11:42,354 --> 01:11:53,556 whistle blower, that he has legitimate whistle blowing documents that he-- I guess, legitimately-- that he fair and square stole from the NSA and 482 01:11:53,556 --> 01:12:06,859 went to China with, and that this is real, that this is not government disinformation. It would-- nah, it doesn't even pass the smell test. I do not 483 01:12:06,859 --> 01:12:17,402 believe so. --It looks like we should hand the mic down. --Just throw it at them. 484 01:12:17,402 --> 01:12:21,979 --If we owned it we would do that. --What could possibly go wrong with that? --[audience] Hi. I had a question about 485 01:12:21,979 --> 01:12:32,585 the relationship between corporate and government surveillance. So, you were saying that one thing we could do to defend against the surveillance of 486 01:12:32,585 --> 01:12:39,972 the mobile phone network is the location information could all be encrypted. But, the mobile phone companies actually need to know which 487 01:12:39,972 --> 01:12:47,982 cell tower to send your signal to. So the mobile phone companies are going to know. They're in a position where they can't help but collude. 488 01:12:47,982 --> 01:12:55,453 Do you have a vision for what a mobile phone-- what people here would consider to be a functional mobile phone network-- would look like that the government couldn't actually spy on? 489 01:12:55,453 --> 01:13:06,760 --So I don't actually know. My guess is that it is possible, that you could have a distributed system that would hide location data from the central 490 01:13:06,760 --> 01:13:17,956 nodes. And some of it is just leveraging small distribution. I think we were all much more secure when there were 100,000 ISPs than when there 491 01:13:17,956 --> 01:13:29,392 were 100. That level of distribution-- again, the economics. We could force the NSA or the FBI to go after all of these companies rather than 492 01:13:29,392 --> 01:13:39,992 just a few. So my guess is that there is a cell architecture that doesn't require centralized [unintelligible]. And just like in file sharing, 493 01:13:39,992 --> 01:13:48,594 the original file sharing systems had a centralized network that knew who had what file. Those were gone after by the music industry, and then the 494 01:13:48,594 --> 01:13:56,430 follow on systems were distributed. They were peer-to-peer. They didn't have that centralized command-and-control. We know how to do 495 01:13:56,430 --> 01:14:03,817 this. The question is making it fast, making it scale. I'm not saying this is easy, but if we want to, yes, I think we can. 496 01:14:03,817 --> 01:14:08,468 --[audience] Do you know anyone who is working on that? --Not a soul. 497 01:14:08,468 --> 01:14:15,733 --[audience] Hey, thanks guys. I've got two quick ones, maybe one for Eben and one for Bruce. One is, is it worth it to encrypt in a big corporate 498 01:14:15,733 --> 01:14:28,353 cloud, like Amazon or Rackspace, using their encryption. And two, what are your thoughts on Sibel Edmonds and Russ Tice and Cyptome, who were 499 01:14:28,353 --> 99:59:59,999 just in a debate with Greenwald on Twitter. --I didn't follow the debate. --Nor I. Tell us about it. 500 99:59:59,999 --> 99:59:59,999 --[audience] Well, basically, Sibel Edmunds was the FBI's [unintelligible] translator pre 9/11. She's wondering, "Where are all the documents? 501 99:59:59,999 --> 99:59:59,999 Why are only 1% out, and what's going on with Paypal and Omidyar, and that" 502 99:59:59,999 --> 99:59:59,999 --I think it turns out that starting a new media empire is harder than you think. That's my guess. It's just things are happening slower than maybe people 503 99:59:59,999 --> 99:59:59,999 would like. You know, releasing documents is hard. There are legitimate secrets in there that you don't want released. There are, there really are. 504 99:59:59,999 --> 99:59:59,999 And it's good that the process is happening slowly and methodically, that a Wikileaks-style data dump would not be fun for anybody. So, that's good. 505 99:59:59,999 --> 99:59:59,999 And there's a lot there, and it's slow to look at, and all of these stories do end up with a negotiation with the government. 506 99:59:59,999 --> 99:59:59,999 And this is the way journalism works. I didn't know this, but I got to meet it, that the reporters say, "We're releasing this story. Do you have 507 99:59:59,999 --> 99:59:59,999 anything that-- basically, do you mind?" And if the government says "Yes, don't release anything," of course no one is going to listen. But if they say 508 99:59:59,999 --> 99:59:59,999 "Look, this particular sentence, if you do this it will disrupt--," and there's a level of trust here, between the reporters and the government, and you know 509 99:59:59,999 --> 99:59:59,999 names are redacted, operational details are redacted. If the NSA is spying on North Korea and the Taliban we're not going to hear about it 510 99:59:59,999 --> 99:59:59,999 because that would be a good thing. So, there is this long process, and figuring out what our stories and legitimate interest is also a long process. So, 511 99:59:59,999 --> 99:59:59,999 this does take a long time. There's a lot of stuff. --[audience] People are also making a lot of money, though. That's kind of what's being conjectured. 512 99:59:59,999 --> 99:59:59,999 --You know-- the people who are doing it, are they? Is anyone reading this stuff except us? --[audience] Isn't Greenwald-- 513 99:59:59,999 --> 99:59:59,999 --Greenwald does have a book deal, but there's way easier ways to make a living than living in exile. None of these people are making lots of money. 514 99:59:59,999 --> 99:59:59,999 Laura Poitras-- anybody think of Bart Gellman for the Washington Post, and Ashkan Soltani who's working with him, this is not a huge profit center. 515 99:59:59,999 --> 99:59:59,999 You would do way, way better calling up the Russian embassy and saying, "how much would you give me for the lot?" 516 99:59:59,999 --> 99:59:59,999 --[audience] I wanted to push back a little bit harder on that. You drew a dichotomy between, on the one hand, surveillance, and on the other hand, 517 99:59:59,999 --> 99:59:59,999 security. And you've said that we have to choose among them, and among them you choose security. When you say there are legitimate secrets and 518 99:59:59,999 --> 99:59:59,999 there are operational details, those operational details are things that we would need in order to defend against this stuff. 519 99:59:59,999 --> 99:59:59,999 And if I have, for instance, a friend who is working on anti-censorship software, or on secrecy software, why would you not give my friend a copy of these 520 99:59:59,999 --> 99:59:59,999 documents so that my friend can actually make his or her software work? 521 99:59:59,999 --> 99:59:59,999 --So, the hope is that the documents your friend gets are enough, that what's eliminated is the name and the phone number of the guy who wrote the 522 99:59:59,999 --> 99:59:59,999 documents. Or, what's eliminated-- you'll see this in some of the documents-- they'll give a list of places we're eavesdropping on, and the IP addresses 523 99:59:59,999 --> 99:59:59,999 will be blacked out. So, my hope is, when I wrote the Tor story I wanted to give enough detail so that the people who design Tor, the people who are working 524 99:59:59,999 --> 99:59:59,999 on internet backbone security, had enough to figure out what the NSA is doing and to defend themselves. I didn't need to tell them-- and again, I'm making 525 99:59:59,999 --> 99:59:59,999 this up-- that FOXACID was implemented successfully against, you know, these guys in Yemen. Because that is not useful to the fixers. I tried very hard-- 526 99:59:59,999 --> 99:59:59,999 and I think the Washington Post did as well, I'm very happy with their level of detail-- that it is enough for us to know what the vulnerabilities are, what the 527 99:59:59,999 --> 99:59:59,999 capabilities are, how they're being used, the extent they're being used, and to give us the information we need that, if we chose to, to fix them. 528 99:59:59,999 --> 99:59:59,999 Yes, I think this is-- in some ways it would be neat to say "here it all is," but it's just not going to happen. It just isn't. If it does, it will be a mistake, 529 99:59:59,999 --> 99:59:59,999 like the Wikileaks. It will be some confluence of bad things that shouldn't have happened. 530 99:59:59,999 --> 99:59:59,999 --It's a little like ordinary vulnerability disclosure, isn't it Dave? I mean, you might as well want to tell people how to fix the problem without explaining which 531 99:59:59,999 --> 99:59:59,999 bank is vulnerable, right? --[audience] Well, if at a certain point if the problems are not getting fixed-- and it seems like on a political 532 99:59:59,999 --> 99:59:59,999 level the problems are not getting fixed, because this is the other piece of it-- our ability to advocate for ourselves as citizens about what policies we 533 99:59:59,999 --> 99:59:59,999 do and do not want. The vulnerability is not being fixed, and at a certain point you do go public and say, "this is the vulnerability." 534 99:59:59,999 --> 99:59:59,999 --Well, I think Mr. Snowden has done that, and the likelihood that politics won't fix this is probably 1.0, no matter how many documents are disclosed. 535 99:59:59,999 --> 99:59:59,999 --Sad, but true. --I don't think politics is going to do it all by itself no matter what happens, precisely because I don't 536 99:59:59,999 --> 99:59:59,999 think it's really going to turn out that the politics hinges on the technical details. It hinges on, it seems to me, where Bruce says it hinges on, where people 537 99:59:59,999 --> 99:59:59,999 are going to buy arguments that they should be afraid. And we don't know how mad democracy is about that yet. Thanks to Mr. Snowden we're about 538 99:59:59,999 --> 99:59:59,999 to find out. 539 99:59:59,999 --> 99:59:59,999 --[audience] Just want to test a slightly more technical question. So you mentioned, basically, that open source systems are better in this case 540 99:59:59,999 --> 99:59:59,999 because we can look at them and see that there's not backdoor, and what immediately comes to mind to me is the underhanded C coding contest, or the 541 99:59:59,999 --> 99:59:59,999 Trusting Trust attack. These things that, you can hide backdoors in systems that people are looking at. Give me ideas on how to defend against 542 99:59:59,999 --> 99:59:59,999 that sort of thing, make systems more audit-able, etc. --So, you can but it's harder. The reason I like open source and free software and non-corporate, is less 543 99:59:59,999 --> 99:59:59,999 because you can look at it, and more because it is harder for someone to slip someone in, because someone is looking at it. And yes, there's an 544 99:59:59,999 --> 99:59:59,999 Obfuscated C Contest, and if you showed up in the Unix kernel with C code that looked like it came from the Obfuscated C Contest, you'd be sent back 545 99:59:59,999 --> 99:59:59,999 and be told to make it look more clear. --[audience] Well, have you looked at OpenSSL recently? 546 99:59:59,999 --> 99:59:59,999 --I have not. Fair enough. I'm just trying to leverage the economics here, I want to make it harder, I want to increase the risk. Something that comes up 547 99:59:59,999 --> 99:59:59,999 again and again in the NSA documents is that they are amazingly risk-averse. They don't like risk. They don't want to take risks. They really take very safe 548 99:59:59,999 --> 99:59:59,999 paths. And if you increase the risk, you're going tip it to a point where they're not going to try, because the risk is dangerous. And I think that's-- this is 549 99:59:59,999 --> 99:59:59,999 something that, without any legislation or any technical fixes, will change because of this summer, all these stories. And as amazing as it is, 550 99:59:59,999 --> 99:59:59,999 the NSA has all sorts of contingencies for all sorts of things, never had a contingency for "all of our documents get released to the public." 551 99:59:59,999 --> 99:59:59,999 It took them, what, two, three months to get a PR firm with enough clearance to talk to. Now they have a blog, and a Twitter feed, and they respond quickly. 552 99:59:59,999 --> 99:59:59,999 But it took them a long time. So, when they were making decisions like, "should we eavesdrop on Belgium," they had all these benefits, costs, and 553 99:59:59,999 --> 99:59:59,999 risks, but the world finding out-- they never thought that was a possibility. Well that is now over. I think that, basically, every NSA operation from now on 554 99:59:59,999 --> 99:59:59,999 is going to be the big red letters underneath the "should be do it or not?" is "This is going to become public in three to five years." 555 99:59:59,999 --> 99:59:59,999 With high probability. "Are we OK with doing it?" And I think some things are just not going to happen, because the blowback has been real. 556 99:59:59,999 --> 99:59:59,999 --[audience] So you don't think that a Trusting Trust attack, a compiler-- --It could, but you know that's not an easy attack. 557 99:59:59,999 --> 99:59:59,999 --[audience] And if it was discovered... --And if it's discovered, "Wow." Suddenly it's really bad. This has rocked the agency. This is not 558 99:59:59,999 --> 99:59:59,999 something they thought of. --But if I could just ask a technical question back, did I hear you say that it would be a really good idea 559 99:59:59,999 --> 99:59:59,999 if OpenSSL were rewritten to be clearer, and more modular, and easier for people to handle-- --[audience] Absolutely, if you could rewrite the 560 99:59:59,999 --> 99:59:59,999 whole thing in Python, that would be-- --I don't think we should necessarily expect it to get rewritten in Python. And I'm personally 561 99:59:59,999 --> 99:59:59,999 not sorry about that. But I'll hold out for rewriting it all in Perl if that'll make you feel better.