0:00:01.381,0:00:05.007 Charlie Rose: So Larry sent me an email 0:00:05.007,0:00:06.994 and he basically said, 0:00:06.994,0:00:10.723 we've got to make sure that [br]we don't seem like we're 0:00:10.723,0:00:15.214 a couple of middle-aged boring men. 0:00:15.214,0:00:18.256 I said, I'm flattered by that -- 0:00:18.256,0:00:20.628 (Laughter) — 0:00:20.628,0:00:24.143 because I'm a bit older, 0:00:24.143,0:00:28.294 and he has a bit more net worth than I do. 0:00:28.294,0:00:30.893 Larry Page: Well, thank you. 0:00:30.893,0:00:33.873 CR: So we'll have a conversation about 0:00:33.873,0:00:36.571 the Internet, and we'll have a conversation Google, 0:00:36.571,0:00:38.005 and we'll have a conversation about search 0:00:38.005,0:00:39.372 and privacy, 0:00:39.372,0:00:40.927 and also about your philosophy 0:00:40.927,0:00:43.383 and a sense of how you've connected the dots 0:00:43.383,0:00:45.474 and how this journey that began 0:00:45.474,0:00:46.758 some time ago 0:00:46.758,0:00:48.653 has such interesting prospects. 0:00:48.653,0:00:51.249 Mainly we want to talk about the future. 0:00:51.249,0:00:52.838 So my first question: Where is Google 0:00:52.838,0:00:54.884 and where is it going? 0:00:54.884,0:00:56.343 LP: Well, this is something we think about a lot, 0:00:56.343,0:00:59.918 and our mission we defined a long time ago 0:00:59.918,0:01:02.181 is to organize the world's information 0:01:02.181,0:01:05.619 and make it universally accessible and useful. 0:01:05.619,0:01:07.661 And people always say, 0:01:07.661,0:01:09.876 is that really what you guys are still doing? 0:01:09.876,0:01:11.994 And I always kind of think about that myself, 0:01:11.994,0:01:14.190 and I'm not quite sure. 0:01:14.190,0:01:18.197 But actually, when I think about search, 0:01:18.197,0:01:20.813 it's such a deep thing for all of us, 0:01:20.813,0:01:23.056 to really understand what you want, 0:01:23.056,0:01:25.424 to understand the world's information, 0:01:25.424,0:01:28.956 and we're still very much in the early stages of that, 0:01:28.956,0:01:30.769 which is totally crazy. 0:01:30.769,0:01:33.287 We've been at it for 15 years already, 0:01:33.287,0:01:36.862 but it's not at all done. 0:01:36.862,0:01:39.538 CR: When it's done, how will it be? 0:01:39.538,0:01:42.255 LP: Well, I guess, 0:01:42.255,0:01:44.655 in thinking about where we're going -- 0:01:44.655,0:01:46.942 you know, why is it not done? -- 0:01:46.942,0:01:49.378 a lot of it is just computing's kind of a mess. 0:01:49.378,0:01:51.181 You know, your computer[br]doesn't know where you are, 0:01:51.181,0:01:53.216 it doesn't know what you're doing, 0:01:53.216,0:01:54.898 it doesn't know what you know, 0:01:54.898,0:01:57.474 and a lot we've been trying to do recently 0:01:57.474,0:02:00.769 is just make your devices work, 0:02:00.769,0:02:03.110 make them understand your context. 0:02:03.110,0:02:05.113 Google Now, you know, knows where you are, 0:02:05.113,0:02:07.295 knows what you may need. 0:02:07.295,0:02:11.403 So really having computing [br]work and understand you 0:02:11.403,0:02:13.459 and understand that information, 0:02:13.459,0:02:15.769 we really haven't done that yet. 0:02:15.769,0:02:17.318 It's still very, very clunky. 0:02:17.318,0:02:19.684 CR: Tell me, when you look at what Google is doing, 0:02:19.684,0:02:22.653 where does Deep Mind fit? 0:02:22.653,0:02:24.237 LP: Yeah, so Deep Mind is a company 0:02:24.237,0:02:26.768 we just acquired recently. 0:02:26.768,0:02:29.850 It's in the U.K. 0:02:29.850,0:02:32.504 First, let me tell you the way we got there, 0:02:32.504,0:02:34.732 which was looking at search 0:02:34.732,0:02:36.355 and really understanding, 0:02:36.355,0:02:38.588 trying to understand everything, 0:02:38.588,0:02:40.193 and also make the computers not clunky 0:02:40.193,0:02:42.394 and really understand you -- 0:02:42.394,0:02:44.506 like, voice was really important. 0:02:44.506,0:02:47.367 So what's the state of the art [br]on speech recognition? 0:02:47.367,0:02:49.027 It's not very good. 0:02:49.027,0:02:51.093 It doesn't really understand you. 0:02:51.093,0:02:53.096 So we started doing machine learning research 0:02:53.096,0:02:54.633 to improve that. 0:02:54.633,0:02:56.336 That helped a lot. 0:02:56.336,0:02:58.703 And we started just looking at things like YouTube. 0:02:58.703,0:03:00.671 Can we understand YouTube? 0:03:00.671,0:03:03.357 But we actually ran machine learning on YouTube 0:03:03.357,0:03:07.442 and it discovered cats, just by itself. 0:03:07.442,0:03:09.533 Now, that's an important concept. 0:03:09.533,0:03:12.524 And we realized there's really something here. 0:03:12.524,0:03:14.641 If we can learn what cats are, 0:03:14.641,0:03:16.716 that must be really important. 0:03:16.716,0:03:19.345 So I think Deep Mind, 0:03:19.345,0:03:21.709 what's really amazing about Deep Mind 0:03:21.709,0:03:23.713 is that it can actually -- 0:03:23.713,0:03:27.270 they're learning things in this unsupervised way. 0:03:27.270,0:03:29.837 They started with video games, 0:03:29.837,0:03:32.330 and really just, maybe I can show the video, 0:03:32.330,0:03:34.534 just playing video games, 0:03:34.534,0:03:36.549 and learning how to do that automatically. 0:03:36.549,0:03:38.401 CR: Take a look at the video games 0:03:38.401,0:03:40.811 and how machines are coming to be able 0:03:40.811,0:03:43.267 to do some remarkable things. 0:03:43.267,0:03:44.596 LP: The amazing thing about this 0:03:44.596,0:03:46.276 is this is, I mean, obviously, 0:03:46.276,0:03:47.750 these are old games, 0:03:47.750,0:03:52.548 but the system just sees what you see, the pixels, 0:03:52.548,0:03:54.979 and it has the controls and it has the score, 0:03:54.979,0:03:57.190 and it's learned to play all of these games, 0:03:57.190,0:03:58.769 same program. 0:03:58.769,0:04:00.806 It's learned to play all of these games 0:04:00.806,0:04:02.592 with superhuman performance. 0:04:02.592,0:04:04.447 We've not been able to do things like this 0:04:04.447,0:04:05.965 with computers before. 0:04:05.965,0:04:08.260 And maybe I'll just narrate this one quickly. 0:04:08.260,0:04:11.065 This is boxing, and it figures out it can 0:04:11.065,0:04:13.699 sort of pin the opponent down. 0:04:13.699,0:04:15.438 The computer's on the left, 0:04:15.438,0:04:18.523 and it's just racking up points. 0:04:18.523,0:04:20.609 So imagine if this kind 0:04:20.609,0:04:22.736 of intelligence were thrown at your schedule, 0:04:22.736,0:04:27.373 or your information needs, or things like that. 0:04:27.373,0:04:29.991 We're really just at the beginning of that, 0:04:29.991,0:04:32.356 and that's what I'm really excited about. 0:04:32.356,0:04:34.826 CR: When you look at all that's taken place 0:04:34.826,0:04:37.410 with Deep Mind and the boxing, 0:04:37.410,0:04:39.750 also a part of where we're going 0:04:39.750,0:04:42.639 is artificial intelligence. 0:04:42.639,0:04:45.438 Where are we, when you look at that? 0:04:45.438,0:04:47.223 LP: Well, I think for me, 0:04:47.223,0:04:48.726 this is kind of one of the most exciting things 0:04:48.726,0:04:50.638 I've seen in a long time. 0:04:50.638,0:04:53.051 The guy who started this company, Demis, 0:04:53.051,0:04:55.829 has a neuroscience and a[br]computer science background. 0:04:55.829,0:04:57.459 He went back to school 0:04:57.459,0:05:00.585 to get his Ph.D. to study the brain. 0:05:00.585,0:05:03.205 And so I think we're seeing a lot of exciting work 0:05:03.205,0:05:06.286 going on that sort of crosses computer science 0:05:06.286,0:05:08.036 and neuroscience 0:05:08.036,0:05:10.361 in terms of really understanding 0:05:10.361,0:05:12.815 what it takes to make something smart 0:05:12.815,0:05:14.530 and do really interesting things. 0:05:14.530,0:05:16.668 CR: But where's the level of it now? 0:05:16.668,0:05:19.374 And how fast do you think we are moving? 0:05:19.374,0:05:22.643 LP: Well, this is the state of the art right now, 0:05:22.643,0:05:24.774 understanding cats on YouTube 0:05:24.774,0:05:26.057 and things like that, 0:05:26.057,0:05:28.204 improving voice recognition. 0:05:28.204,0:05:30.622 We used a lot of machine learning 0:05:30.622,0:05:33.101 to improve things incrementally, 0:05:33.101,0:05:36.495 but I think for me, this example's really exciting, 0:05:36.495,0:05:38.738 because it's one program 0:05:38.738,0:05:40.782 that can do a lot of different things. 0:05:40.782,0:05:41.920 CR: I don't know if we can do this, 0:05:41.920,0:05:43.105 but we've got the image of the cat. 0:05:43.105,0:05:44.859 It would be wonderful to see this. 0:05:44.859,0:05:47.368 This is how machines looked at cats 0:05:47.368,0:05:48.483 and what they came up with. 0:05:48.483,0:05:49.538 Can we see that image? 0:05:49.538,0:05:51.940 LP: Yeah.[br]CR: There it is. Can you see the cat? 0:05:51.940,0:05:53.967 Designed by machines, seen by machines. 0:05:53.967,0:05:55.077 LP: That's right. 0:05:55.077,0:05:57.684 So this is learned from just watching YouTube. 0:05:57.684,0:05:59.551 And there's no training, 0:05:59.551,0:06:00.935 no notion of a cat, 0:06:00.935,0:06:03.496 but this concept of a cat 0:06:03.496,0:06:06.304 is something important that you would understand, 0:06:06.304,0:06:08.827 and now that the machines can kind of understand. 0:06:08.827,0:06:09.999 Maybe just finishing 0:06:09.999,0:06:12.221 also on the search part, 0:06:12.221,0:06:15.007 it started with search, really understanding 0:06:15.007,0:06:17.571 people's context and their information. 0:06:17.571,0:06:19.431 I did have a video 0:06:19.431,0:06:21.441 I wanted to show quickly on that 0:06:21.441,0:06:23.088 that we actually found. 0:06:23.088,0:06:28.200 (Video) ["Soy, Kenya"] 0:06:28.580,0:06:30.452 Zack Matere: Not long ago, 0:06:30.452,0:06:33.038 I planted a crop of potatoes. 0:06:33.038,0:06:36.438 Then suddenly they started[br]dying one after the other. 0:06:36.438,0:06:39.188 I checked out the books and [br]they didn't tell me much. 0:06:39.188,0:06:41.134 So, I went and I did a search. 0:06:41.134,0:06:44.253 ["Zack Matere, Farmer"] 0:06:45.609,0:06:48.756 Potato diseases. 0:06:48.756,0:06:50.484 One of the websites told me 0:06:50.484,0:06:52.386 that ants could be the problem. 0:06:52.386,0:06:54.657 It said, sprinkle wood ash over the plants. 0:06:54.657,0:06:56.941 Then after a few days the ants disappeared. 0:06:56.941,0:06:59.535 I got excited about the Internet. 0:06:59.535,0:07:01.200 I have this friend 0:07:01.200,0:07:04.818 who really would like to expand his business. 0:07:04.818,0:07:08.013 So I went with him to the cyber cafe 0:07:08.013,0:07:10.554 and we checked out several sites. 0:07:10.554,0:07:13.095 When I met him next, he was going to put a windmill 0:07:13.095,0:07:15.789 at the local school. 0:07:15.789,0:07:17.393 I felt proud because 0:07:17.393,0:07:19.421 something that wasn't there before 0:07:19.421,0:07:21.308 was suddenly there. 0:07:21.308,0:07:23.998 I realized that not everybody 0:07:23.998,0:07:25.532 can be able to access 0:07:25.532,0:07:27.018 what I was able to access. 0:07:27.018,0:07:28.856 I thought that I need to have an Internet 0:07:28.856,0:07:30.657 that my grandmother can use. 0:07:30.657,0:07:33.114 So I thought about a notice board. 0:07:33.114,0:07:35.030 A simple wooden notice board. 0:07:35.030,0:07:37.345 When I get information on my phone, 0:07:37.345,0:07:39.582 I'm able to post the information 0:07:39.582,0:07:41.304 on the notice board. 0:07:41.304,0:07:44.162 So it's basically like a computer. 0:07:44.162,0:07:48.051 I use the Internet to help people. 0:07:48.051,0:07:51.461 I think I am searching for 0:07:51.461,0:07:53.002 a better life 0:07:53.002,0:07:57.116 for me and my neighbors. 0:07:57.116,0:08:01.100 So many people have access to information, 0:08:01.100,0:08:03.681 but there's no follow-up to that. 0:08:03.681,0:08:06.189 I think the follow-up to that is our knowledge. 0:08:06.189,0:08:07.795 When people have the knowledge, 0:08:07.795,0:08:09.425 they can find solutions 0:08:09.425,0:08:11.409 without having to helped out. 0:08:11.440,0:08:13.561 Information is powerful, 0:08:13.561,0:08:18.163 but it is how we use it that will define us. 0:08:18.163,0:08:22.544 (Applause) 0:08:22.544,0:08:25.090 LP: Now, the amazing thing about that video, 0:08:25.090,0:08:26.556 actually, was we just read about it in the news, 0:08:26.556,0:08:29.061 and we found this gentlemen, 0:08:29.061,0:08:31.376 and made that little clip. 0:08:31.376,0:08:32.767 CR: When I talk to people about you, 0:08:32.767,0:08:35.372 they say to me, people who know you well, say, 0:08:35.372,0:08:37.263 Larry wants to change the world, 0:08:37.263,0:08:41.375 and he believes technology can show the way. 0:08:41.375,0:08:43.233 And that means access to the Internet. 0:08:43.233,0:08:44.964 It has to do with languages. 0:08:44.964,0:08:47.793 It also means how people can get access 0:08:47.793,0:08:50.499 and do things that will affect their community, 0:08:50.499,0:08:52.992 and this is an example. 0:08:52.992,0:08:56.568 LP: Yeah, that's right, and I think for me, 0:08:56.568,0:08:58.950 I have been focusing on access more, 0:08:58.950,0:09:01.148 if we're talking about the future. 0:09:01.148,0:09:03.822 We recently released this Loon Project 0:09:03.822,0:09:06.122 which is using balloons to do it. 0:09:06.122,0:09:07.782 It sounds totally crazy. 0:09:07.782,0:09:10.321 We can show the video here. 0:09:10.321,0:09:11.801 Actually, two out of three people in the world 0:09:11.801,0:09:14.187 don't have good Internet access now. 0:09:14.187,0:09:17.093 We actually think this can really help people 0:09:17.093,0:09:19.150 sort of cost-efficiently. 0:09:19.150,0:09:22.521 CR: It's a balloon.[br]LP: Yeah, get access to the Internet. 0:09:22.521,0:09:24.664 CR: And why does this balloon give you access 0:09:24.664,0:09:25.877 to the Internet? 0:09:25.877,0:09:27.092 Because there was some interesting things 0:09:27.092,0:09:28.926 you had to do to figure out how 0:09:28.926,0:09:31.057 to make balloons possible, 0:09:31.057,0:09:32.806 they didn't have to be tethered. 0:09:32.806,0:09:34.887 LP: Yeah, and this is a good example of innovation. 0:09:34.887,0:09:37.431 Like, we've been thinking about this idea 0:09:37.431,0:09:39.203 for five years or more 0:09:39.203,0:09:40.804 before we started working on it, 0:09:40.804,0:09:42.123 but it was just really, 0:09:42.123,0:09:45.643 how do we get access points up high, cheaply? 0:09:45.643,0:09:47.435 You normally have to use satellites 0:09:47.435,0:09:50.374 and it takes a long time to launch them. 0:09:50.374,0:09:52.868 But you saw there how easy it is to launch a balloon 0:09:52.868,0:09:54.387 and get it up, 0:09:54.387,0:09:56.388 and actually again, it's the power of the Internet, 0:09:56.388,0:09:58.168 I did a search on it, 0:09:58.168,0:10:00.472 and I found, 30, 40 years ago, 0:10:00.472,0:10:02.361 someone had put up a balloon 0:10:02.361,0:10:05.166 and it had gone around the Earth multiple times. 0:10:05.166,0:10:08.001 And I thought, why can't we do that today? 0:10:08.001,0:10:10.368 And that's how this project got going. 0:10:10.368,0:10:12.698 CR: But are you at the mercy of the wind? 0:10:12.698,0:10:14.820 LP: Yeah, but it turns out, 0:10:14.820,0:10:16.313 we did some weather simulations 0:10:16.313,0:10:18.860 which probably hadn't really been done before, 0:10:18.860,0:10:20.970 and if you control the altitude of the balloons, 0:10:20.970,0:10:23.251 which you can do by pumping air into them 0:10:23.251,0:10:25.073 and other ways, 0:10:25.073,0:10:28.002 you can actually control roughly where they go, 0:10:28.002,0:10:30.207 and so I think we can build a worldwide mesh 0:10:30.207,0:10:33.546 of these balloons that can cover the whole planet. 0:10:33.546,0:10:35.788 CR: Before I talk about the future and transportation, 0:10:35.788,0:10:37.683 where you've been a nerd for a while, 0:10:37.683,0:10:40.107 and this fascination you have with transportation 0:10:40.107,0:10:42.170 and automated cars and bicycles, 0:10:42.170,0:10:43.907 let me talk a bit about what's been the subject here 0:10:43.907,0:10:46.350 earlier with Edward Snowden. 0:10:46.350,0:10:49.456 It is security and privacy. 0:10:49.456,0:10:51.796 You have to have been thinking about that. 0:10:51.796,0:10:53.150 LP: Yeah, absolutely. 0:10:53.150,0:10:55.993 I saw the picture of Sergey with[br]Edward Snowden yesterday. 0:10:55.993,0:10:58.863 Some of you may have seen it. 0:10:58.863,0:11:02.034 But I think, for me, I guess, 0:11:02.034,0:11:05.696 privacy and security are a really important thing. 0:11:05.696,0:11:07.941 We think about it in terms of both things, 0:11:07.941,0:11:10.844 and I think you can't have privacy without security, 0:11:10.844,0:11:13.215 so let me just talk about security first, 0:11:13.215,0:11:15.811 because you asked about Snowden and all of that, 0:11:15.811,0:11:18.252 and then I'll say a little bit about privacy. 0:11:18.252,0:11:22.052 I think for me, it's tremendously disappointing 0:11:22.052,0:11:23.491 that the government 0:11:23.491,0:11:25.821 secretly did all this stuff and didn't tell us. 0:11:25.821,0:11:29.124 I don't think we can have a democracy 0:11:29.124,0:11:32.554 if we're having to protect you and our users 0:11:32.554,0:11:34.250 from the government 0:11:34.250,0:11:37.053 for stuff that we've never had a conversation about. 0:11:37.053,0:11:38.949 And I don't mean we have to know 0:11:38.949,0:11:40.644 what the particular terrorist attack is they're worried 0:11:40.644,0:11:42.406 about protecting us from, 0:11:42.406,0:11:44.204 but we do need to know 0:11:44.204,0:11:46.614 what the parameters of it is, 0:11:46.614,0:11:48.658 what kind of surveillance the government's 0:11:48.658,0:11:50.826 going to do and how and why, 0:11:50.826,0:11:53.103 and I think we haven't had that conversation. 0:11:53.103,0:11:55.670 So I think the government's actually done 0:11:55.670,0:11:57.838 itself a tremendous disservice 0:11:57.838,0:11:59.999 by doing all that in secret. 0:11:59.999,0:12:01.614 CR: Never coming to Google 0:12:01.614,0:12:03.139 to ask for anything. 0:12:03.139,0:12:05.169 LP: Not Google, but the public. 0:12:05.169,0:12:08.942 I think we need to [br]have a debate about that, 0:12:08.942,0:12:11.441 or we can't have a functioning democracy. 0:12:11.441,0:12:12.847 It's just not possible. 0:12:12.847,0:12:15.091 So I'm sad that Google's 0:12:15.091,0:12:17.707 in the position of protecting you and our users 0:12:17.707,0:12:19.241 from the government 0:12:19.241,0:12:21.485 doing secret thing that nobody knows about. 0:12:21.485,0:12:23.232 It doesn't make any sense. 0:12:23.232,0:12:26.222 CR: Yeah. And then there's a privacy side of it. 0:12:26.222,0:12:28.649 LP: Yes. The privacy side, 0:12:28.649,0:12:30.618 I think it's -- the world is changing. 0:12:30.618,0:12:34.523 You carry a phone. It knows where you are. 0:12:34.523,0:12:37.608 There's so much more information about you, 0:12:37.608,0:12:40.454 and that's an important thing, 0:12:40.454,0:12:42.726 and it makes sense why people are asking 0:12:42.726,0:12:44.762 difficult questions. 0:12:44.762,0:12:48.129 We spend a lot of time thinking about this 0:12:48.129,0:12:50.840 and what the issues are. 0:12:50.840,0:12:52.569 I'm a little bit -- 0:12:52.569,0:12:53.829 I think the main thing that we need to do 0:12:53.829,0:12:56.191 is just provide people choice, 0:12:56.191,0:12:58.703 show them what data's being collected -- 0:12:58.703,0:13:03.454 search history, location data. 0:13:03.454,0:13:06.226 We're excited about incognito mode in Chrome, 0:13:06.226,0:13:08.475 and doing that in more ways, 0:13:08.475,0:13:09.871 just giving people more choice 0:13:09.871,0:13:13.164 and more awareness of what's going on. 0:13:13.164,0:13:15.557 I also think it's very easy. 0:13:15.557,0:13:16.834 What I'm worried is that we throw out 0:13:16.834,0:13:18.924 the baby with the bathwater. 0:13:18.924,0:13:21.838 And I look at, on your show, actually, 0:13:21.838,0:13:23.557 I kind of lost my voice, 0:13:23.557,0:13:24.888 and I haven't gotten it back. 0:13:24.888,0:13:26.532 I'm hoping that by talking to you 0:13:26.532,0:13:28.185 I'm going to get it back. 0:13:28.185,0:13:29.917 CR: If I could do anything, I would do that. 0:13:29.917,0:13:32.097 LP: All right. So get out your voodoo doll 0:13:32.097,0:13:34.516 and whatever you need to do. 0:13:34.516,0:13:36.844 But I think, you know what, I look at that, 0:13:36.844,0:13:38.674 I made that public, 0:13:38.674,0:13:39.891 and I got all this information. 0:13:39.891,0:13:42.620 We got a survey done on medical conditions 0:13:42.620,0:13:45.991 with people who have similar issues, 0:13:45.991,0:13:50.732 and I look at medical records, and I say, 0:13:50.732,0:13:52.137 wouldn't it be amazing 0:13:52.137,0:13:54.187 if everyone's medical records were available 0:13:54.187,0:13:55.870 anonymously 0:13:55.870,0:13:58.506 to research doctors? 0:13:58.506,0:14:01.547 And when someone accesses your medical record, 0:14:01.547,0:14:03.156 a research doctor, 0:14:03.156,0:14:05.790 they could see, you could see which doctor 0:14:05.790,0:14:07.650 accessed it and why, 0:14:07.650,0:14:09.230 and you could maybe learn about 0:14:09.230,0:14:10.860 what conditions you have. 0:14:10.860,0:14:12.362 I think if we just did that, 0:14:12.362,0:14:14.527 we'd save 100,000 lives this year. 0:14:14.527,0:14:17.475 CR: Absolutely. Let me go — (Applause) 0:14:17.475,0:14:20.237 LP: So I guess I'm just very worried that 0:14:20.237,0:14:22.043 with Internet privacy, 0:14:22.043,0:14:24.343 we're doing the same thing we're [br]doing with medical records, 0:14:24.347,0:14:26.876 is we're throwing out the baby with the bathwater, 0:14:26.876,0:14:28.704 and we're not really thinking 0:14:28.704,0:14:30.914 about the tremendous good that can come 0:14:30.914,0:14:33.105 from people sharing information 0:14:33.105,0:14:35.682 with the right people in the right ways. 0:14:35.682,0:14:37.919 CR: And the necessary condition 0:14:37.919,0:14:39.621 that people have to have confidence 0:14:39.621,0:14:42.076 that their information will not be abused. 0:14:42.076,0:14:43.853 LP: Yeah, and I had this problem with my voice stuff. 0:14:43.853,0:14:45.361 I was scared to share it. 0:14:45.361,0:14:47.251 Sergey encouraged me to do that, 0:14:47.251,0:14:49.078 and it was a great thing to do. 0:14:49.078,0:14:50.812 CR: And the response has been overwhelming. 0:14:50.812,0:14:52.472 LP: Yeah, and people are super positive. 0:14:52.472,0:14:55.305 We got thousands and thousands of people 0:14:55.305,0:14:56.593 with similar conditions, 0:14:56.593,0:14:59.621 which there's no data on today. 0:14:59.621,0:15:00.977 So it was a really good thing. 0:15:00.977,0:15:03.996 CR: So talking about the future, what is it about you 0:15:03.996,0:15:07.754 and transportation systems? 0:15:07.754,0:15:09.931 LP: Yeah. I guess I was just frustrated 0:15:09.931,0:15:12.470 with this when I was at college in Michigan. 0:15:12.470,0:15:13.920 I had to get on the bus and take it 0:15:13.920,0:15:15.562 and wait for it. 0:15:15.562,0:15:17.741 And it was cold and snowing. 0:15:17.741,0:15:20.396 I did some research on how much it cost, 0:15:20.396,0:15:26.821 and I just became a bit obsessed[br]with transportation systems. 0:15:26.821,0:15:29.191 CR: And that began the idea of an automated car. 0:15:29.191,0:15:30.885 LP: Yeah, about 18 years ago I learned about 0:15:30.885,0:15:34.067 people working on automated cars, 0:15:34.067,0:15:35.690 and I became fascinated by that, 0:15:35.690,0:15:38.467 and it takes a while to [br]get these projects going, 0:15:38.467,0:15:43.564 but I'm super excited about the possibilities of that 0:15:43.564,0:15:45.232 improving the world. 0:15:45.232,0:15:49.758 There's 20 million people or more injured per year. 0:15:49.758,0:15:51.744 It's the leading cause of death 0:15:51.744,0:15:53.874 for people under 34 in the U.S. 0:15:53.874,0:15:55.425 CR: So you're talking about saving lives. 0:15:55.425,0:15:57.780 LP: Yeah, and also saving space 0:15:57.780,0:16:01.695 and making life better. 0:16:01.695,0:16:05.940 Los Angeles is half parking lots and roads, 0:16:05.940,0:16:07.673 half of the area, 0:16:07.673,0:16:10.500 and most cities are not far behind, actually. 0:16:10.500,0:16:12.064 It's just crazy 0:16:12.064,0:16:13.657 that that's what we use our space for. 0:16:13.657,0:16:16.000 CR: And how soon will we be there? 0:16:16.000,0:16:17.926 LP: I think we can be there very, very soon. 0:16:17.926,0:16:21.427 We've driven well over 100,000 miles 0:16:21.427,0:16:25.520 now totally automated. 0:16:25.520,0:16:29.172 I'm super excited about getting that out quickly. 0:16:29.172,0:16:31.577 CR: But it's not only you're[br]talking about automated cars. 0:16:31.577,0:16:33.963 You also have this idea for bicycles. 0:16:33.963,0:16:36.209 LP: Well at Google, we got this idea 0:16:36.209,0:16:39.660 that we should just provide free bikes to everyone, 0:16:39.660,0:16:42.428 and that's been amazing, most of the trips. 0:16:42.428,0:16:44.014 You see bikes going everywhere, 0:16:44.014,0:16:45.580 and the bikes wear out. 0:16:45.580,0:16:47.034 They're getting used 24 hours a day. 0:16:47.034,0:16:49.194 CR: But you want to put them above the street, too. 0:16:49.194,0:16:50.769 LP: Well I said, how do we get people 0:16:50.769,0:16:52.296 using bikes more? 0:16:52.296,0:16:53.921 CR: We may have a video here. 0:16:53.921,0:16:55.199 LP: Yeah, let's show the video. 0:16:55.199,0:16:58.291 I just got excited about this. 0:16:58.291,0:17:02.333 (Music) 0:17:04.213,0:17:06.638 So this is actually how you might separate 0:17:06.638,0:17:10.267 bikes from cars with minimal cost. 0:17:14.711,0:17:16.466 Anyway, it looks totally crazy, 0:17:16.466,0:17:18.793 but I was actually thinking about our campus, 0:17:18.793,0:17:20.853 working with the Zippies and stuff, 0:17:20.853,0:17:23.151 and just trying to get a lot more bike usage, 0:17:23.151,0:17:24.699 and I was thinking about, 0:17:24.699,0:17:27.530 how do you cost-effectively separate 0:17:27.530,0:17:28.944 the bikes from traffic? 0:17:28.944,0:17:30.094 And I went and searched, 0:17:30.094,0:17:31.465 and this is what I found. 0:17:31.465,0:17:33.310 And we're not actually working on this, 0:17:33.310,0:17:34.602 that particular thing, 0:17:34.602,0:17:36.656 but it gets your imagination going. 0:17:36.656,0:17:38.420 CR: Let me close with this. 0:17:38.420,0:17:40.765 Give me a sense of the philosophy [br]of your own mind. 0:17:40.765,0:17:43.253 You have this idea of [Google X]. 0:17:43.253,0:17:46.249 You don't simply want 0:17:46.249,0:17:51.845 to go in some small, measurable arena of progress. 0:17:51.845,0:17:53.558 LP: Yeah, I think 0:17:53.558,0:17:55.689 many of the things we just [br]talked about are like that, 0:17:55.689,0:17:58.641 where they're really -- 0:17:58.641,0:18:02.271 I almost use the economic concept of additionality, 0:18:02.271,0:18:04.461 which means that you're doing something 0:18:04.461,0:18:07.409 that wouldn't happen unless [br]you were actually doing it. 0:18:07.409,0:18:10.549 And I think the more you can do things like that, 0:18:10.549,0:18:12.620 the bigger impact you have, 0:18:12.620,0:18:15.610 and that's about doing things 0:18:15.610,0:18:19.217 that people might not think are possible. 0:18:19.217,0:18:21.046 And I've been amazed, 0:18:21.046,0:18:23.275 the more I learn about technology, 0:18:23.275,0:18:25.471 the more I realize I don't know, 0:18:25.471,0:18:28.808 and that's because this technological horizon, 0:18:28.808,0:18:31.705 the thing that you can see to do next, 0:18:31.705,0:18:33.545 the more you learn about technology, 0:18:33.545,0:18:36.147 the more you learn what's possible. 0:18:36.147,0:18:38.393 You learn that the balloons are possible 0:18:38.393,0:18:40.730 because there's some material[br]that will work for them. 0:18:40.730,0:18:43.109 CR: What's interesting about [br]you too, though, for me, 0:18:43.109,0:18:44.820 is that, we have lots of people 0:18:44.820,0:18:46.962 who are thinking about the future, 0:18:46.962,0:18:50.230 and they are going and looking[br]and they're coming back, 0:18:50.230,0:18:52.357 but we never see the implementation. 0:18:52.357,0:18:53.962 I think of somebody you knew 0:18:53.962,0:18:56.869 and read about, Tesla. 0:18:56.869,0:19:00.673 The principle of that for you is what? 0:19:00.673,0:19:02.458 LP: Well, I think invention is not enough. 0:19:02.458,0:19:03.679 If you invent something, 0:19:03.679,0:19:06.874 Tesla invented electric power that we use, 0:19:06.874,0:19:09.535 but he struggled to get it out to people. 0:19:09.535,0:19:11.219 That had to be done by other people. 0:19:11.219,0:19:12.845 It took a long time. 0:19:12.845,0:19:16.712 And I think if we can actually combine both things, 0:19:16.712,0:19:20.243 where we have an innovation and invention focus, 0:19:20.243,0:19:23.215 plus the ability to really -- a company 0:19:23.215,0:19:25.213 that can really commercialize things 0:19:25.213,0:19:26.843 and get them to people 0:19:26.843,0:19:28.918 in a way that's positive for the world 0:19:28.918,0:19:30.974 and to give people hope. 0:19:30.974,0:19:33.748 You know, I'm amazed with the Loon Project 0:19:33.748,0:19:36.534 just how excited people were about that, 0:19:36.534,0:19:38.348 because it gave them hope 0:19:38.348,0:19:39.969 for the two thirds of the world 0:19:39.969,0:19:42.695 that doesn't have Internet right now that's any good. 0:19:42.695,0:19:44.817 CR: Which is a second thing about corporations. 0:19:44.817,0:19:47.293 You are one of those people who believe 0:19:47.293,0:19:49.610 that corporations are an agent of change 0:19:49.610,0:19:51.081 if they are run well. 0:19:51.081,0:19:52.902 LP: Yeah. I'm really dismayed 0:19:52.902,0:19:56.196 most people think companies are basically evil. 0:19:56.196,0:19:57.962 They get a bad rap. 0:19:57.962,0:20:00.203 And I think that's somewhat correct. 0:20:00.203,0:20:03.073 Companies are doing the same incremental thing 0:20:03.073,0:20:04.836 that they did 50 years ago 0:20:04.836,0:20:06.467 or 20 years ago. 0:20:06.467,0:20:07.837 That's not really what we need. 0:20:07.837,0:20:10.055 We need, especially in technology, 0:20:10.055,0:20:12.172 we need revolutionary change, 0:20:12.172,0:20:13.585 not incremental change. 0:20:13.585,0:20:14.754 CR: You once said, actually, 0:20:14.754,0:20:16.572 as I think I've got this about right, 0:20:16.572,0:20:18.217 that you might consider, 0:20:18.217,0:20:19.970 rather than giving your money, 0:20:19.970,0:20:23.290 if you were leaving it to some cause, 0:20:23.290,0:20:25.296 just simply giving it to Elon Musk, 0:20:25.296,0:20:26.459 because you had confidence 0:20:26.459,0:20:28.301 that he would change the future, 0:20:28.301,0:20:30.078 and that you would therefore — 0:20:30.078,0:20:31.662 LP: Yeah, if you want to go Mars, 0:20:31.662,0:20:33.383 he wants to go to Mars, 0:20:33.383,0:20:35.354 to back up humanity, 0:20:35.354,0:20:37.026 that's a worthy goal, but it's a company, 0:20:37.026,0:20:39.581 and it's philanthropical. 0:20:39.581,0:20:42.533 So I think we aim to do kind of similar things. 0:20:42.533,0:20:45.520 And I think, you ask, we have a lot of employees 0:20:45.520,0:20:48.835 at Google who have become pretty wealthy. 0:20:48.835,0:20:51.355 People make a lot of money in technology. 0:20:51.355,0:20:53.511 A lot of people in the room are pretty wealthy. 0:20:53.511,0:20:55.825 You're working because you[br]want to change the world. 0:20:55.825,0:20:57.587 You want to make it better. 0:20:57.587,0:21:01.032 Why isn't the company that you work for 0:21:01.032,0:21:02.975 worthy not just of your time 0:21:02.975,0:21:05.126 but your money as well? 0:21:05.126,0:21:06.848 I mean, but we don't have a concept of that. 0:21:06.848,0:21:09.152 That's not how we think about companies, 0:21:09.152,0:21:10.619 and I think it's sad, 0:21:10.619,0:21:14.386 because companies are most of our effort. 0:21:14.386,0:21:16.901 They're where most of people's time is, 0:21:16.901,0:21:18.755 where a lot of the money is, 0:21:18.755,0:21:21.107 and so I think I'd like for us to help out 0:21:21.107,0:21:22.233 more than we are. 0:21:22.233,0:21:23.954 CR: When I close conversations with lots of people, 0:21:23.954,0:21:25.733 I always ask this question: 0:21:25.733,0:21:27.248 What state of mind, 0:21:27.248,0:21:29.057 what quality of mind is it 0:21:29.057,0:21:30.824 that has served you best? 0:21:30.824,0:21:33.345 People like Rupert Murdoch have said curiosity, 0:21:33.345,0:21:35.973 and other people in the media have said that. 0:21:35.973,0:21:38.997 Bill Gates and Warren Buffett have said focus. 0:21:38.997,0:21:40.424 What quality of mind, 0:21:40.424,0:21:41.798 as I leave this audience, 0:21:41.798,0:21:45.328 has enabled you to think about the future 0:21:45.328,0:21:46.975 and at the same time 0:21:46.975,0:21:49.180 change the present? 0:21:49.180,0:21:50.850 LP: You know, I think the most important thing -- 0:21:50.850,0:21:52.462 I looked at lots of companies 0:21:52.462,0:21:55.765 and why I thought they don't succeed over time. 0:21:55.765,0:21:58.598 We've had a more rapid turnover of companies. 0:21:58.598,0:22:01.367 And I said, what did they fundamentally do wrong? 0:22:01.367,0:22:03.534 What did those companies all do wrong? 0:22:03.534,0:22:06.806 And usually it's just that they missed the future. 0:22:06.806,0:22:09.250 And so I think, for me, 0:22:09.250,0:22:11.674 I just try to focus on that and say, 0:22:11.674,0:22:13.858 what is that future really going to be 0:22:13.858,0:22:15.645 and how do we create it, 0:22:15.645,0:22:20.312 and how do we cause our organization, 0:22:20.312,0:22:22.752 to really focus on that 0:22:22.752,0:22:26.077 and drive that at a really high rate? 0:22:26.077,0:22:27.437 And so that's been curiosity, 0:22:27.437,0:22:29.170 it's been looking at things 0:22:29.170,0:22:30.888 people might not think about, 0:22:30.888,0:22:33.993 working on things that no one else is working on, 0:22:33.993,0:22:37.299 because that's where the additionality really is, 0:22:37.299,0:22:38.850 and be willing to do that, 0:22:38.850,0:22:40.232 to take that risk. 0:22:40.232,0:22:41.297 Look at Android. 0:22:41.297,0:22:44.082 I felt guilty about working on Android 0:22:44.082,0:22:45.398 when it was starting. 0:22:45.398,0:22:47.356 It was a little startup we bought. 0:22:47.356,0:22:50.026 It wasn't really what we were really working on. 0:22:50.026,0:22:52.521 And I felt guilty about spending time on that. 0:22:52.521,0:22:53.975 That was stupid. 0:22:53.975,0:22:55.026 That was the future, right? 0:22:55.026,0:22:57.311 That was a good thing to be working on. 0:22:57.311,0:22:58.728 CR: It is great to see you here. 0:22:58.728,0:23:00.188 It's great to hear from you, 0:23:00.188,0:23:02.485 and a pleasure to sit at this table with you. 0:23:02.485,0:23:03.413 Thanks, Larry. 0:23:03.413,0:23:05.516 LP: Thank you. 0:23:05.516,0:23:09.448 (Applause) 0:23:09.448,0:23:12.759 CR: Larry Page.