0:00:00.000,0:00:03.204 (Toby Walsh) I want to talk [br]about artificial intelligence: it's -- 0:00:03.674,0:00:05.089 I'm a professor of artificial intelligence 0:00:05.089,0:00:09.271 and its a great time, 2015, [br]to be working in AI. 0:00:09.544,0:00:10.544 We're making real palpable progress 0:00:10.544,0:00:12.820 and there's loads of money [br]being thrown at us. 0:00:12.820,0:00:17.000 Google just spent [br]five hundred million dollars -- 0:00:17.000,0:00:21.989 pounds buying an AI startup called Deep Mind a couple of weeks ago, 0:00:21.989,0:00:25.172 today they announced that they were [br]going to spend a billion dollars 0:00:25.671,0:00:28.469 setting up an AI lab in Silicon Valley. 0:00:29.001,0:00:31.591 IBM is betting about [br]a third of the company 0:00:31.830,0:00:34.858 on their cognitive AI computing effort. 0:00:35.084,0:00:38.216 So it's really interesting time to be working in AI. 0:00:38.631,0:00:41.782 But the first thing I wanted to [br]help inform you about 0:00:42.250,0:00:46.792 is what is the state of art, [br]what progress have we made in AI 0:00:46.792,0:00:51.187 because Hollywood paints [br]all these pictures, 0:00:51.187,0:00:53.488 these mostly dystopian pictures of AI. 0:00:53.495,0:00:58.672 Whenever the next science fiction movie[br]comes out, I put my head in my hands 0:00:58.672,0:01:01.755 and think Oh my God, what do people think[br]that we're doing? 0:01:01.755,0:01:08.729 So, I wanted to start by just giving you[br]a feel for what actually is really capable. 0:01:09.081,0:01:13.872 So a couple of years ago, IBM Watson[br]won the game show Jeopardy 1, 0:01:13.872,0:01:19.737 the million dollar prize in the game show,[br]Jeopardy P, the reigning human champions. 0:01:19.737,0:01:23.357 Now, you might think, well that's just[br]a party trick, isn't it? 0:01:23.357,0:01:27.625 It's a -- pour enough of Wikipedia [br]and the internet into a computer, 0:01:27.625,0:01:30.543 and it can answer general knowledge[br]questions. 0:01:30.543,0:01:33.204 Well, you guys are being a bit unfair to[br]IBM Watson, 0:01:33.204,0:01:35.881 there are the cryptic questions[br]they are answering. 0:01:35.881,0:01:40.878 But just to give you a real feel for [br]what is technically possible today, 0:01:40.878,0:01:43.586 something that was announced[br]just two days ago: 0:01:43.834,0:01:51.869 some colleagues of mine at NII in Japan[br]passed the University Entrance Exam 0:01:51.869,0:01:53.433 with an AI program. 0:01:53.433,0:01:56.326 Now, I thought long and hard about[br]putting up a page of maths. 0:01:56.326,0:01:58.007 I thought, well, I'm going to get -- 0:01:58.007,0:02:00.007 half of the audience is going to [br]leave immediately 0:02:00.007,0:02:01.354 if I put up a page of math 0:02:01.354,0:02:03.133 But I wanted you to see, just to feel 0:02:03.962,0:02:07.403 the depth of questions [br]that they were answering. 0:02:07.403,0:02:11.507 So this is from the maths paper, you know,[br]a non trivial, sort of, 0:02:11.507,0:02:15.148 if you come from the UK, [br]A-level-like math question 0:02:15.148,0:02:16.468 that they were able to answer 0:02:17.272,0:02:21.889 Here is a physics question[br]about Newtonian dynamics 0:02:21.889,0:02:23.552 that they were able to answer. 0:02:23.552,0:02:28.545 Now they got 511 points, out of[br]a maximum of 950. 0:02:28.939,0:02:32.391 That's more than the average score[br]of Japanese students 0:02:32.391,0:02:34.007 sitting the entrance exam. 0:02:34.007,0:02:38.542 They would have got into most [br]Japanese universities with a score of 511. 0:02:38.908,0:02:40.621 That's what's possible today. 0:02:40.621,0:02:44.134 Their ambition in 10 years' time is to get[br]into Tokyo, University of Tokyo, 0:02:44.134,0:02:46.434 which is one of the best universities[br]in the world. 0:02:48.488,0:02:50.475 So, this is why I put up [br]a picture of Terminator, 0:02:50.475,0:02:53.468 because whenever I talk to the media[br]about what we do in AI, 0:02:53.476,0:02:55.769 they put up a picture of Terminator,[br]right? 0:02:55.769,0:02:58.098 So I don't want you to worry [br]about Terminator, 0:02:58.098,0:03:02.117 Terminator is at least [br]50 to 100 years away. 0:03:02.582,0:03:05.985 and there are lots of reasons why[br]we don't have to worry about it, 0:03:05.985,0:03:07.248 about Terminator. 0:03:07.248,0:03:08.979 I'm not going to go and spend[br]too much time 0:03:08.979,0:03:11.348 on why you don't have to worry about Terminator. 0:03:11.676,0:03:13.334 But there is actually things [br]that you should worry about, 0:03:13.334,0:03:15.352 much nearer than Terminator. 0:03:17.007,0:03:20.160 Many people have said, [br]Stephen Hawkins has said 0:03:20.160,0:03:22.713 that, you know, AI is going to spell[br]the end of the human race. 0:03:23.295,0:03:27.961 Elon Musk chimed in afterwards, said[br]"It's our biggest existential threat." 0:03:29.793,0:03:32.841 Bill Gates followed on by saying,[br]"Elon Musk was right." 0:03:33.498,0:03:39.519 Lots of people have said that's a --[br]AI is a real existential threat to us. 0:03:39.519,0:03:42.887 I don't want you to worry about[br]the existential threat that AI 0:03:42.887,0:03:44.900 or Terminator is going to bring. 0:03:45.953,0:03:47.711 There's actually a very common confusion, 0:03:48.121,0:03:51.139 which is that it's not the AI [br]that's going to be the existential threat, 0:03:51.139,0:03:52.139 it's autonomy, it's autonomous systems. 0:03:54.386,0:03:56.042 What's in the (check) -- 0:03:56.042,0:03:57.972 it isn't it going to wake up [br]any time in the morning and say: 0:03:57.972,0:04:02.109 "You know what? I'm tired [br]of playing Jeopardy, 0:04:02.109,0:04:04.173 "I want to play Who Wants to Be a Millionaire! 0:04:05.299,0:04:07.468 "Or wait a second, I'm tired of playing [br]game shows, 0:04:07.468,0:04:09.393 I want to take over the universe." 0:04:09.821,0:04:12.195 It's just not in its code, there is no way, 0:04:12.195,0:04:15.065 it's not given any freedom to think [br]about anything other 0:04:15.065,0:04:17.000 than maximizing its Jeopardy score. 0:04:18.219,0:04:21.209 And it has no desires, no other desires than, 0:04:21.228,0:04:23.482 other to improve its maximum scores. 0:04:23.482,0:04:26.256 So I don't want you to worry about Terminator, 0:04:27.350,0:04:29.102 but I do want you to worry about jobs. 0:04:29.974,0:04:32.166 Because lots of people, [br]lots of very serious people, 0:04:32.166,0:04:35.076 have been saying [br]hat AI is going to end jobs, 0:04:35.076,0:04:38.157 and that is a very great consequence [br]for anyone working in education, 0:04:38.157,0:04:42.269 because, certainly, the jobs that are going [br]to exist in the future 0:04:42.269,0:04:44.925 are going to be different [br]than the jobs that exist today. 0:04:46.880,0:04:49.724 Now, who has an odd birthday? 0:04:51.251,0:04:52.938 Well, I haven't told you [br]what an odd birthday is yet, 0:04:52.938,0:04:54.990 so someone has an odd birthday, like me. 0:04:54.990,0:04:57.858 OK. Who was born on an odd-number [br]day of the month? 0:04:57.858,0:04:59.995 I was born on the 11th of April, right? 0:05:00.824,0:05:03.368 Come on, it's half the room, [br]I know it's half the room. 0:05:03.368,0:05:04.923 (Laughter) 0:05:04.923,0:05:08.690 OK.Well, you want to have [br]an odd birthday, by the way, 0:05:09.059,0:05:12.504 because that means, in 20 years' time, [br]you will be a person with a job. 0:05:13.426,0:05:16.277 As opposed to the even people, [br]who won't have jobs. 0:05:16.277,0:05:20.662 That's certainty -- if you believe [br]lots of serious people, 0:05:22.141,0:05:25.010 you might have missed this news [br]on Friday the 13th, 0:05:25.010,0:05:27.778 I thought this was a rather[br]depressing news story 0:05:27.778,0:05:29.748 ....... (check) comparison otherwise ...... (check) 0:05:29.777,0:05:31.527 but the chief economist, Bank of England, 0:05:31.922,0:05:36.230 went on the record saying 50% of jobs [br]were under threat in the UK. 0:05:36.948,0:05:40.420 And he's not the first serious person[br]who should know what he's talking about 0:05:40.420,0:05:42.736 who said similar things. 0:05:43.029,0:05:46.255 There was a very influential Merrill Lynch[br]report that came out a month or to ago 0:05:46.270,0:05:49.430 saying very similar things about [br]the impact of AI, 0:05:49.430,0:05:51.375 robotics, automation on jobs. 0:05:53.439,0:05:56.278 And some of this goes back to, I think, [br]one of the first reports 0:05:56.278,0:05:59.288 that really hit the press, [br]that really got people's attention, 0:05:59.288,0:06:02.075 was a report that came out of [br]the Oxford Martin School. 0:06:02.075,0:06:05.702 They predicted that 47% of jobs [br]in the United States 0:06:06.110,0:06:08.550 were under threat of automation [br]in the next 20 years. 0:06:10.234,0:06:13.594 We followed that up [br]with a very similar study and analysis 0:06:13.880,0:06:15.961 for jobs in Australia, where I work. 0:06:16.865,0:06:19.323 And because it's a slightly different [br]profile of workers, 0:06:19.323,0:06:24.329 of the work force in Australia, we came up[br]with a number of around 40%. 0:06:24.329,0:06:27.129 These are non trivial numbers, right? [br]40-50%. 0:06:27.589,0:06:32.497 No, just an aside: 47%, I don't know [br]why they didn't say 47.2%, right? 0:06:32.497,0:06:35.551 You can't believe a number [br]that's far too precise 0:06:35.551,0:06:37.842 when you're predicting the future,[br]but nevertheless, 0:06:37.842,0:06:39.532 the fact that it's of this sort of scale, 0:06:39.532,0:06:44.601 you've got to take away: it wasn't 4%, [br]it was roughly about half the jobs. 0:06:47.784,0:06:50.495 Now, let's put some context to this. 0:06:50.495,0:06:52.413 I mean, is this really a credible claim? 0:06:52.413,0:06:55.549 The Number One job [br]in the United States today: truck driver. 0:06:55.962,0:06:57.526 Now you might have noticed, 0:06:57.536,0:07:00.079 the autonomous cars[br]are coming to us very soon. 0:07:00.506,0:07:03.579 We're going to be having -- tried[br]the first trial of autonomous cars 0:07:03.579,0:07:06.760 on the roads, public roads of Australia, [br]three weeks ago. 0:07:07.371,0:07:09.657 The Google Car has driven [br]over a million kilometers 0:07:09.657,0:07:12.542 -- or the Google cars, rather, [br]have driven over a million kilometers, 0:07:12.907,0:07:15.050 autonomously, on the roads of California. 0:07:15.893,0:07:18.553 In 20 years' time, we are going to have [br]autonomous cars. 0:07:18.553,0:07:20.255 We're also going to have [br]autonomous trucks. 0:07:21.033,0:07:24.769 So if you are in the Number One profession[br]in the United States, 0:07:24.769,0:07:28.480 you have to worry that your job [br]is not going to be automated away. 0:07:29.712,0:07:32.494 The Number Two job in the United States [br]is salesperson. 0:07:33.343,0:07:36.248 Again, since we use the internet, 0:07:36.248,0:07:39.618 we've actually mostly automated [br]that process ourselves, 0:07:39.618,0:07:43.382 but it's clear that a lot of those jobs [br]are going to be disappearing. 0:07:43.382,0:07:46.736 So I think these claims [br]have a lot of credibility. 0:07:48.896,0:07:53.104 There's actually a nice dinner party game [br]that my colleagues in AI play 0:07:53.104,0:07:55.736 at the end of our conferences,[br]where we sit around 0:07:55.736,0:07:58.250 and the game is, you have to name a job 0:07:59.115,0:08:02.671 and then, someone has to put up [br]some credible evidence 0:08:02.671,0:08:05.748 that we're actually well on the way [br]to actually automating that. 0:08:05.748,0:08:08.152 And this game is almost impossible to win. 0:08:08.737,0:08:10.423 If I had more time, [br]I'd play the game with you. 0:08:11.293,0:08:15.260 The only -- about the only winning answer[br]is politician. 0:08:15.650,0:08:17.931 (Laughter) 0:08:17.931,0:08:20.747 They will certainly regulate that[br]they'll be the last to be automated. 0:08:20.747,0:08:22.731 But that's about [br]the only winning answer we have. 0:08:24.049,0:08:28.836 So -- and it's not just technology [br]that is the cause of this. 0:08:28.836,0:08:33.035 There's many other, really, [br]sort of rather unhelpful trends. 0:08:33.035,0:08:35.558 If you were trying to set up [br]the world's economy, 0:08:35.558,0:08:38.842 you would not put these things [br]all down on the table at the same time: 0:08:38.842,0:08:40.825 the global, [br]ongoing global financial crisis, 0:08:40.825,0:08:43.840 which seems like [br]it will never disappear, I think; 0:08:44.755,0:08:48.013 the fact that we're all living longer: [br]this is great, great news for us 0:08:48.013,0:08:50.333 but bad news for employment; 0:08:50.961,0:08:54.425 the impact of globalization, the fact that [br]we can outsource our work 0:08:54.425,0:08:56.191 to cheaper economies. 0:08:56.191,0:09:00.191 All of these things [br]are compounding the impact 0:09:00.191,0:09:02.957 that technology is having [br]on the nature of work. 0:09:04.749,0:09:09.725 And this transformation is going to be [br]different than the last one, 0:09:09.725,0:09:11.357 the Industrial revolution. 0:09:12.021,0:09:15.591 There's no hard and fast [br]rule of economics that says: 0:09:16.911,0:09:20.039 "As many jobs need to be created [br]by a new technology as destroyed." 0:09:20.039,0:09:22.952 Every time we have a new technology, [br]of course, new jobs are created. 0:09:22.952,0:09:25.883 There's lots of, there's thousands, [br]hundreds of thousands of new jobs 0:09:26.204,0:09:27.848 enabled by technology today. 0:09:28.355,0:09:32.484 But there's no reason that they have to [br]balance exactly those that are destroyed. 0:09:33.092,0:09:36.199 In the last -- in the last revolution, [br]that did happen to be the case. 0:09:36.853,0:09:40.930 A third of the population was working [br]out in the fields, in agriculture. 0:09:40.930,0:09:44.552 Now, worldwide, [br]it's 3 or 4% of the world's population 0:09:44.552,0:09:45.657 working in agriculture. 0:09:45.657,0:09:48.029 Those people are working [br]in factories and offices now. 0:09:48.029,0:09:51.920 We employ far more people than we did [br]at the turn of the 19th century. 0:09:52.629,0:09:55.993 But this one looks different, this [br]information revolution looks different. 0:09:55.993,0:10:00.233 It looks like it has the potential [br]to take away more jobs, perhaps, 0:10:00.233,0:10:01.529 than it does. 0:10:01.529,0:10:03.684 And one of the other things is that [br]we used to think 0:10:03.684,0:10:05.293 it was the blue-collar jobs. 0:10:06.509,0:10:10.937 And that's true: if you go [br]to a car factory today, sure enough, 0:10:10.937,0:10:12.578 there are robots [br]that are doing the painting, 0:10:12.578,0:10:14.800 there are robots [br]that are doing the welding. 0:10:14.800,0:10:16.584 But nowadays, it's white-collar jobs: 0:10:16.584,0:10:20.006 it's journalists, it's lawyers, [br]it's accountants, 0:10:20.006,0:10:21.667 these jobs that are under threat. 0:10:21.667,0:10:25.444 These graphs here show [br]the percentage change in employment 0:10:25.444,0:10:31.244 and the change in employment rates. 0:10:31.660,0:10:34.672 And it's the middle, the middle class, [br]white-collar professions 0:10:34.672,0:10:37.923 that we thought that you would go [br]to university to make yourself safe, 0:10:37.926,0:10:40.551 but it seems to be the ones [br]that are most under threat. 0:10:40.551,0:10:42.646 If you are a ....... (check) [br]it's probably -- 0:10:43.277,0:10:46.172 you're too cheap to be replaced [br]by something automated. 0:10:46.172,0:10:49.963 But if you're a more expensive person, [br]and this means (check) 0:10:49.963,0:10:53.427 that the rich are getting richer and [br]inequalities that we are seeing in society 0:10:53.427,0:10:55.281 that are distressing our societies to day, 0:10:55.281,0:10:58.284 seem to be magnified [br]by these technological changes. 0:10:59.899,0:11:01.761 And there is so many frightening graphs, 0:11:01.761,0:11:03.723 Go and read [br]Thomas Picketty (check), I encourage you. 0:11:04.079,0:11:05.980 Go and look at one of his books[br]and you can see here 0:11:05.980,0:11:09.619 that we're seeing [br]a constant improvement in productivity. 0:11:10.079,0:11:12.860 Technology is buying us [br]those improvements in productivity, 0:11:12.860,0:11:17.888 is increasing our wealth,[br]but there's a leveling off of employment. 0:11:18.462,0:11:20.187 And so, the challenge, then, is how 0:11:20.753,0:11:23.651 -- it's a question for society, [br]not for a technologist like myself -- 0:11:23.651,0:11:25.777 how do we all benefit [br]from this rising tide, 0:11:25.777,0:11:30.648 not so that it is the rich get richer [br]and the rest of us get further behind. 0:11:31.613,0:11:36.443 So, many parts of many jobs [br]looks likely (check) to be automated. 0:11:36.767,0:11:39.624 One confusion is this: people say [br]these jobs are going to disappear. 0:11:39.624,0:11:43.009 Actually, it seems to be more likely that [br]many parts of your job will be automated. 0:11:43.009,0:11:45.892 But that still means that there is [br]perhaps less employment around. 0:11:46.201,0:11:48.860 So how can you make yourself [br]more future-proof? 0:11:49.382,0:11:51.291 Well, I have two pieces of advice [br]as a technologist, 0:11:51.291,0:11:53.950 in terms of what's going to be [br]technically possible in AI. 0:11:54.215,0:11:57.886 Either you've got to embrace the future, [br]so become like me, 0:11:58.605,0:12:01.725 become someone who's working [br]on trying to invent that future. 0:12:02.218,0:12:03.671 And if you're not technically minded, [br]that's fine: 0:12:03.671,0:12:07.127 I've got the other part of the equation, [br]the other answer to your question 0:12:07.127,0:12:10.437 which is completely at the other end [br]of the spectrum, which is: 0:12:10.786,0:12:14.011 focus on those things that find as AI, (check)[br]the hardest things, 0:12:14.011,0:12:15.672 making computers more creative, 0:12:15.672,0:12:18.719 making computers that can understand [br]your emotional state, 0:12:19.269,0:12:23.111 focusing on emotional terms [br]and not intellectual intelligence. 0:12:24.765,0:12:26.866 So, how safe is education? 0:12:26.866,0:12:28.728 The room is here, full of people [br]working in education. 0:12:28.728,0:12:30.104 How safe are your jobs? 0:12:30.104,0:12:33.452 Well these are the numbers [br]from that Oxford Martin report I wrote. 0:12:33.789,0:12:35.096 So if you're a telemarketeer, 0:12:35.096,0:12:37.908 99% chance that [br]you're going to be automated. 0:12:37.908,0:12:39.162 Not surprising, right? 0:12:39.162,0:12:41.208 Easy to automate, it's down the phone. 0:12:42.589,0:12:45.322 Some of the numbers [br]I just don't want you to take away: 0:12:45.322,0:12:48.195 they used machine only, they used AI [br]to actually generate the report 0:12:48.195,0:12:50.240 and I don't believe some of the numbers, 0:12:50.240,0:12:53.141 I don't believe: [br]"Bicycle repairmen: 94%." 0:12:53.141,0:12:54.550 There's no chance in hell 0:12:54.550,0:12:56.804 that the bicycle repair person [br]is going to be automated: 0:12:56.804,0:12:58.916 far too cheap and intricate a job. 0:12:58.916,0:13:03.840 "Parking lot attendant: 87%."[br]I don't know why it's not 100%: 0:13:03.840,0:13:06.271 well, you're not going to have[br]parking lot attendants, for sure. 0:13:06.770,0:13:08.290 But look: luckily, 0:13:08.290,0:13:10.408 you and me are [br]right at the bottom of the list, 0:13:10.743,0:13:12.974 down at the 0's and 1%, right? 0:13:13.451,0:13:15.381 I think those numbers [br]probably underestimate 0:13:15.981,0:13:19.203 how replaceable or [br]how irreplaceable we are, 0:13:19.203,0:13:23.315 but nevertheless, [br]you can take some heart away 0:13:23.315,0:13:25.418 from the sort of numbers you see there. 0:13:26.488,0:13:28.217 And the reason being? 0:13:28.217,0:13:30.873 The first reason is, because [br]we're dealing with people, 0:13:30.873,0:13:32.436 we're trying [br]to understand people, 0:13:32.436,0:13:36.469 understand their motivations,[br]what are their mental blocks. 0:13:36.469,0:13:38.404 These are things that are [br]really hard to get computers, 0:13:38.404,0:13:41.347 I can tell you, really hard to get [br]computers, program computers to do. 0:13:43.683,0:13:46.774 So, one of the things, I think, [br]we have to realize, 0:13:46.774,0:13:49.943 is that with this impact that automation[br]and AI in particular 0:13:49.943,0:13:51.841 is going to have on jobs 0:13:52.292,0:13:55.153 -- and as was mentioned in several [br]of the earlier talks -- 0:13:55.153,0:13:58.920 is that of course, we try to educate [br]people for a future that does not exist, 0:13:58.920,0:14:01.075 for technologies [br]that have yet to be invented. 0:14:01.075,0:14:05.414 Education, therefore, inherently, [br]is going to have to be a lifelong process. 0:14:05.860,0:14:08.445 We can't teach you [br]the programming language of the future, 0:14:08.445,0:14:09.539 because we haven't invented it. 0:14:09.539,0:14:13.241 It's going to be -- [br]we have to teach you fundamental ideas 0:14:13.528,0:14:17.127 so that you can then go off and learn [br]as you go on in life. 0:14:17.531,0:14:19.359 So, to have a job, [br]you're going to have to 0:14:19.359,0:14:21.500 keep yourself abreast [br]of the latest technologies 0:14:21.500,0:14:25.962 and AI and technologies should be there [br]to help us do that. 0:14:26.996,0:14:32.680 AI will help that lifelong journey, and [br]I think the big thing that AI can help is, 0:14:32.680,0:14:35.852 is to personalize[br]that learning experience, 0:14:36.174,0:14:39.166 to construct a model [br]of your understanding of the topic. 0:14:39.576,0:14:43.119 So, to generate you [br]infinite numbers of test problems 0:14:43.440,0:14:45.925 that can be tailored to exactly [br]where your state of knowledge is, 0:14:45.925,0:14:48.565 and then mark those test problems [br]instantly. 0:14:49.004,0:14:50.689 So, the fact that we can answer, 0:14:50.948,0:14:53.782 that we can answer [br]Japanese college entrance exams means 0:14:53.782,0:14:56.338 that we can also mark [br]Japanese college entrance exams. 0:14:56.338,0:14:58.329 We can do all those things, right? 0:14:58.329,0:15:00.555 with technology, and we can do it for you. 0:15:01.511,0:15:04.444 And MOOCs will turn into [br]-- I've invented a little acronym there -- 0:15:04.444,0:15:07.838 POOCs, Personal Open Online Courses,[br]right? 0:15:07.838,0:15:10.484 I don't think they should be massive, [br]they should be for you. 0:15:11.459,0:15:13.434 You can choose your own trajectory. 0:15:13.434,0:15:15.218 I always find it very strange 0:15:15.218,0:15:18.739 that we're there still [br]just like the classroom experience. 0:15:18.739,0:15:20.008 We should be able to follow 0:15:20.008,0:15:23.583 whatever interesting trajectory we want [br]through the material. 0:15:24.145,0:15:28.385 And then, of course, use AI techniques [br]like data mining analytics 0:15:28.385,0:15:31.583 across the massive part of the MOOC 0:15:31.583,0:15:34.163 to really improve [br]your learning experience. 0:15:35.075,0:15:37.629 So, I just wanted to conclude by saying 0:15:37.629,0:15:40.278 AI is not quite [br]what you see in the movies 0:15:40.617,0:15:42.870 but we are making impressive progress 0:15:42.870,0:15:45.680 and it's certainly going to be [br]an interesting part of the equation 0:15:45.680,0:15:48.127 for this interesting future [br]that we all face. 0:15:48.403,0:15:55.743 Thank you very much. [br](Applause)