0:00:00.564,0:00:05.207 My job is to design, build and study[br]robots that communicate with people. 0:00:05.231,0:00:08.850 But this story doesn't start with robotics[br]at all, it starts with animation. 0:00:08.874,0:00:11.243 When I first saw Pixar's "Luxo Jr.," 0:00:11.267,0:00:13.973 I was amazed by how much[br]emotion they could put 0:00:13.997,0:00:16.825 into something as trivial as a desk lamp. 0:00:17.477,0:00:19.822 I mean, look at them --[br]at the end of this movie, 0:00:19.846,0:00:22.625 you actually feel something[br]for two pieces of furniture. 0:00:22.649,0:00:23.930 (Laughter) 0:00:23.954,0:00:25.982 And I said, I have[br]to learn how to do this. 0:00:26.006,0:00:28.233 So I made a really bad career decision. 0:00:28.257,0:00:29.451 (Laughter) 0:00:29.475,0:00:31.668 And that's what my mom[br]was like when I did it. 0:00:31.692,0:00:33.833 (Laughter) 0:00:33.857,0:00:37.278 I left a very cozy tech job in Israel[br]at a nice software company 0:00:37.302,0:00:39.594 and I moved to New York[br]to study animation. 0:00:39.618,0:00:40.769 And there I lived 0:00:40.793,0:00:43.611 in a collapsing apartment building[br]in Harlem with roommates. 0:00:43.635,0:00:45.659 I'm not using this phrase[br]metaphorically -- 0:00:45.683,0:00:48.412 the ceiling actually collapsed[br]one day in our living room. 0:00:48.436,0:00:51.678 Whenever they did news stories[br]about building violations in New York, 0:00:51.702,0:00:54.118 they would put the report[br]in front of our building, 0:00:54.142,0:00:56.879 as kind of, like, a backdrop[br]to show how bad things are. 0:00:56.903,0:00:58.840 Anyway, during the day, I went to school 0:00:58.864,0:01:02.291 and at night I would sit and draw[br]frame by frame of pencil animation. 0:01:02.315,0:01:04.500 And I learned two surprising lessons. 0:01:04.524,0:01:09.014 One of them was that[br]when you want to arouse emotions, 0:01:09.038,0:01:11.213 it doesn't matter so much[br]how something looks; 0:01:11.237,0:01:14.435 it's all in the motion, in the timing[br]of how the thing moves. 0:01:14.816,0:01:18.006 And the second was something[br]one of our teachers told us. 0:01:18.030,0:01:20.323 He actually did the weasel in "Ice Age." 0:01:20.662,0:01:24.519 And he said, "As an animator,[br]you're not a director -- you're an actor." 0:01:24.895,0:01:27.918 So, if you want to find[br]the right motion for a character, 0:01:27.942,0:01:30.387 don't think about it --[br]go use your body to find it. 0:01:30.411,0:01:33.344 Stand in front of a mirror,[br]act it out in front of a camera -- 0:01:33.368,0:01:36.306 whatever you need -- and then[br]put it back in your character. 0:01:36.696,0:01:39.908 A year later I found myself at MIT[br]in the Robotic Life Group. 0:01:39.932,0:01:42.790 It was one of the first groups[br]researching the relationships 0:01:42.814,0:01:44.128 between humans and robots. 0:01:44.152,0:01:48.108 And I still had this dream[br]to make an actual, physical Luxo Jr. lamp. 0:01:48.132,0:01:51.124 But I found that robots didn't move[br]at all in this engaging way 0:01:51.148,0:01:53.323 that I was used to[br]from my animation studies. 0:01:53.347,0:01:55.811 Instead, they were all --[br]how should I put it -- 0:01:55.835,0:01:57.289 they were all kind of robotic. 0:01:57.313,0:01:59.187 (Laughter) 0:01:59.211,0:02:02.947 And I thought, what if I took[br]whatever I learned in animation school, 0:02:02.971,0:02:05.574 and used that to design[br]my robotic desk lamp. 0:02:05.598,0:02:07.643 So I went and designed frame by frame 0:02:07.667,0:02:11.627 to try to make this robot as graceful[br]and engaging as possible. 0:02:12.199,0:02:15.908 And here when you see the robot[br]interacting with me on a desktop -- 0:02:15.932,0:02:17.968 and I'm actually redesigning the robot, 0:02:17.992,0:02:19.796 so, unbeknownst to itself, 0:02:19.820,0:02:22.395 it's kind of digging its own[br]grave by helping me. 0:02:22.419,0:02:24.425 (Laughter) 0:02:24.449,0:02:27.688 I wanted it to be less of a mechanical[br]structure giving me light, 0:02:27.712,0:02:30.730 and more of a helpful,[br]kind of quiet apprentice 0:02:30.754,0:02:33.923 that's always there when you need it[br]and doesn't really interfere. 0:02:33.947,0:02:37.282 And when, for example, I'm looking[br]for a battery that I can't find, 0:02:37.306,0:02:40.253 in a subtle way, it'll show me[br]where the battery is. 0:02:41.872,0:02:43.705 So you can see my confusion here. 0:02:44.442,0:02:45.654 I'm not an actor. 0:02:48.585,0:02:51.618 And I want you to notice[br]how the same mechanical structure 0:02:51.642,0:02:53.729 can, at one point,[br]just by the way it moves, 0:02:53.753,0:02:55.897 seem gentle and caring[br]and in the other case, 0:02:55.921,0:02:58.292 seem violent and confrontational. 0:02:58.316,0:03:01.163 And it's the same structure,[br]just the motion is different. 0:03:07.419,0:03:11.991 Actor: "You want to know something?[br]Well, you want to know something? 0:03:12.015,0:03:13.904 He was already dead! 0:03:13.928,0:03:17.830 Just laying there, eyes glazed over!" 0:03:17.854,0:03:18.875 (Laughter) 0:03:18.899,0:03:21.955 But, moving in a graceful way[br]is just one building block 0:03:21.979,0:03:24.574 of this whole structure[br]called human-robot interaction. 0:03:24.598,0:03:28.051 I was, at the time, doing my PhD,[br]I was working on human-robot teamwork, 0:03:28.075,0:03:30.137 teams of humans and robots[br]working together. 0:03:30.161,0:03:31.796 I was studying the engineering, 0:03:31.820,0:03:34.352 the psychology,[br]the philosophy of teamwork, 0:03:34.376,0:03:35.527 and at the same time, 0:03:35.551,0:03:38.045 I found myself in my own kind[br]of teamwork situation, 0:03:38.069,0:03:40.321 with a good friend of mine,[br]who's actually here. 0:03:40.345,0:03:42.749 And in that situation,[br]we can easily imagine robots 0:03:42.773,0:03:44.673 in the near future being there with us. 0:03:44.697,0:03:46.387 It was after a Passover Seder. 0:03:46.411,0:03:48.466 We were folding up[br]a lot of folding chairs, 0:03:48.490,0:03:51.148 and I was amazed at how quickly[br]we found our own rhythm. 0:03:51.172,0:03:54.282 Everybody did their own part,[br]we didn't have to divide our tasks. 0:03:54.306,0:03:56.752 We didn't have to communicate[br]verbally about this -- 0:03:56.776,0:03:57.965 it all just happened. 0:03:57.989,0:04:00.869 And I thought, humans and robots[br]don't look at all like this. 0:04:00.893,0:04:04.024 When humans and robots interact,[br]it's much more like a chess game: 0:04:04.048,0:04:07.169 the human does a thing, the robot[br]analyzes whatever the human did, 0:04:07.193,0:04:09.836 the robot decides what to do next,[br]plans it and does it. 0:04:09.860,0:04:12.215 Then the human waits,[br]until it's their turn again. 0:04:12.239,0:04:14.970 So it's much more like a chess game,[br]and that makes sense, 0:04:14.994,0:04:18.105 because chess is great for mathematicians[br]and computer scientists. 0:04:18.129,0:04:21.557 It's all about information, analysis,[br]decision-making and planning. 0:04:21.581,0:04:25.311 But I wanted my robot[br]to be less of a chess player, 0:04:25.335,0:04:27.128 and more like a doer 0:04:27.152,0:04:29.153 that just clicks and works together. 0:04:29.177,0:04:32.546 So I made my second[br]horrible career choice: 0:04:32.570,0:04:35.123 I decided to study acting for a semester. 0:04:35.147,0:04:38.066 I took off from the PhD,[br]I went to acting classes. 0:04:38.090,0:04:40.128 I actually participated in a play -- 0:04:40.152,0:04:42.454 I hope there’s no video[br]of that around still. 0:04:42.478,0:04:43.516 (Laughter) 0:04:43.540,0:04:45.834 And I got every book[br]I could find about acting, 0:04:45.858,0:04:48.987 including one from the 19th century[br]that I got from the library. 0:04:49.011,0:04:52.624 And I was really amazed, because my name[br]was the second name on the list -- 0:04:52.648,0:04:54.648 the previous name was in 1889. 0:04:54.672,0:04:55.687 (Laughter) 0:04:55.711,0:04:57.960 And this book was[br]kind of waiting for 100 years 0:04:57.984,0:05:00.316 to be rediscovered for robotics. 0:05:00.340,0:05:01.916 And this book shows actors 0:05:01.940,0:05:04.178 how to move every muscle in the body 0:05:04.202,0:05:06.918 to match every kind of emotion[br]that they want to express. 0:05:06.942,0:05:09.968 But the real revelation was[br]when I learned about method acting. 0:05:09.992,0:05:12.245 It became very popular[br]in the 20th century. 0:05:12.269,0:05:13.420 And method acting said 0:05:13.444,0:05:15.744 you don't have to plan[br]every muscle in your body; 0:05:15.768,0:05:18.713 instead, you have to use your body[br]to find the right movement. 0:05:18.737,0:05:21.617 You have to use your sense memory[br]to reconstruct the emotions 0:05:21.641,0:05:24.679 and kind of think with your body[br]to find the right expression -- 0:05:24.703,0:05:26.578 improvise, play off your scene partner. 0:05:26.602,0:05:28.047 And this came at the same time 0:05:28.071,0:05:31.177 as I was reading about this trend[br]in cognitive psychology, 0:05:31.201,0:05:34.453 called embodied cognition,[br]which also talks about the same ideas. 0:05:34.477,0:05:35.992 We use our bodies to think; 0:05:36.016,0:05:38.973 we don't just think with our brains[br]and use our bodies to move, 0:05:38.997,0:05:40.915 but our bodies feed back into our brain 0:05:40.939,0:05:43.104 to generate the way that we behave. 0:05:43.128,0:05:44.724 And it was like a lightning bolt. 0:05:44.748,0:05:45.962 I went back to my office, 0:05:45.986,0:05:48.391 I wrote this paper,[br]which I never really published, 0:05:48.415,0:05:50.947 called "Acting Lessons[br]for Artificial Intelligence." 0:05:50.971,0:05:52.352 And I even took another month 0:05:52.376,0:05:54.820 to do what was then the first theater play 0:05:54.844,0:05:56.797 with a human and a robot acting together. 0:05:56.821,0:05:59.214 That's what you saw[br]before with the actors. 0:06:00.564,0:06:01.715 And I thought: 0:06:01.739,0:06:04.676 How can we make an artificial[br]intelligence model -- 0:06:04.700,0:06:06.683 a computer, computational model -- 0:06:06.707,0:06:09.214 that will model some of these[br]ideas of improvisation, 0:06:09.238,0:06:11.068 of taking risks, of taking chances, 0:06:11.092,0:06:12.619 even of making mistakes? 0:06:12.643,0:06:15.256 Maybe it can make for better[br]robotic teammates. 0:06:15.280,0:06:17.884 So I worked for quite[br]a long time on these models 0:06:17.908,0:06:20.303 and I implemented them[br]on a number of robots. 0:06:20.327,0:06:22.628 Here you can see a very early example 0:06:22.652,0:06:26.231 with the robots trying to use[br]this embodied artificial intelligence 0:06:26.255,0:06:28.718 to try to match my movements[br]as closely as possible. 0:06:28.742,0:06:30.205 It's sort of like a game. 0:06:30.530,0:06:31.739 Let's look at it. 0:06:35.652,0:06:39.191 You can see when I psych[br]it out, it gets fooled. 0:06:39.698,0:06:42.293 And it's a little bit[br]like what you might see actors do 0:06:42.317,0:06:43.951 when they try to mirror each other 0:06:43.975,0:06:46.340 to find the right synchrony between them. 0:06:46.364,0:06:48.196 And then, I did another experiment, 0:06:48.220,0:06:52.279 and I got people off the street[br]to use the robotic desk lamp, 0:06:52.303,0:06:55.405 and try out this idea[br]of embodied artificial intelligence. 0:06:55.921,0:07:00.519 So, I actually used two kinds[br]of brains for the same robot. 0:07:00.543,0:07:02.495 The robot is the same lamp that you saw, 0:07:02.519,0:07:03.858 and I put two brains in it. 0:07:03.882,0:07:05.609 For one half of the people, 0:07:05.633,0:07:08.569 I put in a brain[br]that's kind of the traditional, 0:07:08.593,0:07:09.822 calculated robotic brain. 0:07:09.846,0:07:12.486 It waits for its turn,[br]it analyzes everything, it plans. 0:07:12.510,0:07:14.192 Let's call it the calculated brain. 0:07:14.216,0:07:17.463 The other got more the stage[br]actor, risk-taker brain. 0:07:17.487,0:07:19.609 Let's call it the adventurous brain. 0:07:19.633,0:07:22.570 It sometimes acts without knowing[br]everything it has to know. 0:07:22.594,0:07:24.886 It sometimes makes mistakes[br]and corrects them. 0:07:24.910,0:07:28.828 And I had them do this very tedious task[br]that took almost 20 minutes, 0:07:28.852,0:07:30.328 and they had to work together, 0:07:30.352,0:07:32.905 somehow simulating, like, a factory job 0:07:32.929,0:07:35.016 of repetitively doing the same thing. 0:07:35.445,0:07:38.776 What I found is that people[br]actually loved the adventurous robot. 0:07:38.800,0:07:40.577 They thought it was more intelligent, 0:07:40.601,0:07:42.707 more committed,[br]a better member of the team, 0:07:42.731,0:07:44.796 contributed to the success[br]of the team more. 0:07:44.820,0:07:46.528 They even called it "he" and "she," 0:07:46.552,0:07:48.673 whereas people with the calculated brain 0:07:48.697,0:07:51.499 called it "it," and nobody[br]ever called it "he" or "she." 0:07:51.876,0:07:55.169 When they talked about it after the task,[br]with the adventurous brain, 0:07:55.193,0:07:58.976 they said, "By the end, we were good[br]friends and high-fived mentally." 0:07:59.397,0:08:00.690 Whatever that means. 0:08:00.714,0:08:02.620 (Laughter) 0:08:02.644,0:08:03.997 Sounds painful. 0:08:04.021,0:08:06.759 Whereas the people[br]with the calculated brain 0:08:06.783,0:08:09.141 said it was just like a lazy apprentice. 0:08:09.165,0:08:11.910 It only did what it was supposed[br]to do and nothing more, 0:08:11.934,0:08:14.283 which is almost what people[br]expect robots to do, 0:08:14.307,0:08:17.862 so I was surprised that people[br]had higher expectations of robots 0:08:17.886,0:08:21.392 than what anybody in robotics[br]thought robots should be doing. 0:08:22.027,0:08:24.042 And in a way, I thought,[br]maybe it's time -- 0:08:24.066,0:08:27.028 just like method acting[br]changed the way people thought 0:08:27.052,0:08:28.648 about acting in the 19th century, 0:08:28.672,0:08:31.749 from going from the very calculated,[br]planned way of behaving, 0:08:31.773,0:08:35.271 to a more intuitive, risk-taking,[br]embodied way of behaving -- 0:08:35.295,0:08:38.491 maybe it's time for robots[br]to have the same kind of revolution. 0:08:39.994,0:08:43.593 A few years later, I was at my next[br]research job at Georgia Tech in Atlanta, 0:08:43.617,0:08:46.441 and I was working in a group[br]dealing with robotic musicians. 0:08:46.465,0:08:48.825 And I thought, music:[br]that's the perfect place 0:08:48.849,0:08:52.992 to look at teamwork, coordination,[br]timing, improvisation -- 0:08:53.016,0:08:55.459 and we just got this robot[br]playing marimba. 0:08:55.483,0:08:57.739 And the marimba, for everybody like me, 0:08:57.763,0:09:00.502 it was this huge, wooden xylophone. 0:09:00.526,0:09:02.589 And when I was looking at this, 0:09:02.613,0:09:05.611 I looked at other works[br]in human-robot improvisation -- 0:09:05.635,0:09:08.382 yes, there are other works[br]in human-robot improvisation -- 0:09:08.406,0:09:10.755 and they were also a little bit[br]like a chess game. 0:09:10.779,0:09:11.930 The human would play, 0:09:11.954,0:09:14.127 the robot analyzed what was played, 0:09:14.151,0:09:16.230 and would improvise their own part. 0:09:16.254,0:09:19.392 So, this is what musicians called[br]a call-and-response interaction, 0:09:19.416,0:09:23.163 and it also fits very well[br]robots and artificial intelligence. 0:09:23.187,0:09:26.440 But I thought, if I use the same ideas[br]I used in the theater play 0:09:26.464,0:09:28.208 and in the teamwork studies, 0:09:28.232,0:09:31.882 maybe I can make the robots[br]jam together like a band. 0:09:31.906,0:09:36.174 Everybody's riffing off each other,[br]nobody is stopping for a moment. 0:09:36.198,0:09:39.072 And so I tried to do the same[br]things, this time with music, 0:09:39.096,0:09:41.930 where the robot doesn't really know[br]what it's about to play, 0:09:41.954,0:09:44.971 it just sort of moves its body[br]and uses opportunities to play, 0:09:44.995,0:09:47.534 and does what my jazz teacher[br]when I was 17 taught me. 0:09:47.558,0:09:48.955 She said, when you improvise, 0:09:48.979,0:09:51.995 sometimes you don't know[br]what you're doing, and you still do it. 0:09:52.019,0:09:55.345 So I tried to make a robot that doesn't[br]actually know what it's doing, 0:09:55.369,0:09:56.590 but is still doing it. 0:09:56.614,0:09:59.125 So let's look at a few seconds[br]from this performance, 0:09:59.149,0:10:01.276 where the robot listens[br]to the human musician 0:10:01.300,0:10:02.465 and improvises. 0:10:02.489,0:10:05.464 And then, look how the human[br]musician also responds 0:10:05.488,0:10:06.761 to what the robot is doing 0:10:06.785,0:10:09.429 and picking up from its behavior, 0:10:09.453,0:10:13.409 and at some point can even be surprised[br]by what the robot came up with. 0:10:13.433,0:10:15.627 (Music) 0:10:55.618,0:10:57.618 (Music ends) 0:10:59.134,0:11:04.868 (Applause) 0:11:04.892,0:11:07.142 Being a musician[br]is not just about making notes, 0:11:07.166,0:11:09.514 otherwise nobody[br]would ever go see a live show. 0:11:09.538,0:11:11.660 Musicians also communicate[br]with their bodies, 0:11:11.684,0:11:13.701 with other band members,[br]with the audience, 0:11:13.725,0:11:15.739 they use their bodies[br]to express the music. 0:11:15.763,0:11:18.460 And I thought, we already have[br]a robot musician on stage, 0:11:18.484,0:11:20.796 why not make it be[br]a full-fledged musician? 0:11:20.820,0:11:23.494 And I started designing[br]a socially expressive head 0:11:23.518,0:11:24.861 for the robot. 0:11:24.885,0:11:26.955 The head doesn’t actually[br]touch the marimba, 0:11:26.979,0:11:28.938 it just expresses what the music is like. 0:11:28.962,0:11:31.458 These are some napkin sketches[br]from a bar in Atlanta 0:11:31.482,0:11:34.025 that was dangerously[br]located exactly halfway 0:11:34.049,0:11:35.860 between my lab and my home. 0:11:35.884,0:11:39.822 So I spent, I would say, on average,[br]three to four hours a day there. 0:11:39.846,0:11:41.033 I think. 0:11:41.057,0:11:42.584 (Laughter) 0:11:42.917,0:11:45.799 And I went back to my animation[br]tools and tried to figure out 0:11:45.823,0:11:48.193 not just what a robotic[br]musician would look like, 0:11:48.217,0:11:50.842 but especially what a robotic[br]musician would move like, 0:11:50.866,0:11:54.268 to sort of show that it doesn't like[br]what the other person is playing -- 0:11:54.292,0:11:58.117 and maybe show whatever beat[br]it's feeling at the moment. 0:11:58.141,0:12:02.644 So we ended up actually getting the money[br]to build this robot, which was nice. 0:12:02.668,0:12:05.272 I'm going to show you now[br]the same kind of performance, 0:12:05.296,0:12:07.336 this time with a socially expressive head. 0:12:07.360,0:12:09.160 And notice one thing -- 0:12:09.184,0:12:10.855 how the robot is really showing us 0:12:10.879,0:12:12.785 the beat it's picking up from the human, 0:12:12.809,0:12:16.736 while also giving the human a sense[br]that the robot knows what it's doing. 0:12:16.760,0:12:18.680 And also how it changes the way it moves 0:12:18.704,0:12:20.656 as soon as it starts its own solo. 0:12:20.680,0:12:24.576 (Music) 0:12:24.600,0:12:27.620 Now it's looking at me,[br]showing that it's listening. 0:12:27.644,0:12:30.367 (Music) 0:12:49.181,0:12:52.102 Now look at the final chord[br]of the piece again. 0:12:52.126,0:12:55.043 And this time the robot[br]communicates with its body 0:12:55.067,0:12:57.180 when it's busy doing its own thing, 0:12:57.204,0:13:02.055 and when it's ready to coordinate[br]the final chord with me. 0:13:02.079,0:13:04.816 (Music) 0:13:09.783,0:13:11.783 (Music ending) 0:13:14.103,0:13:15.104 (Final chord) 0:13:15.128,0:13:20.882 (Applause) 0:13:20.906,0:13:22.061 Thanks. 0:13:22.085,0:13:23.666 I hope you see 0:13:23.690,0:13:28.086 how much this part of the body[br]that doesn't touch the instrument 0:13:28.110,0:13:30.702 actually helps[br]with the musical performance. 0:13:31.160,0:13:33.169 And at some point -- we are in Atlanta, 0:13:33.193,0:13:36.443 so obviously some rapper[br]will come into our lab at some point -- 0:13:36.467,0:13:41.260 and we had this rapper come in and do[br]a little jam with the robot. 0:13:41.284,0:13:45.332 Here you can see the robot[br]basically responding to the beat. 0:13:45.356,0:13:46.507 Notice two things: 0:13:46.531,0:13:50.140 one, how irresistible it is to join[br]the robot while it's moving its head. 0:13:50.164,0:13:52.767 You kind of want to move[br]your own head when it does it. 0:13:52.791,0:13:56.377 And second, even though the rapper[br]is really focused on his iPhone, 0:13:56.401,0:13:59.511 as soon as the robot turns[br]to him, he turns back. 0:13:59.535,0:14:02.171 So even though it's just[br]in the periphery of his vision, 0:14:02.195,0:14:04.315 in the corner of his eye,[br]it's very powerful. 0:14:04.339,0:14:06.173 And the reason is that we can't ignore 0:14:06.197,0:14:08.221 physical things moving in our environment. 0:14:08.245,0:14:09.415 We are wired for that. 0:14:09.439,0:14:11.083 So if you have a problem -- 0:14:11.107,0:14:15.782 maybe your partner is looking[br]at their iPhone or smartphone too much -- 0:14:15.806,0:14:18.640 you might want to have a robot[br]there to get their attention. 0:14:18.664,0:14:19.665 (Laughter) 0:14:19.689,0:14:21.758 (Music) 0:14:34.113,0:14:36.113 (Music ends) 0:14:38.138,0:14:45.086 (Applause) 0:14:45.633,0:14:49.467 Just to introduce the last robot[br]that we've worked on, 0:14:49.491,0:14:51.841 it came out of something[br]surprising that we found: 0:14:51.865,0:14:55.023 Some point people didn't care[br]about the robot being intelligent, 0:14:55.047,0:14:56.449 able to improvise and listen, 0:14:56.473,0:15:00.879 and do all these embodied intelligence[br]things that I spent years developing. 0:15:00.903,0:15:03.584 They really liked that the robot[br]was enjoying the music. 0:15:03.608,0:15:04.609 (Laughter) 0:15:04.633,0:15:07.178 And they didn't say the robot[br]was moving to the music, 0:15:07.202,0:15:08.690 they said "enjoying" the music. 0:15:08.714,0:15:10.840 And we thought,[br]why don't we take this idea, 0:15:10.864,0:15:13.605 and I designed a new piece of furniture. 0:15:13.629,0:15:16.201 This time it wasn't a desk lamp,[br]it was a speaker dock, 0:15:16.225,0:15:18.865 one of those things[br]you plug your smartphone in. 0:15:18.889,0:15:20.039 And I thought, 0:15:20.063,0:15:23.563 what would happen if your speaker dock[br]didn't just play the music for you, 0:15:23.587,0:15:25.633 but would actually enjoy it, too? 0:15:25.657,0:15:29.818 And so again, here are some[br]animation tests from an early stage. 0:15:29.842,0:15:31.226 (Laughter) 0:15:31.835,0:15:34.706 And this is what the final[br]product looked like. 0:15:46.247,0:15:48.428 (Music) 0:16:06.848,0:16:08.848 (Music ends) 0:16:09.461,0:16:12.081 So, a lot of bobbing heads. 0:16:12.105,0:16:15.716 (Applause) 0:16:15.740,0:16:17.630 A lot of bobbing heads in the audience, 0:16:17.654,0:16:20.362 so we can still see[br]robots influence people. 0:16:20.386,0:16:23.171 And it's not just fun and games. 0:16:23.195,0:16:25.396 I think one of the reasons I care so much 0:16:25.420,0:16:27.653 about robots that use[br]their body to communicate 0:16:27.677,0:16:29.383 and use their body to move is -- 0:16:29.407,0:16:32.797 I'm going to let you in on a little[br]secret we roboticists are hiding -- 0:16:32.821,0:16:35.614 is that every one of you[br]is going to be living with a robot 0:16:35.638,0:16:37.230 at some point in your life. 0:16:37.254,0:16:40.187 Somewhere in your future,[br]there will be a robot in your life. 0:16:40.211,0:16:42.079 If not in yours, your children's lives. 0:16:42.103,0:16:47.093 And I want these robots to be more fluent,[br]more engaging, more graceful 0:16:47.117,0:16:48.846 than currently they seem to be. 0:16:48.870,0:16:52.157 And for that I think maybe robots[br]need to be less like chess players 0:16:52.181,0:16:54.822 and more like stage actors[br]and more like musicians. 0:16:54.846,0:16:57.659 Maybe they should be able[br]to take chances and improvise. 0:16:57.683,0:17:00.707 Maybe they should be able[br]to anticipate what you're about to do. 0:17:00.731,0:17:04.079 Maybe they even need to be able[br]to make mistakes and correct them, 0:17:04.103,0:17:05.991 because in the end, we are human. 0:17:06.015,0:17:09.569 And maybe as humans,[br]robots that are a little less than perfect 0:17:09.593,0:17:11.324 are just perfect for us. 0:17:11.348,0:17:12.507 Thank you. 0:17:12.531,0:17:19.367 (Applause)