0:00:00.539,0:00:03.358 I know this is going to sound strange, 0:00:03.382,0:00:08.365 but I think robots can inspire us[br]to be better humans. 0:00:09.245,0:00:12.339 See, I grew up in Bethlehem, Pennsylvania, 0:00:12.363,0:00:14.301 the home of Bethlehem Steel. 0:00:14.991,0:00:17.103 My father was an engineer, 0:00:17.127,0:00:21.446 and when I was growing up,[br]he would teach me how things worked. 0:00:21.470,0:00:23.564 We would build projects together, 0:00:23.588,0:00:26.356 like model rockets and slot cars. 0:00:26.732,0:00:29.640 Here's the go-kart that we built together. 0:00:30.208,0:00:31.952 That's me behind the wheel, 0:00:31.976,0:00:34.200 with my sister and my best[br]friend at the time. 0:00:35.875,0:00:37.801 And one day, 0:00:37.825,0:00:40.908 he came home, when I was[br]about 10 years old, 0:00:40.932,0:00:43.558 and at the dinner table, he announced 0:00:43.582,0:00:47.294 that for our next project, [br]we were going to build ... 0:00:47.318,0:00:48.468 a robot. 0:00:49.604,0:00:50.759 A robot. 0:00:51.120,0:00:53.183 Now, I was thrilled about this, 0:00:53.207,0:00:57.009 because at school,[br]there was a bully named Kevin, 0:00:57.033,0:00:58.922 and he was picking on me, 0:00:58.946,0:01:01.336 because I was the only[br]Jewish kid in class. 0:01:01.910,0:01:04.283 So I couldn't wait to get[br]started to work on this, 0:01:04.307,0:01:06.917 so I could introduce Kevin to my robot. 0:01:06.941,0:01:07.971 (Laughter) 0:01:07.995,0:01:12.238 (Robot noises) 0:01:17.552,0:01:18.944 (Laughter) 0:01:18.968,0:01:22.817 But that wasn't the kind of robot[br]my dad had in mind. 0:01:22.841,0:01:23.909 (Laughter) 0:01:23.933,0:01:27.565 See, he owned a chromium-plating company, 0:01:27.589,0:01:33.462 and they had to move heavy steel parts[br]between tanks of chemicals. 0:01:33.486,0:01:36.859 And so he needed[br]an industrial robot like this, 0:01:36.883,0:01:38.986 that could basically do the heavy lifting. 0:01:39.684,0:01:43.275 But my dad didn't get[br]the kind of robot he wanted, either. 0:01:43.870,0:01:46.300 He and I worked on it for several years, 0:01:46.324,0:01:51.069 but it was the 1970s, and the technology[br]that was available to amateurs 0:01:51.093,0:01:52.419 just wasn't there yet. 0:01:53.992,0:01:56.777 So Dad continued to do[br]this kind of work by hand. 0:01:57.595,0:01:59.023 And a few years later, 0:01:59.730,0:02:01.306 he was diagnosed with cancer. 0:02:03.822,0:02:05.392 You see, 0:02:05.416,0:02:08.012 what the robot we were trying[br]to build was telling him 0:02:08.036,0:02:10.192 was not about doing the heavy lifting. 0:02:10.216,0:02:11.367 It was a warning 0:02:11.391,0:02:13.895 about his exposure to the toxic chemicals. 0:02:15.184,0:02:17.477 He didn't recognize that at the time, 0:02:17.501,0:02:19.187 and he contracted leukemia. 0:02:19.827,0:02:21.719 And he died at the age of 45. 0:02:23.354,0:02:25.130 I was devastated by this. 0:02:25.703,0:02:28.465 And I never forgot the robot[br]that he and I tried to build. 0:02:30.195,0:02:33.748 When I was at college, I decided[br]to study engineering, like him. 0:02:35.081,0:02:39.629 And I went to Carnegie Mellon,[br]and I earned my PhD in robotics. 0:02:39.653,0:02:41.562 I've been studying robots ever since. 0:02:42.721,0:02:47.316 So what I'd like to tell you about[br]are four robot projects, 0:02:47.340,0:02:50.389 and how they've inspired me[br]to be a better human. 0:02:54.206,0:02:59.820 By 1993, I was a young professor at USC, 0:02:59.844,0:03:02.662 and I was just building up[br]my own robotics lab, 0:03:02.686,0:03:05.907 and this was the year[br]the World Wide Web came out. 0:03:06.487,0:03:09.472 And I remember my students[br]were the ones who told me about it, 0:03:09.496,0:03:11.813 and we would -- we were just amazed. 0:03:11.837,0:03:15.220 We started playing with this,[br]and that afternoon, 0:03:15.244,0:03:19.522 we realized that we could use[br]this new, universal interface 0:03:19.546,0:03:23.974 to allow anyone in the world[br]to operate the robot in our lab. 0:03:25.339,0:03:29.381 So, rather than have it fight[br]or do industrial work, 0:03:30.413,0:03:32.943 we decided to build a planter, 0:03:32.967,0:03:34.918 put the robot into the center of it, 0:03:34.942,0:03:36.620 and we called it the Telegarden. 0:03:37.736,0:03:42.253 And we had put a camera[br]in the gripper of the hand of the robot, 0:03:42.277,0:03:45.208 and we wrote some[br]special scripts and software, 0:03:45.232,0:03:47.266 so that anyone in the world could come in, 0:03:47.290,0:03:49.074 and by clicking on the screen, 0:03:49.098,0:03:52.719 they could move the robot around[br]and visit the garden. 0:03:53.416,0:03:57.267 But we also set up some other software 0:03:57.291,0:04:01.207 that lets you participate[br]and help us water the garden, remotely. 0:04:01.643,0:04:03.932 And if you watered it a few times, 0:04:03.956,0:04:06.231 we'd give you your own seed to plant. 0:04:07.167,0:04:10.643 Now, this was an engineering project, 0:04:10.667,0:04:14.779 and we published some papers[br]on the system design of it, 0:04:14.803,0:04:17.448 but we also thought of it[br]as an art installation. 0:04:18.638,0:04:20.787 It was invited, after the first year, 0:04:20.811,0:04:23.831 by the Ars Electronica Museum in Austria, 0:04:23.855,0:04:26.677 to have it installed in their lobby. 0:04:27.558,0:04:31.545 And I'm happy to say, it remained[br]online there, 24 hours a day, 0:04:31.569,0:04:33.400 for almost nine years. 0:04:34.369,0:04:38.144 That robot was operated by more people 0:04:38.168,0:04:40.065 than any other robot in history. 0:04:41.303,0:04:42.930 Now, one day, 0:04:42.954,0:04:46.302 I got a call out of the blue[br]from a student, 0:04:47.211,0:04:51.369 who asked a very simple[br]but profound question. 0:04:52.361,0:04:55.187 He said, "Is the robot real?" 0:04:56.599,0:04:58.885 Now, everyone else had assumed it was, 0:04:58.909,0:05:01.425 and we knew it was,[br]because we were working with it. 0:05:01.449,0:05:02.847 But I knew what he meant, 0:05:02.871,0:05:04.313 because it would be possible 0:05:04.337,0:05:07.122 to take a bunch of pictures[br]of flowers in a garden 0:05:07.146,0:05:10.785 and then, basically, index them[br]in a computer system, 0:05:10.809,0:05:13.436 such that it would appear[br]that there was a real robot, 0:05:13.460,0:05:14.610 when there wasn't. 0:05:15.088,0:05:16.612 And the more I thought about it, 0:05:16.636,0:05:20.215 I couldn't think of a good answer[br]for how he could tell the difference. 0:05:20.628,0:05:23.425 This was right about the time[br]that I was offered a position 0:05:23.449,0:05:24.750 here at Berkeley. 0:05:24.774,0:05:26.419 And when I got here, 0:05:26.443,0:05:28.805 I looked up Hubert Dreyfus, 0:05:28.829,0:05:31.756 who's a world-renowned[br]professor of philosophy, 0:05:32.582,0:05:35.312 And I talked with him[br]about this and he said, 0:05:35.336,0:05:39.438 "This is one of the oldest[br]and most central problems in philosophy. 0:05:39.462,0:05:43.479 It goes back to the Skeptics[br]and up through Descartes. 0:05:43.503,0:05:46.685 It's the issue of epistemology, 0:05:46.709,0:05:49.577 the study of how do we know[br]that something is true." 0:05:50.625,0:05:52.625 So he and I started working together, 0:05:52.649,0:05:55.520 and we coined a new term:[br]"telepistemology," 0:05:56.757,0:05:58.893 the study of knowledge at a distance. 0:05:59.303,0:06:03.794 We invited leading artists,[br]engineers and philosophers 0:06:03.818,0:06:05.137 to write essays about this, 0:06:05.161,0:06:08.865 and the results are collected[br]in this book from MIT Press. 0:06:09.959,0:06:12.055 So thanks to this student, 0:06:12.079,0:06:15.398 who questioned what everyone else[br]had assumed to be true, 0:06:15.422,0:06:19.399 this project taught me[br]an important lesson about life, 0:06:19.423,0:06:22.410 which is to always question assumptions. 0:06:23.807,0:06:25.983 Now, the second project[br]I'll tell you about 0:06:26.007,0:06:27.999 grew out of the Telegarden. 0:06:28.023,0:06:30.844 As it was operating, my students[br]and I were very interested 0:06:30.868,0:06:33.209 in how people were interacting[br]with each other, 0:06:33.233,0:06:35.290 and what they were doing with the garden. 0:06:35.314,0:06:36.465 So we started thinking: 0:06:36.489,0:06:38.457 what if the robot could leave the garden 0:06:38.481,0:06:41.295 and go out into some other[br]interesting environment? 0:06:41.319,0:06:44.060 Like, for example,[br]what if it could go to a dinner party 0:06:44.084,0:06:45.545 at the White House? 0:06:45.569,0:06:47.068 (Laughter) 0:06:48.401,0:06:51.970 So, because we were interested[br]more in the system design 0:06:51.994,0:06:54.937 and the user interface[br]than in the hardware, 0:06:54.961,0:06:56.112 we decided that, 0:06:56.136,0:07:00.623 rather than have a robot replace[br]the human to go to the party, 0:07:00.647,0:07:02.877 we'd have a human replace the robot. 0:07:03.679,0:07:05.139 We called it the Tele-Actor. 0:07:05.978,0:07:07.986 We got a human, 0:07:08.010,0:07:11.118 someone who's very[br]outgoing and gregarious, 0:07:11.142,0:07:15.397 and she was outfitted with a helmet[br]with various equipment, 0:07:15.421,0:07:16.705 cameras and microphones, 0:07:16.729,0:07:19.794 and then a backpack with wireless[br]Internet connection. 0:07:20.882,0:07:23.026 And the idea was that she could go 0:07:23.050,0:07:25.500 into a remote and interesting environment, 0:07:25.524,0:07:27.541 and then over the Internet, 0:07:27.565,0:07:30.157 people could experience[br]what she was experiencing. 0:07:30.736,0:07:33.707 So they could see what she was seeing, 0:07:33.731,0:07:37.014 but then, more importantly,[br]they could participate, 0:07:37.038,0:07:41.788 by interacting with each other[br]and coming up with ideas 0:07:41.812,0:07:45.781 about what she should do next[br]and where she should go, 0:07:45.805,0:07:48.011 and then conveying those[br]to the Tele-Actor. 0:07:49.069,0:07:51.490 So we got a chance to take the Tele-Actor 0:07:51.514,0:07:54.322 to the Webby Awards in San Francisco. 0:07:55.129,0:07:58.369 And that year, Sam Donaldson was the host. 0:08:00.290,0:08:02.980 Just before the curtain went[br]up, I had about 30 seconds 0:08:03.004,0:08:06.746 to explain to Mr. Donaldson[br]what we were going to do. 0:08:08.119,0:08:11.636 And I said, "The Tele-Actor[br]is going to be joining you onstage. 0:08:11.660,0:08:14.014 This is a new experimental project, 0:08:14.038,0:08:16.633 and people are watching her[br]on their screens, 0:08:16.657,0:08:19.948 there's cameras involved[br]and there's microphones 0:08:19.972,0:08:21.758 and she's got an earbud in her ear, 0:08:21.782,0:08:24.099 and people over the network[br]are giving her advice 0:08:24.123,0:08:25.281 about what to do next." 0:08:25.305,0:08:26.700 And he said, "Wait a second. 0:08:27.806,0:08:28.979 That's what I do." 0:08:29.003,0:08:33.980 (Laughter) 0:08:34.004,0:08:35.876 So he loved the concept, 0:08:35.900,0:08:40.153 and when the Tele-Actor walked onstage,[br]she walked right up to him, 0:08:40.177,0:08:42.636 and she gave him a big kiss[br]right on the lips. 0:08:42.660,0:08:44.612 (Laughter) 0:08:44.636,0:08:47.558 We were totally surprised --[br]we had no idea that would happen. 0:08:47.582,0:08:50.200 And he was great, he just gave her[br]a big hug in return, 0:08:50.224,0:08:51.881 and it worked out great. 0:08:51.905,0:08:53.945 But that night, as we were packing up, 0:08:53.969,0:08:58.524 I asked the Tele-Actor,[br]how did the Tele-Directors decide 0:08:58.548,0:09:01.548 that they would give[br]a kiss to Sam Donaldson? 0:09:03.135,0:09:04.592 And she said they hadn't. 0:09:05.274,0:09:07.892 She said, when she was[br]just about to walk onstage, 0:09:07.916,0:09:10.890 the Tele-Directors still were trying[br]to agree on what to do, 0:09:10.914,0:09:14.014 and so she just walked onstage[br]and did what felt most natural. 0:09:14.038,0:09:18.552 (Laughter) 0:09:18.576,0:09:21.668 So, the success[br]of the Tele-Actor that night 0:09:21.692,0:09:25.945 was due to the fact[br]that she was a wonderful actor. 0:09:25.969,0:09:28.398 She knew when to trust her instincts. 0:09:28.422,0:09:32.351 And so that project taught me[br]another lesson about life, 0:09:32.375,0:09:36.022 which is that, when in doubt, improvise. 0:09:36.046,0:09:37.712 (Laughter) 0:09:38.664,0:09:43.583 Now, the third project[br]grew out of my experience 0:09:43.607,0:09:45.456 when my father was in the hospital. 0:09:47.284,0:09:50.637 He was undergoing a treatment --[br]chemotherapy treatments -- 0:09:50.661,0:09:54.931 and there's a related treatment[br]called brachytherapy, 0:09:54.955,0:09:58.743 where tiny, radioactive seeds[br]are placed into the body 0:09:58.767,0:10:00.501 to treat cancerous tumors. 0:10:01.572,0:10:03.843 And the way it's done,[br]as you can see here, 0:10:03.867,0:10:08.234 is that surgeons[br]insert needles into the body 0:10:08.258,0:10:09.573 to deliver the seeds. 0:10:09.974,0:10:13.672 And all these needles[br]are inserted in parallel. 0:10:14.445,0:10:19.838 So it's very common that some[br]of the needles penetrate sensitive organs. 0:10:20.635,0:10:27.608 And as a result, the needles damage[br]these organs, cause damage, 0:10:27.632,0:10:30.041 which leads to trauma and side effects. 0:10:30.581,0:10:32.225 So my students and I wondered: 0:10:32.249,0:10:36.771 what if we could modify the system, 0:10:36.795,0:10:39.420 so that the needles[br]could come in at different angles? 0:10:40.395,0:10:42.238 So we simulated this; 0:10:42.262,0:10:45.707 we developed some optimization[br]algorithms and we simulated this. 0:10:45.731,0:10:46.882 And we were able to show 0:10:46.906,0:10:49.468 that we are able to avoid[br]the delicate organs, 0:10:49.492,0:10:54.545 and yet still achieve the coverage[br]of the tumors with the radiation. 0:10:55.313,0:10:58.787 So now, we're working with doctors at UCSF 0:10:58.811,0:11:01.078 and engineers at Johns Hopkins, 0:11:01.102,0:11:05.064 and we're building a robot[br]that has a number of -- 0:11:05.088,0:11:07.687 it's a specialized design[br]with different joints 0:11:07.711,0:11:12.034 that can allow the needles to come in[br]at an infinite variety of angles. 0:11:12.483,0:11:16.205 And as you can see here,[br]they can avoid delicate organs 0:11:16.229,0:11:18.893 and still reach the targets[br]they're aiming for. 0:11:20.019,0:11:25.004 So, by questioning this assumption[br]that all the needles have to be parallel, 0:11:25.028,0:11:27.654 this project also taught me[br]an important lesson: 0:11:28.114,0:11:32.869 When in doubt, when your path[br]is blocked, pivot. 0:11:33.797,0:11:37.637 And the last project[br]also has to do with medical robotics. 0:11:38.187,0:11:41.757 And this is something[br]that's grown out of a system 0:11:41.781,0:11:45.129 called the da Vinci surgical robot. 0:11:45.866,0:11:48.310 And this is a commercially[br]available device. 0:11:48.334,0:11:51.307 It's being used in over 2,000[br]hospitals around the world. 0:11:52.013,0:11:56.386 The idea is it allows the surgeon[br]to operate comfortably 0:11:56.410,0:11:58.187 in his own coordinate frame. 0:12:00.762,0:12:06.531 Many of the subtasks in surgery are very[br]routine and tedious, like suturing, 0:12:06.555,0:12:08.822 and currently, all of these are performed 0:12:08.846,0:12:12.713 under the specific and immediate[br]control of the surgeon. 0:12:13.374,0:12:15.642 So the surgeon becomes fatigued over time. 0:12:16.086,0:12:17.522 And we've been wondering, 0:12:17.546,0:12:21.917 what if we could program the robot[br]to perform some of these subtasks, 0:12:21.941,0:12:23.322 and thereby free the surgeon 0:12:23.346,0:12:26.581 to focus on the more complicated[br]parts of the surgery, 0:12:26.605,0:12:29.405 and also cut down on the time[br]that the surgery would take 0:12:29.429,0:12:32.268 if we could get the robot[br]to do them a little bit faster? 0:12:32.958,0:12:36.380 Now, it's hard to program a robot[br]to do delicate things like this. 0:12:36.943,0:12:41.149 But it turns out my colleague[br]Pieter Abbeel, who's here at Berkeley, 0:12:41.173,0:12:46.653 has developed a new set of techniques[br]for teaching robots from example. 0:12:47.170,0:12:49.913 So he's gotten robots to fly helicopters, 0:12:49.937,0:12:53.191 do incredibly interesting,[br]beautiful acrobatics, 0:12:53.215,0:12:55.182 by watching human experts fly them. 0:12:56.152,0:12:58.058 So we got one of these robots. 0:12:58.082,0:13:00.622 We started working with Pieter[br]and his students. 0:13:00.646,0:13:04.826 And we asked a surgeon[br]to perform a task -- 0:13:06.761,0:13:07.912 with the robot. 0:13:07.936,0:13:10.961 So what we're doing is asking[br]the surgeon to perform the task, 0:13:10.985,0:13:13.021 and we record the motions of the robot. 0:13:13.045,0:13:14.471 So here's an example. 0:13:14.495,0:13:17.616 I'll use tracing out[br]a figure eight as an example. 0:13:18.170,0:13:21.804 So here's what it looks like[br]when the robot -- 0:13:21.828,0:13:24.896 this is what the robot's path[br]looks like, those three examples. 0:13:24.920,0:13:29.056 Now, those are much better[br]than what a novice like me could do, 0:13:29.080,0:13:31.774 but they're still jerky and imprecise. 0:13:31.798,0:13:34.739 So we record all these examples, the data, 0:13:34.763,0:13:37.624 and then go through a sequence of steps. 0:13:38.278,0:13:41.404 First, we use a technique[br]called dynamic time warping 0:13:41.428,0:13:42.918 from speech recognition. 0:13:42.942,0:13:46.279 And this allows us to temporally[br]align all of the examples. 0:13:46.918,0:13:52.120 And then we apply Kalman filtering,[br]a technique from control theory, 0:13:52.144,0:13:55.113 that allows us to statistically[br]analyze all the noise 0:13:55.137,0:13:59.259 and extract the desired[br]trajectory that underlies them. 0:14:01.283,0:14:03.833 Now we take those human demonstrations -- 0:14:03.857,0:14:05.532 they're all noisy and imperfect -- 0:14:05.556,0:14:08.154 and we extract from them[br]an inferred task trajectory 0:14:08.178,0:14:10.704 and control sequence for the robot. 0:14:11.181,0:14:13.497 We then execute that on the robot, 0:14:13.521,0:14:15.546 we observe what happens, 0:14:15.570,0:14:16.927 then we adjust the controls, 0:14:16.951,0:14:19.747 using a sequence of techniques[br]called iterative learning. 0:14:21.129,0:14:24.538 Then what we do is we increase[br]the velocity a little bit. 0:14:25.244,0:14:28.379 We observe the results,[br]adjust the controls again, 0:14:29.340,0:14:31.112 and observe what happens. 0:14:31.136,0:14:33.303 And we go through this several rounds. 0:14:33.327,0:14:34.508 And here's the result. 0:14:34.968,0:14:36.706 That's the inferred task trajectory, 0:14:36.730,0:14:39.951 and here's the robot[br]moving at the speed of the human. 0:14:39.975,0:14:42.080 Here's four times the speed of the human. 0:14:42.477,0:14:43.961 Here's seven times. 0:14:45.004,0:14:49.829 And here's the robot operating[br]at 10 times the speed of the human. 0:14:50.762,0:14:53.663 So we're able to get a robot[br]to perform a delicate task 0:14:53.687,0:14:56.573 like a surgical subtask, 0:14:57.081,0:14:58.962 at 10 times the speed of a human. 0:15:00.185,0:15:02.445 So this project also, 0:15:02.469,0:15:04.994 because of its involved[br]practicing and learning, 0:15:05.018,0:15:06.780 doing something over and over again, 0:15:06.804,0:15:09.381 this project also has a lesson, which is: 0:15:09.405,0:15:11.809 if you want to do something well, 0:15:13.084,0:15:17.524 there's no substitute[br]for practice, practice, practice. 0:15:21.186,0:15:24.673 So these are four of the lessons[br]that I've learned from robots 0:15:24.697,0:15:25.995 over the years. 0:15:27.312,0:15:32.819 And the field of robotics[br]has gotten much better over time. 0:15:34.319,0:15:36.595 Nowadays, high school students[br]can build robots, 0:15:36.619,0:15:39.503 like the industrial robot[br]my dad and I tried to build. 0:15:40.675,0:15:43.445 But, it's very -- now ... 0:15:43.954,0:15:47.255 And now, I have a daughter, 0:15:47.857,0:15:49.014 named Odessa. 0:15:49.673,0:15:50.926 She's eight years old. 0:15:51.682,0:15:53.331 And she likes robots, too. 0:15:53.970,0:15:55.366 Maybe it runs in the family. 0:15:55.390,0:15:56.630 (Laughter) 0:15:56.654,0:15:58.749 I wish she could meet my dad. 0:16:00.303,0:16:03.043 And now I get to teach her[br]how things work, 0:16:03.067,0:16:05.297 and we get to build projects together. 0:16:05.321,0:16:08.527 And I wonder what kind of lessons[br]she'll learn from them. 0:16:10.147,0:16:14.154 Robots are the most human of our machines. 0:16:14.983,0:16:17.937 They can't solve all[br]of the world's problems, 0:16:17.961,0:16:21.484 but I think they have something[br]important to teach us. 0:16:22.276,0:16:24.277 I invite all of you 0:16:24.301,0:16:27.879 to think about the innovations[br]that you're interested in, 0:16:28.712,0:16:30.955 the machines that you wish for. 0:16:31.660,0:16:34.267 And think about[br]what they might be telling you. 0:16:35.211,0:16:38.956 Because I have a hunch that many[br]of our technological innovations, 0:16:38.980,0:16:40.651 the devices we dream about, 0:16:42.166,0:16:45.015 can inspire us to be better humans. 0:16:45.984,0:16:47.135 Thank you. 0:16:47.159,0:16:49.071 (Applause)