A monkey that controls a robot with its thoughts. No, really.
-
0:00 - 0:03The kind of neuroscience that I do and my colleagues do
-
0:03 - 0:05is almost like the weatherman.
-
0:05 - 0:09We are always chasing storms.
-
0:09 - 0:14We want to see and measure storms -- brainstorms, that is.
-
0:14 - 0:17And we all talk about brainstorms in our daily lives,
-
0:17 - 0:20but we rarely see or listen to one.
-
0:20 - 0:22So I always like to start these talks
-
0:22 - 0:25by actually introducing you to one of them.
-
0:25 - 0:28Actually, the first time we recorded more than one neuron --
-
0:28 - 0:30a hundred brain cells simultaneously --
-
0:30 - 0:33we could measure the electrical sparks
-
0:33 - 0:35of a hundred cells in the same animal,
-
0:35 - 0:37this is the first image we got,
-
0:37 - 0:39the first 10 seconds of this recording.
-
0:39 - 0:43So we got a little snippet of a thought,
-
0:43 - 0:46and we could see it in front of us.
-
0:46 - 0:47I always tell the students
-
0:47 - 0:51that we could also call neuroscientists some sort of astronomer,
-
0:51 - 0:52because we are dealing with a system
-
0:52 - 0:55that is only comparable in terms of number of cells
-
0:55 - 0:58to the number of galaxies that we have in the universe.
-
0:58 - 1:01And here we are, out of billions of neurons,
-
1:01 - 1:04just recording, 10 years ago, a hundred.
-
1:04 - 1:06We are doing a thousand now.
-
1:06 - 1:11And we hope to understand something fundamental about our human nature.
-
1:11 - 1:13Because, if you don't know yet,
-
1:13 - 1:18everything that we use to define what human nature is comes from these storms,
-
1:18 - 1:23comes from these storms that roll over the hills and valleys of our brains
-
1:23 - 1:27and define our memories, our beliefs,
-
1:27 - 1:30our feelings, our plans for the future.
-
1:30 - 1:32Everything that we ever do,
-
1:32 - 1:37everything that every human has ever done, do or will do,
-
1:37 - 1:42requires the toil of populations of neurons producing these kinds of storms.
-
1:42 - 1:45And the sound of a brainstorm, if you've never heard one,
-
1:45 - 1:48is somewhat like this.
-
1:48 - 1:51You can put it louder if you can.
-
1:51 - 1:58My son calls this "making popcorn while listening to a badly-tuned A.M. station."
-
1:58 - 1:59This is a brain.
-
1:59 - 2:03This is what happens when you route these electrical storms to a loudspeaker
-
2:03 - 2:06and you listen to a hundred brain cells firing,
-
2:06 - 2:10your brain will sound like this -- my brain, any brain.
-
2:10 - 2:14And what we want to do as neuroscientists in this time
-
2:14 - 2:19is to actually listen to these symphonies, these brain symphonies,
-
2:19 - 2:23and try to extract from them the messages they carry.
-
2:23 - 2:26In particular, about 12 years ago
-
2:26 - 2:29we created a preparation that we named brain-machine interfaces.
-
2:29 - 2:31And you have a scheme here that describes how it works.
-
2:31 - 2:37The idea is, let's have some sensors that listen to these storms, this electrical firing,
-
2:37 - 2:40and see if you can, in the same time that it takes
-
2:40 - 2:45for this storm to leave the brain and reach the legs or the arms of an animal --
-
2:45 - 2:48about half a second --
-
2:48 - 2:50let's see if we can read these signals,
-
2:50 - 2:54extract the motor messages that are embedded in it,
-
2:54 - 2:56translate it into digital commands
-
2:56 - 2:58and send it to an artificial device
-
2:58 - 3:04that will reproduce the voluntary motor wheel of that brain in real time.
-
3:04 - 3:08And see if we can measure how well we can translate that message
-
3:08 - 3:11when we compare to the way the body does that.
-
3:11 - 3:14And if we can actually provide feedback,
-
3:14 - 3:20sensory signals that go back from this robotic, mechanical, computational actuator
-
3:20 - 3:22that is now under the control of the brain,
-
3:22 - 3:23back to the brain,
-
3:23 - 3:25how the brain deals with that,
-
3:25 - 3:30of receiving messages from an artificial piece of machinery.
-
3:30 - 3:33And that's exactly what we did 10 years ago.
-
3:33 - 3:36We started with a superstar monkey called Aurora
-
3:36 - 3:38that became one of the superstars of this field.
-
3:38 - 3:40And Aurora liked to play video games.
-
3:40 - 3:42As you can see here,
-
3:42 - 3:47she likes to use a joystick, like any one of us, any of our kids, to play this game.
-
3:47 - 3:51And as a good primate, she even tries to cheat before she gets the right answer.
-
3:51 - 3:56So even before a target appears that she's supposed to cross
-
3:56 - 3:58with the cursor that she's controlling with this joystick,
-
3:58 - 4:02Aurora is trying to find the target, no matter where it is.
-
4:02 - 4:04And if she's doing that,
-
4:04 - 4:07because every time she crosses that target with the little cursor,
-
4:07 - 4:10she gets a drop of Brazilian orange juice.
-
4:10 - 4:13And I can tell you, any monkey will do anything for you
-
4:13 - 4:16if you get a little drop of Brazilian orange juice.
-
4:16 - 4:19Actually any primate will do that.
-
4:19 - 4:20Think about that.
-
4:20 - 4:24Well, while Aurora was playing this game, as you saw,
-
4:24 - 4:26and doing a thousand trials a day
-
4:26 - 4:30and getting 97 percent correct and 350 milliliters of orange juice,
-
4:30 - 4:33we are recording the brainstorms that are produced in her head
-
4:33 - 4:35and sending them to a robotic arm
-
4:35 - 4:39that was learning to reproduce the movements that Aurora was making.
-
4:39 - 4:43Because the idea was to actually turn on this brain-machine interface
-
4:43 - 4:47and have Aurora play the game just by thinking,
-
4:47 - 4:50without interference of her body.
-
4:50 - 4:53Her brainstorms would control an arm
-
4:53 - 4:56that would move the cursor and cross the target.
-
4:56 - 4:59And to our shock, that's exactly what Aurora did.
-
4:59 - 5:03She played the game without moving her body.
-
5:03 - 5:05So every trajectory that you see of the cursor now,
-
5:05 - 5:08this is the exact first moment she got that.
-
5:08 - 5:10That's the exact first moment
-
5:10 - 5:17a brain intention was liberated from the physical domains of a body of a primate
-
5:17 - 5:21and could act outside, in that outside world,
-
5:21 - 5:24just by controlling an artificial device.
-
5:24 - 5:29And Aurora kept playing the game, kept finding the little target
-
5:29 - 5:32and getting the orange juice that she wanted to get, that she craved for.
-
5:32 - 5:39Well, she did that because she, at that time, had acquired a new arm.
-
5:39 - 5:42The robotic arm that you see moving here 30 days later,
-
5:42 - 5:45after the first video that I showed to you,
-
5:45 - 5:47is under the control of Aurora's brain
-
5:47 - 5:51and is moving the cursor to get to the target.
-
5:51 - 5:55And Aurora now knows that she can play the game with this robotic arm,
-
5:55 - 6:00but she has not lost the ability to use her biological arms to do what she pleases.
-
6:00 - 6:04She can scratch her back, she can scratch one of us, she can play another game.
-
6:04 - 6:06By all purposes and means,
-
6:06 - 6:10Aurora's brain has incorporated that artificial device
-
6:10 - 6:13as an extension of her body.
-
6:13 - 6:16The model of the self that Aurora had in her mind
-
6:16 - 6:20has been expanded to get one more arm.
-
6:20 - 6:23Well, we did that 10 years ago.
-
6:23 - 6:26Just fast forward 10 years.
-
6:26 - 6:31Just last year we realized that you don't even need to have a robotic device.
-
6:31 - 6:36You can just build a computational body, an avatar, a monkey avatar.
-
6:36 - 6:40And you can actually use it for our monkeys to either interact with them,
-
6:40 - 6:45or you can train them to assume in a virtual world
-
6:45 - 6:48the first-person perspective of that avatar
-
6:48 - 6:53and use her brain activity to control the movements of the avatar's arms or legs.
-
6:53 - 6:56And what we did basically was to train the animals
-
6:56 - 6:59to learn how to control these avatars
-
6:59 - 7:03and explore objects that appear in the virtual world.
-
7:03 - 7:05And these objects are visually identical,
-
7:05 - 7:09but when the avatar crosses the surface of these objects,
-
7:09 - 7:16they send an electrical message that is proportional to the microtactile texture of the object
-
7:16 - 7:20that goes back directly to the monkey's brain,
-
7:20 - 7:25informing the brain what it is the avatar is touching.
-
7:25 - 7:30And in just four weeks, the brain learns to process this new sensation
-
7:30 - 7:36and acquires a new sensory pathway -- like a new sense.
-
7:36 - 7:38And you truly liberate the brain now
-
7:38 - 7:43because you are allowing the brain to send motor commands to move this avatar.
-
7:43 - 7:48And the feedback that comes from the avatar is being processed directly by the brain
-
7:48 - 7:50without the interference of the skin.
-
7:50 - 7:53So what you see here is this is the design of the task.
-
7:53 - 7:57You're going to see an animal basically touching these three targets.
-
7:57 - 8:01And he has to select one because only one carries the reward,
-
8:01 - 8:03the orange juice that they want to get.
-
8:03 - 8:09And he has to select it by touch using a virtual arm, an arm that doesn't exist.
-
8:09 - 8:11And that's exactly what they do.
-
8:11 - 8:14This is a complete liberation of the brain
-
8:14 - 8:19from the physical constraints of the body and the motor in a perceptual task.
-
8:19 - 8:23The animal is controlling the avatar to touch the targets.
-
8:23 - 8:28And he's sensing the texture by receiving an electrical message directly in the brain.
-
8:28 - 8:32And the brain is deciding what is the texture associated with the reward.
-
8:32 - 8:36The legends that you see in the movie don't appear for the monkey.
-
8:36 - 8:39And by the way, they don't read English anyway,
-
8:39 - 8:44so they are here just for you to know that the correct target is shifting position.
-
8:44 - 8:48And yet, they can find them by tactile discrimination,
-
8:48 - 8:51and they can press it and select it.
-
8:51 - 8:54So when we look at the brains of these animals,
-
8:54 - 8:57on the top panel you see the alignment of 125 cells
-
8:57 - 9:02showing what happens with the brain activity, the electrical storms,
-
9:02 - 9:04of this sample of neurons in the brain
-
9:04 - 9:06when the animal is using a joystick.
-
9:06 - 9:08And that's a picture that every neurophysiologist knows.
-
9:08 - 9:13The basic alignment shows that these cells are coding for all possible directions.
-
9:13 - 9:19The bottom picture is what happens when the body stops moving
-
9:19 - 9:25and the animal starts controlling either a robotic device or a computational avatar.
-
9:25 - 9:28As fast as we can reset our computers,
-
9:28 - 9:34the brain activity shifts to start representing this new tool,
-
9:34 - 9:39as if this too was a part of that primate's body.
-
9:39 - 9:44The brain is assimilating that too, as fast as we can measure.
-
9:44 - 9:48So that suggests to us that our sense of self
-
9:48 - 9:52does not end at the last layer of the epithelium of our bodies,
-
9:52 - 9:58but it ends at the last layer of electrons of the tools that we're commanding with our brains.
-
9:58 - 10:02Our violins, our cars, our bicycles, our soccer balls, our clothing --
-
10:02 - 10:09they all become assimilated by this voracious, amazing, dynamic system called the brain.
-
10:09 - 10:11How far can we take it?
-
10:11 - 10:15Well, in an experiment that we ran a few years ago, we took this to the limit.
-
10:15 - 10:18We had an animal running on a treadmill
-
10:18 - 10:20at Duke University on the East Coast of the United States,
-
10:20 - 10:23producing the brainstorms necessary to move.
-
10:23 - 10:27And we had a robotic device, a humanoid robot,
-
10:27 - 10:29in Kyoto, Japan at ATR Laboratories
-
10:29 - 10:35that was dreaming its entire life to be controlled by a brain,
-
10:35 - 10:38a human brain, or a primate brain.
-
10:38 - 10:43What happens here is that the brain activity that generated the movements in the monkey
-
10:43 - 10:47was transmitted to Japan and made this robot walk
-
10:47 - 10:51while footage of this walking was sent back to Duke,
-
10:51 - 10:56so that the monkey could see the legs of this robot walking in front of her.
-
10:56 - 11:00So she could be rewarded, not by what her body was doing
-
11:00 - 11:05but for every correct step of the robot on the other side of the planet
-
11:05 - 11:07controlled by her brain activity.
-
11:07 - 11:15Funny thing, that round trip around the globe took 20 milliseconds less
-
11:15 - 11:19than it takes for that brainstorm to leave its head, the head of the monkey,
-
11:19 - 11:23and reach its own muscle.
-
11:23 - 11:29The monkey was moving a robot that was six times bigger, across the planet.
-
11:29 - 11:35This is one of the experiments in which that robot was able to walk autonomously.
-
11:35 - 11:40This is CB1 fulfilling its dream in Japan
-
11:40 - 11:44under the control of the brain activity of a primate.
-
11:44 - 11:46So where are we taking all this?
-
11:46 - 11:48What are we going to do with all this research,
-
11:48 - 11:54besides studying the properties of this dynamic universe that we have between our ears?
-
11:54 - 11:59Well the idea is to take all this knowledge and technology
-
11:59 - 12:04and try to restore one of the most severe neurological problems that we have in the world.
-
12:04 - 12:09Millions of people have lost the ability to translate these brainstorms
-
12:09 - 12:11into action, into movement.
-
12:11 - 12:16Although their brains continue to produce those storms and code for movements,
-
12:16 - 12:21they cannot cross a barrier that was created by a lesion on the spinal cord.
-
12:21 - 12:24So our idea is to create a bypass,
-
12:24 - 12:28is to use these brain-machine interfaces to read these signals,
-
12:28 - 12:32larger-scale brainstorms that contain the desire to move again,
-
12:32 - 12:36bypass the lesion using computational microengineering
-
12:36 - 12:43and send it to a new body, a whole body called an exoskeleton,
-
12:43 - 12:49a whole robotic suit that will become the new body of these patients.
-
12:49 - 12:53And you can see an image produced by this consortium.
-
12:53 - 12:57This is a nonprofit consortium called the Walk Again Project
-
12:57 - 13:00that is putting together scientists from Europe,
-
13:00 - 13:01from here in the United States, and in Brazil
-
13:01 - 13:06together to work to actually get this new body built --
-
13:06 - 13:09a body that we believe, through the same plastic mechanisms
-
13:09 - 13:15that allow Aurora and other monkeys to use these tools through a brain-machine interface
-
13:15 - 13:21and that allows us to incorporate the tools that we produce and use in our daily life.
-
13:21 - 13:24This same mechanism, we hope, will allow these patients,
-
13:24 - 13:28not only to imagine again the movements that they want to make
-
13:28 - 13:31and translate them into movements of this new body,
-
13:31 - 13:38but for this body to be assimilated as the new body that the brain controls.
-
13:38 - 13:42So I was told about 10 years ago
-
13:42 - 13:47that this would never happen, that this was close to impossible.
-
13:47 - 13:49And I can only tell you that as a scientist,
-
13:49 - 13:52I grew up in southern Brazil in the mid-'60s
-
13:52 - 13:58watching a few crazy guys telling [us] that they would go to the Moon.
-
13:58 - 13:59And I was five years old,
-
13:59 - 14:03and I never understood why NASA didn't hire Captain Kirk and Spock to do the job;
-
14:03 - 14:06after all, they were very proficient --
-
14:06 - 14:09but just seeing that as a kid
-
14:09 - 14:12made me believe, as my grandmother used to tell me,
-
14:12 - 14:14that "impossible is just the possible
-
14:14 - 14:18that someone has not put in enough effort to make it come true."
-
14:18 - 14:22So they told me that it's impossible to make someone walk.
-
14:22 - 14:25I think I'm going to follow my grandmother's advice.
-
14:25 - 14:26Thank you.
-
14:26 - 14:34(Applause)
- Title:
- A monkey that controls a robot with its thoughts. No, really.
- Speaker:
- Miguel Nicolelis
- Description:
-
Can we use our brains to directly control machines -- without requiring a body as the middleman? Miguel Nicolelis talks through an astonishing experiment, in which a clever monkey in the US learns to control a monkey avatar, and then a robot arm in Japan, purely with its thoughts. The research has big implications for quadraplegic people -- and maybe for all of us. (Filmed at TEDMED 2012.)
- Video Language:
- English
- Team:
- closed TED
- Project:
- TEDTalks
- Duration:
- 14:55
Thu-Huong Ha edited English subtitles for A monkey that controls a robot with its thoughts. No, really. | ||
Thu-Huong Ha approved English subtitles for A monkey that controls a robot with its thoughts. No, really. | ||
Thu-Huong Ha edited English subtitles for A monkey that controls a robot with its thoughts. No, really. | ||
Morton Bast accepted English subtitles for A monkey that controls a robot with its thoughts. No, really. | ||
Morton Bast edited English subtitles for A monkey that controls a robot with its thoughts. No, really. | ||
Timothy Covell added a translation |