Can we create new senses for humans?
-
0:01 - 0:06We are built out of very small stuff,
-
0:06 - 0:08and we are embedded in
a very large cosmos, -
0:08 - 0:13and the fact is that we are not
very good at understanding reality -
0:13 - 0:14at either of those scales,
-
0:14 - 0:16and that's because our brains
-
0:16 - 0:20haven't evolved to understand
the world at that scale. -
0:20 - 0:24Instead, we're trapped on this
very thin slice of perception -
0:24 - 0:26right in the middle.
-
0:27 - 0:31But it gets strange, because even at
that slice of reality that we call home, -
0:31 - 0:34we're not seeing most
of the action that's going on. -
0:34 - 0:38So take the colors of our world.
-
0:38 - 0:42This is light waves, electromagnetic
radiation that bounces off objects -
0:42 - 0:46and it hits specialized receptors
in the back of our eyes. -
0:46 - 0:49But we're not seeing
all the waves out there. -
0:49 - 0:51In fact, what we see
-
0:51 - 0:55is less than a 10 trillionth
of what's out there. -
0:55 - 0:58So you have radio waves and microwaves
-
0:58 - 1:02and X-rays and gamma rays
passing through your body right now -
1:02 - 1:05and you're completely unaware of it,
-
1:05 - 1:08because you don't come with
the proper biological receptors -
1:08 - 1:09for picking it up.
-
1:10 - 1:12There are thousands
of cell phone conversations -
1:12 - 1:14passing through you right now,
-
1:14 - 1:16and you're utterly blind to it.
-
1:16 - 1:20Now, it's not that these things
are inherently unseeable. -
1:20 - 1:25Snakes include some infrared
in their reality, -
1:25 - 1:29and honeybees include ultraviolet
in their view of the world, -
1:29 - 1:32and of course we build machines
in the dashboards of our cars -
1:32 - 1:35to pick up on signals
in the radio frequency range, -
1:35 - 1:39and we built machines in hospitals
to pick up on the X-ray range. -
1:39 - 1:42But you can't sense
any of those by yourself, -
1:42 - 1:43at least not yet,
-
1:43 - 1:47because you don't come equipped
with the proper sensors. -
1:47 - 1:52Now, what this means is that
our experience of reality -
1:52 - 1:55is constrained by our biology,
-
1:55 - 1:58and that goes against
the common sense notion -
1:58 - 2:00that our eyes and our ears
and our fingertips -
2:00 - 2:04are just picking up
the objective reality that's out there. -
2:04 - 2:10Instead, our brains are sampling
just a little bit of the world. -
2:10 - 2:12Now, across the animal kingdom,
-
2:12 - 2:15different animals pick up
on different parts of reality. -
2:15 - 2:18So in the blind
and deaf world of the tick, -
2:18 - 2:23the important signals
are temperature and butyric acid; -
2:23 - 2:26in the world of the black ghost knifefish,
-
2:26 - 2:31its sensory world is lavishly colored
by electrical fields; -
2:31 - 2:33and for the echolocating bat,
-
2:33 - 2:37its reality is constructed
out of air compression waves. -
2:37 - 2:42That's the slice of their ecosystem
that they can pick up on, -
2:42 - 2:43and we have a word for this in science.
-
2:43 - 2:45It's called the umwelt,
-
2:45 - 2:49which is the German word
for the surrounding world. -
2:49 - 2:52Now, presumably, every animal assumes
-
2:52 - 2:56that its umwelt is the entire
objective reality out there, -
2:56 - 2:58because why would you ever stop to imagine
-
2:58 - 3:01that there's something beyond
what we can sense. -
3:01 - 3:04Instead, what we all do
is we accept reality -
3:04 - 3:07as it's presented to us.
-
3:07 - 3:10Let's do a consciousness-raiser on this.
-
3:10 - 3:12Imagine that you are a bloodhound dog.
-
3:13 - 3:15Your whole world is about smelling.
-
3:15 - 3:20You've got a long snout that has
200 million scent receptors in it, -
3:20 - 3:24and you have wet nostrils
that attract and trap scent molecules, -
3:24 - 3:28and your nostrils even have slits
so you can take big nosefuls of air. -
3:28 - 3:31Everything is about smell for you.
-
3:31 - 3:35So one day, you stop in your tracks
with a revelation. -
3:35 - 3:39You look at your human owner
and you think, -
3:39 - 3:43"What is it like to have the pitiful,
impoverished nose of a human? -
3:43 - 3:45(Laughter)
-
3:45 - 3:48What is it like when you take
a feeble little noseful of air? -
3:48 - 3:52How can you not know that there's
a cat 100 yards away, -
3:52 - 3:56or that your neighbor was on
this very spot six hours ago?" -
3:56 - 3:58(Laughter)
-
3:58 - 4:01So because we're humans,
-
4:01 - 4:03we've never experienced
that world of smell, -
4:03 - 4:06so we don't miss it,
-
4:06 - 4:10because we are firmly settled
into our umwelt. -
4:10 - 4:14But the question is,
do we have to be stuck there? -
4:14 - 4:19So as a neuroscientist, I'm interested
in the way that technology -
4:19 - 4:21might expand our umwelt,
-
4:21 - 4:25and how that's going to change
the experience of being human. -
4:26 - 4:30So we already know that we can marry
our technology to our biology, -
4:30 - 4:34because there are hundreds of thousands
of people walking around -
4:34 - 4:37with artificial hearing
and artificial vision. -
4:37 - 4:42So the way this works is, you take
a microphone and you digitize the signal, -
4:42 - 4:45and you put an electrode strip
directly into the inner ear. -
4:45 - 4:48Or, with the retinal implant,
you take a camera -
4:48 - 4:51and you digitize the signal,
and then you plug an electrode grid -
4:51 - 4:54directly into the optic nerve.
-
4:54 - 4:58And as recently as 15 years ago,
-
4:58 - 5:02there were a lot of scientists who thought
these technologies wouldn't work. -
5:02 - 5:07Why? It's because these technologies
speak the language of Silicon Valley, -
5:07 - 5:12and it's not exactly the same dialect
as our natural biological sense organs. -
5:12 - 5:15But the fact is that it works;
-
5:15 - 5:19the brain figures out
how to use the signals just fine. -
5:20 - 5:21Now, how do we understand that?
-
5:22 - 5:23Well, here's the big secret:
-
5:23 - 5:29Your brain is not hearing
or seeing any of this. -
5:29 - 5:35Your brain is locked in a vault of silence
and darkness inside your skull. -
5:35 - 5:39All it ever sees are
electrochemical signals -
5:39 - 5:42that come in along different data cables,
-
5:42 - 5:46and this is all it has to work with,
and nothing more. -
5:47 - 5:49Now, amazingly,
-
5:49 - 5:52the brain is really good
at taking in these signals -
5:52 - 5:55and extracting patterns
and assigning meaning, -
5:55 - 5:59so that it takes this inner cosmos
and puts together a story -
5:59 - 6:04of this, your subjective world.
-
6:04 - 6:06But here's the key point:
-
6:06 - 6:10Your brain doesn't know,
and it doesn't care, -
6:10 - 6:13where it gets the data from.
-
6:13 - 6:17Whatever information comes in,
it just figures out what to do with it. -
6:17 - 6:20And this is a very efficient
kind of machine. -
6:20 - 6:24It's essentially a general purpose
computing device, -
6:24 - 6:26and it just takes in everything
-
6:26 - 6:29and figures out
what it's going to do with it, -
6:29 - 6:33and that, I think, frees up Mother Nature
-
6:33 - 6:37to tinker around with different
sorts of input channels. -
6:37 - 6:40So I call this the P.H.
model of evolution, -
6:40 - 6:42and I don't want to get
too technical here, -
6:42 - 6:45but P.H. stands for Potato Head,
-
6:45 - 6:49and I use this name to emphasize
that all these sensors -
6:49 - 6:52that we know and love, like our eyes
and our ears and our fingertips, -
6:52 - 6:57these are merely peripheral
plug-and-play devices: -
6:57 - 7:00You stick them in, and you're good to go.
-
7:00 - 7:05The brain figures out what to do
with the data that comes in. -
7:06 - 7:08And when you look across
the animal kingdom, -
7:08 - 7:11you find lots of peripheral devices.
-
7:11 - 7:15So snakes have heat pits
with which to detect infrared, -
7:15 - 7:18and the ghost knifefish has
electroreceptors, -
7:18 - 7:21and the star-nosed mole has this appendage
-
7:21 - 7:24with 22 fingers on it
-
7:24 - 7:27with which it feels around and constructs
a 3D model of the world, -
7:27 - 7:31and many birds have magnetite
so they can orient -
7:31 - 7:34to the magnetic field of the planet.
-
7:34 - 7:38So what this means is that
nature doesn't have to continually -
7:38 - 7:40redesign the brain.
-
7:40 - 7:45Instead, with the principles
of brain operation established, -
7:45 - 7:49all nature has to worry about
is designing new peripherals. -
7:49 - 7:52Okay. So what this means is this:
-
7:52 - 7:54The lesson that surfaces
-
7:54 - 7:58is that there's nothing
really special or fundamental -
7:58 - 8:01about the biology that we
come to the table with. -
8:01 - 8:03It's just what we have inherited
-
8:03 - 8:06from a complex road of evolution.
-
8:06 - 8:10But it's not what we have to stick with,
-
8:10 - 8:12and our best proof of principle of this
-
8:12 - 8:14comes from what's called
sensory substitution. -
8:14 - 8:18And that refers to feeding
information into the brain -
8:18 - 8:20via unusual sensory channels,
-
8:20 - 8:23and the brain just figures out
what to do with it. -
8:23 - 8:26Now, that might sound speculative,
-
8:26 - 8:31but the first paper demonstrating this was
published in the journal Nature in 1969. -
8:32 - 8:34So a scientist named Paul Bach-y-Rita
-
8:34 - 8:38put blind people
in a modified dental chair, -
8:38 - 8:40and he set up a video feed,
-
8:40 - 8:42and he put something
in front of the camera, -
8:42 - 8:45and then you would feel that
-
8:45 - 8:48poked into your back
with a grid of solenoids. -
8:48 - 8:50So if you wiggle a coffee cup
in front of the camera, -
8:50 - 8:52you're feeling that in your back,
-
8:52 - 8:55and amazingly, blind people
got pretty good -
8:55 - 8:59at being able to determine
what was in front of the camera -
8:59 - 9:03just by feeling it
in the small of their back. -
9:03 - 9:06Now, there have been many
modern incarnations of this. -
9:06 - 9:10The sonic glasses take a video feed
right in front of you -
9:10 - 9:12and turn that into a sonic landscape,
-
9:12 - 9:15so as things move around,
and get closer and farther, -
9:15 - 9:17it sounds like "Bzz, bzz, bzz."
-
9:17 - 9:19It sounds like a cacophony,
-
9:19 - 9:23but after several weeks, blind people
start getting pretty good -
9:23 - 9:25at understanding what's in front of them
-
9:25 - 9:28just based on what they're hearing.
-
9:28 - 9:30And it doesn't have to be
through the ears: -
9:30 - 9:33this system uses an electrotactile grid
on the forehead, -
9:33 - 9:37so whatever's in front of the video feed,
you're feeling it on your forehead. -
9:37 - 9:40Why the forehead? Because you're not
using it for much else. -
9:40 - 9:44The most modern incarnation
is called the brainport, -
9:44 - 9:48and this is a little electrogrid
that sits on your tongue, -
9:48 - 9:52and the video feed gets turned into
these little electrotactile signals, -
9:52 - 9:58and blind people get so good at using this
that they can throw a ball into a basket, -
9:58 - 10:02or they can navigate
complex obstacle courses. -
10:03 - 10:08They can come to see through their tongue.
-
10:08 - 10:10Now, that sounds completely insane, right?
-
10:10 - 10:13But remember, all vision ever is
-
10:13 - 10:17is electrochemical signals
coursing around in your brain. -
10:17 - 10:19Your brain doesn't know
where the signals come from. -
10:19 - 10:23It just figures out what to do with them.
-
10:23 - 10:28So my interest in my lab
is sensory substitution for the deaf, -
10:28 - 10:31and this is a project I've undertaken
-
10:31 - 10:34with a graduate student
in my lab, Scott Novich, -
10:34 - 10:37who is spearheading this for his thesis.
-
10:37 - 10:39And here is what we wanted to do:
-
10:39 - 10:43we wanted to make it so that
sound from the world gets converted -
10:43 - 10:47in some way so that a deaf person
can understand what is being said. -
10:47 - 10:52And we wanted to do this, given the power
and ubiquity of portable computing, -
10:52 - 10:57we wanted to make sure that this
would run on cell phones and tablets, -
10:57 - 10:59and also we wanted
to make this a wearable, -
10:59 - 11:02something that you could wear
under your clothing. -
11:02 - 11:04So here's the concept.
-
11:05 - 11:10So as I'm speaking, my sound
is getting captured by the tablet, -
11:10 - 11:16and then it's getting mapped onto a vest
that's covered in vibratory motors, -
11:16 - 11:20just like the motors in your cell phone.
-
11:20 - 11:22So as I'm speaking,
-
11:22 - 11:28the sound is getting translated
to a pattern of vibration on the vest. -
11:28 - 11:30Now, this is not just conceptual:
-
11:30 - 11:35this tablet is transmitting Bluetooth,
and I'm wearing the vest right now. -
11:35 - 11:37So as I'm speaking -- (Applause) --
-
11:38 - 11:44the sound is getting translated
into dynamic patterns of vibration. -
11:44 - 11:49I'm feeling the sonic world around me.
-
11:49 - 11:53So, we've been testing this
with deaf people now, -
11:53 - 11:57and it turns out that after
just a little bit of time, -
11:57 - 12:00people can start feeling,
they can start understanding -
12:00 - 12:03the language of the vest.
-
12:03 - 12:08So this is Jonathan. He's 37 years old.
He has a master's degree. -
12:08 - 12:10He was born profoundly deaf,
-
12:10 - 12:14which means that there's a part
of his umwelt that's unavailable to him. -
12:14 - 12:19So we had Jonathan train with the vest
for four days, two hours a day, -
12:19 - 12:22and here he is on the fifth day.
-
12:22 - 12:24Scott Novich: You.
-
12:24 - 12:27David Eagleman: So Scott says a word,
Jonathan feels it on the vest, -
12:27 - 12:30and he writes it on the board.
-
12:30 - 12:34SN: Where. Where.
-
12:34 - 12:38DE: Jonathan is able to translate
this complicated pattern of vibrations -
12:38 - 12:41into an understanding
of what's being said. -
12:41 - 12:44SN: Touch. Touch.
-
12:44 - 12:49DE: Now, he's not doing this --
-
12:49 - 12:55(Applause) --
-
12:56 - 13:00Jonathan is not doing this consciously,
because the patterns are too complicated, -
13:00 - 13:06but his brain is starting to unlock
the pattern that allows it to figure out -
13:06 - 13:08what the data mean,
-
13:08 - 13:12and our expectation is that,
after wearing this for about three months, -
13:12 - 13:17he will have a direct
perceptual experience of hearing -
13:17 - 13:21in the same way that when a blind person
passes a finger over braille, -
13:21 - 13:26the meaning comes directly off the page
without any conscious intervention at all. -
13:27 - 13:30Now, this technology has the potential
to be a game-changer, -
13:30 - 13:34because the only other solution
for deafness is a cochlear implant, -
13:34 - 13:37and that requires an invasive surgery.
-
13:37 - 13:42And this can be built for 40 times cheaper
than a cochlear implant, -
13:42 - 13:47which opens up this technology globally,
even for the poorest countries. -
13:48 - 13:53Now, we've been very encouraged
by our results with sensory substitution, -
13:53 - 13:57but what we've been thinking a lot about
is sensory addition. -
13:57 - 14:03How could we use a technology like this
to add a completely new kind of sense, -
14:03 - 14:06to expand the human umvelt?
-
14:06 - 14:10For example, could we feed
real-time data from the Internet -
14:10 - 14:12directly into somebody's brain,
-
14:12 - 14:16and can they develop a direct
perceptual experience? -
14:16 - 14:18So here's an experiment
we're doing in the lab. -
14:18 - 14:22A subject is feeling a real-time
streaming feed from the Net of data -
14:22 - 14:24for five seconds.
-
14:24 - 14:27Then, two buttons appear,
and he has to make a choice. -
14:27 - 14:29He doesn't know what's going on.
-
14:29 - 14:32He makes a choice,
and he gets feedback after one second. -
14:32 - 14:33Now, here's the thing:
-
14:33 - 14:36The subject has no idea
what all the patterns mean, -
14:36 - 14:39but we're seeing if he gets better
at figuring out which button to press. -
14:39 - 14:41He doesn't know that what we're feeding
-
14:41 - 14:45is real-time data from the stock market,
-
14:45 - 14:47and he's making buy and sell decisions.
-
14:47 - 14:49(Laughter)
-
14:49 - 14:53And the feedback is telling him
whether he did the right thing or not. -
14:53 - 14:56And what we're seeing is,
can we expand the human umvelt -
14:56 - 14:59so that he comes to have,
after several weeks, -
14:59 - 15:05a direct perceptual experience
of the economic movements of the planet. -
15:05 - 15:08So we'll report on that later
to see how well this goes. -
15:08 - 15:10(Laughter)
-
15:11 - 15:13Here's another thing we're doing:
-
15:13 - 15:17During the talks this morning,
we've been automatically scraping Twitter -
15:17 - 15:20for the TED2015 hashtag,
-
15:20 - 15:23and we've been doing
an automated sentiment analysis, -
15:23 - 15:27which means, are people using positive
words or negative words or neutral? -
15:27 - 15:30And while this has been going on,
-
15:30 - 15:33I have been feeling this,
-
15:33 - 15:37and so I am plugged in
to the aggregate emotion -
15:37 - 15:41of thousands of people in real time,
-
15:41 - 15:45and that's a new kind of human experience,
because now I can know -
15:45 - 15:48how everyone's doing
and how much you're loving this. -
15:48 - 15:53(Laughter) (Applause)
-
15:55 - 15:59It's a bigger experience
than a human can normally have. -
16:00 - 16:03We're also expanding the umvelt of pilots.
-
16:03 - 16:07So in this case, the vest is streaming
nine different measures -
16:07 - 16:08from this quadcopter,
-
16:08 - 16:12so pitch and yaw and roll
and orientation and heading, -
16:12 - 16:16and that improves
this pilot's ability to fly it. -
16:16 - 16:21It's essentially like he's extending
his skin up there, far away. -
16:21 - 16:23And that's just the beginning.
-
16:23 - 16:28What we're envisioning is taking
a modern cockpit full of gauges -
16:28 - 16:33and instead of trying
to read the whole thing, you feel it. -
16:33 - 16:35We live in a world of information now,
-
16:35 - 16:39and there is a difference
between accessing big data -
16:39 - 16:42and experiencing it.
-
16:42 - 16:46So I think there's really no end
to the possibilities -
16:46 - 16:48on the horizon for human expansion.
-
16:48 - 16:53Just imagine an astronaut
being able to feel -
16:53 - 16:57the overall health
of the International Space Station, -
16:57 - 17:02or, for that matter, having you feel
the invisible states of your own health, -
17:02 - 17:05like your blood sugar
and the state of your microbiome, -
17:05 - 17:11or having 360-degree vision
or seeing in infrared or ultraviolet. -
17:11 - 17:15So the key is this:
As we move into the future, -
17:15 - 17:20we're going to increasingly be able
to choose our own peripheral devices. -
17:20 - 17:23We no longer have to wait
for Mother Nature's sensory gifts -
17:23 - 17:25on her timescales,
-
17:25 - 17:29but instead, like any good parent,
she's given us the tools that we need -
17:29 - 17:34to go out and define our own trajectory.
-
17:34 - 17:35So the question now is,
-
17:35 - 17:41how do you want to go out
and experience your universe? -
17:41 - 17:43Thank you.
-
17:43 - 17:51(Applause)
-
17:59 - 18:02Chris Anderson: Can you feel it?
DE: Yeah. -
18:02 - 18:05Actually, this was the first time
I felt applause on the vest. -
18:05 - 18:07It's nice. It's like a massage. (Laughter)
-
18:07 - 18:11CA: Twitter's going crazy.
Twitter's going mad. -
18:11 - 18:13So that stock market experiment.
-
18:13 - 18:18This could be the first experiment
that secures its funding forevermore, -
18:18 - 18:20right, if successful?
-
18:20 - 18:23DE: Well, that's right, I wouldn't
have to write to NIH anymore. -
18:23 - 18:26CA: Well look, just to be
skeptical for a minute, -
18:26 - 18:29I mean, this is amazing,
but isn't most of the evidence so far -
18:29 - 18:31that sensory substitution works,
-
18:31 - 18:33not necessarily
that sensory addition works? -
18:33 - 18:37I mean, isn't it possible that the
blind person can see through their tongue -
18:37 - 18:42because the visual cortex is still there,
ready to process, -
18:42 - 18:44and that that is needed as part of it?
-
18:44 - 18:46DE: That's a great question.
We actually have no idea -
18:46 - 18:50what the theoretical limits are of what
kind of data the brain can take in. -
18:50 - 18:53The general story, though,
is that it's extraordinarily flexible. -
18:53 - 18:57So when a person goes blind,
what we used to call their visual cortex -
18:57 - 19:02gets taken over by other things,
by touch, by hearing, by vocabulary. -
19:02 - 19:06So what that tells us is that
the cortex is kind of a one-trick pony. -
19:06 - 19:09It just runs certain kinds
of computations on things. -
19:09 - 19:12And when we look around
at things like braille, for example, -
19:12 - 19:15people are getting information
through bumps on their fingers. -
19:15 - 19:19So I don't think we have any reason
to think there's a theoretical limit -
19:19 - 19:20that we know the edge of.
-
19:21 - 19:25CA: If this checks out,
you're going to be deluged. -
19:25 - 19:28There are so many
possible applications for this. -
19:28 - 19:32Are you ready for this? What are you most
excited about, the direction it might go? -
19:32 - 19:34DE: I mean, I think there's
a lot of applications here. -
19:34 - 19:38In terms of beyond sensory substitution,
the things I started mentioning -
19:38 - 19:42about astronauts on the space station,
they spend a lot of their time -
19:42 - 19:45monitoring things, and they could instead
just get what's going on, -
19:45 - 19:49because what this is really good for
is multidimensional data. -
19:49 - 19:54The key is this: Our visual systems
are good at detecting blobs and edges, -
19:54 - 19:56but they're really bad
at what our world has become, -
19:56 - 19:58which is screens
with lots and lots of data. -
19:58 - 20:01We have to crawl that
with our attentional systems. -
20:01 - 20:03So this is a way of just
feeling the state of something, -
20:03 - 20:07just like the way you know the state
of your body as you're standing around. -
20:07 - 20:10So I think heavy machinery, safety,
feeling the state of a factory, -
20:10 - 20:13of your equipment, that's one place
it'll go right away. -
20:13 - 20:17CA: David Eagleman, that was one
mind-blowing talk. Thank you very much. -
20:17 - 20:22DE: Thank you, Chris.
(Applause)
- Title:
- Can we create new senses for humans?
- Speaker:
- David Eagleman
- Description:
-
As humans, we can perceive less than a ten-trillionth of all light waves. “Our experience of reality,” says neuroscientist David Eagleman, “is constrained by our biology.” He wants to change that. His research into our brain processes has led him to create new interfaces — such as a sensory vest — to take in previously unseen information about the world around us.
- Video Language:
- English
- Team:
- closed TED
- Project:
- TEDTalks
- Duration:
- 20:34
Brian Greene commented on English subtitles for Can we create new senses for humans? | ||
Brian Greene edited English subtitles for Can we create new senses for humans? | ||
Krystian Aparta edited English subtitles for Can we create new senses for humans? | ||
Morton Bast edited English subtitles for Can we create new senses for humans? | ||
Morton Bast edited English subtitles for Can we create new senses for humans? | ||
Morton Bast edited English subtitles for Can we create new senses for humans? | ||
Morton Bast edited English subtitles for Can we create new senses for humans? | ||
Morton Bast approved English subtitles for Can we create new senses for humans? |
Brian Greene
A correction was made to this transcript on 1/15/16.
At 19:15, the subtitle now reads: "I don't think"