-
We are built out of
very small stuff,
-
and we are embedded in
a very large cosmos,
-
and the fact is that we are not
very good at understanding reality
-
at either of those scales,
-
and that's because our brains
-
haven't evolved to understand
the world at that scale.
-
Instead, we're trapped on this
very thin slice of perception
-
right in the middle.
-
But it gets strange, because even at
that slice of reality that we call home,
-
we're not seeing most
of the action that's going on.
-
So take the colors of our world.
-
This is light waves, electromagnetic
radiation that bounces off objects
-
and it hits specialized receptors
in the back of our eyes.
-
But we're not seeing
all the waves out there.
-
In fact, what we see
-
is less than a 10 trillionth
of what's out there.
-
So you have radio waves and microwaves
-
and X-rays and gamma rays
passing through your body right now
-
and you're completely unaware of it,
-
because you don't come with
the proper biological receptors
-
for picking it up.
-
There are thousands
of cell phone conversations
-
passing through you right now,
-
and you're utterly blind to it.
-
Now, it's not that these things
are inherently unseeable.
-
Snakes include some infrared
in their reality,
-
and honeybees include ultraviolet
in their view of the world,
-
and of course we build machines
in the dashboards of our cars
-
to pick up on signals
in the radio frequency range,
-
and we built machines in hospitals
to pick up on the X-ray range.
-
But you can't sense
any of those by yourself,
-
at least not yet,
-
because you don't come equipped
with the proper sensors.
-
Now, what this means is that
our experience of reality
-
is constrained by our biology,
-
and that goes against
the common sense notion
-
that our eyes and our ears
and our fingertips
-
are just picking up
the objective reality that's out there.
-
Instead, our brains are sampling
just a little bit of the world.
-
Now, across the animal kingdom,
-
different animals pick up
on different parts of reality.
-
So in the blind
and deaf world of the tick,
-
the important signals
are temperature and butyric acid;
-
in the world of the black ghost knifefish,
-
its sensory world is lavishly colored
by electrical fields;
-
and for the echolocating bat,
-
its reality is constructed
out of air compression waves.
-
That's the slice of their ecosystem
that they can pick up on,
-
and we have a word for this in science.
-
It's called the umwelt,
-
which is the German word
for the surrounding world.
-
Now, presumably, every animal assumes
-
that its umwelt is the entire
objective reality out there,
-
because why would you ever stop to imagine
-
that there's something beyond
what we can sense.
-
Instead, what we all do
is we accept reality
-
as it's presented to us.
-
Let's do a consciousness-raiser on this:
-
imagine that you are a bloodhound dog.
-
Your whole world is about smelling.
-
You've got a long snout that has
200 million scent receptors in it,
-
and you have wet nostrils
that attract and trap scent molecules,
-
and your nostrils even have slits
so you can take big nose-fulls of air.
-
Everything is about smell for you.
-
So one day, you stop in your tracks
with a revelation:
-
you look at your human owner
and you think,
-
"What is it like to have the pitiful,
impoverished nose of a human?
-
What is it like when you take
a feeble little nose-full of air?
-
How can you not know that there's
a cat a hundred yards away,
-
or that your neighbor was on
this very same spot six hours ago?"
-
(Laughter)
-
So because we're humans,
-
we've never experienced
that world of smell,
-
so we don't miss it,
-
because we are firmly settled
into our umwelt.
-
But the question is,
do we have to be stuck there?
-
So as a neuroscientist, I'm interested
in the way that technology
-
might expand our umwelt,
-
and how that's going to change
the experience of being human.
-
So we already know that we can marry
our technology to our biology,
-
because there are hundreds of thousands
of people walking around
-
with artificial hearing
and artificial vision.
-
So the way this works is, you take
a microphone and you digitize the signal,
-
and you put an electrode strip
directly into the inner ear.
-
Or, with the retinal implant,
you take a camera
-
and you digitize the signal,
and then you plug an electrode grid
-
directly into the optic nerve.
-
And as recently as 15 years ago,
-
there were a lot of scientists who thought
these technologies wouldn't work.
-
Why? It's because these technologies
speak the language of Silicon Valley,
-
and it's not exactly the same dialect
as our natural biological sense organs.
-
But the fact is that it works:
-
the brain figures out
how to use the signals just fine.
-
Now, how do we understand that?
-
Well, here's the big secret:
-
your brain is not hearing
or seeing any of this.
-
Your brain is locked in a vault of silence
and darkness inside your skull.
-
All it ever sees are
electrochemical signals
-
that come in along different data cables,
-
and this is all it has to work with,
and nothing more.
-
Now, amazingly,
-
the brain is really good
at taking in these signals
-
and extracting patterns
and assigning meaning,
-
so that it takes this inner cosmos
and puts together a story of this,
-
your subjective world.
-
But here's the key point:
-
your brain doesn't know,
and it doesn't care,
-
where it gets the data from.
-
Whatever information comes in,
it just figures out what to do with it.
-
And this is a very efficient
kind of machine.
-
It's essentially a general purpose
computing device,
-
and it just takes in everything
-
and figures out
what it's going to do with it,
-
and that, I think, frees up Mother Nature
-
to tinker around with different
sorts of input channels.
-
So I call this the pH model of evolution,
-
and I don't want to get
too technical here,
-
but pH stands for "potato head,"
-
and I use this name to emphasize
that all these sensors
-
that we know and love, like our eyes
and our ears and our fingertips,
-
these are merely peripheral
plug-and-play devices:
-
you stick them in, and you're good to go.
-
The brain figures out what to do
with the data that comes in.
-
And when you look across
the animal kingdom,
-
you find lots of peripheral devices.
-
So snakes have heat pits
with which to detect infrared,
-
and the ghost knifefish has
electroreceptors,
-
and the star-nosed mole
has this appendage
-
with 22 fingers on it
-
with which it feels around and constructs
a 3D model of the world,
-
and many birds have magnetite
so they can orient
-
to the magnetic field of the planet.
-
So what this means is that
nature doesn't have to continually
-
redesign the brain.
-
Instead, with the principles
of brain operation established,
-
all nature has to worry about
is designing new peripherals.
-
Okay. So what this means is this:
-
the lesson that surfaces
-
is that there's nothing
really special or fundamental
-
about the biology that we
come to the table with.
-
It's just what we have inherited
-
from a complex road of evolution.
-
But it's not what we have to stick with,
-
and our best proof of principle of this
-
comes from what's called
"sensory substitution."
-
And that refers to feeding
information into the brain
-
via unusual sensory channels,
-
and the brain just figures out
what to do with it.
-
Now, that might sound speculative,
-
but the first paper demonstrating this
was published in the journal "Nature"
-
in 1969.
-
So a scientist named Paul Bach-y-Rita
-
put blind people
in a modified dental chair,
-
and he set up a video feed,
-
and he put something
in front of the camera,
-
and then you would feel that
-
poked into your back
with the grit of solenoids.
-
So if you wiggle a coffee cup
in front of the camera,
-
you're feeling that in your back,
-
and amazingly, blind people
got pretty good
-
at being able to determine
what was in front of the camera
-
just by feeling it
in the small of their back.
-
Now, there have been many
modern incarnations of this.
-
The sonic glasses take a video feed
right in front of you
-
and turn that into a sonic landscape,
-
so as things move around,
and get closer and farther,
-
it sounds like "Bzz, bzz, bzz."
-
It sounds like a cacophony,
-
but after several weeks, blind people
start getting pretty good
-
at understanding what's in front of them
-
just based on what they're hearing.
-
And it doesn't have to be
through the ears:
-
this system uses an electrotactile grid
on the forehead,
-
so whatever's in front of the video feed,
you're feeling it on your forehead.
-
Why the forehead? Because you're not
using it for much else.
-
The most modern incarnation
is called the brainport,
-
and this is a little electrogrid
that sits on your tongue,
-
and the video feed gets turned into
these little electrotactile signals,
-
and blind people get so good at using this
that they can throw a ball into a basket,
-
or they can navigate
complex obstacle courses.
-
They can come to see through their tongue.
-
Now, that sounds completely insane, right?
-
But remember, all vision ever is
-
is electrochemical signals
coursing around in your brain.
-
Your brain doesn't know
where the signals come from.
-
It just figures out what to do with them.
-
So my interest in my lab
is sensory substitution for the deaf,
-
and this is a project I've undertaken
-
with a graduate student
in my lab, Scott Novich,
-
who is spearheading this for his thesis.
-
And here is what we wanted to do:
-
we wanted to make it so that
sound from the world gets converted
-
in some way so that a deaf person
can understand what is being said.
-
And we wanted to do this, given the power
and ubiquity of portable computing,
-
we wanted to make sure that this
would run on cell phones and tablets,
-
and also we wanted
to make this a wearable,
-
something that you could wear
under your clothing.
-
So here's the concept.
-
So as I'm speaking, my sound
is getting captured by the tablet,
-
and then it's getting mapped onto a vest
that's covered in vibratory motors,
-
just like the motors in your cell phone.
-
So as I'm speaking,
-
the sound is getting translated
to a pattern of vibration on the vest.
-
Now, this is not just conceptual:
-
this tablet is transmitting Bluetooth,
and I'm wearing the vest right now.
-
So as I'm speaking -- (Applause) --
-
the sound is getting translated
into dynamic patterns of vibration.
-
I'm feeling the sonic world around me.
-
So, we've been testing this
with deaf people now,
-
and it turns out that after
just a little bit of time,
-
people can start feeling,
they can start understanding
-
the language of the vest.
-
So this is Jonathan. He's 37 years old.
He has a masters degree.
-
He was born profoundly deaf,
-
which means that there's a part
of his umwelt that's unavailable to him.
-
So we had Jonathan train with the vest
for four days, two hours a day,
-
and here he is on the fifth day.
-
(Video) Scott Novich: You.
-
David Eagleman: So Scott says a word,
Jonathan feels it on the vest,
-
and he writes it on the board.
-
(Video) SN: Where. Where.
-
DE: Jonathan is able to translate
this complicated pattern of vibrations
-
into an understanding
of what's being said.
-
(Video) SN: Touch. Touch.
-
DE: Now, he's not doing this
-
-- (Applause) --
-
Jonathan is not doing this consciously,
because the patterns are too complicated,
-
but his brain is starting to unlock
the pattern that allows it to figure out
-
what the data mean,
-
and our expectation is that,
after wearing this for about three months,
-
he will have a direct
perceptual experience of hearing
-
in the same way that when a blind person
passes a finger over braille,
-
the meaning comes directly off the page
without any conscious intervention at all.
-
Now, this technology has the potential
to be a game-changer,
-
because the only other solution
for deafness is a cochlear implant,
-
and that requires an invasive surgery.
-
And this can be built for 40 times cheaper
than a cochlear implant,
-
which opens up this technology globally,
even for the poorest countries.
-
Now, we've been very encouraged
by our results with sensory substitution,
-
but what we've been thinking a lot about
is sensory addition.
-
How could we use a technology like this
to add a completely new kind of sense,
-
to expand the human umvelt?
-
For example, could we feed
real-time data from the Internet
-
directly into somebody's brain,
-
and can they develop a direct
perceptual experience?
-
So here's an experiment
we're doing in the lab.
-
A subject is feeling a real-time
streaming feed from the Net of data
-
for five seconds.
-
Then, two buttons appear,
and he has to make a choice.
-
He doesn't know what's going on.
-
He makes a choice,
and he gets feedback after one second.
-
Now, here's the thing:
-
the subject has no idea
what all the patterns mean,
-
but we're seeing if he gets better
at figuring out which button to press.
-
He doesn't know that what we're feeding
-
is realtime data from the stock market,
-
and he's making buy and sell decisions.
-
(Laughter)
-
And the feedback is telling him
whether he did the right thing or not.
-
And what we're seeing is,
can we expand the human umvelt
-
so that he comes to have,
after several weeks,
-
a direct perceptual experience
of the economic movements of the planet.
-
So we'll report on that later
to see how well this goes.
-
(Laughter)
-
Here's another thing we're doing:
-
during the talks this morning,
we've been automatically scraping Twitter
-
for the TED2015 hashtag,
-
and we've been doing
an automated sentiment analysis,
-
which means, are people using positive
words or negative words or neutral?
-
And while this has been going on,
-
I have been feeling this,
-
and so I am plugged in
to the aggregate emotion
-
of thousands of people in real time,
-
and that's a new kind of human experience,
because now I can know
-
how everyone's doing
and how much you're loving this.
-
(Laughter) (Applause)
-
It's a bigger experience
than a human can normally have.
-
We're also expanding the umvelt of pilots.
-
So in this case, the vest is streaming
nine different measures
-
from this quadcopter,
-
so pitch and yaw and roll
and orientation and heading,
-
and that improves
this pilot's ability to fly it.
-
It's essentially like he's extending
his skin up there, far away.
-
And that's just the beginning.
-
What we're envisioning is taking
a modern cockpit full of gauges
-
and instead of trying
to read the whole thing, you feel it.
-
We live in a world of information now,
-
and there is a difference
between accessing Big Data
-
and experiencing it.
-
So I think there's really no end
to the possibilities
-
on the horizon for human expansion.
-
Just imagine an astronaut
being able to feel
-
the overall health
of the International Space Station,
-
or, for that matter, having you feel
the invisible states of your own health,
-
like your blood sugar
and the state of your microbiome,
-
or having 360-degree vision
or seeing in infrared or ultraviolet.
-
So the key is this:
as we move into the future,
-
we're going to increasingly be able
to choose our own peripheral devices.
-
We no longer have to wait
for Mother Nature's sensory gifts
-
on her timescales,
-
but instead, like any good parent,
she's given us the tools that we need
-
to go out and define our own trajectory.
-
So the question now is,
-
how do you want to go out
and experience your universe?
-
Thank you.
-
(Applause)
-
Chris Anderson: Can you feel it?
DE: Yeah.
-
Actually, this was the first time
I felt applause on the vest.
-
It's nice. It's like a massage. (Laughter)
-
CA: Twitter's going crazy.
Twitter's going mad.
-
So that stock market experiment.
-
This could be the first experiment
that secures its funding forevermore,
-
right, if successful?
-
DE: Well, that's right, I wouldn't
have to write to NIH anymore.
-
CA: Well look, just to be
skeptical for a minute,
-
I mean, this is amazing,
but isn't most of the evidence so far
-
that sensory substitution works,
-
not necessarily
that sensory addition works?
-
I mean, isn't it possible that the
blind person can see through their tongue
-
because the visual cortex is still there,
ready to process,
-
and that that is needed as part of it?
-
DE: That's a great question.
We actually have no idea
-
what the theoretical limits are of what
kind of data the brain can take in.
-
The general story, though,
is that it's extraordinarily flexible.
-
So when a person goes blind,
what we used to call their visual cortex
-
gets taken over by other things,
by touch, by hearing, by vocabulary.
-
So what that tells us is that
the cortex is kind of a one-trick pony.
-
It just runs certain kinds
of computations on things.
-
And when we look around
at things like braille, for example,
-
people are getting information
through bumps on their fingers.
-
So I don't thing we have any reason
to think there's a theoretical limit
-
that we know the edge of.
-
CA: If this checks out,
you're going to be deluged.
-
There are so many
possible applications for this.
-
Are you ready for this? What are you most
excited about, the direction it might go?
-
DE: I mean, I think there's
a lot of applications here.
-
In terms of beyond sensory substitution,
the things I started mentioning
-
about astronauts on the space station,
they spend a lot of their time
-
monitoring things, and they could instead
just get what's going on,
-
because what this is really good for
is multidimensional data.
-
The key is this: our visual systems
are good at detecting blobs and edges,
-
but they're really bad
at what our world has become,
-
which is screens
with lots and lots of data.
-
We have to crawl that
with our attentional systems.
-
So this is a way of just
feeling the state of something,
-
just like the way you know the state
of your body as you're standing around.
-
So I think heavy machinery, safety,
feeling the state of a factory,
-
of your equipment, that's one place
it'll go right away.
-
CA: David Eagleman, that was one
mind-blowing talk. Thank you very much.
-
DE: Thank you, Chris.
(Applause)
Brian Greene
A correction was made to this transcript on 1/15/16.
At 19:15, the subtitle now reads: "I don't think"