Return to Video

How to see past your own perspective and find truth

  • 0:01 - 0:04
    So, imagine that you had
    your smartphone miniaturized
  • 0:04 - 0:07
    and hooked up directly to your brain.
  • 0:08 - 0:10
    If you had this sort of brain chip,
  • 0:10 - 0:12
    you'd be able to upload
    and download to the internet
  • 0:12 - 0:13
    at the speed of thought.
  • 0:14 - 0:17
    Accessing social media or Wikipedia
    would be a lot like --
  • 0:17 - 0:19
    well, from the inside at least --
  • 0:19 - 0:21
    like consulting your own memory.
  • 0:21 - 0:24
    It would be as easy
    and as intimate as thinking.
  • 0:26 - 0:29
    But would it make it easier
    for you to know what's true?
  • 0:29 - 0:32
    Just because a way
    of accessing information is faster
  • 0:32 - 0:34
    it doesn't mean it's more
    reliable, of course,
  • 0:34 - 0:37
    and it doesn't mean that we would all
    interpret it the same way.
  • 0:37 - 0:41
    And it doesn't mean that you would be
    any better at evaluating it.
  • 0:41 - 0:43
    In fact, you might even be worse,
  • 0:43 - 0:45
    because, you know, more data,
    less time for evaluation.
  • 0:46 - 0:50
    Something like this is already
    happening to us right now.
  • 0:50 - 0:54
    We already carry a world of information
    around in our pockets,
  • 0:54 - 0:58
    but it seems as if the more information
    we share and access online,
  • 0:58 - 1:01
    the more difficult it can be for us
    to tell the difference
  • 1:01 - 1:03
    between what's real and what's fake.
  • 1:04 - 1:07
    It's as if we know more
    but understand less.
  • 1:08 - 1:11
    Now, it's a feature
    of modern life, I suppose,
  • 1:11 - 1:15
    that large swaths of the public
    live in isolated information bubbles.
  • 1:16 - 1:21
    We're polarized: not just over values,
    but over the facts.
  • 1:21 - 1:24
    One reason for that is, the data
    analytics that drive the internet
  • 1:24 - 1:27
    get us not just more information,
  • 1:27 - 1:29
    but more of the information that we want.
  • 1:29 - 1:31
    Our online life is personalized;
  • 1:31 - 1:33
    everything from the ads we read
  • 1:33 - 1:36
    to the news that comes down
    our Facebook feed
  • 1:36 - 1:39
    is tailored to satisfy our preferences.
  • 1:39 - 1:41
    And so while we get more information,
  • 1:41 - 1:44
    a lot of that information ends up
    reflecting ourselves
  • 1:44 - 1:47
    as much as it does reality.
  • 1:47 - 1:49
    It ends up, I suppose,
  • 1:50 - 1:52
    inflating our bubbles
    rather than bursting them.
  • 1:53 - 1:55
    And so maybe it's no surprise
  • 1:55 - 1:58
    that we're in a situation,
    a paradoxical situation,
  • 1:58 - 2:01
    of thinking that we know so much more,
  • 2:01 - 2:04
    and yet not agreeing
    on what it is we know.
  • 2:05 - 2:09
    So how are we going to solve
    this problem of knowledge polarization?
  • 2:09 - 2:13
    One obvious tactic is to try
    to fix our technology,
  • 2:13 - 2:15
    to redesign our digital platforms,
  • 2:15 - 2:18
    so as to make them less
    susceptible to polarization.
  • 2:19 - 2:20
    And I'm happy to report
  • 2:20 - 2:25
    that many smart people at Google
    and Facebook are working on just that.
  • 2:25 - 2:26
    And these projects are vital.
  • 2:28 - 2:31
    I think that fixing technology
    is obviously really important,
  • 2:31 - 2:36
    but I don't think technology alone,
    fixing it, is going to solve the problem
  • 2:36 - 2:37
    of knowledge polarization.
  • 2:37 - 2:40
    I don't think that because I don't think,
    at the end of the day,
  • 2:40 - 2:42
    it is a technological problem.
  • 2:42 - 2:44
    I think it's a human problem,
  • 2:44 - 2:47
    having to do with how we think
    and what we value.
  • 2:48 - 2:51
    In order to solve it, I think
    we're going to need help.
  • 2:51 - 2:54
    We're going to need help
    from psychology and political science.
  • 2:54 - 2:57
    But we're also going to need help,
    I think, from philosophy.
  • 2:59 - 3:02
    Because to solve the problem
    of knowledge polarization,
  • 3:04 - 3:06
    we're going to need to reconnect
  • 3:06 - 3:10
    with one fundamental, philosophical idea:
  • 3:11 - 3:14
    that we live in a common reality.
  • 3:15 - 3:19
    The idea of a common reality
    is like, I suppose,
  • 3:19 - 3:20
    a lot of philosophical concepts:
  • 3:20 - 3:21
    easy to state
  • 3:21 - 3:24
    but mysteriously difficult
    to put into practice.
  • 3:25 - 3:26
    To really accept it,
  • 3:26 - 3:29
    I think we need to do three things,
  • 3:29 - 3:31
    each of which is a challenge right now.
  • 3:33 - 3:35
    First, we need to believe in truth.
  • 3:36 - 3:37
    You might have noticed
  • 3:37 - 3:40
    that our culture is having
    something of a troubled relationship
  • 3:40 - 3:42
    with that concept right now.
  • 3:43 - 3:46
    It seems as if we disagree so much that,
  • 3:46 - 3:49
    as one political commentator
    put it not long ago,
  • 3:49 - 3:51
    it's as if there are no facts anymore.
  • 3:53 - 3:57
    But that thought is actually an expression
  • 3:57 - 4:01
    of a sort of seductive line
    of argument that's in the air.
  • 4:02 - 4:03
    It goes like this:
  • 4:04 - 4:07
    we just can't step outside
    of our own perspectives;
  • 4:07 - 4:10
    we can't step outside of our biases.
  • 4:10 - 4:11
    Every time we try,
  • 4:11 - 4:15
    we just get more information
    from our perspective.
  • 4:16 - 4:18
    So, this line of thought goes,
  • 4:19 - 4:23
    we might as well admit
    that objective truth is an illusion,
  • 4:23 - 4:24
    or it doesn't matter,
  • 4:24 - 4:26
    because either we'll never
    know what it is,
  • 4:27 - 4:29
    or it doesn't exist in the first place.
  • 4:31 - 4:34
    That's not a new philosophical thought --
  • 4:34 - 4:36
    skepticism about truth.
  • 4:37 - 4:40
    During the end of the last century,
    as some of you know,
  • 4:40 - 4:43
    it was very popular in certain
    academic circles.
  • 4:43 - 4:48
    But it really goes back all the way
    to the Greek philosopher Protagoras,
  • 4:48 - 4:50
    if not farther back.
  • 4:50 - 4:53
    Protagoras said that objective
    truth was an illusion
  • 4:53 - 4:56
    because "man is the measure
    of all things."
  • 4:56 - 4:58
    Man is the measure of all things.
  • 4:58 - 5:01
    That can seem like a bracing bit
    of realpolitik to people,
  • 5:01 - 5:02
    or liberating,
  • 5:02 - 5:07
    because it allows each of us
    to discover or make our own truth.
  • 5:09 - 5:13
    But actually, I think it's a bit
    of self-serving rationalization
  • 5:13 - 5:15
    disguised as philosophy.
  • 5:16 - 5:18
    It confuses the difficulty
    of being certain
  • 5:18 - 5:21
    with the impossibility of truth.
  • 5:22 - 5:23
    Look --
  • 5:25 - 5:28
    of course it's difficult
    to be certain about anything;
  • 5:29 - 5:31
    we might all be living in "The Matrix."
  • 5:32 - 5:34
    You might have a brain chip in your head
  • 5:34 - 5:36
    feeding you all the wrong information.
  • 5:38 - 5:42
    But in practice, we do agree
    on all sorts of facts.
  • 5:42 - 5:45
    We agree that bullets can kill people.
  • 5:46 - 5:50
    We agree that you can't flap
    your arms and fly.
  • 5:50 - 5:52
    We agree -- or we should --
  • 5:53 - 5:55
    that there is an external reality
  • 5:55 - 5:57
    and ignoring it can get you hurt.
  • 5:59 - 6:03
    Nonetheless, skepticism
    about truth can be tempting,
  • 6:03 - 6:07
    because it allows us to rationalize
    away our own biases.
  • 6:07 - 6:10
    When we do that, we're sort of like
    the guy in the movie
  • 6:10 - 6:12
    who knew he was living in "The Matrix"
  • 6:13 - 6:16
    but decided he liked it there, anyway.
  • 6:17 - 6:20
    After all, getting what you
    want feels good.
  • 6:20 - 6:23
    Being right all the time feels good.
  • 6:23 - 6:26
    So, often it's easier for us
  • 6:26 - 6:29
    to wrap ourselves in our cozy
    information bubbles,
  • 6:30 - 6:32
    live in bad faith,
  • 6:32 - 6:35
    and take those bubbles
    as the measure of reality.
  • 6:37 - 6:42
    An example, I think, of how
    this bad faith gets into our action
  • 6:42 - 6:47
    is our reaction
    to the phenomenon of fake news.
  • 6:48 - 6:51
    The fake news that spread on the internet
  • 6:51 - 6:55
    during the American
    presidential election of 2016
  • 6:56 - 6:58
    was designed to feed into our biases,
  • 6:58 - 7:00
    designed to inflate our bubbles.
  • 7:00 - 7:02
    But what was really striking about it
  • 7:02 - 7:05
    was not just that it fooled
    so many people.
  • 7:06 - 7:08
    What was really striking to me
    about fake news,
  • 7:08 - 7:10
    the phenomenon,
  • 7:10 - 7:15
    is how quickly it itself became
    the subject of knowledge polarization;
  • 7:16 - 7:19
    so much so, that the very term --
    the very term -- "fake news"
  • 7:19 - 7:23
    now just means: "news story I don't like."
  • 7:23 - 7:28
    That's an example of the bad faith
    towards the truth that I'm talking about.
  • 7:31 - 7:35
    But the really, I think, dangerous thing
  • 7:36 - 7:39
    about skepticism with regard to truth
  • 7:39 - 7:41
    is that it leads to despotism.
  • 7:42 - 7:45
    "Man is the measure of all things"
  • 7:45 - 7:49
    inevitably becomes "The Man
    is the measure of all things."
  • 7:50 - 7:53
    Just as "every man for himself"
  • 7:53 - 7:56
    always seems to turn out to be
    "only the strong survive."
  • 7:56 - 7:59
    At the end of Orwell's "1984,"
  • 8:00 - 8:04
    the thought policeman O'Brien is torturing
    the protagonist Winston Smith
  • 8:04 - 8:08
    into believing two plus two equals five.
  • 8:09 - 8:11
    What O'Brien says is the point,
  • 8:13 - 8:18
    is that he wants to convince Smith
    that whatever the party says is the truth,
  • 8:18 - 8:21
    and the truth is whatever the party says.
  • 8:21 - 8:25
    And what O'Brien knows
    is that once this thought is accepted,
  • 8:26 - 8:29
    critical dissent is impossible.
  • 8:30 - 8:32
    You can't speak truth to power
  • 8:32 - 8:35
    if the power speaks truth by definition.
  • 8:37 - 8:41
    I said that in order to accept
    that we really live in a common reality,
  • 8:41 - 8:42
    we have to do three things.
  • 8:42 - 8:44
    The first thing is to believe in truth.
  • 8:44 - 8:46
    The second thing can be summed up
  • 8:46 - 8:51
    by the Latin phrase that Kant took
    as the motto for the Enlightenment:
  • 8:51 - 8:53
    "Sapere aude,"
  • 8:53 - 8:55
    or "dare to know."
  • 8:55 - 8:57
    Or as Kant wants,
    "to dare to know for yourself."
  • 8:58 - 9:00
    I think in the early days of the internet,
  • 9:00 - 9:01
    a lot of us thought
  • 9:01 - 9:05
    that information technology
    was always going to make it easier
  • 9:05 - 9:07
    for us to know for ourselves,
  • 9:07 - 9:10
    and of course in many ways, it has.
  • 9:10 - 9:14
    But as the internet has become
    more and more a part of our lives,
  • 9:14 - 9:16
    our reliance on it, our use of it,
  • 9:16 - 9:18
    has become often more passive.
  • 9:18 - 9:21
    Much of what we know today we Google-know.
  • 9:21 - 9:25
    We download prepackaged sets of facts
  • 9:25 - 9:29
    and sort of shuffle them along
    the assembly line of social media.
  • 9:29 - 9:31
    Now, Google-knowing is useful
  • 9:31 - 9:34
    precisely because it involves
    a sort of intellectual outsourcing.
  • 9:34 - 9:40
    We offload our effort onto a network
    of others and algorithms.
  • 9:40 - 9:43
    And that allows us, of course,
    to not clutter our minds
  • 9:43 - 9:44
    with all sorts of facts.
  • 9:44 - 9:47
    We can just download them
    when we need them.
  • 9:47 - 9:48
    And that's awesome.
  • 9:49 - 9:54
    But there's a difference
    between downloading a set of facts
  • 9:55 - 10:00
    and really understanding how or why
    those facts are as they are.
  • 10:01 - 10:06
    Understanding why
    a particular disease spreads,
  • 10:06 - 10:08
    or how a mathematical proof works,
  • 10:08 - 10:10
    or why your friend is depressed,
  • 10:10 - 10:12
    involves more than just downloading.
  • 10:13 - 10:15
    It's going to require, most likely,
  • 10:16 - 10:18
    doing some work for yourself:
  • 10:19 - 10:20
    having a little creative insight;
  • 10:20 - 10:22
    using your imagination;
  • 10:22 - 10:23
    getting out into the field;
  • 10:23 - 10:24
    doing the experiment;
  • 10:24 - 10:25
    working through the proof;
  • 10:26 - 10:27
    talking to someone.
  • 10:32 - 10:35
    Now, I'm not saying, of course,
    that we should stop Google-knowing.
  • 10:36 - 10:38
    I'm just saying
  • 10:38 - 10:39
    we shouldn't overvalue it, either.
  • 10:39 - 10:44
    We need to find ways of encouraging
    forms of knowing that are more active,
  • 10:45 - 10:50
    and don't always involve passing off
    our effort into our bubble.
  • 10:50 - 10:54
    Because the thing about Google-knowing
    is that too often it ends up
  • 10:54 - 10:55
    being bubble-knowing.
  • 10:56 - 10:58
    And bubble-knowing means
    always being right.
  • 10:59 - 11:01
    But daring to know,
  • 11:01 - 11:03
    daring to understand,
  • 11:04 - 11:07
    means risking the possibility
    that you could be wrong.
  • 11:08 - 11:10
    It means risking the possibility
  • 11:10 - 11:15
    that what you want and what's true
    are different things.
  • 11:16 - 11:19
    Which brings me to the third thing
    that I think we need to do
  • 11:20 - 11:23
    if we want to accept that we live
    in a common reality.
  • 11:23 - 11:26
    That third thing is:
    have a little humility.
  • 11:27 - 11:29
    By humility here, I mean
    epistemic humility,
  • 11:29 - 11:31
    which means, in a sense,
  • 11:32 - 11:34
    knowing that you don't know it all.
  • 11:34 - 11:36
    But it also means something
    more than that.
  • 11:36 - 11:41
    It means seeing your worldview
    as open to improvement
  • 11:41 - 11:43
    by the evidence and experience of others.
  • 11:43 - 11:45
    Seeing your worldview
    as open to improvement
  • 11:45 - 11:47
    by the evidence and experience of others.
  • 11:48 - 11:50
    That's more than just
    being open to change.
  • 11:50 - 11:53
    It's more than just being open
    to self-improvement.
  • 11:53 - 11:57
    It means seeing your knowledge
    as capable of enhancing
  • 11:57 - 11:59
    or being enriched
    by what others contribute.
  • 12:00 - 12:03
    That's part of what is involved
  • 12:03 - 12:05
    in recognizing there's a common reality
  • 12:06 - 12:08
    that you, too, are responsible to.
  • 12:10 - 12:12
    I don't think it's much
    of a stretch to say
  • 12:12 - 12:17
    that our society is not particularly great
    at enhancing or encouraging
  • 12:17 - 12:18
    that sort of humility.
  • 12:18 - 12:20
    That's partly because,
  • 12:21 - 12:24
    well, we tend to confuse
    arrogance and confidence.
  • 12:24 - 12:27
    And it's partly because, well, you know,
  • 12:27 - 12:29
    arrogance is just easier.
  • 12:29 - 12:32
    It's just easier to think of yourself
    as knowing it all.
  • 12:32 - 12:35
    It's just easier to think of yourself
    as having it all figured out.
  • 12:37 - 12:39
    But that's another example
    of the bad faith towards the truth
  • 12:39 - 12:41
    that I've been talking about.
  • 12:43 - 12:46
    So the concept of a common reality,
  • 12:46 - 12:48
    like a lot of philosophical concepts,
  • 12:48 - 12:50
    can seem so obvious,
  • 12:51 - 12:53
    that we can look right past it
  • 12:54 - 12:56
    and forget why it's important.
  • 12:57 - 13:02
    Democracies can't function
    if their citizens don't strive,
  • 13:02 - 13:04
    at least some of the time,
  • 13:04 - 13:05
    to inhabit a common space,
  • 13:05 - 13:09
    a space where they can pass
    ideas back and forth
  • 13:10 - 13:12
    when -- and especially when --
  • 13:12 - 13:13
    they disagree.
  • 13:14 - 13:16
    But you can't strive to inhabit that space
  • 13:18 - 13:21
    if you don't already accept
    that you live in the same reality.
  • 13:23 - 13:25
    To accept that, we've got
    to believe in truth,
  • 13:25 - 13:29
    we've got to encourage
    more active ways of knowing.
  • 13:29 - 13:31
    And we've got to have the humility
  • 13:32 - 13:35
    to realize that we're not
    the measure of all things.
  • 13:37 - 13:41
    We may yet one day realize the vision
  • 13:41 - 13:43
    of having the internet in our brains.
  • 13:45 - 13:48
    But if we want that to be liberating
    and not terrifying,
  • 13:48 - 13:51
    if we want it to expand our understanding
  • 13:51 - 13:54
    and not just our passive knowing,
  • 13:55 - 13:58
    we need to remember that our perspectives,
  • 13:58 - 14:01
    as wondrous, as beautiful as they are,
  • 14:02 - 14:03
    are just that --
  • 14:03 - 14:06
    perspectives on one reality.
  • 14:07 - 14:08
    Thank you.
  • 14:08 - 14:13
    (Applause)
Title:
How to see past your own perspective and find truth
Speaker:
Michael Patrick Lynch
Description:

more » « less
Video Language:
English
Team:
closed TED
Project:
TEDTalks
Duration:
14:26

English subtitles

Revisions Compare revisions