Return to Video

Dare to disagree

  • 0:00 - 0:02
    In Oxford in the 1950s,
  • 0:02 - 0:06
    there was a fantastic doctor, who was very unusual,
  • 0:06 - 0:08
    named Alice Stewart.
  • 0:08 - 0:11
    And Alice was unusual partly because, of course,
  • 0:11 - 0:15
    she was a woman, which was pretty rare in the 1950s.
  • 0:15 - 0:17
    And she was brilliant, she was one of the,
  • 0:17 - 0:22
    at the time, the youngest Fellow to be elected to the Royal College of Physicians.
  • 0:22 - 0:25
    She was unusual too because she continued to work after she got married,
  • 0:25 - 0:27
    after she had kids,
  • 0:27 - 0:30
    and even after she got divorced and was a single parent,
  • 0:30 - 0:33
    she continued her medical work.
  • 0:33 - 0:37
    And she was unusual because she was really interested in a new science,
  • 0:37 - 0:40
    the emerging field of epidemiology,
  • 0:40 - 0:43
    the study of patterns in disease.
  • 0:43 - 0:45
    But like every scientist, she appreciated
  • 0:45 - 0:47
    that to make her mark, what she needed to do
  • 0:47 - 0:52
    was find a hard problem and solve it.
  • 0:52 - 0:54
    The hard problem that Alice chose
  • 0:54 - 0:58
    was the rising incidence of childhood cancers.
  • 0:58 - 1:00
    Most disease is correlated with poverty,
  • 1:00 - 1:02
    but in the case of childhood cancers,
  • 1:02 - 1:05
    the children who were dying seemed mostly to come
  • 1:05 - 1:07
    from affluent families.
  • 1:07 - 1:09
    So, what, she wanted to know,
  • 1:09 - 1:12
    could explain this anomaly?
  • 1:12 - 1:15
    Now, Alice had trouble getting funding for her research.
  • 1:15 - 1:17
    In the end, she got just 1,000 pounds
  • 1:17 - 1:19
    from the Lady Tata Memorial prize.
  • 1:19 - 1:22
    And that meant she knew she only had one shot
  • 1:22 - 1:24
    at collecting her data.
  • 1:24 - 1:26
    Now, she had no idea what to look for.
  • 1:26 - 1:29
    This really was a needle in a haystack sort of search,
  • 1:29 - 1:32
    so she asked everything she could think of.
  • 1:32 - 1:34
    Had the children eaten boiled sweets?
  • 1:34 - 1:36
    Had they consumed colored drinks?
  • 1:36 - 1:38
    Did they eat fish and chips?
  • 1:38 - 1:40
    Did they have indoor or outdoor plumbing?
  • 1:40 - 1:43
    What time of life had they started school?
  • 1:43 - 1:46
    And when her carbon copied questionnaire started to come back,
  • 1:46 - 1:49
    one thing and one thing only jumped out
  • 1:49 - 1:52
    with the statistical clarity of a kind that
  • 1:52 - 1:55
    most scientists can only dream of.
  • 1:55 - 1:57
    By a rate of two to one,
  • 1:57 - 1:59
    the children who had died
  • 1:59 - 2:05
    had had mothers who had been X-rayed when pregnant.
  • 2:05 - 2:09
    Now that finding flew in the face of conventional wisdom.
  • 2:09 - 2:11
    Conventional wisdom held
  • 2:11 - 2:15
    that everything was safe up to a point, a threshold.
  • 2:15 - 2:18
    It flew in the face of conventional wisdom,
  • 2:18 - 2:21
    which was huge enthusiasm for the cool new technology
  • 2:21 - 2:25
    of that age, which was the X-ray machine.
  • 2:25 - 2:29
    And it flew in the face of doctors' idea of themselves,
  • 2:29 - 2:33
    which was as people who helped patients,
  • 2:33 - 2:36
    they didn't harm them.
  • 2:36 - 2:39
    Nevertheless, Alice Stewart rushed to publish
  • 2:39 - 2:43
    her preliminary findings in The Lancet in 1956.
  • 2:43 - 2:47
    People got very excited, there was talk of the Nobel Prize,
  • 2:47 - 2:49
    and Alice really was in a big hurry
  • 2:49 - 2:53
    to try to study all the cases of childhood cancer she could find
  • 2:53 - 2:55
    before they disappeared.
  • 2:55 - 2:59
    In fact, she need not have hurried.
  • 2:59 - 3:03
    It was fully 25 years before the British and medical --
  • 3:03 - 3:06
    British and American medical establishments
  • 3:06 - 3:12
    abandoned the practice of X-raying pregnant women.
  • 3:12 - 3:18
    The data was out there, it was open, it was freely available,
  • 3:18 - 3:22
    but nobody wanted to know.
  • 3:22 - 3:25
    A child a week was dying,
  • 3:25 - 3:28
    but nothing changed.
  • 3:28 - 3:34
    Openness alone can't drive change.
  • 3:34 - 3:39
    So for 25 years Alice Stewart had a very big fight on her hands.
  • 3:39 - 3:43
    So, how did she know that she was right?
  • 3:43 - 3:46
    Well, she had a fantastic model for thinking.
  • 3:46 - 3:49
    She worked with a statistician named George Kneale,
  • 3:49 - 3:51
    and George was pretty much everything that Alice wasn't.
  • 3:51 - 3:54
    So, Alice was very outgoing and sociable,
  • 3:54 - 3:56
    and George was a recluse.
  • 3:56 - 4:00
    Alice was very warm, very empathetic with her patients.
  • 4:00 - 4:05
    George frankly preferred numbers to people.
  • 4:05 - 4:09
    But he said this fantastic thing about their working relationship.
  • 4:09 - 4:15
    He said, "My job is to prove Dr. Stewart wrong."
  • 4:15 - 4:18
    He actively sought disconfirmation.
  • 4:18 - 4:21
    Different ways of looking at her models,
  • 4:21 - 4:24
    at her statistics, different ways of crunching the data
  • 4:24 - 4:27
    in order to disprove her.
  • 4:27 - 4:33
    He saw his job as creating conflict around her theories.
  • 4:33 - 4:36
    Because it was only by not being able to prove
  • 4:36 - 4:38
    that she was wrong,
  • 4:38 - 4:41
    that George could give Alice the confidence she needed
  • 4:41 - 4:44
    to know that she was right.
  • 4:44 - 4:49
    It's a fantastic model of collaboration --
  • 4:49 - 4:54
    thinking partners who aren't echo chambers.
  • 4:54 - 4:56
    I wonder how many of us have,
  • 4:56 - 5:03
    or dare to have, such collaborators.
  • 5:03 - 5:07
    Alice and George were very good at conflict.
  • 5:07 - 5:10
    They saw it as thinking.
  • 5:10 - 5:14
    So what does that kind of constructive conflict require?
  • 5:14 - 5:18
    Well, first of all, it requires that we find people
  • 5:18 - 5:20
    who are very different from ourselves.
  • 5:20 - 5:25
    That means we have to resist the neurobiological drive,
  • 5:25 - 5:29
    which means that we really prefer people mostly like ourselves,
  • 5:29 - 5:31
    and it means we have to seek out people
  • 5:31 - 5:34
    with different backgrounds, different disciplines,
  • 5:34 - 5:38
    different ways of thinking and different experience,
  • 5:38 - 5:42
    and find ways to engage with them.
  • 5:42 - 5:47
    That requires a lot of patience and a lot of energy.
  • 5:47 - 5:48
    And the more I've thought about this,
  • 5:48 - 5:54
    the more I think, really, that that's a kind of love.
  • 5:54 - 5:57
    Because you simply won't commit that kind of energy
  • 5:57 - 6:01
    and time if you don't really care.
  • 6:01 - 6:06
    And it also means that we have to be prepared to change our minds.
  • 6:06 - 6:08
    Alice's daughter told me
  • 6:08 - 6:11
    that every time Alice went head-to-head with a fellow scientist,
  • 6:11 - 6:15
    they made her think and think and think again.
  • 6:15 - 6:19
    "My mother," she said, "My mother didn't enjoy a fight,
  • 6:19 - 6:25
    but she was really good at them."
  • 6:25 - 6:29
    So it's one thing to do that in a one-to-one relationship.
  • 6:29 - 6:32
    But it strikes me that the biggest problems we face,
  • 6:32 - 6:35
    many of the biggest disasters that we've experienced,
  • 6:35 - 6:37
    mostly haven't come from individuals,
  • 6:37 - 6:39
    they've come from organizations,
  • 6:39 - 6:41
    some of them bigger than countries,
  • 6:41 - 6:43
    many of them capable of affecting hundreds,
  • 6:43 - 6:47
    thousands, even millions of lives.
  • 6:47 - 6:51
    So how do organizations think?
  • 6:51 - 6:56
    Well, for the most part, they don't.
  • 6:56 - 6:59
    And that isn't because they don't want to,
  • 6:59 - 7:01
    it's really because they can't.
  • 7:01 - 7:04
    And they can't because the people inside of them
  • 7:04 - 7:08
    are too afraid of conflict.
  • 7:08 - 7:11
    In surveys of European and American executives,
  • 7:11 - 7:14
    fully 85 percent of them acknowledged
  • 7:14 - 7:18
    that they had issues or concerns at work
  • 7:18 - 7:21
    that they were afraid to raise.
  • 7:21 - 7:25
    Afraid of the conflict that that would provoke,
  • 7:25 - 7:27
    afraid to get embroiled in arguments
  • 7:27 - 7:29
    that they did not know how to manage,
  • 7:29 - 7:34
    and felt that they were bound to lose.
  • 7:34 - 7:40
    Eighty-five percent is a really big number.
  • 7:40 - 7:43
    It means that organizations mostly can't do
  • 7:43 - 7:45
    what George and Alice so triumphantly did.
  • 7:45 - 7:49
    They can't think together.
  • 7:49 - 7:52
    And it means that people like many of us,
  • 7:52 - 7:54
    who have run organizations,
  • 7:54 - 7:57
    and gone out of our way to try to find the very best people we can,
  • 7:57 - 8:04
    mostly fail to get the best out of them.
  • 8:04 - 8:07
    So how do we develop the skills that we need?
  • 8:07 - 8:11
    Because it does take skill and practice, too.
  • 8:11 - 8:14
    If we aren't going to be afraid of conflict,
  • 8:14 - 8:17
    we have to see it as thinking,
  • 8:17 - 8:21
    and then we have to get really good at it.
  • 8:21 - 8:25
    So, recently, I worked with an executive named Joe,
  • 8:25 - 8:29
    and Joe worked for a medical device company.
  • 8:29 - 8:32
    And Joe was very worried about the device that he was working on.
  • 8:32 - 8:35
    He thought that it was too complicated
  • 8:35 - 8:37
    and he thought that its complexity
  • 8:37 - 8:41
    created margins of error that could really hurt people.
  • 8:41 - 8:45
    He was afraid of doing damage to the patients he was trying to help.
  • 8:45 - 8:47
    But when he looked around his organization,
  • 8:47 - 8:52
    nobody else seemed to be at all worried.
  • 8:52 - 8:54
    So, he didn't really want to say anything.
  • 8:54 - 8:56
    After all, maybe they knew something he didn't.
  • 8:56 - 8:59
    Maybe he'd look stupid.
  • 8:59 - 9:01
    But he kept worrying about it,
  • 9:01 - 9:04
    and he worried about it so much that he got to the point
  • 9:04 - 9:06
    where he thought the only thing he could do
  • 9:06 - 9:11
    was leave a job he loved.
  • 9:11 - 9:15
    In the end, Joe and I found a way
  • 9:15 - 9:16
    for him to raise his concerns.
  • 9:16 - 9:19
    And what happened then is what almost always
  • 9:19 - 9:21
    happens in this situation.
  • 9:21 - 9:24
    It turned out everybody had exactly the same
  • 9:24 - 9:26
    questions and doubts.
  • 9:26 - 9:30
    So now Joe had allies. They could think together.
  • 9:30 - 9:33
    And yes, there was a lot of conflict and debate
  • 9:33 - 9:37
    and argument, but that allowed everyone around the table
  • 9:37 - 9:42
    to be creative, to solve the problem,
  • 9:42 - 9:46
    and to change the device.
  • 9:46 - 9:49
    Joe was what a lot of people might think of
  • 9:49 - 9:51
    as a whistle-blower,
  • 9:51 - 9:54
    except that like almost all whistle-blowers,
  • 9:54 - 9:57
    he wasn't a crank at all,
  • 9:57 - 10:00
    he was passionately devoted to the organization
  • 10:00 - 10:03
    and the higher purposes that that organization served.
  • 10:03 - 10:07
    But he had been so afraid of conflict,
  • 10:07 - 10:12
    until finally he became more afraid of the silence.
  • 10:12 - 10:14
    And when he dared to speak,
  • 10:14 - 10:18
    he discovered much more inside himself
  • 10:18 - 10:23
    and much more give in the system than he had ever imagined.
  • 10:23 - 10:26
    And his colleagues don't think of him as a crank.
  • 10:26 - 10:31
    They think of him as a leader.
  • 10:31 - 10:36
    So, how do we have these conversations more easily
  • 10:36 - 10:38
    and more often?
  • 10:38 - 10:40
    Well, the University of Delft
  • 10:40 - 10:42
    requires that its PhD students
  • 10:42 - 10:46
    have to submit five statements that they're prepared to defend.
  • 10:46 - 10:49
    It doesn't really matter what the statements are about,
  • 10:49 - 10:53
    what matters is that the candidates are willing and able
  • 10:53 - 10:56
    to stand up to authority.
  • 10:56 - 10:58
    I think it's a fantastic system,
  • 10:58 - 11:01
    but I think leaving it to PhD candidates
  • 11:01 - 11:05
    is far too few people, and way too late in life.
  • 11:05 - 11:08
    I think we need to be teaching these skills
  • 11:08 - 11:12
    to kids and adults at every stage of their development,
  • 11:12 - 11:15
    if we want to have thinking organizations
  • 11:15 - 11:18
    and a thinking society.
  • 11:18 - 11:24
    The fact is that most of the biggest catastrophes that we've witnessed
  • 11:24 - 11:30
    rarely come from information that is secret or hidden.
  • 11:30 - 11:35
    It comes from information that is freely available and out there,
  • 11:35 - 11:37
    but that we are willfully blind to,
  • 11:37 - 11:40
    because we can't handle, don't want to handle,
  • 11:40 - 11:44
    the conflict that it provokes.
  • 11:44 - 11:47
    But when we dare to break that silence,
  • 11:47 - 11:50
    or when we dare to see,
  • 11:50 - 11:52
    and we create conflict,
  • 11:52 - 11:55
    we enable ourselves and the people around us
  • 11:55 - 11:59
    to do our very best thinking.
  • 11:59 - 12:03
    Open information is fantastic,
  • 12:03 - 12:06
    open networks are essential.
  • 12:06 - 12:08
    But the truth won't set us free
  • 12:08 - 12:11
    until we develop the skills and the habit and the talent
  • 12:11 - 12:16
    and the moral courage to use it.
  • 12:16 - 12:19
    Openness isn't the end.
  • 12:19 - 12:22
    It's the beginning.
  • 12:22 - 12:33
    (Applause)
Title:
Dare to disagree
Speaker:
Margaret Heffernan
Description:

Most people instinctively avoid conflict, but as Margaret Heffernan shows us, good disagreement is central to progress. She illustrates (sometimes counterintuitively) how the best partners aren’t echo chambers -- and how great research teams, relationships and businesses allow people to deeply disagree.

more » « less
Video Language:
English
Team:
closed TED
Project:
TEDTalks
Duration:
12:56
Silvia Fornasiero edited English subtitles for Dare to disagree
Thu-Huong Ha edited English subtitles for Dare to disagree
Thu-Huong Ha approved English subtitles for Dare to disagree
Thu-Huong Ha edited English subtitles for Dare to disagree
Thu-Huong Ha edited English subtitles for Dare to disagree
Morton Bast accepted English subtitles for Dare to disagree
Morton Bast edited English subtitles for Dare to disagree
Thu-Huong Ha edited English subtitles for Dare to disagree
Show all

English subtitles

Revisions Compare revisions