Return to Video

Let's pool our medical data

  • 0:00 - 0:03
    So I have bad news, I have good news,
  • 0:03 - 0:05
    and I have a task.
  • 0:05 - 0:08
    So the bad news is that we all get sick.
  • 0:08 - 0:10
    I get sick. You get sick.
  • 0:10 - 0:13
    And every one of us gets sick, and the question really is,
  • 0:13 - 0:16
    how sick do we get? Is it something that kills us?
  • 0:16 - 0:17
    Is it something that we survive?
  • 0:17 - 0:19
    Is it something that we can treat?
  • 0:19 - 0:22
    And we've gotten sick as long as we've been people.
  • 0:22 - 0:26
    And so we've always looked for reasons to explain why we get sick.
  • 0:26 - 0:28
    And for a long time, it was the gods, right?
  • 0:28 - 0:31
    The gods are angry with me, or the gods are testing me,
  • 0:31 - 0:33
    right? Or God, singular, more recently,
  • 0:33 - 0:36
    is punishing me or judging me.
  • 0:36 - 0:39
    And as long as we've looked for explanations,
  • 0:39 - 0:42
    we've wound up with something that gets closer and closer to science,
  • 0:42 - 0:45
    which is hypotheses as to why we get sick,
  • 0:45 - 0:49
    and as long as we've had hypotheses about why we get sick, we've tried to treat it as well.
  • 0:49 - 0:54
    So this is Avicenna. He wrote a book over a thousand years ago called "The Canon of Medicine,"
  • 0:54 - 0:56
    and the rules he laid out for testing medicines
  • 0:56 - 0:58
    are actually really similar to the rules we have today,
  • 0:58 - 1:01
    that the disease and the medicine must be the same strength,
  • 1:01 - 1:03
    the medicine needs to be pure, and in the end we need
  • 1:03 - 1:06
    to test it in people. And so if you put together these themes
  • 1:06 - 1:11
    of a narrative or a hypothesis in human testing,
  • 1:11 - 1:13
    right, you get some beautiful results,
  • 1:13 - 1:15
    even when we didn't have very good technologies.
  • 1:15 - 1:18
    This is a guy named Carlos Finlay. He had a hypothesis
  • 1:18 - 1:21
    that was way outside the box for his time, in the late 1800s.
  • 1:21 - 1:24
    He thought yellow fever was not transmitted by dirty clothing.
  • 1:24 - 1:26
    He thought it was transmitted by mosquitos.
  • 1:26 - 1:28
    And they laughed at him. For 20 years, they called this guy
  • 1:28 - 1:32
    "the mosquito man." But he ran an experiment in people,
  • 1:32 - 1:35
    right? He had this hypothesis, and he tested it in people.
  • 1:35 - 1:40
    So he got volunteers to go move to Cuba and live in tents
  • 1:40 - 1:43
    and be voluntarily infected with yellow fever.
  • 1:43 - 1:46
    So some of the people in some of the tents had dirty clothes
  • 1:46 - 1:47
    and some of the people were in tents that were full
  • 1:47 - 1:49
    of mosquitos that had been exposed to yellow fever.
  • 1:49 - 1:53
    And it definitively proved that it wasn't this magic dust
  • 1:53 - 1:56
    called fomites in your clothes that caused yellow fever.
  • 1:56 - 1:59
    But it wasn't until we tested it in people that we actually knew.
  • 1:59 - 2:01
    And this is what those people signed up for.
  • 2:01 - 2:04
    This is what it looked like to have yellow fever in Cuba
  • 2:04 - 2:09
    at that time. You suffered in a tent, in the heat, alone,
  • 2:09 - 2:12
    and you probably died.
  • 2:12 - 2:15
    But people volunteered for this.
  • 2:15 - 2:18
    And it's not just a cool example of a scientific design
  • 2:18 - 2:21
    of experiment in theory. They also did this beautiful thing.
  • 2:21 - 2:25
    They signed this document, and it's called an informed consent document.
  • 2:25 - 2:27
    And informed consent is an idea that we should be
  • 2:27 - 2:30
    very proud of as a society, right? It's something that
  • 2:30 - 2:32
    separates us from the Nazis at Nuremberg,
  • 2:32 - 2:35
    enforced medical experimentation. It's the idea
  • 2:35 - 2:39
    that agreement to join a study without understanding isn't agreement.
  • 2:39 - 2:43
    It's something that protects us from harm, from hucksters,
  • 2:43 - 2:46
    from people that would try to hoodwink us into a clinical
  • 2:46 - 2:50
    study that we don't understand, or that we don't agree to.
  • 2:50 - 2:54
    And so you put together the thread of narrative hypothesis,
  • 2:54 - 2:57
    experimentation in humans, and informed consent,
  • 2:57 - 2:59
    and you get what we call clinical study, and it's how we do
  • 2:59 - 3:02
    the vast majority of medical work. It doesn't really matter
  • 3:02 - 3:05
    if you're in the north, the south, the east, the west.
  • 3:05 - 3:09
    Clinical studies form the basis of how we investigate,
  • 3:09 - 3:11
    so if we're going to look at a new drug, right,
  • 3:11 - 3:14
    we test it in people, we draw blood, we do experiments,
  • 3:14 - 3:16
    and we gain consent for that study, to make sure
  • 3:16 - 3:19
    that we're not screwing people over as part of it.
  • 3:19 - 3:22
    But the world is changing around the clinical study,
  • 3:22 - 3:26
    which has been fairly well established for tens of years
  • 3:26 - 3:28
    if not 50 to 100 years.
  • 3:28 - 3:31
    So now we're able to gather data about our genomes,
  • 3:31 - 3:34
    but, as we saw earlier, our genomes aren't dispositive.
  • 3:34 - 3:36
    We're able to gather information about our environment.
  • 3:36 - 3:38
    And more importantly, we're able to gather information
  • 3:38 - 3:41
    about our choices, because it turns out that what we think of
  • 3:41 - 3:44
    as our health is more like the interaction of our bodies,
  • 3:44 - 3:47
    our genomes, our choices and our environment.
  • 3:47 - 3:50
    And the clinical methods that we've got aren't very good
  • 3:50 - 3:53
    at studying that because they are based on the idea
  • 3:53 - 3:55
    of person-to-person interaction. You interact
  • 3:55 - 3:57
    with your doctor and you get enrolled in the study.
  • 3:57 - 3:59
    So this is my grandfather. I actually never met him,
  • 3:59 - 4:03
    but he's holding my mom, and his genes are in me, right?
  • 4:03 - 4:06
    His choices ran through to me. He was a smoker,
  • 4:06 - 4:09
    like most people were. This is my son.
  • 4:09 - 4:12
    So my grandfather's genes go all the way through to him,
  • 4:12 - 4:15
    and my choices are going to affect his health.
  • 4:15 - 4:17
    The technology between these two pictures
  • 4:17 - 4:21
    cannot be more different, but the methodology
  • 4:21 - 4:25
    for clinical studies has not radically changed over that time period.
  • 4:25 - 4:28
    We just have better statistics.
  • 4:28 - 4:31
    The way we gain informed consent was formed in large part
  • 4:31 - 4:34
    after World War II, around the time that picture was taken.
  • 4:34 - 4:38
    That was 70 years ago, and the way we gain informed consent,
  • 4:38 - 4:41
    this tool that was created to protect us from harm,
  • 4:41 - 4:44
    now creates silos. So the data that we collect
  • 4:44 - 4:47
    for prostate cancer or for Alzheimer's trials
  • 4:47 - 4:50
    goes into silos where it can only be used
  • 4:50 - 4:53
    for prostate cancer or for Alzheimer's research.
  • 4:53 - 4:56
    Right? It can't be networked. It can't be integrated.
  • 4:56 - 4:59
    It cannot be used by people who aren't credentialed.
  • 4:59 - 5:02
    So a physicist can't get access to it without filing paperwork.
  • 5:02 - 5:05
    A computer scientist can't get access to it without filing paperwork.
  • 5:05 - 5:10
    Computer scientists aren't patient. They don't file paperwork.
  • 5:10 - 5:14
    And this is an accident. These are tools that we created
  • 5:14 - 5:17
    to protect us from harm, but what they're doing
  • 5:17 - 5:19
    is protecting us from innovation now.
  • 5:19 - 5:23
    And that wasn't the goal. It wasn't the point. Right?
  • 5:23 - 5:25
    It's a side effect, if you will, of a power we created
  • 5:25 - 5:28
    to take us for good.
  • 5:28 - 5:31
    And so if you think about it, the depressing thing is that
  • 5:31 - 5:33
    Facebook would never make a change to something
  • 5:33 - 5:36
    as important as an advertising algorithm
  • 5:36 - 5:40
    with a sample size as small as a Phase III clinical trial.
  • 5:40 - 5:44
    We cannot take the information from past trials
  • 5:44 - 5:48
    and put them together to form statistically significant samples.
  • 5:48 - 5:51
    And that sucks, right? So 45 percent of men develop
  • 5:51 - 5:54
    cancer. Thirty-eight percent of women develop cancer.
  • 5:54 - 5:57
    One in four men dies of cancer.
  • 5:57 - 6:00
    One in five women dies of cancer, at least in the United States.
  • 6:00 - 6:02
    And three out of the four drugs we give you
  • 6:02 - 6:06
    if you get cancer fail. And this is personal to me.
  • 6:06 - 6:08
    My sister is a cancer survivor.
  • 6:08 - 6:12
    My mother-in-law is a cancer survivor. Cancer sucks.
  • 6:12 - 6:14
    And when you have it, you don't have a lot of privacy
  • 6:14 - 6:17
    in the hospital. You're naked the vast majority of the time.
  • 6:17 - 6:21
    People you don't know come in and look at you and poke you and prod you,
  • 6:21 - 6:24
    and when I tell cancer survivors that this tool we created
  • 6:24 - 6:27
    to protect them is actually preventing their data from being used,
  • 6:27 - 6:29
    especially when only three to four percent of people
  • 6:29 - 6:32
    who have cancer ever even sign up for a clinical study,
  • 6:32 - 6:36
    their reaction is not, "Thank you, God, for protecting my privacy."
  • 6:36 - 6:39
    It's outrage
  • 6:39 - 6:41
    that we have this information and we can't use it.
  • 6:41 - 6:43
    And it's an accident.
  • 6:43 - 6:46
    So the cost in blood and treasure of this is enormous.
  • 6:46 - 6:50
    Two hundred and twenty-six billion a year is spent on cancer in the United States.
  • 6:50 - 6:53
    Fifteen hundred people a day die in the United States.
  • 6:53 - 6:56
    And it's getting worse.
  • 6:56 - 6:59
    So the good news is that some things have changed,
  • 6:59 - 7:00
    and the most important thing that's changed
  • 7:00 - 7:03
    is that we can now measure ourselves in ways
  • 7:03 - 7:06
    that used to be the dominion of the health system.
  • 7:06 - 7:08
    So a lot of people talk about it as digital exhaust.
  • 7:08 - 7:11
    I like to think of it as the dust that runs along behind my kid.
  • 7:11 - 7:13
    We can reach back and grab that dust,
  • 7:13 - 7:16
    and we can learn a lot about health from it, so if our choices
  • 7:16 - 7:18
    are part of our health, what we eat is a really important
  • 7:18 - 7:21
    aspect of our health. So you can do something very simple
  • 7:21 - 7:23
    and basic and take a picture of your food,
  • 7:23 - 7:26
    and if enough people do that, we can learn a lot about
  • 7:26 - 7:27
    how our food affects our health.
  • 7:27 - 7:32
    One interesting thing that came out of this — this is an app for iPhones called The Eatery —
  • 7:32 - 7:34
    is that we think our pizza is significantly healthier
  • 7:34 - 7:38
    than other people's pizza is. Okay? (Laughter)
  • 7:38 - 7:41
    And it seems like a trivial result, but this is the sort of research
  • 7:41 - 7:44
    that used to take the health system years
  • 7:44 - 7:46
    and hundreds of thousands of dollars to accomplish.
  • 7:46 - 7:50
    It was done in five months by a startup company of a couple of people.
  • 7:50 - 7:52
    I don't have any financial interest in it.
  • 7:52 - 7:55
    But more nontrivially, we can get our genotypes done,
  • 7:55 - 7:58
    and although our genotypes aren't dispositive, they give us clues.
  • 7:58 - 8:01
    So I could show you mine. It's just A's, T's, C's and G's.
  • 8:01 - 8:03
    This is the interpretation of it. As you can see,
  • 8:03 - 8:05
    I carry a 32 percent risk of prostate cancer,
  • 8:05 - 8:10
    22 percent risk of psoriasis and a 14 percent risk of Alzheimer's disease.
  • 8:10 - 8:12
    So that means, if you're a geneticist, you're freaking out,
  • 8:12 - 8:16
    going, "Oh my God, you told everyone you carry the ApoE E4 allele. What's wrong with you?"
  • 8:16 - 8:20
    Right? When I got these results, I started talking to doctors,
  • 8:20 - 8:22
    and they told me not to tell anyone, and my reaction is,
  • 8:22 - 8:26
    "Is that going to help anyone cure me when I get the disease?"
  • 8:26 - 8:29
    And no one could tell me yes.
  • 8:29 - 8:31
    And I live in a web world where, when you share things,
  • 8:31 - 8:34
    beautiful stuff happens, not bad stuff.
  • 8:34 - 8:36
    So I started putting this in my slide decks,
  • 8:36 - 8:39
    and I got even more obnoxious, and I went to my doctor,
  • 8:39 - 8:41
    and I said, "I'd like to actually get my bloodwork.
  • 8:41 - 8:43
    Please give me back my data." So this is my most recent bloodwork.
  • 8:43 - 8:46
    As you can see, I have high cholesterol.
  • 8:46 - 8:48
    I have particularly high bad cholesterol, and I have some
  • 8:48 - 8:51
    bad liver numbers, but those are because we had a dinner party with a lot of good wine
  • 8:51 - 8:54
    the night before we ran the test. (Laughter)
  • 8:54 - 8:59
    Right. But look at how non-computable this information is.
  • 8:59 - 9:02
    This is like the photograph of my granddad holding my mom
  • 9:02 - 9:05
    from a data perspective, and I had to go into the system
  • 9:05 - 9:07
    and get it out.
  • 9:07 - 9:11
    So the thing that I'm proposing we do here
  • 9:11 - 9:13
    is that we reach behind us and we grab the dust,
  • 9:13 - 9:16
    that we reach into our bodies and we grab the genotype,
  • 9:16 - 9:19
    and we reach into the medical system and we grab our records,
  • 9:19 - 9:22
    and we use it to build something together, which is a commons.
  • 9:22 - 9:25
    And there's been a lot of talk about commonses, right,
  • 9:25 - 9:28
    here, there, everywhere, right. A commons is nothing more
  • 9:28 - 9:31
    than a public good that we build out of private goods.
  • 9:31 - 9:34
    We do it voluntarily, and we do it through standardized
  • 9:34 - 9:37
    legal tools. We do it through standardized technologies.
  • 9:37 - 9:40
    Right. That's all a commons is. It's something that we build
  • 9:40 - 9:42
    together because we think it's important.
  • 9:42 - 9:45
    And a commons of data is something that's really unique,
  • 9:45 - 9:48
    because we make it from our own data. And although
  • 9:48 - 9:50
    a lot of people like privacy as their methodology of control
  • 9:50 - 9:53
    around data, and obsess around privacy, at least
  • 9:53 - 9:56
    some of us really like to share as a form of control,
  • 9:56 - 9:58
    and what's remarkable about digital commonses
  • 9:58 - 10:01
    is you don't need a big percentage if your sample size is big enough
  • 10:01 - 10:04
    to generate something massive and beautiful.
  • 10:04 - 10:07
    So not that many programmers write free software,
  • 10:07 - 10:09
    but we have the Apache web server.
  • 10:09 - 10:12
    Not that many people who read Wikipedia edit,
  • 10:12 - 10:16
    but it works. So as long as some people like to share
  • 10:16 - 10:19
    as their form of control, we can build a commons, as long as we can get the information out.
  • 10:19 - 10:22
    And in biology, the numbers are even better.
  • 10:22 - 10:24
    So Vanderbilt ran a study asking people, we'd like to take
  • 10:24 - 10:28
    your biosamples, your blood, and share them in a biobank,
  • 10:28 - 10:30
    and only five percent of the people opted out.
  • 10:30 - 10:33
    I'm from Tennessee. It's not the most science-positive state
  • 10:33 - 10:36
    in the United States of America. (Laughter)
  • 10:36 - 10:38
    But only five percent of the people wanted out.
  • 10:38 - 10:42
    So people like to share, if you give them the opportunity and the choice.
  • 10:42 - 10:47
    And the reason that I got obsessed with this, besides the obvious family aspects,
  • 10:47 - 10:50
    is that I spend a lot of time around mathematicians,
  • 10:50 - 10:53
    and mathematicians are drawn to places where there's a lot of data
  • 10:53 - 10:56
    because they can use it to tease signals out of noise.
  • 10:56 - 10:59
    And those correlations that they can tease out, they're not
  • 10:59 - 11:03
    necessarily causal agents, but math, in this day and age,
  • 11:03 - 11:05
    is like a giant set of power tools
  • 11:05 - 11:09
    that we're leaving on the floor, not plugged in in health,
  • 11:09 - 11:11
    while we use hand saws.
  • 11:11 - 11:16
    If we have a lot of shared genotypes, and a lot of shared
  • 11:16 - 11:19
    outcomes, and a lot of shared lifestyle choices,
  • 11:19 - 11:21
    and a lot of shared environmental information, we can start
  • 11:21 - 11:24
    to tease out the correlations between subtle variations
  • 11:24 - 11:30
    in people, the choices they make and the health that they create as a result of those choices,
  • 11:30 - 11:32
    and there's open-source infrastructure to do all of this.
  • 11:32 - 11:35
    Sage Bionetworks is a nonprofit that's built a giant math system
  • 11:35 - 11:40
    that's waiting for data, but there isn't any.
  • 11:40 - 11:44
    So that's what I do. I've actually started what we think is
  • 11:44 - 11:48
    the world's first fully digital, fully self-contributed,
  • 11:48 - 11:53
    unlimited in scope, global in participation, ethically approved
  • 11:53 - 11:56
    clinical research study where you contribute the data.
  • 11:56 - 11:59
    So if you reach behind yourself and you grab the dust,
  • 11:59 - 12:01
    if you reach into your body and grab your genome,
  • 12:01 - 12:04
    if you reach into the medical system and somehow extract your medical record,
  • 12:04 - 12:08
    you can actually go through an online informed consent process --
  • 12:08 - 12:10
    because the donation to the commons must be voluntary
  • 12:10 - 12:13
    and it must be informed -- and you can actually upload
  • 12:13 - 12:16
    your information and have it syndicated to the
  • 12:16 - 12:19
    mathematicians who will do this sort of big data research,
  • 12:19 - 12:21
    and the goal is to get 100,000 in the first year
  • 12:21 - 12:24
    and a million in the first five years so that we have
  • 12:24 - 12:28
    a statistically significant cohort that you can use to take
  • 12:28 - 12:30
    smaller sample sizes from traditional research
  • 12:30 - 12:32
    and map it against,
  • 12:32 - 12:35
    so that you can use it to tease out those subtle correlations
  • 12:35 - 12:37
    between the variations that make us unique
  • 12:37 - 12:41
    and the kinds of health that we need to move forward as a society.
  • 12:41 - 12:44
    And I've spent a lot of time around other commons.
  • 12:44 - 12:47
    I've been around the early web. I've been around
  • 12:47 - 12:49
    the early creative commons world, and there's four things
  • 12:49 - 12:53
    that all of these share, which is, they're all really simple.
  • 12:53 - 12:56
    And so if you were to go to the website and enroll in this study,
  • 12:56 - 12:58
    you're not going to see something complicated.
  • 12:58 - 13:03
    But it's not simplistic. These things are weak intentionally,
  • 13:03 - 13:06
    right, because you can always add power and control to a system,
  • 13:06 - 13:10
    but it's very difficult to remove those things if you put them in at the beginning,
  • 13:10 - 13:12
    and so being simple doesn't mean being simplistic,
  • 13:12 - 13:15
    and being weak doesn't mean weakness.
  • 13:15 - 13:17
    Those are strengths in the system.
  • 13:17 - 13:20
    And open doesn't mean that there's no money.
  • 13:20 - 13:23
    Closed systems, corporations, make a lot of money
  • 13:23 - 13:26
    on the open web, and they're one of the reasons why the open web lives
  • 13:26 - 13:29
    is that corporations have a vested interest in the openness
  • 13:29 - 13:31
    of the system.
  • 13:31 - 13:35
    And so all of these things are part of the clinical study that we've created,
  • 13:35 - 13:39
    so you can actually come in, all you have to be is 14 years old,
  • 13:39 - 13:41
    willing to sign a contract that says I'm not going to be a jerk,
  • 13:41 - 13:43
    basically, and you're in.
  • 13:43 - 13:45
    You can start analyzing the data.
  • 13:45 - 13:49
    You do have to solve a CAPTCHA as well. (Laughter)
  • 13:49 - 13:53
    And if you'd like to build corporate structures on top of it,
  • 13:53 - 13:56
    that's okay too. That's all in the consent,
  • 13:56 - 13:58
    so if you don't like those terms, you don't come in.
  • 13:58 - 14:01
    It's very much the design principles of a commons
  • 14:01 - 14:04
    that we're trying to bring to health data.
  • 14:04 - 14:07
    And the other thing about these systems is that it only takes
  • 14:07 - 14:10
    a small number of really unreasonable people working together
  • 14:10 - 14:13
    to create them. It didn't take that many people
  • 14:13 - 14:17
    to make Wikipedia Wikipedia, or to keep it Wikipedia.
  • 14:17 - 14:19
    And we're not supposed to be unreasonable in health,
  • 14:19 - 14:21
    and so I hate this word "patient."
  • 14:21 - 14:24
    I don't like being patient when systems are broken,
  • 14:24 - 14:27
    and health care is broken.
  • 14:27 - 14:31
    I'm not talking about the politics of health care, I'm talking about the way we scientifically approach health care.
  • 14:31 - 14:34
    So I don't want to be patient. And the task I'm giving to you
  • 14:34 - 14:37
    is to not be patient. So I'd like you to actually try,
  • 14:37 - 14:40
    when you go home, to get your data.
  • 14:40 - 14:43
    You'll be shocked and offended and, I would bet, outraged,
  • 14:43 - 14:46
    at how hard it is to get it.
  • 14:46 - 14:48
    But it's a challenge that I hope you'll take,
  • 14:48 - 14:51
    and maybe you'll share it. Maybe you won't.
  • 14:51 - 14:52
    If you don't have anyone in your family who's sick,
  • 14:52 - 14:55
    maybe you wouldn't be unreasonable. But if you do,
  • 14:55 - 14:57
    or if you've been sick, then maybe you would.
  • 14:57 - 15:01
    And we're going to be able to do an experiment in the next several months
  • 15:01 - 15:04
    that lets us know exactly how many unreasonable people are out there.
  • 15:04 - 15:06
    So this is the Athena Breast Health Network. It's a study
  • 15:06 - 15:10
    of 150,000 women in California, and they're going to
  • 15:10 - 15:12
    return all the data to the participants of the study
  • 15:12 - 15:15
    in a computable form, with one-clickability to load it into
  • 15:15 - 15:18
    the study that I've put together. So we'll know exactly
  • 15:18 - 15:20
    how many people are willing to be unreasonable.
  • 15:20 - 15:23
    So what I'd end [with] is,
  • 15:23 - 15:26
    the most beautiful thing I've learned since I quit my job
  • 15:26 - 15:29
    almost a year ago to do this, is that it really doesn't take
  • 15:29 - 15:33
    very many of us to achieve spectacular results.
  • 15:33 - 15:36
    You just have to be willing to be unreasonable,
  • 15:36 - 15:38
    and the risk we're running is not the risk those 14 men
  • 15:38 - 15:40
    who got yellow fever ran. Right?
  • 15:40 - 15:43
    It's to be naked, digitally, in public. So you know more
  • 15:43 - 15:46
    about me and my health than I know about you. It's asymmetric now.
  • 15:46 - 15:50
    And being naked and alone can be terrifying.
  • 15:50 - 15:55
    But to be naked in a group, voluntarily, can be quite beautiful.
  • 15:55 - 15:56
    And so it doesn't take all of us.
  • 15:56 - 15:59
    It just takes all of some of us. Thank you.
  • 15:59 - 16:05
    (Applause)
Title:
Let's pool our medical data
Speaker:
John Wilbanks
Description:

When you're getting medical treatment, or taking part in medical testing, privacy is important; strict laws limit what researchers can see and know about you. But what if your medical data could be used -- anonymously -- by anyone seeking to test a hypothesis? John Wilbanks wonders if the desire to protect our privacy is slowing research, and if opening up medical data could lead to a wave of health care innovation.

more » « less
Video Language:
English
Team:
closed TED
Project:
TEDTalks
Duration:
16:25
Morton Bast edited English subtitles for Let's pool our medical data
Morton Bast edited English subtitles for Let's pool our medical data
Thu-Huong Ha approved English subtitles for Let's pool our medical data
Thu-Huong Ha edited English subtitles for Let's pool our medical data
Morton Bast accepted English subtitles for Let's pool our medical data
Morton Bast edited English subtitles for Let's pool our medical data
Joseph Geni added a translation

English subtitles

Revisions Compare revisions