Return to Video

"Not Cool Bro" Theory of Privacy - Brad Rosen at TEDxYale

  • 0:10 - 0:12
    Hi everybody. My name is Brad
  • 0:12 - 0:14
    and I'm here to talk to you about privacy.
  • 0:14 - 0:16
    So just a quick caveat
  • 0:16 - 0:19
    my views expressed are my own
    not either of my employers,
  • 0:19 - 0:22
    so impute the crazy only to me.
  • 0:22 - 0:24
    So I wanna talk to you a little about how
  • 0:24 - 0:26
    privacy is changing and how the ways
  • 0:26 - 0:28
    that we think about what is private
  • 0:28 - 0:31
    has started to morph
    and where I think it's going.
  • 0:31 - 0:33
    And so my premise to you is we've gone
  • 0:33 - 0:36
    from the society of privacy
    laws to privacy norms
  • 0:36 - 0:39
    and that we can encapsulate this
  • 0:39 - 0:42
    with a simple phrase of
    "Not cool bro" or
  • 0:42 - 0:45
    "bro act" or whatever
    the proper term is
  • 0:45 - 0:48
    as the case may be.
    So, now, you gonna ask,
  • 0:48 - 0:50
    What do I mean when I say
  • 0:50 - 0:53
    "Not cool bro"? So, when I say,
  • 0:53 - 0:55
    "Not cool bro",
    I'll give you an example:
  • 0:55 - 0:56
    You are on Facebook and
  • 0:56 - 0:58
    you have a bad break-up or fight with
  • 0:58 - 1:00
    one of your friends.
    And you block
  • 1:00 - 1:02
    that person and so they
    can no longer see your wall.
  • 1:02 - 1:05
    Someone else who still has access to your
  • 1:05 - 1:07
    Facebook wall, goes in, and either
  • 1:07 - 1:09
    copies and pastes the entire thing,
  • 1:09 - 1:12
    or takes a screenshot every day
  • 1:12 - 1:14
    and mails it to that person.
  • 1:14 - 1:16
    OK, so our initial response is:
  • 1:16 - 1:19
    That's not cool. Not cool bro! Not cool!
  • 1:19 - 1:22
    We get the visceral feeling
  • 1:22 - 1:24
    that that's somehow privacy violation.
  • 1:24 - 1:26
    We get the feeling that that something
  • 1:26 - 1:28
    unseemly is happening but we don't know
  • 1:28 - 1:30
    how to quite talk about it.
  • 1:30 - 1:32
    And that is what I am talking about by
  • 1:32 - 1:34
    "Not cool bro" privacy, because the law
  • 1:34 - 1:36
    doesn't recognize that as a real
  • 1:36 - 1:37
    privacy interest.
  • 1:37 - 1:39
    So it first makes a little sense
  • 1:39 - 1:41
    to talk about what the law does recognize.
  • 1:41 - 1:43
    And most of our privacy comes from
  • 1:43 - 1:45
    the criminal context.
  • 1:45 - 1:49
    All privacy was literally --
    you had to sneak up to
  • 1:49 - 1:51
    someones house and eavesdrop.
  • 1:51 - 1:53
    So the eaves of a house were the waterfalls.
  • 1:53 - 1:55
    So you would stand outside
  • 1:55 - 1:56
    and you would listen.
  • 1:56 - 1:57
    You would climb over their fence,
  • 1:57 - 1:59
    shimmy across their lawn,
  • 1:59 - 2:00
    nozzle up to the side of the house,
  • 2:00 - 2:02
    maybe like you know poke a hole
  • 2:02 - 2:04
    on a window and then
    you can eavesdrop.
  • 2:04 - 2:05
    That's what you had to do
  • 2:05 - 2:07
    to violate someone's privacy.
  • 2:07 - 2:08
    So privacy was really about access
  • 2:08 - 2:10
    to information.
  • 2:10 - 2:12
    And we didn't need special rules
  • 2:12 - 2:14
    regulating access to information because
  • 2:14 - 2:16
    you couldn't violate someone's privacy
  • 2:16 - 2:18
    in the 1700's, unless you showed up
  • 2:18 - 2:20
    at their front gate.
  • 2:20 - 2:22
    Right? There was no Facebook.
  • 2:22 - 2:25
    The only wall that happened in 1700
  • 2:25 - 2:26
    was literally the wall
  • 2:26 - 2:28
    outside the carriage house.
  • 2:28 - 2:33
    So we go from that to this modern era
  • 2:33 - 2:35
    or pre-modern era in which we have
  • 2:35 - 2:37
    ways of getting information
  • 2:37 - 2:39
    that don't necessarily involve
  • 2:39 - 2:42
    the traditional structures of space.
  • 2:42 - 2:43
    So there is a very famous case
  • 2:43 - 2:45
    involving a guy
  • 2:45 - 2:46
    who walks into the phone booth
  • 2:46 - 2:49
    and he closes the door behind him
    and then he does some illegal activity
  • 2:49 - 2:51
    and the police are listening
    and they don't have a warrant.
  • 2:51 - 2:54
    And by the way in the law
    you need to get a warrant
  • 2:54 - 2:55
    before you can listen to any of this stuff.
  • 2:55 - 2:56
    And for those of you who've seen
  • 2:56 - 2:57
    CSI or Law and Order -
  • 2:57 - 3:00
    Big judges who give out warrants
    are sometimes like Oprah:
  • 3:00 - 3:02
    You get a warrant and You get a warrant
  • 3:02 - 3:04
    and everybody gets a warrant.
  • 3:04 - 3:06
    (Laughter)
  • 3:06 - 3:08
    But at least we still have that
  • 3:08 - 3:09
    nominal process, that the privacy
  • 3:09 - 3:11
    is still being protected,
  • 3:11 - 3:13
    right, there is a reasonableness here.
  • 3:13 - 3:15
    Well, this guy and his name is Kats,
  • 3:15 - 3:18
    so you can have all sorts of phonetics --
    it's spelled with a K.
  • 3:18 - 3:20
    He had a reasonable expectation
  • 3:20 - 3:22
    of privacy here. Two types:
  • 3:22 - 3:24
    It was subjected: meaning
  • 3:24 - 3:26
    he personally believed that it was private
  • 3:26 - 3:27
    because he closed the booth behind him.
  • 3:27 - 3:30
    And then two: objective.
  • 3:30 - 3:31
    Society as a whole is willing to say,
  • 3:31 - 3:33
    You know what, yeah, if you are going
  • 3:33 - 3:37
    into a telephone booth and close the door
    that should be private.
  • 3:37 - 3:39
    Now, what about if we apply this to Facebook?
  • 3:39 - 3:42
    Right? This is the "not cool bro"
    version of privacy.
  • 3:42 - 3:44
    You just take it one more step.
  • 3:44 - 3:46
    In the old version of privacy,
  • 3:46 - 3:48
    it's all about control over
  • 3:48 - 3:50
    the means of accessing information.
  • 3:50 - 3:53
    In the new version of privacy,
  • 3:53 - 3:54
    it's all about control over
  • 3:54 - 3:58
    who can get the information
    that you give it to?
  • 3:58 - 4:00
    So in the old version of privacy,
    if I tell someone,
  • 4:00 - 4:03
    "Hey I got an F on a test."
  • 4:03 - 4:05
    That person can tell anyone else
  • 4:05 - 4:06
    and it's not really considered
  • 4:06 - 4:08
    a violation of privacy.
  • 4:08 - 4:09
    I didn't keep it private,
  • 4:09 - 4:11
    because I allowed someone else access.
  • 4:11 - 4:13
    In the new version of privacy,
  • 4:13 - 4:16
    when you post something
    to your Facebook wall,
  • 4:16 - 4:19
    if you prevented the rest of the world
    from seeing it,
  • 4:19 - 4:22
    is there an implied understanding to
    anyone else you've given access to
  • 4:22 - 4:24
    that they shouldn't re-share it?
  • 4:24 - 4:27
    Or another example:
    If you are on Twitter
  • 4:27 - 4:29
    and you've got a protected Twitter stream
  • 4:29 - 4:30
    that no one can see
  • 4:30 - 4:33
    unless they expressly follow you
    and you allow them.
  • 4:33 - 4:34
    And someone just sort of re-tweeting
  • 4:34 - 4:36
    all of your protected tweets.
  • 4:36 - 4:38
    You would say the exact same thing.
  • 4:38 - 4:39
    But the understanding would've been:
  • 4:39 - 4:41
    Wait a minute, I only let you see
  • 4:41 - 4:43
    my Twitter stream, because
  • 4:43 - 4:45
    I thought you weren't going to re-tweet it.
  • 4:45 - 4:48
    You violated some sort of implied understanding
    we had.
  • 4:48 - 4:50
    So, now we have our modern eavesdropper,
  • 4:50 - 4:52
    who is in your Facebook wall
  • 4:52 - 4:54
    posting again. So this is
  • 4:54 - 4:56
    our modern eavesdropper over here
  • 4:56 - 4:58
    in conversation. So how do we get
  • 4:58 - 5:00
    to a place in which the law
  • 5:00 - 5:02
    can come up to where we are.
  • 5:02 - 5:06
    Because we think of this,
    new norm, this new idea
  • 5:06 - 5:08
    you won't re-share something
  • 5:08 - 5:10
    I only share with you.
  • 5:10 - 5:11
    How do we get there?
  • 5:11 - 5:13
    And there is actually an interesting way
  • 5:13 - 5:16
    and I'll take it back a little bit
    to give you an example.
  • 5:16 - 5:18
    When Facebook started out
  • 5:18 - 5:20
    you had to be a member of a network
  • 5:20 - 5:22
    to just look at someone's profile.
  • 5:22 - 5:25
    And I remember having a job,
  • 5:25 - 5:28
    where someone in HR knew
    that I went to Yale
  • 5:28 - 5:30
    and sent me an e-mail
    and said:
  • 5:30 - 5:31
    Brad, we know you went to Yale,
  • 5:31 - 5:35
    could you log into Facebook
    and print out a copy
  • 5:35 - 5:37
    of this applicant's Facebook page?
  • 5:37 - 5:39
    We'd like to see it.
  • 5:39 - 5:40
    So, by the way, for those of you
  • 5:40 - 5:43
    who thought that didn't happen,
    it was happening in 2004.
  • 5:43 - 5:44
    So you better believe it's happening now.
  • 5:44 - 5:49
    That said, I recoiled in like
    shock and horror.
  • 5:49 - 5:52
    I was like that would be a violation
    of that person's privacy.
  • 5:52 - 5:53
    It would be a betrayal of trust.
  • 5:53 - 5:57
    But the idea was back then,
    what happened in the Yale network,
  • 5:57 - 5:58
    stayed in the Yale network.
  • 5:58 - 6:00
    It's kind of like Vegas.
    (Laughter)
  • 6:00 - 6:03
    And so, there was a knowledge
    that was like,
  • 6:03 - 6:05
    Hey, not cool, bro!
  • 6:05 - 6:06
    You knew that you only had access to this.
  • 6:06 - 6:09
    There was a reciprocal understanding.
  • 6:09 - 6:11
    You won't tell other people
    what goes on in the Yale network,
  • 6:11 - 6:14
    and I won't tell other people
    what goes on in the Yale network.
  • 6:14 - 6:15
    And so, if you think about it
  • 6:15 - 6:17
    there is an implied commitment
  • 6:17 - 6:18
    when you join any of these social networks,
  • 6:18 - 6:20
    that if you are not supposed
  • 6:20 - 6:22
    to re-share information, you won't.
  • 6:22 - 6:24
    And there is another area of the law
  • 6:24 - 6:27
    that actually has really
    had this transformation,
  • 6:27 - 6:29
    where they went from a very formal --
    you have to be explicit understandings --
  • 6:29 - 6:31
    to just anything goes.
  • 6:31 - 6:32
    And that's products liability.
  • 6:32 - 6:34
    As crazy as this may sound
  • 6:34 - 6:35
    there was a time where if you bought
  • 6:35 - 6:39
    a can of coca-cola and it blew up
    in your hand as you were drinking it.
  • 6:39 - 6:40
    The only person you could sue
  • 6:40 - 6:42
    was the bodega or bodegua,
  • 6:42 - 6:45
    as the case may be, that you bought it from.
  • 6:45 - 6:48
    And over time courts sort of didn't like this.
  • 6:48 - 6:50
    They said, Well, there is an implied contract
  • 6:50 - 6:52
    between the original manufacturer
    and each step in the chain
  • 6:52 - 6:54
    of distributions until to the ultimate consumer
  • 6:54 - 6:57
    does not need to be
    in contractual privity
  • 6:57 - 6:58
    with the original manufacturer.
  • 6:58 - 7:00
    Which is the fancy way of saying:
  • 7:00 - 7:02
    We are going to say
    that there is an implied contract
  • 7:02 - 7:04
    that runs all the way
    through all these steps in the middle.
  • 7:04 - 7:06
    And when you finally buy
    that can of coke,
  • 7:06 - 7:09
    you get an implied contract from Coke.
  • 7:09 - 7:12
    Finally a court in California said,
    Enough, is enough.
  • 7:12 - 7:14
    We are done pretending,
    we are done making up
  • 7:14 - 7:16
    these implied contracts.
  • 7:16 - 7:18
    We are just gonna say
    strict liability applies
  • 7:18 - 7:22
    if you make a product
    and you put it out in the universe.
  • 7:22 - 7:23
    You have a reasonable understanding
  • 7:23 - 7:25
    that if somebody gets hurt
    by that product
  • 7:25 - 7:27
    they are going to sue you.
  • 7:27 - 7:29
    We can do the exact same thing
    with privacy.
  • 7:29 - 7:31
    If you join a social network,
  • 7:31 - 7:33
    Facebook, Twitter, Google+
  • 7:33 - 7:37
    and you join that network
    knowing that there are privacy settings
  • 7:37 - 7:40
    and knowing that other people
    are sharing information with you,
  • 7:40 - 7:44
    but at the same time
    prohibiting other people
  • 7:44 - 7:46
    from accessing that information.
  • 7:46 - 7:49
    It's a violation of whatever
    you wanna phrase it as.
  • 7:49 - 7:52
    Wouldn't that person's
    expressed understanding
  • 7:52 - 7:54
    that they would only share
    the information with you
  • 7:54 - 7:55
    if you didn't re-share it.
  • 7:55 - 7:58
    Now, we can get there without
    waiting, for example,
  • 7:58 - 8:02
    Facebook could put a little lock
    or a hash icon on every post
  • 8:02 - 8:05
    on every element of Facebook
    that's been shared with you
  • 8:05 - 8:08
    that would let you know
    whether or not it was public,
  • 8:08 - 8:09
    whether or not it was OK to re-share.
  • 8:09 - 8:10
    Twitter already does this.
  • 8:10 - 8:13
    You can not one click re-tweet
    a protected tweet.
  • 8:13 - 8:18
    And Craigslist has code-matching that
    will actually look at posts
  • 8:18 - 8:20
    you've previously made
    and if you're re-posting
  • 8:20 - 8:22
    similar content it will stop you.
  • 8:22 - 8:24
    Facebook and Twitter
    could do the exact same thing
  • 8:24 - 8:27
    if they see you trying
    to use copy and paste
  • 8:27 - 8:29
    to get around these mechanisms.
  • 8:29 - 8:30
    The other thing they could do is,
  • 8:30 - 8:31
    we could amend our terms of service,
  • 8:31 - 8:33
    we could make an expressed term of service
  • 8:33 - 8:37
    to re-share information
    that is not supposed to be re-shared.
  • 8:37 - 8:39
    Like on Google+,
    where you can actually click
  • 8:39 - 8:41
    "Disable re-sharing"
    and then no one else can.
  • 8:41 - 8:45
    So there is a sense
    that we can get there.
  • 8:45 - 8:48
    We also can maybe get there in law.
  • 8:48 - 8:50
    It won't necessarily happen
  • 8:50 - 8:52
    right away, but in a recent case
  • 8:52 - 8:54
    the Supreme Court is starting
    to go there.
  • 8:54 - 8:57
    In the Jones case,
    which dealt with GPS.
  • 8:57 - 8:59
    Just the sort of you know, maybe it's time
  • 8:59 - 9:03
    that we start to rethink this notion
  • 9:03 - 9:04
    that if you share something
  • 9:04 - 9:07
    with one person it is no longer private.
  • 9:07 - 9:11
    Because our societal expectations
    have changed.
  • 9:11 - 9:14
    We've moved from law to norms.
  • 9:14 - 9:19
    The norms of not-cool-bro-to-re-share
    are how we now think about privacy.
  • 9:19 - 9:22
    And as a result our law
    is lagging behind a little bit.
  • 9:22 - 9:25
    And although we are not there yet
    with our laws,
  • 9:25 - 9:27
    we have inter measures
    we can use from CoE.
  • 9:27 - 9:31
    So the interesting thing
    to see is where we head
  • 9:31 - 9:34
    now that we are a society of norms
  • 9:34 - 9:36
    and when we think about privacy
    as norm-based.
  • 9:37 - 9:40
    (Applause)
Title:
"Not Cool Bro" Theory of Privacy - Brad Rosen at TEDxYale
Description:

Brad Rosen is a lecturer in the computer science department at Yale University. He talks about how privacy is changing and about how the ways we think about what is private is beginning to morph and where he thinks it's going with his "not cool, bro" theory of privacy.

more » « less
Video Language:
English
Team:
closed TED
Project:
TEDxTalks
Duration:
09:46

English subtitles

Revisions Compare revisions