Return to Video

How online abuse of women has spiraled out of control

  • 0:00 - 0:03
    [This talk contains graphic language
    and descriptions of sexual violence
  • 0:03 - 0:05
    View discretion is advised]
  • 0:05 - 0:06
    "Ashley Judd,
  • 0:06 - 0:08
    stupid fucking slut."
  • 0:10 - 0:13
    "You can't sue someone
    for calling them a cunt.
  • 0:13 - 0:15
    If you can't handle the Intenet,
  • 0:15 - 0:16
    fuck off, whore."
  • 0:17 - 0:20
    "I wish Ashley Judd would
    die a horrible death.
  • 0:20 - 0:22
    She is the absolute worst."
  • 0:22 - 0:26
    "Ashley Judd, you're the reason
    women shouldn't vote."
  • 0:26 - 0:29
    "'Twisted' is such a bad movie,
  • 0:29 - 0:31
    I don't even want to rape it."
  • 0:33 - 0:34
    "Whatever you do,
  • 0:34 - 0:37
    don't tell Ashley Judd she'll
    die alone with a dried out vagina."
  • 0:39 - 0:41
    "If I had to fuck an older women,
  • 0:41 - 0:42
    oh my God,
  • 0:42 - 0:45
    I would fuck the shit out of Ashley Judd,
  • 0:45 - 0:48
    that bitch is hot AF.
  • 0:48 - 0:51
    The unforgivable shit I would do to her."
  • 0:53 - 0:59
    Online misogyny is a global
    gender rights tragedy,
  • 0:59 - 1:02
    and it is imperative that it ends.
  • 1:03 - 1:05
    (Applause)
  • 1:11 - 1:13
    Girls' and women's voices,
  • 1:13 - 1:17
    and our allies' voices
    are constrained in ways
  • 1:17 - 1:19
    that are personally, economically,
  • 1:19 - 1:21
    professionally and politically [damaging],
  • 1:21 - 1:24
    and when we curb abuse,
  • 1:24 - 1:26
    we will expand freedom.
  • 1:28 - 1:29
    I am a Kentucky basketball fan,
  • 1:29 - 1:31
    so on a fine March day last year,
  • 1:32 - 1:34
    I was doing one of the things I do best:
  • 1:34 - 1:36
    I was cheering for my Wildcats.
  • 1:36 - 1:37
    The daffodils were blooming,
  • 1:37 - 1:40
    but the referees were not blowing
    the whistle when I was telling them to.
  • 1:40 - 1:41
    (Laughter)
  • 1:42 - 1:43
    Funny,
  • 1:43 - 1:44
    they're very friendly to me
    before the opening [tip],
  • 1:45 - 1:47
    but they really ignore me during the game.
  • 1:47 - 1:48
    (Laughter)
  • 1:48 - 1:49
    Three of my players were bleeding,
  • 1:49 - 1:51
    so I did the next-best thing ...
  • 1:52 - 1:53
    I tweeted.
  • 1:54 - 1:56
    [@ArkRazorback dirty play can kiss
    my team's free throw making a --
  • 1:56 - 1:58
    @KySportsRadio @marchmadness
    @espn Bloodied 3 players so far.]
  • 1:58 - 2:01
    It is routine for me to be treated
    in the ways I've already described to you.
  • 2:01 - 2:03
    It happens to me every single day
  • 2:03 - 2:06
    on social media platforms
    such as Twitter and Facebook.
  • 2:07 - 2:09
    Since I joined Twitter in 2011,
  • 2:09 - 2:13
    misogyny and misogynists
    have amply demonstrated
  • 2:13 - 2:15
    they will dog my every step.
  • 2:15 - 2:17
    My spirituality,
  • 2:17 - 2:18
    my faith --
  • 2:18 - 2:19
    being a hillbilly, I can say that,
  • 2:19 - 2:20
    you can't --
  • 2:20 - 2:22
    all of it is fair game.
  • 2:23 - 2:26
    And I have responded to this
    with various strategies.
  • 2:26 - 2:28
    I've tried engaging people.
  • 2:28 - 2:32
    This one guy was sending me
    hyper sexual, nasty stuff,
  • 2:32 - 2:35
    and there was a girl in his avatar,
  • 2:35 - 2:36
    and I wrote him back and said,
  • 2:37 - 2:39
    "Is that your daughter?
  • 2:40 - 2:43
    I feel a lot of fear that you
    may think about
  • 2:43 - 2:44
    and talk to women this way."
  • 2:44 - 2:46
    And he surprised me by saying,
  • 2:46 - 2:46
    "You know what?
  • 2:47 - 2:47
    You're right.
  • 2:47 - 2:48
    I apologize."
  • 2:48 - 2:51
    Sometimes people want
    to be held accountable.
  • 2:52 - 2:54
    This one guy was musing
    to I don't know who
  • 2:55 - 2:57
    that maybe I was the definition of a cunt.
  • 2:57 - 2:59
    I was married to a Scot for 14 years,
  • 2:59 - 3:02
    so I said, "Cunt means many
    different things in different countries."
  • 3:03 - 3:03
    (Laughter)
  • 3:04 - 3:07
    "But I'm pretty sure you epitomize
    the global standard of a dick."
  • 3:07 - 3:09
    (Laughter)
  • 3:09 - 3:10
    (Applause)
  • 3:13 - 3:14
    I've tried to rise above it,
  • 3:14 - 3:15
    I've tried to get in the trenches,
  • 3:15 - 3:19
    but mostly I would scroll through
    these social media platforms
  • 3:19 - 3:20
    with one eye partially closed,
  • 3:21 - 3:21
    trying not to see it,
  • 3:22 - 3:24
    but you can't make
    a cucumber out of a pickle.
  • 3:25 - 3:26
    What is seen goes in.
  • 3:26 - 3:27
    It's traumatic.
  • 3:28 - 3:30
    And I was always secretly hoping
    in some part of me
  • 3:31 - 3:36
    that what was being said to me
    and about me wasn't true.
  • 3:37 - 3:39
    Because even I,
  • 3:40 - 3:44
    an avowed, self-declared feminist,
  • 3:44 - 3:46
    who worships at the altar of Gloria --
  • 3:46 - 3:47
    (Laughter)
  • 3:49 - 3:51
    internalize the patriarchy.
  • 3:51 - 3:53
    This is really critical.
  • 3:53 - 3:55
    Patriarchy is not boys and men.
  • 3:56 - 3:58
    It is a system in which
    we all participate,
  • 3:59 - 4:00
    including me.
  • 4:02 - 4:04
    On that particular day,
  • 4:04 - 4:07
    for some reason that particular tweet
    after the basketball game
  • 4:08 - 4:10
    triggered something called a cyber mob.
  • 4:10 - 4:16
    This vitriolic, global outpouring
    of the most heinous hate speech:
  • 4:16 - 4:18
    death threats, rape threats,
  • 4:18 - 4:19
    and don't you know,
  • 4:19 - 4:21
    when I was sitting at home
    alone in my nightgown,
  • 4:21 - 4:22
    I got a phone call,
  • 4:22 - 4:24
    and it was my beloved former husband,
  • 4:24 - 4:25
    and he said on a voicemail,
  • 4:26 - 4:28
    "Loved one,
  • 4:28 - 4:31
    what is happening to you is not OK."
  • 4:31 - 4:39
    And there was something about him
    taking a stand for me that night
  • 4:39 - 4:41
    that allowed me
    to take a stand for myself.
  • 4:42 - 4:43
    And I started to write.
  • 4:43 - 4:47
    I started to write about sharing
    the fact that I'm a survivor
  • 4:47 - 4:48
    of all forms of sexual abuse,
  • 4:48 - 4:49
    including three rapes.
  • 4:50 - 4:52
    And the hate speech
    I get in response to that --
  • 4:52 - 4:55
    these are just some of the comments
    posted to news outlets.
  • 4:57 - 4:59
    Being told I'm a "snitch" is really fun.
  • 5:01 - 5:03
    [Jay: She enjoyed every second of it!!!!!]
  • 5:03 - 5:04
    (Audience) Jesus.
  • 5:04 - 5:05
    AJ: Thank you, Jesus
  • 5:06 - 5:06
    (Laughter)
  • 5:06 - 5:08
    May your grace and mercy shine.
  • 5:08 - 5:10
    So I wrote this feminist op-ed,
  • 5:10 - 5:15
    it is entitled, "Forget Your Team:
  • 5:15 - 5:17
    It is Your Online Gender Violence
    Toward Girls and Women
  • 5:18 - 5:21
    that Can Kiss My Righteous Ass."
  • 5:21 - 5:22
    (Laugther)
  • 5:22 - 5:24
    (Applause)
  • 5:25 - 5:26
    And I did that alone,
  • 5:26 - 5:27
    and I published it alone,
  • 5:27 - 5:30
    because my chief advisor
    said, "Please don't,
  • 5:30 - 5:32
    the rain of retaliatory garbage
    that is inevitable,
  • 5:33 - 5:34
    I fear for you."
  • 5:34 - 5:35
    But I trust girls,
  • 5:35 - 5:36
    and I trust women,
  • 5:36 - 5:37
    and I trust our aliies.
  • 5:38 - 5:38
    It was pulished,
  • 5:39 - 5:39
    it went viral,
  • 5:39 - 5:41
    it proves that every single day
  • 5:42 - 5:45
    online misogyny is a phenomenon
    endured by us all,
  • 5:45 - 5:46
    all over the world,
  • 5:46 - 5:48
    and when it is intersectional,
  • 5:48 - 5:49
    it is worse.
  • 5:49 - 5:50
    Sexual orientation,
  • 5:50 - 5:51
    gender identity,
  • 5:51 - 5:53
    race, ethnicity, religion --
  • 5:53 - 5:54
    you name it,
  • 5:54 - 5:57
    it amplifies the violence
    endured by girls and women,
  • 5:57 - 5:58
    and for our younger girls,
  • 5:58 - 6:00
    it is worse.
  • 6:01 - 6:03
    It's clearly traumatizing.
  • 6:03 - 6:05
    Our mental health,
  • 6:05 - 6:06
    our emotional well-being,
  • 6:07 - 6:08
    are so gravely effected,
  • 6:08 - 6:09
    because the threat of violence
  • 6:09 - 6:13
    is experienced
    neurobiologically as violence.
  • 6:14 - 6:15
    The cortisol shoots up,
  • 6:15 - 6:16
    the limbic system gets fired,
  • 6:17 - 6:19
    we lose productivity at work.
  • 6:20 - 6:22
    And let's talk about work.
  • 6:22 - 6:25
    Our ability to work is constrained.
  • 6:25 - 6:30
    Online searches of women applying for jobs
    reveal nude pictures of them,
  • 6:30 - 6:31
    false allegations --
  • 6:32 - 6:33
    they have STDs --
  • 6:33 - 6:38
    their addresses indicating
    that they are available for sex
  • 6:38 - 6:42
    with real examples of people
    showing up at the his house
  • 6:42 - 6:44
    for said sex.
  • 6:44 - 6:49
    Our ability to go to school is impaired.
  • 6:49 - 6:55
    96 percent of all postings
    of sexual images [in] our young people:
  • 6:56 - 6:57
    girls.
  • 6:57 - 6:59
    Our girls.
  • 6:59 - 7:03
    Our boys are two-to-three
    times more likely --
  • 7:03 - 7:05
    non-consensually --
  • 7:05 - 7:07
    to share images.
  • 7:08 - 7:10
    And I want to say a word
    about revenge porn.
  • 7:11 - 7:13
    Part of what came out of this tweet
  • 7:13 - 7:16
    was my getting connected with allies
    and other activists
  • 7:16 - 7:19
    who are fighting for a safe
    and free Internet.
  • 7:19 - 7:22
    We started something
    called the Speech Project:
  • 7:22 - 7:23
    curbing abuse,
  • 7:23 - 7:24
    expanding freedom.
  • 7:24 - 7:28
    And that website provides
    a critical forum,
  • 7:28 - 7:33
    because there is no global, legal thing
    to help us figure this out.
  • 7:34 - 7:37
    But we do provide on that website
    a standardized list of definitions,
  • 7:37 - 7:40
    because it's hard to attack
    a behavior in the right way
  • 7:40 - 7:44
    if we're not all sharing a definition
    of what that behavior is.
  • 7:44 - 7:48
    And I learned that revenge porn
    is often dangerously misapplied.
  • 7:49 - 7:52
    It is the non-consensual
    sharing of an image
  • 7:52 - 7:56
    used tacticly to shame
    and humiliate a girl or woman
  • 7:56 - 7:59
    that attempts to pornogrpiphy us.
  • 7:59 - 8:03
    Our natural sexuality is --
  • 8:03 - 8:04
    I don't know about yours --
  • 8:04 - 8:06
    pretty gorgeous and wonderful.
  • 8:07 - 8:10
    And my expressing it does not
    pornography make.
  • 8:11 - 8:13
    (Applause)
  • 8:16 - 8:18
    So I have all these resources,
  • 8:18 - 8:19
    that I'm keenly aware --
  • 8:19 - 8:22
    so many people in the world do not.
  • 8:22 - 8:25
    I was able to start
    the Speech Project with colleagues.
  • 8:26 - 8:28
    I can often get a social media
    company's attention,
  • 8:28 - 8:31
    I have a wonderful visit
    to Facebook HQ coming up.
  • 8:34 - 8:39
    Hasn't helped the idiotic
    reporting standards yet ...
  • 8:39 - 8:45
    I actually pay someone to scrub
    my social media feeds,
  • 8:45 - 8:47
    attempting to spare my brain
  • 8:47 - 8:51
    the daily iterations
    of the trauma of hate speech.
  • 8:51 - 8:53
    And guess what?
  • 8:53 - 8:54
    I get hate speech for that.
  • 8:54 - 8:55
    "Oh, you live in an echo chamber."
  • 8:56 - 8:57
    Well, guess what?
  • 8:57 - 8:59
    Having someone post a photograph
    of me with my mouth open
  • 9:00 - 9:01
    saying they can't wait to cum on my face,
  • 9:02 - 9:04
    I have a right to set that boundary.
  • 9:04 - 9:06
    (Applause)
  • 9:10 - 9:13
    This distinction between virtual
    and real is specious
  • 9:13 - 9:14
    because guess what --
  • 9:14 - 9:17
    that actually happened to me once
    when I was a child,
  • 9:17 - 9:20
    and so that tweet brought up that trauma,
  • 9:20 - 9:21
    and I had to do work on that.
  • 9:21 - 9:22
    But you know what we do?
  • 9:22 - 9:25
    We take all of this hate speech,
  • 9:25 - 9:27
    and we disaggragate it,
  • 9:27 - 9:29
    and we code it,
  • 9:29 - 9:30
    and we give that data,
  • 9:31 - 9:33
    so that we understand
    the intersectionality of it.
  • 9:33 - 9:34
    When I get porn,
  • 9:34 - 9:36
    when it's about political affiliation,
  • 9:36 - 9:37
    when it's about age,
  • 9:37 - 9:39
    when it's about all of it.
  • 9:39 - 9:41
    We're going to win this fight.
  • 9:43 - 9:46
    There are a lot of solutions --
  • 9:46 - 9:47
    thank goodness.
  • 9:48 - 9:50
    I'm going to offer just a few,
  • 9:50 - 9:55
    and of course I challenge you
    to create and contribute your own.
  • 9:55 - 9:56
    Number one:
  • 9:56 - 9:59
    we have to start
    with digital media literacy,
  • 9:59 - 10:02
    and clearly it must have a gendered lens.
  • 10:02 - 10:05
    Kids, schools, caregivers, parents:
  • 10:05 - 10:06
    it's essential.
  • 10:07 - 10:09
    Two:
  • 10:09 - 10:12
    should we talk about our friends in tech?
  • 10:12 - 10:15
    Said with dignity and respect,
  • 10:15 - 10:19
    the sexism in your workplaces must end.
  • 10:20 - 10:22
    (Applause)
  • 10:23 - 10:24
    (Cheers)
  • 10:24 - 10:25
    EDGE,
  • 10:26 - 10:28
    the global standard for gender equality,
  • 10:28 - 10:30
    is the minimum standard.
  • 10:30 - 10:32
    And guess what, Silicon Valley,
  • 10:32 - 10:34
    if L'Oreal in India
  • 10:34 - 10:35
    and the Phillipines,
  • 10:35 - 10:36
    and Brazil,
  • 10:36 - 10:38
    and in Russia and can do it,
  • 10:38 - 10:40
    you can, too.
  • 10:41 - 10:42
    Enough excuses.
  • 10:42 - 10:47
    Only when women have critical mass
    in every department in your companies,
  • 10:47 - 10:50
    including building platforms
    from the ground up,
  • 10:50 - 10:54
    will the conversations about priorities
    and solutions change.
  • 10:54 - 10:57
    And more love for my friends in tech:
  • 10:57 - 11:00
    profiteering off misogyny
    in video games must end.
  • 11:01 - 11:04
    I'm so tired of hearing you talk
    to me at cocktail parties --
  • 11:05 - 11:07
    like you did a couple
    weeks ago in Aspen --
  • 11:07 - 11:10
    about how deplorable #gamergate was
  • 11:10 - 11:12
    when you're still making billions
    of dollars off games
  • 11:13 - 11:15
    that maim and dump women for sport.
  • 11:16 - 11:17
    Basta,
  • 11:17 - 11:18
    as the Italians would say.
  • 11:19 - 11:19
    Enough.
  • 11:20 - 11:22
    (Applause)
  • 11:24 - 11:27
    Our friends in law enforcement
    have much to do,
  • 11:27 - 11:29
    because we've seen
  • 11:29 - 11:33
    that online violence is an extension
    of in person violence.
  • 11:34 - 11:36
    In our country,
  • 11:36 - 11:41
    more girls and women have been
    murdered by their intimate partners
  • 11:41 - 11:43
    than died on 9/11
  • 11:43 - 11:48
    and have died since in Afghanistan
    and Iraq combined.
  • 11:48 - 11:50
    And it's not "cool" to say that,
  • 11:50 - 11:51
    but it is true.
  • 11:52 - 11:55
    We care so much geopolitically
    about what men are doing over there
  • 11:56 - 11:58
    to women over there ...
  • 11:58 - 12:00
    In 2015,
  • 12:00 - 12:06
    72,828 women used intimate
    partner violence services in this country.
  • 12:07 - 12:10
    That is not counting the girls
    and women and boys who needed them.
  • 12:11 - 12:15
    Law enforcement must be empowered
  • 12:15 - 12:17
    with up-to-date Internet technology,
  • 12:17 - 12:18
    the devices,
  • 12:18 - 12:20
    and an understanding
    of these platforms --
  • 12:20 - 12:21
    how they work.
  • 12:22 - 12:24
    The police wanted to be helpful
    when Amanda Hesse called
  • 12:25 - 12:27
    about the death threat
    she was getting on Twitter,
  • 12:27 - 12:29
    but they couldn't really when they said,
  • 12:29 - 12:31
    "What's Twitter?"
  • 12:33 - 12:37
    Our legislators must write and pass
    astute legislation
  • 12:37 - 12:39
    that reflects today's technology
  • 12:39 - 12:43
    and our notions of free and hate speech.
  • 12:43 - 12:44
    In New York recently,
  • 12:44 - 12:47
    the law could not be applied
    to a perpetrator
  • 12:47 - 12:49
    because the crimes must
    have been committed --
  • 12:49 - 12:50
    even if it was anonymous --
  • 12:50 - 12:55
    they must have been committed
    by telephone, in mail,
  • 12:55 - 12:57
    by telegraph --
  • 12:57 - 12:58
    (Laughter)
  • 13:01 - 13:04
    The language must be
    technologically neutral.
  • 13:06 - 13:08
    So apparently,
  • 13:08 - 13:09
    I've got a pretty bold voice.
  • 13:10 - 13:12
    So let's talk about your friends ...
  • 13:12 - 13:14
    white men.
  • 13:15 - 13:18
    You have a role to play
    and a choice to make.
  • 13:19 - 13:21
    You can do something,
  • 13:21 - 13:23
    or you can do nothing.
  • 13:25 - 13:26
    We're cool in this room,
  • 13:26 - 13:27
    but when this goes out
    everyone will say,
  • 13:27 - 13:28
    "Oh my god,
  • 13:28 - 13:30
    she's a reverse racist."
  • 13:31 - 13:33
    That quote was said by a white man,
  • 13:33 - 13:34
    Robert Morris,
  • 13:35 - 13:37
    chairperson Price Waterhouse Cooper,
  • 13:37 - 13:39
    he asked me to include it in my talk.
  • 13:41 - 13:45
    We need to grow support lines
    and help groups,
  • 13:45 - 13:48
    so victims can help each other
  • 13:48 - 13:50
    when their lives and finances
    have been derailed.
  • 13:50 - 13:55
    We must as individuals disrupt
    gender violence as it is happening.
  • 13:55 - 13:57
    92 percent of young people,
  • 13:57 - 13:57
    29 and under,
  • 13:58 - 13:59
    witness it.
  • 13:59 - 14:02
    72 percent of us have witnessed it.
  • 14:02 - 14:04
    We must have the courage and urgency
  • 14:04 - 14:07
    to practice stopping it
    as it is unfolding.
  • 14:08 - 14:11
    And lastly,
  • 14:11 - 14:13
    believe her.
  • 14:14 - 14:15
    Believe her.
  • 14:15 - 14:18
    (Applause)
  • 14:22 - 14:26
    This is fundamentally a problem
    of human interaction.
  • 14:27 - 14:31
    I believe that human interaction
    is at the core of our healing.
  • 14:31 - 14:35
    Trauma not transformed
    will be trauma transferred.
  • 14:36 - 14:37
    Edith Horton said,
  • 14:37 - 14:39
    "The end is latent in the beginning,"
  • 14:39 - 14:44
    so we are going to end this talk
    replacing hate speech with love speech.
  • 14:44 - 14:46
    Because I get lonely in this,
  • 14:46 - 14:48
    but I know that we are allies.
  • 14:49 - 14:51
    I recently learned
  • 14:51 - 14:55
    about how gratitude and affirmations
    offset negative interactions.
  • 14:55 - 14:59
    It takes five of those to offset
    one negative interaction,
  • 14:59 - 15:00
    and gratitude in particular --
  • 15:00 - 15:03
    free, available globally
    any time, anywhere,
  • 15:03 - 15:05
    to anyone in any dialect.
  • 15:05 - 15:08
    It fires the pregenual anterior cingulate,
  • 15:09 - 15:10
    a watershed part of the brain,
  • 15:10 - 15:13
    that floods it with great, good stuff.
  • 15:13 - 15:15
    So I'm going to say awesome
    stuff about myself.
  • 15:16 - 15:18
    I would like for you
    to reflect it back to me.
  • 15:18 - 15:21
    It might sound something like this --
  • 15:21 - 15:22
    (Laughter)
  • 15:22 - 15:24
    I am a powerful and strong woman,
  • 15:24 - 15:26
    and you would say, "Yes you are."
  • 15:26 - 15:27
    (Audience) Yes your are.
  • 15:27 - 15:29
    My mama loves me.
  • 15:29 - 15:30
    (Audience) Yes she does.
  • 15:31 - 15:33
    I did a great job with my talk.
  • 15:33 - 15:34
    (Audience) Yes you did.
  • 15:34 - 15:37
    I have a right to be here.
  • 15:37 - 15:39
    (Audience) Yes you do.
  • 15:39 - 15:40
    I'm really cute.
  • 15:40 - 15:41
    (Laughter)
  • 15:42 - 15:43
    (Audience) Yes you are.
  • 15:43 - 15:45
    God does good work.
  • 15:45 - 15:46
    (Audience) Yes he does.
  • 15:47 - 15:48
    And I love you.
  • 15:49 - 15:50
    Thank you so much
  • 15:50 - 15:52
    for letting me be of service.
  • 15:52 - 15:55
    (Applause)
Title:
How online abuse of women has spiraled out of control
Speaker:
Ashley Judd
Description:

more » « less
Video Language:
English
Team:
closed TED
Project:
TEDTalks
Duration:
16:10

English subtitles

Revisions Compare revisions