Return to Video

The conversation we're not having about digital child abuse

  • 0:00 - 0:04
    [This talk contains graphic content.
    Viewer discretion is advised.]
  • 0:04 - 0:07
    This is Nina Rodríguez's Facebook profile.
  • 0:09 - 0:11
    This person had three different profiles
  • 0:11 - 0:16
    and 890 kids between 8 and 13 years old
    among her friends list.
  • 0:19 - 0:23
    These are excerpts of a chat
    with one of those kids.
  • 0:25 - 0:28
    This is an exact copy of the chat.
  • 0:29 - 0:30
    It's part of the case file.
  • 0:33 - 0:36
    This kid started sending private photos
  • 0:36 - 0:39
    until his family realized
    what was going on.
  • 0:40 - 0:44
    The police report and subsequent
    investigation lead them to a house.
  • 0:45 - 0:48
    This was the girl's bedroom.
  • 0:50 - 0:55
    Nina Rodríguez was actually
    a 24-year-old man
  • 0:55 - 0:58
    that used to do this with lots of kids.
  • 1:01 - 1:04
    Micaela Ortega was 12 years old
  • 1:05 - 1:07
    when she went to meet
    her new Facebook friend,
  • 1:07 - 1:09
    also 12.
  • 1:10 - 1:12
    "Rochi de River," was her name.
  • 1:14 - 1:18
    She actually met Jonathan Luna,
    who was 26 years old.
  • 1:19 - 1:20
    When they finally caught him,
  • 1:20 - 1:25
    he confessed that he killed the girl
    because she refused to have sex with him.
  • 1:27 - 1:30
    He had four Facebook profiles
  • 1:30 - 1:33
    and 1,700 women on his contact list;
  • 1:35 - 1:37
    90 percent of them
    were under 13 years old.
  • 1:41 - 1:43
    These are two different
    cases of "grooming":
  • 1:44 - 1:47
    an adult contacts a kid
    through the internet,
  • 1:48 - 1:53
    and through manipulation or lying,
    leads that kid into sexual territory --
  • 1:53 - 1:55
    from talking about sex
  • 1:55 - 1:57
    to sharing private photos,
  • 1:57 - 1:59
    recording the kid using a webcam
  • 1:59 - 2:01
    or arranging an in-person meeting.
  • 2:03 - 2:04
    This is grooming.
  • 2:05 - 2:07
    This is happening, and it's on the rise.
  • 2:10 - 2:12
    The question is: What are we going to do?
  • 2:12 - 2:15
    Because, in the meantime, kids are alone.
  • 2:17 - 2:19
    They finish dinner, go to their rooms,
  • 2:19 - 2:20
    close the door,
  • 2:21 - 2:23
    get on their computer, their cell phones,
  • 2:23 - 2:26
    and get into a bar,
  • 2:27 - 2:28
    into a club.
  • 2:30 - 2:33
    Think for one second
    about what I've just said:
  • 2:35 - 2:38
    they're in a place full of strangers
  • 2:38 - 2:40
    in an uninhibited environment.
  • 2:42 - 2:44
    The internet broke physical boundaries.
  • 2:45 - 2:49
    When we're alone in our bedroom
    and we go online,
  • 2:49 - 2:50
    we're not really alone.
  • 2:53 - 2:58
    There are at least two reasons
    why we're not taking care of this,
  • 2:58 - 3:00
    or at least not in the right way.
  • 3:02 - 3:06
    First, we're sure that everything
    that happens online is "virtual."
  • 3:06 - 3:09
    In fact, we call it "the virtual world."
  • 3:11 - 3:13
    If you look it up in the dictionary,
  • 3:14 - 3:16
    something virtual is something
    that seems to exist
  • 3:16 - 3:18
    but is not real.
  • 3:19 - 3:23
    And we use that word
    to talk about the internet:
  • 3:23 - 3:25
    something not real.
  • 3:27 - 3:29
    And that's the problem with grooming.
  • 3:29 - 3:31
    It is real.
  • 3:32 - 3:38
    Degenerate, perverted adults
    use the internet to abuse boys and girls
  • 3:38 - 3:40
    and take advantage of, among other things,
  • 3:40 - 3:43
    the fact that the kids and their parents
    think that what happens online
  • 3:43 - 3:45
    doesn't actually happen.
  • 3:48 - 3:51
    Several years ago,
    some colleagues and I founded an NGO
  • 3:51 - 3:53
    called "Argentina Cibersegura,"
  • 3:53 - 3:57
    dedicated to raising awareness
    about online safety.
  • 3:59 - 4:03
    In 2013, we attended meetings
    at the House of Legislature
  • 4:03 - 4:05
    to discuss a law about grooming.
  • 4:08 - 4:10
    I remember that a lot of people thought
  • 4:10 - 4:12
    that grooming was strictly a precursor
  • 4:12 - 4:16
    to arranging an in-person meeting
    with a kid to have sex with them.
  • 4:18 - 4:22
    But they didn't think about what happened
    to the kids who were exposed
  • 4:22 - 4:25
    by talking about sex
    with an adult without knowing it,
  • 4:26 - 4:30
    or who shared intimate photos thinking
    only another kid would see them,
  • 4:30 - 4:31
    or even worse,
  • 4:32 - 4:34
    who had exposed themselves
    using their web cam.
  • 4:35 - 4:37
    Nobody considered that rape.
  • 4:39 - 4:44
    I'm sure lots of you find it odd to think
    one person can abuse another
  • 4:44 - 4:45
    without physical contact.
  • 4:46 - 4:48
    We're programmed to think that way.
  • 4:49 - 4:51
    I know, because I used to think that way.
  • 4:51 - 4:54
    I was just an IT security guy
  • 4:55 - 4:57
    until this happened to me.
  • 4:59 - 5:01
    At the end of 2011,
  • 5:02 - 5:05
    in a little town in Buenos Aires Province,
  • 5:05 - 5:07
    I heard about a case for the first time.
  • 5:09 - 5:10
    After giving a talk,
  • 5:11 - 5:16
    I met the parents of an 11-year-old girl
    who had been a victim of grooming.
  • 5:18 - 5:22
    A man had manipulated her
    into masturbating in front of her web cam,
  • 5:22 - 5:24
    and recorded it.
  • 5:24 - 5:27
    And the video was on several websites.
  • 5:29 - 5:32
    That day, her parents asked us, in tears,
  • 5:32 - 5:33
    to tell them the magic formula
  • 5:33 - 5:36
    for how to delete those videos
    from the internet.
  • 5:38 - 5:42
    It broke my heart and changed me forever
  • 5:42 - 5:45
    to be their last disappointment,
    telling them it was too late:
  • 5:46 - 5:48
    once content is online,
  • 5:48 - 5:50
    we've already lost control.
  • 5:53 - 5:55
    Since that day, I think about that girl
  • 5:57 - 6:01
    waking up in the morning,
    having breakfast with her family,
  • 6:01 - 6:02
    who had seen the video,
  • 6:03 - 6:09
    and then walking to school, meeting
    people that had seen her naked,
  • 6:09 - 6:13
    arriving to school, playing with
    her friends, who had also seen her.
  • 6:15 - 6:16
    That was her life.
  • 6:18 - 6:19
    Exposed.
  • 6:22 - 6:24
    Of course, nobody raped her body.
  • 6:25 - 6:28
    But hadn't her sexuality been abused?
  • 6:31 - 6:35
    We clearly use different standards
    to measure physical and digital things.
  • 6:37 - 6:39
    And we get angry at social networks
  • 6:39 - 6:43
    because being angry with ourselves
    is more painful and more true.
  • 6:45 - 6:47
    And this brings us
    to the second reason why
  • 6:47 - 6:49
    we aren't paying proper
    attention to this issue.
  • 6:50 - 6:55
    We're convinced that kids
    don't need our help,
  • 6:55 - 6:57
    that they "know everything"
    about technology.
  • 7:00 - 7:01
    When I was a kid,
  • 7:03 - 7:06
    at one point, my parents started
    letting me walk to school alone.
  • 7:07 - 7:11
    After years of taking me by the hand
    and walking me to school,
  • 7:12 - 7:14
    one day they sat me down,
  • 7:14 - 7:16
    gave me the house keys
  • 7:16 - 7:20
    and said, "Be very careful with these;
    don't give them to anyone,
  • 7:20 - 7:24
    take the route we showed you,
    be at home at the time we said,
  • 7:24 - 7:27
    cross at the corner,
    and look both ways before you cross,
  • 7:27 - 7:31
    and no matter what,
    don't talk to strangers."
  • 7:33 - 7:35
    I knew everything about walking,
  • 7:36 - 7:40
    and yet, there was a responsible adult
    there taking care of me.
  • 7:41 - 7:43
    Knowing how to do something is one thing,
  • 7:43 - 7:45
    knowing how to take care
    of yourself is another.
  • 7:47 - 7:49
    Imagine this situation:
  • 7:49 - 7:51
    I'm 10 or 11 years old,
    I wake up in the morning,
  • 7:51 - 7:53
    my parents toss me the keys and say,
  • 7:53 - 7:55
    "Seba, now you can walk to school alone."
  • 7:56 - 7:59
    And when I come back late,
  • 8:00 - 8:03
    they say, "No, you need to be home
    at the time we said."
  • 8:05 - 8:07
    And two weeks later,
  • 8:07 - 8:10
    when it comes up,
    they say, "You know what?
  • 8:10 - 8:13
    You have to cross at the corner,
    and look both ways before crossing."
  • 8:15 - 8:16
    And two years later, they say,
  • 8:17 - 8:21
    "And also, don't talk to strangers."
  • 8:23 - 8:25
    It sounds absurd, right?
  • 8:26 - 8:29
    We have the same absurd behavior
    in relation to technology.
  • 8:30 - 8:32
    We give kids total access
  • 8:32 - 8:35
    and we see if one day, sooner or later,
  • 8:35 - 8:37
    they learn how to take care of themselves.
  • 8:39 - 8:41
    Knowing how to do something is one thing,
  • 8:41 - 8:43
    knowing how to take care
    of yourself is another.
  • 8:45 - 8:47
    Along those same lines,
    when we talk to parents,
  • 8:47 - 8:53
    they often say they don't care
    about technology and social networks.
  • 8:54 - 8:57
    I always rejoin that by asking
    if they care about their kids.
  • 8:58 - 9:00
    As adults, being interested
    or not in technology
  • 9:00 - 9:03
    is the same as being interested
    or not in our kids.
  • 9:03 - 9:05
    The internet is part of their lives.
  • 9:07 - 9:12
    Technology forces us to rethink
    the relationship between adults and kids.
  • 9:13 - 9:16
    Education was always based
    on two main concepts:
  • 9:16 - 9:19
    experience and knowledge.
  • 9:21 - 9:26
    How do we teach our kids to be safe online
    when we don't have either?
  • 9:28 - 9:31
    Nowadays, we adults
    have to guide our children
  • 9:31 - 9:33
    through what is often for us
    unfamiliar territory --
  • 9:33 - 9:35
    territory much more inviting for them.
  • 9:38 - 9:41
    It's impossible to find an answer
  • 9:41 - 9:44
    without doing new things --
    things that make us uncomfortable,
  • 9:44 - 9:45
    things we're not used to.
  • 9:48 - 9:50
    A lot of you may think it's easy for me,
  • 9:50 - 9:52
    because I'm relatively young.
  • 9:53 - 9:54
    And it used to be that way.
  • 9:55 - 9:57
    Used to.
  • 9:58 - 10:00
    Until last year,
  • 10:00 - 10:04
    when I felt the weight
    of my age on my shoulders
  • 10:05 - 10:10
    the first time I opened Snapchat.
  • 10:10 - 10:13
    (Laughter)
  • 10:15 - 10:17
    (Applause)
  • 10:20 - 10:22
    I didn't understand a thing!
  • 10:23 - 10:26
    I found it unnecessary,
  • 10:26 - 10:29
    useless, hard to understand;
  • 10:29 - 10:31
    it looked like a camera!
  • 10:31 - 10:32
    It didn't have menu options!
  • 10:35 - 10:37
    It was the first time I felt the gap
  • 10:37 - 10:40
    that sometimes exists
    between kids and adults.
  • 10:42 - 10:45
    But it was also an opportunity
    to do the right thing,
  • 10:45 - 10:47
    to leave my comfort zone, to force myself.
  • 10:49 - 10:52
    I never thought I'd ever use Snapchat,
  • 10:52 - 10:57
    but then I asked my teenage cousin
    to show me how to use it.
  • 10:58 - 11:00
    I also asked why she used it.
  • 11:01 - 11:02
    What was fun about it?
  • 11:04 - 11:05
    We had a really nice talk.
  • 11:06 - 11:08
    She showed me her Snapchat,
    she told me things,
  • 11:08 - 11:11
    we got closer, we laughed.
  • 11:13 - 11:14
    Today, I use it.
  • 11:15 - 11:16
    (Laughter)
  • 11:17 - 11:18
    I don't know if I do it right,
  • 11:18 - 11:23
    but the most important thing
    is that I know it and I understand it.
  • 11:24 - 11:29
    The key was to overcome the initial shock
  • 11:29 - 11:30
    and do something new.
  • 11:32 - 11:33
    Something new.
  • 11:33 - 11:36
    Today, we have the chance
    to create new conversations.
  • 11:37 - 11:40
    What's the last app you downloaded?
  • 11:40 - 11:43
    Which social network do you use
    to contact your friends?
  • 11:43 - 11:45
    What kind of information do you share?
  • 11:47 - 11:49
    Have you ever been
    approached by strangers?
  • 11:51 - 11:54
    Could we have these conversations
    between kids and adults?
  • 11:56 - 11:58
    We have to force ourselves
    to do it. All of us.
  • 11:58 - 12:03
    Today, lots of kids are listening to us.
  • 12:05 - 12:07
    Sometimes when we go
    to schools to give our talks,
  • 12:07 - 12:09
    or through social networks,
  • 12:09 - 12:11
    kids ask or tell us things
  • 12:11 - 12:16
    they haven't told
    their parents or their teachers.
  • 12:16 - 12:18
    They tell us -- they don't even know us.
  • 12:21 - 12:23
    Those kids need to know
  • 12:24 - 12:26
    what the risks of being online are,
  • 12:28 - 12:30
    how to take care of themselves,
  • 12:30 - 12:33
    but also that, fundamentally,
    as with almost everything else,
  • 12:33 - 12:36
    kids can learn this from any adult.
  • 12:40 - 12:43
    Online safety needs to be
    a conversation topic
  • 12:43 - 12:46
    in every house and every
    classroom in the country.
  • 12:48 - 12:52
    We did a survey this year that showed
    that 15 percent of schools said
  • 12:52 - 12:54
    they knew of cases of grooming
    in their school.
  • 12:55 - 12:57
    And this number is growing.
  • 13:00 - 13:03
    Technology changed
    every aspect of our life,
  • 13:03 - 13:06
    including the risks we face
  • 13:06 - 13:08
    and how we take care of ourselves.
  • 13:09 - 13:12
    Grooming shows us this
    in the most painful way:
  • 13:13 - 13:15
    by involving our kids.
  • 13:17 - 13:19
    Are we going to do something
    to avoid this?
  • 13:20 - 13:24
    The solution starts
    with something as easy as:
  • 13:24 - 13:26
    talking about it.
  • 13:26 - 13:27
    Thank you.
  • 13:27 - 13:33
    (Applause)
Title:
The conversation we're not having about digital child abuse
Speaker:
Sebastián Bortnik
Description:

We need to talk to kids about the risks they face online, says information security expert Sebastián Bortnik. In this talk, Bortnik discusses the issue of "grooming" -- the sexual predation of children by adults on the internet -- and outlines the conversations we need to start having about technology to keep our kids safe.

more » « less
Video Language:
Spanish
Team:
closed TED
Project:
TEDTalks
Duration:
13:40

English subtitles

Revisions Compare revisions