Return to Video

Digital Slavery Redemption | Evgeny Chereshnev | TEDxKazan

  • 0:18 - 0:19
    When George Orwell created his
  • 0:20 - 0:21
    masterpiece antiutopia,
  • 0:21 - 0:22
    I don't think he could have
  • 0:22 - 0:23
    imagined, that actually
  • 0:24 - 0:25
    one day this might come true.
  • 0:27 - 0:29
    Yet during my research I realised, that
  • 0:29 - 0:32
    Internet is scary and it resembles
  • 0:35 - 0:37
    what was the plot of 1984,
  • 0:38 - 0:39
    where war is peace,
  • 0:40 - 0:41
    freedom is slavery,
  • 0:41 - 0:42
    and ignorance is strength.
  • 0:44 - 0:45
    Basically, I started my experiment
  • 0:46 - 0:47
    2 years ago by implanting
  • 0:48 - 0:49
    myself a biochip.
  • 0:49 - 0:52
    I volunteered to become an early version of
  • 0:52 - 0:53
    a Terminator. Basically,
  • 0:54 - 0:56
    to understand true challenges
  • 0:56 - 0:57
    of data management
  • 0:57 - 0:58
    and, basically, what it's like to
  • 0:59 - 1:01
    live in the Internet of things.
  • 1:03 - 1:04
    Today I'm gonna share my discoveries with you.
  • 1:06 - 1:07
    So you know and understand, that the biochip
  • 1:07 - 1:08
    is right here , in my left hand.
  • 1:09 - 1:12
    And it's very small- 2,12 mm
  • 1:13 - 1:16
    It has a small chip of memory
  • 1:16 - 1:17
    in it and an antenna.
  • 1:18 - 1:19
    And all that is covered by so-called "bioglass".
  • 1:20 - 1:22
    Bioglass is hypoallergenic material
  • 1:23 - 1:24
    that basically makes the chip friends
  • 1:24 - 1:25
    with human tissue, because
  • 1:26 - 1:28
    usually it doesn't happen a lot in nature.
  • 1:28 - 1:30
    Our body is very good, when it
  • 1:30 - 1:32
    comes to rejecting anything and everything, that
  • 1:32 - 1:34
    is you know, with different DNA.
  • 1:35 - 1:37
    Luckily, in my case, thanks to bioglass
  • 1:37 - 1:38
    and moderate amount of tequilla,
  • 1:39 - 1:40
    it went smooth.
  • 1:40 - 1:42
    So, to your right you see my hand
  • 1:42 - 1:43
    roughly 2 hours past the surgery
  • 1:44 - 1:46
    and to your left you see the same hand
  • 1:47 - 1:48
    three days after.
  • 1:48 - 1:49
    As you see, it's healthy
  • 1:49 - 1:50
    and it's good and it's smooth
  • 1:50 - 1:51
    and already programmed.
  • 1:53 - 1:55
    But when I started to play with the chip, like, 2 minutes
  • 1:55 - 1:57
    after implantation, I never stopped since.
  • 1:58 - 1:59
    Basically, the fist thing I did-
  • 1:59 - 2:00
    I connected myself to an Android phone
  • 2:01 - 2:04
    and saved a very small piece of text:
  • 2:04 - 2:06
    "Do android dream of electric sheep?"
  • 2:07 - 2:08
    I did it for 2 reasons:
  • 2:08 - 2:11
    First one- I'm a big fan of Blade Runner
  • 2:11 - 2:12
    and I did it as a homage
  • 2:13 - 2:13
    to Philip K.Dick.
  • 2:14 - 2:15
    And second-
  • 2:15 - 2:17
    I guess I truly wanted to
  • 2:17 - 2:19
    know the answer.
  • 2:19 - 2:20
    And today I know the answer.
  • 2:20 - 2:21
    But before we walk on my dreams,
  • 2:21 - 2:22
    let me walk on yours today.
  • 2:23 - 2:25
    Let me show you, what a guy with a small
  • 2:25 - 2:26
    basic biochip can do.
  • 2:27 - 2:29
    So, right now in my hand
  • 2:30 - 2:31
    there is a futuristic prototype
  • 2:31 - 2:34
    of a unified key, because right now
  • 2:34 - 2:36
    in my company
  • 2:36 - 2:37
    they helped me and replaced
  • 2:37 - 2:38
    all the locks in the office,
  • 2:39 - 2:40
    so I can access the campus,
  • 2:41 - 2:42
    the cantine, the garage, my own
  • 2:42 - 2:43
    office, restricted areas
  • 2:44 - 2:45
    just by touching the door.
  • 2:46 - 2:47
    I don't have to carry an ID on me.
  • 2:48 - 2:49
    It's just a tip of the iceberg, because
  • 2:49 - 2:52
    it turns out I can unlock any
  • 2:53 - 2:55
    Android device with no PIN, no fingerprints,
  • 2:56 - 2:58
    I just need to be there, by touching.
  • 2:58 - 3:00
    And if you think about it,
  • 3:01 - 3:02
    everything, that is related to
  • 3:03 - 3:05
    identification could be done differently.
  • 3:06 - 3:07
    For example, we took it further
  • 3:08 - 3:09
    and we connected a small NFC reader
  • 3:10 - 3:12
    to a laptop, and did some basic coding
  • 3:12 - 3:14
    in the backend of my blog,
  • 3:14 - 3:15
    and we proved, that it's possible
  • 3:16 - 3:18
    to access any website
  • 3:18 - 3:20
    or any app just by being there.
  • 3:21 - 3:23
    I didn't need to enter anything?
  • 3:24 - 3:24
    no password, no login.
  • 3:25 - 3:27
    And actually it made me think of
  • 3:27 - 3:30
    one specific silly thing:
  • 3:30 - 3:32
    modern identification documents
  • 3:32 - 3:34
    are obsolete by technological standards, because
  • 3:34 - 3:36
    what is a passport or driving license?
  • 3:37 - 3:39
    It's a piece of paper or plastic with
  • 3:40 - 3:42
    some basic texts and numbers in it.
  • 3:44 - 3:45
    They are easy to fake
  • 3:45 - 3:47
    and it has no value anymore.
  • 3:47 - 3:51
    For example, when you cross the border,
  • 3:52 - 3:53
    and you show your passport,
  • 3:53 - 3:54
    it's not the passport itself, that
  • 3:55 - 3:57
    makes you you. It's the fact, that
  • 3:57 - 3:58
    it's checked with a computer database
  • 3:59 - 4:00
    by border control, right?
  • 4:01 - 4:02
    So, why to carry all these documents on us
  • 4:03 - 4:04
    at all times?
  • 4:04 - 4:05
    It doesn't make any sense anymore.
  • 4:06 - 4:08
    And in fact, why to carry a wallet at all?
  • 4:09 - 4:11
    Because we proved, that
  • 4:11 - 4:12
    using cryptocurrency you can actually
  • 4:12 - 4:14
    pay or get paid just by touching
  • 4:15 - 4:16
    the terminal.
  • 4:17 - 4:18
    And actually, some things, that can be done
  • 4:19 - 4:20
    even with no touching.
  • 4:20 - 4:22
    For example, every Star Wars fan
  • 4:22 - 4:24
    knows: if you are not doing this
  • 4:24 - 4:25
    to doors with photo elements, you
  • 4:26 - 4:28
    are wasting your time on Earth, really.
  • 4:29 - 4:31
    And being a fan of Star Wars, I
  • 4:31 - 4:32
    couldn't resist the temptation
  • 4:32 - 4:33
    of the Light Side, so my friends
  • 4:34 - 4:36
    helped me, and we totally misused
  • 4:36 - 4:38
    Microsoft Kinect and did
  • 4:38 - 4:40
    some basic engineering and we
  • 4:40 - 4:41
    forced this door to obey
  • 4:41 - 4:42
    the power of a Jedi.
  • 4:43 - 4:44
    So, to look realistic,
  • 4:49 - 4:52
    the funny thing is: some of you
  • 4:52 - 4:56
    might be wondering, if it's so awesome and great and fantastic,
  • 4:56 - 4:57
    what's the catch here?
  • 4:59 - 5:01
    And those of you, who think this way, are right.
  • 5:01 - 5:02
    There's a catch.
  • 5:02 - 5:04
    It just took me a while to understand it.
  • 5:05 - 5:06
    I had to live with the biochip for
  • 5:06 - 5:06
    almost 2 years to understand
  • 5:09 - 5:11
    what I'm paying with for the fun I'm having.
  • 5:12 - 5:13
    And I'm gonna share it with you.
  • 5:14 - 5:15
    So, I conducted an experiment,
    where I was using
  • 5:16 - 5:19
    my laptop and my smartphone
  • 5:20 - 5:21
    and basically lots of devices
  • 5:22 - 5:23
    and was leading a normal life.
  • 5:23 - 5:25
    I was buying stuff, I was using
  • 5:26 - 5:26
    social media and stuff.
  • 5:27 - 5:30
    But I used only 1 form of ID- my biochip.
  • 5:32 - 5:35
    And this is where I realized
    very weird thing.
  • 5:36 - 5:38
    Computers don't have eyes, right?
  • 5:39 - 5:39
    They cannot see us.
  • 5:41 - 5:45
    The only thing they could see is zeros and ones.
  • 5:46 - 5:46
    This is how they see us. Those zeros and ones
  • 5:47 - 5:50
    flying in cyberspace- this is how they get the idea
  • 5:51 - 5:53
    of what a face looks like, about the behaviour,
  • 5:53 - 5:56
    about who the person interacting with the computer is.
  • 5:57 - 5:59
    But I realised, that in my case
  • 6:00 - 6:04
    those weren't just zeros and ones anymore.
  • 6:05 - 6:06
    Those were literally parts of my body,
  • 6:07 - 6:08
    because I produced this, right?
  • 6:09 - 6:11
    And this was actually a revelation,
  • 6:11 - 6:13
    because today I'm gonna say a very critical,
  • 6:13 - 6:16
    disruptive thing: I propose we start
  • 6:16 - 6:18
    treating personal data as a layer
  • 6:19 - 6:20
    to human DNA. Because DNA is
  • 6:22 - 6:23
    the same thing. It has
  • 6:23 - 6:25
    information, and it's unique to everyone.
  • 6:26 - 6:27
    Why not to treat it the same way?
  • 6:28 - 6:29
    In fact, I tracked myself
  • 6:30 - 6:32
    and I'm gonna show you how it looks like,
  • 6:32 - 6:33
    so you understand I'm pretty
  • 6:33 - 6:34
    serious about it.
  • 6:34 - 6:35
    So, this is me, just
  • 6:36 - 6:37
    one hour of my life. I just was
  • 6:38 - 6:40
    using my browser, surfing and doing stuff,
  • 6:40 - 6:42
    and you can see my interests, you can see
  • 6:42 - 6:46
    what I did. This is my personal Facebook account
  • 6:47 - 6:49
    and who I interacted with through the day.
  • 6:50 - 6:52
    This is more applicable to you
  • 6:52 - 6:53
    and all of us,
    but basically
  • 6:53 - 6:56
    it's a Wi-Fi, enabled in a cafeteria,
  • 6:56 - 6:57
    Starbucks in this case,
  • 6:57 - 6:58
    on a busy day.
  • 6:59 - 6:59
    One minute of it.
  • 7:00 - 7:02
    It's us, surfing and researching, doing
  • 7:02 - 7:04
    our routine, basically.
  • 7:05 - 7:07
    This is a small start-up developing their mobile
  • 7:07 - 7:10
    application, that is why they have one color in common.
  • 7:10 - 7:13
    Because they do one thing in the same time.
  • 7:14 - 7:16
    And this is a split second of your time,
  • 7:16 - 7:18
    if you are company like Google, Yandex,
  • 7:19 - 7:20
    Microsoft, Apple... Anyway, big companies.
  • 7:24 - 7:25
    Very important thing hit me,
  • 7:26 - 7:28
    that computers in the last 25 years
  • 7:29 - 7:31
    changed, their role changed.
  • 7:32 - 7:35
    In 1992 computers were produced as tools,
  • 7:36 - 7:38
    marketed as tools and sold as tools.
  • 7:39 - 7:39
    And we were all users.
  • 7:41 - 7:43
    In 2020, where the forecast is: we're gonna
  • 7:44 - 7:45
    have roughly 50 billion devices.
  • 7:46 - 7:47
    It's not the same anymore,
  • 7:47 - 7:49
    because the role changed.
  • 7:50 - 7:51
    We're not users anymore, we're sensors
  • 7:52 - 7:54
    for a system, that is big,
  • 7:54 - 7:56
    and we don't own it or control it.
  • 7:57 - 7:59
    And computer is not a tool anymore.
  • 8:00 - 8:02
    It's an interface between life and
  • 8:02 - 8:03
    artificial life. And this
  • 8:04 - 8:06
    makes hell of a difference.
  • 8:07 - 8:08
    And, what's more important,
  • 8:08 - 8:09
    I would be totally okay with that,
  • 8:10 - 8:11
    if the system was in balance, but
  • 8:11 - 8:12
    the system is not in balance.
  • 8:13 - 8:15
    I'm gonna illustrate my point by
  • 8:15 - 8:16
    so-called Johari concept.
  • 8:17 - 8:22
    Show off hands, who know what it is.
  • 8:22 - 8:23
    At least one person, that's okay.
  • 8:24 - 8:25
    That's good. Two, nice.
  • 8:25 - 8:26
    So, we have balcony here.
  • 8:26 - 8:32
    So, as all complicated concepts,
  • 8:32 - 8:34
    it could be explained by a boring slide,
  • 8:35 - 8:36
    but we're gonna skip it, because
  • 8:37 - 8:37
    nobody would read it.
  • 8:38 - 8:39
    I'm gonna use the example of Rosencrantz and
  • 8:39 - 8:43
    Guildenstern. So, basically, Johari window says,
  • 8:43 - 8:45
    that there are only four types of information:
  • 8:45 - 8:49
    known, unknown, and some other variations.
  • 8:50 - 8:52
    And we as people migrate from
  • 8:52 - 8:55
    unknown to known, we improve,
  • 8:55 - 8:56
    we become better.
  • 8:57 - 8:58
    And the point is:
  • 8:59 - 9:00
    everyone has something, that
  • 9:01 - 9:02
    we don't know about ourselves.
  • 9:04 - 9:07
    If we apply Johari concept to modern Internet,
  • 9:08 - 9:10
    this is how it's gonna look like.
  • 9:10 - 9:12
    There are literally many companies, but
  • 9:13 - 9:14
    at least five of them know about you more
  • 9:17 - 9:18
    than you, your family and friends combined.
  • 9:20 - 9:21
    And they can predict your behaviour.
  • 9:22 - 9:23
    For me it's not okay.
  • 9:25 - 9:26
    And you might be wondering:
  • 9:26 - 9:28
    how did that come up to that?
  • 9:29 - 9:29
    How did that happen?
  • 9:31 - 9:32
    Ask yourself just 1 question:
  • 9:32 - 9:33
    who owns our data?
  • 9:34 - 9:36
    In my case it's my digital DNA. Who owns it?
  • 9:36 - 9:40
    Physically. It turns out, everybody but us, who produce it.
  • 9:42 - 9:43
    That's the problem, right?
  • 9:45 - 9:46
    And some of you, I'm sure, are engaged in
  • 9:47 - 9:51
    It and technology and you must be wondering: what's the problem with that?
  • 9:51 - 9:52
    I don't care, how much data they have on me,
  • 9:53 - 9:54
    unless they provide quality service and it's free.
  • 9:56 - 9:57
    Well, I strongly disagree.
  • 9:59 - 10:00
    And I'll show you, why everyone should care.
  • 10:03 - 10:04
    The first problem is:
  • 10:05 - 10:06
    you cannot compete with data.
  • 10:07 - 10:08
    It's almost impossible.
  • 10:09 - 10:10
    Next time you're gonna think of
  • 10:10 - 10:12
    starting your own business, or you're gonna come up with an idea,
  • 10:13 - 10:14
    think twice.
  • 10:15 - 10:17
    Because it's almost impossible to compete with
  • 10:17 - 10:18
    big companies, who can make their service
  • 10:19 - 10:22
    better just by applying data to it.
  • 10:23 - 10:23
    They don't have to guess,
  • 10:24 - 10:26
    they know what businesses to start
  • 10:27 - 10:30
    and they know how to make
    existing businesses better.
  • 10:30 - 10:30
    It's a problem.
  • 10:32 - 10:33
    If you wanna understand
    what it feels like,
  • 10:35 - 10:36
    try to play chess with
    Deep Blue computer.
  • 10:37 - 10:40
    I dare you. And good luck thinking,
    that you might win,
  • 10:40 - 10:41
    because you can't.
    And it's already been proven.
  • 10:42 - 10:46
    Computer can beat a human at chess,
  • 10:47 - 10:48
    because it knows your behaviour,
  • 10:48 - 10:50
    it can predict your moves before
  • 10:50 - 10:51
    you even think of it.
  • 10:51 - 10:55
    The problem is: data is now applied to businesses, not
  • 10:55 - 10:59
    chess. The second problem is cybersecurity,
  • 11:00 - 11:02
    because I have lots of friends in
  • 11:02 - 11:05
    companies like Google and Apple and Facebook etc.
  • 11:06 - 11:07
    and I think they're great.
  • 11:08 - 11:10
    They have the best minds working for them.
  • 11:10 - 11:12
    They truly care about cybersecurity.
  • 11:13 - 11:15
    The problem is: not everybody is Google.
  • 11:15 - 11:18
    There are at least 2,000 networks out there
  • 11:19 - 11:21
    who sell and buy user information
  • 11:21 - 11:23
    and God know how many small
  • 11:24 - 11:25
    e-commerces who cannot protect our data:
  • 11:26 - 11:28
    passwords, logins, credit card information,
  • 11:29 - 11:30
    history. They just can't.
  • 11:30 - 11:31
    they don't have resources of Google.
  • 11:32 - 11:34
    And that's a problem, because cybercriminals
  • 11:34 - 11:37
    know that. And they can always find a weak spot.
  • 11:39 - 11:41
    They can use your name by
  • 11:41 - 11:44
    duplicating your identity and commit crimes.
  • 11:46 - 11:48
    And the problem here is actually never ending, because
  • 11:49 - 11:51
    if we go to people, who actually solve crimes,
  • 11:51 - 11:54
    or whose job is intelligence or country intelligence,
  • 11:55 - 11:56
    they are not okay with the situation either.
  • 11:57 - 12:00
    Because governments have needs, we all know that.
  • 12:01 - 12:03
    They would always want to know everything about everybody
  • 12:04 - 12:05
    while keeping their own secrets.
  • 12:06 - 12:08
    It's okay, it's what governments do, right?
  • 12:09 - 12:11
    The problem is twofold.
  • 12:11 - 12:14
    First: the access to information now is binary,
  • 12:15 - 12:16
    meaning that, let's say,
  • 12:16 - 12:18
    you're crossing the border and let's
  • 12:18 - 12:19
    say there's an officer asking you
  • 12:20 - 12:23
    to open your iPhone. If you do that in comply,
  • 12:24 - 12:26
    you are not giving him just a small piece
  • 12:26 - 12:29
    of information that this person needs for his official business,
  • 12:30 - 12:31
    you're revealing your whole life.
  • 12:32 - 12:34
    And it's not okay at all.
  • 12:35 - 12:36
    And the second problem is that
  • 12:38 - 12:40
    with all due respect to governments, they are
  • 12:40 - 12:41
    like enterprises, and they are not
  • 12:42 - 12:43
    safe, they can be hacked.
  • 12:44 - 12:47
    And cases like WikiLeaks,
    Edward Snowden, StuxNet
  • 12:48 - 12:49
    and many others prove, that nobody's safe.
  • 12:51 - 12:54
    And the problem is, that if you're well-funded
  • 12:54 - 12:57
    cybersecurity organization,
    there's nothing
  • 12:58 - 12:59
    more sexy to you, than a huge
  • 13:00 - 13:01
    governmental database. And governments
  • 13:03 - 13:06
    love creating those. Still, the problem is:
  • 13:08 - 13:10
    it would take a while for them to catch up
  • 13:11 - 13:12
    with new mentality, that technologies like
  • 13:13 - 13:16
    encryption, PTP storage,
    like cryptocurrency like
  • 13:17 - 13:17
    bitcoins and stuff, they have to be
  • 13:18 - 13:20
    embraced and developed.
  • 13:22 - 13:22
    Because it would help the governments.
  • 13:24 - 13:25
    They just don't get it yet.
  • 13:26 - 13:27
    But even that's not the main problem.
  • 13:28 - 13:29
    The main problem is human rights.
  • 13:30 - 13:31
    Because, you know what?
  • 13:32 - 13:34
    Nobody listens to our opinion.
  • 13:35 - 13:37
    I'm not the first person on Earth
  • 13:37 - 13:38
    to bring up the privacy issue.
  • 13:39 - 13:40
    I'm just probably the first cyborg to do that,
  • 13:41 - 13:42
    which makes the difference, I hope.
  • 13:43 - 13:46
    So, my point is, many times
  • 13:46 - 13:48
    we asked to have an adult conversation
  • 13:49 - 13:50
    to discuss the fact that clicking
  • 13:51 - 13:52
    "I agree" is not a fair choice.
  • 13:53 - 13:54
    It's an ultimatum.
  • 13:54 - 13:55
    And it's not okay.
  • 13:56 - 13:57
    But nothing changes.
  • 13:58 - 13:59
    And I have a reference in our history,
  • 14:00 - 14:03
    that is very similar: slavery.
  • 14:04 - 14:05
    Because slaves might have an opinion, but nobody
  • 14:05 - 14:07
    would care, right?
  • 14:08 - 14:10
    Because they're slaves.
  • 14:10 - 14:12
    And if I have an opinion, that it should be stored,
  • 14:12 - 14:16
    my own digital DNA should be stored differently
  • 14:16 - 14:18
    and deal with differently, nobody listens.
  • 14:18 - 14:21
    That makes me a slave in a way.
  • 14:22 - 14:23
    Like a digital slave for sure.
  • 14:24 - 14:26
    And you must be wondering
  • 14:26 - 14:27
    if there are any good news.
  • 14:28 - 14:29
    Well, there is good news,
  • 14:30 - 14:32
    because in our history we did have
  • 14:32 - 14:34
    our fair share of slavery and oppression,
  • 14:34 - 14:36
    and I think that Internet is a form of society.
  • 14:37 - 14:39
    It copies our history. And in our history we
  • 14:40 - 14:42
    did have some feudals, monarchs, who controlled
  • 14:42 - 14:44
    vast amount of resources and used it
  • 14:44 - 14:46
    for their own good. With no explanation.
  • 14:47 - 14:48
    One could wake up and go and attack France.
  • 14:49 - 14:50
    Because someone just wanted to attack France.
  • 14:51 - 14:54
    Just gather up, grab your swords and go.
  • 14:55 - 14:56
    I just like their wine.
  • 14:56 - 14:58
    And this is exactly
    what's happening in the Internet.
  • 15:00 - 15:02
    We have several companies, who own
  • 15:03 - 15:06
    the new analog of medieval gold or
  • 15:06 - 15:09
    modern oil and they use it for their own sake.
  • 15:11 - 15:12
    Without explanation.
  • 15:13 - 15:14
    And I think it's a huge problem.
  • 15:15 - 15:17
    To be honest, it's not that obvious to
  • 15:17 - 15:18
    most of you right now.
  • 15:19 - 15:21
    But let me take you there, in my head.
  • 15:23 - 15:25
    Let's say one day we will witness singularity.
  • 15:26 - 15:27
    And nothing changes today.
  • 15:28 - 15:30
    Singularity is a situation, where
  • 15:31 - 15:32
    in several years, theoretically,
  • 15:32 - 15:34
    computers will be sophisticated enough
  • 15:34 - 15:37
    to store our minds online and let us live
  • 15:37 - 15:38
    forever as machines.
  • 15:39 - 15:43
    Well, if that happens, and it's not impossible,
  • 15:44 - 15:45
    what would happen?
  • 15:45 - 15:46
    We're gonna be indexed.
  • 15:47 - 15:49
    All our memories, all our emotional experiences,
  • 15:49 - 15:53
    everything that we are, would be indexed
    by companies like Facebook,
  • 15:53 - 15:54
    Google, Apple, Microsoft.
  • 15:54 - 15:58
    We would lose our freedom forever.
  • 15:59 - 16:00
    We're already there.
  • 16:00 - 16:02
    But there's still time to change something.
  • 16:04 - 16:05
    And coming back to my question before
  • 16:05 - 16:07
    "Do androids dream of electric sheep?"
  • 16:08 - 16:09
    well, as part android, my answer is no.
  • 16:11 - 16:12
    I dream of a totally different thing.
  • 16:13 - 16:14
    I truly dream that one day
  • 16:15 - 16:16
    we're gonna be free in the digital
  • 16:16 - 16:20
    world the same way we are in the real one.
  • 16:21 - 16:24
    Because I strongly believe they're not different anymore.
  • 16:25 - 16:27
    Artificial and real, they're merging.
  • 16:27 - 16:28
    some of you might say: that's not realistic.
  • 16:33 - 16:35
    It's your opinion, I abide it.
  • 16:36 - 16:37
    I'm just saying I won't stop.
  • 16:38 - 16:40
    I'm just gonna pursue my freedom to the end.
  • 16:41 - 16:42
    And actually I already did something.
  • 16:42 - 16:43
    I gathered a small team and this spring,
  • 16:43 - 16:46
    we got a patent, that basically
  • 16:47 - 16:49
    makes all logins and passwords obsolete in
  • 16:50 - 16:50
    the nearest future. Your behaviour will be
  • 16:51 - 16:54
    your identity on the Internet or everywhere.
  • 16:55 - 16:57
    and some of you might say: that's tracking.
  • 16:58 - 17:01
    No, I'm doing it responsively.
  • 17:01 - 17:09
    I have a chip here. And just by using the technologies like encryption, containerization, disinformation
  • 17:10 - 17:12
    and by faking my digital trail I can win
  • 17:12 - 17:15
    our privacy and freedom back
  • 17:16 - 17:19
    while keeping businesses, governments and us happy.
  • 17:20 - 17:22
    I think I found a win-win solution.
  • 17:23 - 17:25
    We don't need our IDs or credentials online, what we
  • 17:26 - 17:28
    need is single multipesonality, that
  • 17:28 - 17:31
    every person could own and switch
  • 17:31 - 17:34
    at all times and where his
  • 17:34 - 17:36
    rights to commit mistakes,
  • 17:36 - 17:38
    his rights to do whatever he wants
  • 17:39 - 17:40
    and his rights to be untrackable
  • 17:41 - 17:43
    are actually guaranteed by constitution and law.
  • 17:45 - 17:46
    I'm not saying you should join me.
  • 17:47 - 17:47
    I'll be happy, if you do.
  • 17:48 - 17:50
    I'm just saying that this is my dream.
  • 17:51 - 17:52
    And my path.
  • 17:52 - 17:53
    And I'm going all the way.
  • 17:54 - 17:56
    But whatever you dream about
  • 17:57 - 17:59
    I just hope that all those moments
  • 18:00 - 18:03
    won't be lost in time like tears in rain.
  • 18:04 - 18:05
    Thank you.
Title:
Digital Slavery Redemption | Evgeny Chereshnev | TEDxKazan
Description:

more » « less
Video Language:
English
Team:
closed TED
Project:
TEDxTalks
Duration:
18:08

English subtitles

Revisions