Return to Video

Image recognition that triggers augmented reality

  • 0:01 - 0:04
    So wouldn't it be amazing if our phones
  • 0:04 - 0:06
    could see the world in the same way that we do,
  • 0:06 - 0:08
    as we're walking around being able
  • 0:08 - 0:10
    to point a phone at anything,
  • 0:10 - 0:12
    and then have it actually recognize images and objects
  • 0:12 - 0:15
    like the human brain, and then be able to pull in information
  • 0:15 - 0:18
    from an almost infinite library of knowledge
  • 0:18 - 0:20
    and experiences and ideas.
  • 0:20 - 0:22
    Well, traditionally that was seen as science fiction,
  • 0:22 - 0:24
    but now we've moved to a world
  • 0:24 - 0:26
    where actually this has become possible.
  • 0:26 - 0:28
    So the best way of explaining it is to just show it.
  • 0:28 - 0:30
    What you can see over here is Tamara,
  • 0:30 - 0:32
    who is holding my phone that's now plugged in.
  • 0:32 - 0:34
    So let me start with this.
  • 0:34 - 0:35
    What we have here is a painting
  • 0:35 - 0:38
    of the great poet Rabbie Burns,
  • 0:38 - 0:39
    and it's just a normal image,
  • 0:39 - 0:42
    but if we now switch inputs over to the phone,
  • 0:42 - 0:44
    running our technology, you can see
  • 0:44 - 0:46
    effectively what Tamara's seeing on the screen,
  • 0:46 - 0:48
    and when she points at this image,
  • 0:48 - 0:50
    something magical happens.
  • 0:50 - 0:55
    (Laughter) (Bagpipes)
  • 0:55 - 0:57
    (Bagpipes) (Applause)
  • 0:57 - 1:04
    (Bagpipes)
  • 1:04 - 1:06
    Voice: Now simmer blinks on flowery braes ...
  • 1:06 - 1:07
    Matt Mills: Now, what's great about this is,
  • 1:07 - 1:09
    there's no trickery here.
  • 1:09 - 1:12
    There's nothing done to this image.
  • 1:12 - 1:15
    And what's great about this is the technology's
  • 1:15 - 1:17
    actually allowing the phone to start to see and understand
  • 1:17 - 1:20
    much like how the human brain does.
  • 1:20 - 1:22
    Not only that, but as I move the object around,
  • 1:22 - 1:28
    it's going to track it and overlay that content seamlessly.
  • 1:28 - 1:30
    Again, the thing that's incredible about this is
  • 1:30 - 1:32
    this is how advanced these devices have become.
  • 1:32 - 1:35
    All the processing to do that was actually done
  • 1:35 - 1:37
    on the device itself.
  • 1:37 - 1:39
    Now, this has applications everywhere,
  • 1:39 - 1:43
    whether in things like art in museums, like you just saw,
  • 1:43 - 1:46
    or in the world of, say, advertising, or print journalism.
  • 1:46 - 1:49
    So a newspaper becomes out of date as soon as it's printed.
  • 1:49 - 1:51
    And here is this morning's newspaper,
  • 1:51 - 1:54
    and we have some Wimbledon news, which is great.
  • 1:54 - 1:57
    Now what we can do is point at the front of the newspaper
  • 1:57 - 1:59
    and immediately get the bulletin.
  • 1:59 - 2:02
    Voice: ... To the grass, and it's very important that you adapt
  • 2:02 - 2:04
    and you, you have to be flexible, you have to be willing
  • 2:04 - 2:07
    to change direction at a split second,
  • 2:07 - 2:10
    and she does all that. She's won this title.
  • 2:10 - 2:13
    MM: And that linking of the digital content
  • 2:13 - 2:15
    to something that's physical is what we call an aura, and
  • 2:15 - 2:18
    I'll be using that term a little bit as we go through the talk.
  • 2:18 - 2:21
    So, what's great about this is it isn't just a faster,
  • 2:21 - 2:24
    more convenient way to get information in the real world,
  • 2:24 - 2:26
    but there are times when actually using this medium
  • 2:26 - 2:29
    allows you to be able to display information in a way
  • 2:29 - 2:31
    that was never before possible.
  • 2:31 - 2:34
    So what I have here is a wireless router.
  • 2:34 - 2:37
    My American colleagues have told me I've got to call it a router,
  • 2:37 - 2:39
    so that everyone here understands — (Laughter) —
  • 2:39 - 2:42
    but nonetheless, here is the device.
  • 2:42 - 2:45
    So now what I can do is, rather than getting the instructions
  • 2:45 - 2:48
    for the device online, I can simply point at it,
  • 2:48 - 2:52
    the device is recognized, and then --
  • 2:52 - 2:56
    Voice: Begin by plugging in the grey ADSL cable.
  • 2:56 - 3:01
    Then connect the power. Finally, the yellow ethernet cable.
  • 3:01 - 3:04
    Congratulations. You have now completed setup.
  • 3:04 - 3:07
    (Laughter) MM: Awesome. Thank you.
  • 3:07 - 3:10
    (Applause)
  • 3:10 - 3:13
    The incredible work that made that possible was done
  • 3:13 - 3:15
    here in the U.K. by scientists at Cambridge,
  • 3:15 - 3:17
    and they work in our offices,
  • 3:17 - 3:19
    and I've got a lovely picture of them here.
  • 3:19 - 3:21
    They couldn't all be on stage, but we're going to
  • 3:21 - 3:24
    bring their aura to the stage, so here they are.
  • 3:28 - 3:30
    They're not very animated. (Laughter)
  • 3:30 - 3:34
    This was the fourth take, I'm told. (Laughter)
  • 3:34 - 3:38
    Okay. So, as we're talking about Cambridge,
  • 3:38 - 3:40
    let's now move on to technical advancements,
  • 3:40 - 3:42
    because since we started putting this technology
  • 3:42 - 3:46
    on mobile phones less than 12 months ago,
  • 3:46 - 3:48
    the speed and the processing in these devices
  • 3:48 - 3:51
    has grown at a really phenomenal rate,
  • 3:51 - 3:53
    and that means that I can now take cinema-quality
  • 3:53 - 3:56
    3D models and place them in the world around me,
  • 3:56 - 3:58
    so I have one over here.
  • 3:58 - 4:02
    Tamara, would you like to jump in?
  • 4:04 - 4:07
    (Music)
  • 4:07 - 4:12
    (Dinosaur roaring) (Laughter)
  • 4:12 - 4:14
    MM: I should leap in.
  • 4:14 - 4:18
    (Music) (Dinosaur roaring)
  • 4:18 - 4:23
    (Applause)
  • 4:23 - 4:25
    So then, after the fun, comes the more emotional side
  • 4:25 - 4:28
    of what we do, because effectively, this technology
  • 4:28 - 4:30
    allows you to see the world through someone's eyes,
  • 4:30 - 4:33
    and for that person to be able to take a moment in time
  • 4:33 - 4:36
    and effectively store it and tag it to something physical
  • 4:36 - 4:38
    that exists in the real world.
  • 4:38 - 4:41
    What's great about this is, the tools to do this are free.
  • 4:41 - 4:44
    They're open, they're available to everyone within our application,
  • 4:44 - 4:47
    and educators have really got on board with the classrooms.
  • 4:47 - 4:50
    So we have teachers who've tagged up textbooks,
  • 4:50 - 4:52
    teachers who've tagged up school classrooms,
  • 4:52 - 4:54
    and a great example of this is a school in the U.K.
  • 4:54 - 4:57
    I have a picture here from a video, and we're now going to play it.
  • 4:58 - 5:06
    Teacher: See what happens. (Children talking) Keep going.
  • 5:06 - 5:10
    Child: TV. (Children react)
  • 5:10 - 5:11
    Child: Oh my God.
  • 5:11 - 5:14
    Teacher: Now move it either side. See what happens.
  • 5:14 - 5:17
    Move away from it and come back to it.
  • 5:17 - 5:21
    Child: Oh, that is so cool.
  • 5:21 - 5:23
    Teacher: And then, have you got it again?
  • 5:23 - 5:28
    Child: Oh my God! How did you do that?
  • 5:28 - 5:31
    Second child: It's magic.
  • 5:31 - 5:34
    (Laughter) MM: (Laughs) So, it's not magic.
  • 5:34 - 5:36
    It's available for everyone to do,
  • 5:36 - 5:38
    and actually I'm going to show you how easy it is to do
  • 5:38 - 5:39
    by doing one right now.
  • 5:39 - 5:42
    So, as sort of — I'm told it's called a stadium wave,
  • 5:42 - 5:43
    so we're going to start from this side of the room
  • 5:43 - 5:45
    on the count of three, and go over to here.
  • 5:45 - 5:46
    Tamara, are you recording?
  • 5:46 - 5:48
    Okay, so are you all ready?
  • 5:48 - 5:51
    One, two, three. Go!
  • 5:51 - 5:54
    Audience: Whooooooo!
  • 5:54 - 5:58
    MM: Fellows are really good at that. (Laughs) (Laughter)
  • 5:58 - 5:59
    Okay. Now we're going to switch back
  • 5:59 - 6:01
    into the Aurasma application,
  • 6:01 - 6:05
    and what Tamara's going to do is tag that video that we just
  • 6:05 - 6:11
    took onto my badge, so that I can remember it forever.
  • 6:11 - 6:14
    Now, we have lots of people who are doing this already,
  • 6:14 - 6:16
    and we've talked a little bit about the educational side.
  • 6:16 - 6:18
    On the emotional side, we have people who've
  • 6:18 - 6:21
    done things like send postcards and Christmas cards
  • 6:21 - 6:25
    back to their family with little messages on them.
  • 6:25 - 6:27
    We have people who have, for example,
  • 6:27 - 6:29
    taken the inside of the engine bay of an old car
  • 6:29 - 6:31
    and tagged up different components within an engine,
  • 6:31 - 6:34
    so that if you're stuck and you want to find out more,
  • 6:34 - 6:36
    you can point and discover the information.
  • 6:36 - 6:39
    We're all very, very familiar with the Internet.
  • 6:39 - 6:42
    In the last 20 years, it's really
  • 6:42 - 6:44
    changed the way that we live and work,
  • 6:44 - 6:47
    and the way that we see the world, and what's great is,
  • 6:47 - 6:49
    we sort of think this is the next paradigm shift,
  • 6:49 - 6:53
    because now we can literally take the content
  • 6:53 - 6:56
    that we share, we discover, and that we enjoy
  • 6:56 - 6:58
    and make it a part of the world around us.
  • 6:58 - 7:00
    It's completely free to download this application.
  • 7:00 - 7:03
    If you have a good Wi-Fi connection or 3G,
  • 7:03 - 7:04
    this process is very, very quick.
  • 7:04 - 7:06
    Oh, there we are. We can save it now.
  • 7:06 - 7:08
    It's just going to do a tiny bit of processing
  • 7:08 - 7:10
    to convert that image that we just took
  • 7:10 - 7:12
    into a sort of digital fingerprint,
  • 7:12 - 7:14
    and the great thing is, if you're a professional user,
  • 7:14 - 7:17
    -- so, a newspaper -- the tools are pretty much identical
  • 7:17 - 7:19
    to what we've just used to create this demonstration.
  • 7:19 - 7:21
    The only difference is that you've got the ability
  • 7:21 - 7:24
    to add in links and slightly more content. Are you now ready?
  • 7:24 - 7:25
    Tamara Roukaerts: We're ready to go.
  • 7:25 - 7:27
    MM: Okay. So, I'm told we're ready, which means
  • 7:27 - 7:30
    we can now point at the image, and there you all are.
  • 7:30 - 7:36
    MM on video: One, two, three. Go!
  • 7:36 - 7:38
    MM: Well done. We've been Aurasma. Thank you.
  • 7:38 - 7:44
    (Applause)
Title:
Image recognition that triggers augmented reality
Speaker:
Matt Mills
Description:

Matt Mills and Tamara Roukaerts demonstrate Aurasma, a new augmented reality tool that can seamlessly animate the world as seen through a smartphone. Going beyond previous augmented reality, their "auras" can do everything from making a painting talk to overlaying live news onto a printed newspaper.

more » « less
Video Language:
English
Team:
closed TED
Project:
TEDTalks
Duration:
08:04

English subtitles

Revisions Compare revisions