Return to Video

W01_L01_P02 - Representation of signals as frequencies

  • 0:00 - 0:05
    Okay, by the way there's something
    fundamental we're doing here that's going
  • 0:05 - 0:11
    to be important for what we want to think
    about. We talked about we're going to talk
  • 0:11 - 0:16
    about time frequency analysis right. And
    here's the big idea that we're going to
  • 0:16 - 0:23
    think about with Fourier. Fourier says we
    can represent things with sines and
  • 0:23 - 0:28
    cosines. Well, a big deal? Why does that
    help us, right? When we think about this
  • 0:28 - 0:34
    we're going to think about this F of X as
    being a time signal. And what this Fourier
  • 0:34 - 0:39
    transform allows us to do i.e. Fourier
    expansion allows us to do is say, well
  • 0:39 - 0:45
    actually what am I to do I'm going to
    place, representing this function in the X
  • 0:45 - 0:51
    domain by representing it in terms of
    collections of cosines and sines. And in
  • 0:51 - 0:57
    fact what I care about are the C of N, the
    coefficients, okay, here. So for instance,
  • 0:57 - 1:03
    C1 corresponds to N being one here. Maybe
    I'll illustrate it actually over here,
  • 1:03 - 1:12
    this is a little simpler. Think about what
    these mean right here, if I think about
  • 1:12 - 1:20
    what the A1 represents. A1 represents
    cosine X, right? So cosine X has a certain
  • 1:20 - 1:26
    frequency. So what I'm doing is replacing
    my X dependents by a bunch of frequency
  • 1:26 - 1:33
    content so I have cosine 1X, cosine 2X,
    cosine 3X, cosine 4X every, and it gets
  • 1:33 - 1:39
    higher, higher and higher frequencies. So
    what I want to do is take a signal in time
  • 1:39 - 1:46
    and simply replace it by a representation
    of the signal, as a, as a collection of
  • 1:46 - 1:53
    frequency components, okay? So by the way,
    when you hear a bird chirp, if you have a
  • 1:53 - 2:00
    detector for bird chirpers, right? You
    would see its frequency content. You're
  • 2:00 - 2:05
    seeing it's fourier transform. You're not
    seeing the pressure wave and how the
  • 2:05 - 2:10
    signal is changing in time. What you see
    is, what is the spectral content of that
  • 2:10 - 2:15
    bird chirping, okay? That's what you look
    at when you typically do audio
  • 2:15 - 2:21
    engineering. Frequency content, you always
    take the signal and you always represent
  • 2:21 - 2:26
    it in terms of frequency content. And how
    do you do that? Fourier transform it,
  • 2:26 - 2:33
    okay? That's the role this is going to
    play. It's a different representation of
  • 2:33 - 2:39
    the signal, okay? W e're going to work
    with that a lot. Part of what we want to
  • 2:39 - 2:43
    do is start thinking about if I represent
    this is frequency domain, what does that
  • 2:43 - 2:48
    buy me? And it turns out like for instance
    in radar problems it buys you a ton, so
  • 2:48 - 2:54
    that's what we're going to talk about
    Friday. Okay, questions on that, so far.
  • 2:54 - 3:02
    Yeah? Function's not periodic on my .
    What's the relation? Yeah. So typically
  • 3:02 - 3:07
    this is defined, when you. The way we
    think about this is you say look. Suppose
  • 3:07 - 3:15
    this is defined on. Negative l to l and
    let's just say this is some function like
  • 3:15 - 3:21
    this. That's my function, F of X, but the
    Fourier transform, where the Fourier
  • 3:21 - 3:26
    representation, Fourier series does to do
    this, it says, well, actually, this is a
  • 3:26 - 3:31
    periodic function. Is it not? It just then
    takes this thing here, and it. I should
  • 3:31 - 3:36
    have drawn an easier function. [LAUGH] And
    it makes a periodic extension of it. Now
  • 3:36 - 3:41
    typically we're only interested in this
    domain here but technically what it does
  • 3:41 - 3:46
    is it create, it's valid from negative
    infinity, to infinity and what it does is
  • 3:46 - 3:51
    creates periodic extension and at
    discontinuities, you see one right there,
  • 3:51 - 3:57
    it takes a value halfway between. As you
    sum to infinity. By the way, nobody in
  • 3:57 - 4:04
    this class is going to sum to infinity. I
    know, you'd be here for a long time, okay?
  • 4:04 - 4:09
    All right, we're not summing to infinity.
    You always say, yeah. Infinity's big,
  • 4:09 - 4:14
    right? So lets just take something big and
    call it good, okay? So that's going to be
  • 4:14 - 4:19
    the idea of the Fourier Transform, okay?
    Now, by the way, what else are Fourier
  • 4:19 - 4:24
    Transforms good for? Let me show you.
    Actually, I'm going to skip that cause' we
  • 4:24 - 4:30
    want to get to the. There's some nice
    properties it has, that we'll address
  • 4:30 - 4:36
    later. But for right now, I want to make a
    couple comments about this here. This is
  • 4:37 - 4:42
    exists in MATLAB. You want to take a
    signal or a function, and turn it into
  • 4:42 - 4:47
    frequencies. Okay? You would define it sum
    f. Suppose you have a vector F in MATLAB
  • 4:47 - 4:58
    Actually, let's not call it F. Let's call
    it, suppose I have U. I have some function
  • 4:58 - 5:04
    you define and if I want it in the tran
    sform domain all I got to do in MATLAB is
  • 5:04 - 5:10
    say the following let's call it UT for U
    transform is, see how easy that is, nice.
  • 5:10 - 5:16
    FFT stands for fast fourier transform. Now
    why not just call it the FT, right? Seems
  • 5:16 - 5:22
    a little bit bragging to call it the FFT.
    Like, oh, I'm super-fast. Well, it turns
  • 5:22 - 5:27
    out they can brag cause this operation
    here, one of a couple comments I want to
  • 5:27 - 5:33
    make about this. We're not going to go
    into too much detail about it. But when
  • 5:33 - 5:41
    you do the FFT. FFT was kind of invented
    in the late 60s by Tookie and Cooley. Now
  • 5:42 - 5:48
    picture yourself in the late 60s. Long
    hair, tie dye, ladies show up to class,
  • 5:48 - 5:55
    having a drum circle afterwards. Okay, but
    at that time, computers were, slow. My
  • 5:55 - 6:02
    iPhone over there, would be like, the mega
    computer of the world. 1968. Okay, so
  • 6:02 - 6:10
    given that, these guys are trying to
    actually compute pretty important things.
  • 6:10 - 6:15
    I mean, I mean it depends on what you
    think is important. But for instance one
  • 6:15 - 6:19
    of the big, one of the big pushes in
    computational science came from Los
  • 6:19 - 6:23
    Alamos, when they were actually trying to
    compute, pressure waves to create, bombs,
  • 6:23 - 6:29
    okay? that's a slow computation. It's a
    3D, PD comp, computation. And, extremely
  • 6:29 - 6:35
    slow. Anything you can do to speed up your
    code would be greatly appreciated, 'kay?
  • 6:35 - 6:40
    And generically, when you would compute
    this, this would be something like,
  • 6:40 - 6:46
    whatever the size of your vector you're
    working with here, costs you order N
  • 6:46 - 6:51
    squared to do this. Well, Cooley and
    Tookie came up with an algorithm that said
  • 6:51 - 6:59
    well actually I can do that. N log n. By
    the way, if you can beat n log n, you'll
  • 6:59 - 7:06
    be famous. Okay?You will. Guarantee it.
    Promise you. I'll give you more than 25
  • 7:06 - 7:12
    cents if you can do it. Okay? I'll give
    you $one, and buy you lunch. Okay. So
  • 7:12 - 7:17
    there you go, so what they came up with is
    this very, this algorithm that's pretty
  • 7:17 - 7:22
    much the standard. That's, that's about
    as, unless you have a very specific
  • 7:22 - 7:27
    problem with very specific setup maybe you
    can beat this, but for the most part
  • 7:27 - 7:32
    that's kind of almost like a speed limit.
    And if you can beat that, you're, yo u're
  • 7:32 - 7:38
    rocking it, so that's why they added the
    extra F, okay? Fast forward transform and
  • 7:38 - 7:42
    we're about to use it. Couple things that
    they came up with, a couple of
  • 7:42 - 7:47
    observations, the key to this algorithm is
    that what it does, here's, it was actually
  • 7:47 - 7:51
    a very simple observation they made. What
    they found out is because of the
  • 7:51 - 7:55
    relationship among these four A
    coefficients or these, these, these
  • 7:55 - 8:00
    frequency components. What they found out
    is, hey you know what? I could take a
  • 8:00 - 8:06
    problem of size N and I could break it up
    into two sizes of N/2. Well, that problem
  • 8:06 - 8:11
    size van over two. I can break that up
    into two problems sizes of n over four.
  • 8:11 - 8:15
    Right, so you take this problem of size N.
    Two problems of size n over two. Four
  • 8:15 - 8:20
    problems of size n over four. Eight
    problems of size N over eight, blah, blah,
  • 8:20 - 8:25
    blah, all the way down to. End problems of
    one by one. Do you know how easy to solve
  • 8:25 - 8:30
    one by one, pretty easy. Remember each
    problem that you solve what ever size it
  • 8:30 - 8:37
    is . Computational speed. So if you're
    solving the big problem of, order ends,
  • 8:37 - 8:42
    of, size N, it's very expensive, but if I
    can solve, a bunch one by ones, then I
  • 8:42 - 8:47
    have to just group my answers back
    together, and that's what cost, log N,
  • 8:47 - 8:52
    cost me N, solve, the N equation decides
    one, and log N to glue it all back
  • 8:52 - 8:59
    together, k. So that's the big
    contribution to the world, Okay. So that's
  • 8:59 - 9:04
    what they did, late'60s. Fast Fourier
    transform. And at that point, the
  • 9:04 - 9:11
    algorithm really relied on the fact that
    when you take this vector, U, is that. 2N
  • 9:11 - 9:16
    number of points disquaritazation, right,
    so you disquarize it into, sixteen points,
  • 9:16 - 9:21
    32 points, 64 points, 128 points, however
    many points, right, that you, power of
  • 9:21 - 9:25
    two, so you can always break it down to
    smaller and smaller problems. Nowadays,
  • 9:25 - 9:31
    that's not so important, you can almost
    get N log N, there's, really fancy ways to
  • 9:31 - 9:36
    do this, like if I use a hundred points,
    right, which is. Two problems the size 50,
  • 9:36 - 9:41
    and four problems the size 25. Right,
    then, all of a sudden. 25. Okay what do I
  • 9:41 - 9:45
    got to do to that. So then, right, so you
    ran into a prob lem.
  • 9:45 - 9:50
    But now they pad 0's in there and they can
    pretty much get an n login speedout. Okay
  • 9:50 - 9:57
    and there was a hand, yes. Fidelity, when
    gluing the, gluing the solutions back
  • 9:57 - 10:01
    together. Yup, exactly, sweet right? I
    know
Title:
W01_L01_P02 - Representation of signals as frequencies

English subtitles

Revisions