Okay, by the way there's something
fundamental we're doing here that's going
to be important for what we want to think
about. We talked about we're going to talk
about time frequency analysis right. And
here's the big idea that we're going to
think about with Fourier. Fourier says we
can represent things with sines and
cosines. Well, a big deal? Why does that
help us, right? When we think about this
we're going to think about this F of X as
being a time signal. And what this Fourier
transform allows us to do i.e. Fourier
expansion allows us to do is say, well
actually what am I to do I'm going to
place, representing this function in the X
domain by representing it in terms of
collections of cosines and sines. And in
fact what I care about are the C of N, the
coefficients, okay, here. So for instance,
C1 corresponds to N being one here. Maybe
I'll illustrate it actually over here,
this is a little simpler. Think about what
these mean right here, if I think about
what the A1 represents. A1 represents
cosine X, right? So cosine X has a certain
frequency. So what I'm doing is replacing
my X dependents by a bunch of frequency
content so I have cosine 1X, cosine 2X,
cosine 3X, cosine 4X every, and it gets
higher, higher and higher frequencies. So
what I want to do is take a signal in time
and simply replace it by a representation
of the signal, as a, as a collection of
frequency components, okay? So by the way,
when you hear a bird chirp, if you have a
detector for bird chirpers, right? You
would see its frequency content. You're
seeing it's fourier transform. You're not
seeing the pressure wave and how the
signal is changing in time. What you see
is, what is the spectral content of that
bird chirping, okay? That's what you look
at when you typically do audio
engineering. Frequency content, you always
take the signal and you always represent
it in terms of frequency content. And how
do you do that? Fourier transform it,
okay? That's the role this is going to
play. It's a different representation of
the signal, okay? W e're going to work
with that a lot. Part of what we want to
do is start thinking about if I represent
this is frequency domain, what does that
buy me? And it turns out like for instance
in radar problems it buys you a ton, so
that's what we're going to talk about
Friday. Okay, questions on that, so far.
Yeah? Function's not periodic on my .
What's the relation? Yeah. So typically
this is defined, when you. The way we
think about this is you say look. Suppose
this is defined on. Negative l to l and
let's just say this is some function like
this. That's my function, F of X, but the
Fourier transform, where the Fourier
representation, Fourier series does to do
this, it says, well, actually, this is a
periodic function. Is it not? It just then
takes this thing here, and it. I should
have drawn an easier function. [LAUGH] And
it makes a periodic extension of it. Now
typically we're only interested in this
domain here but technically what it does
is it create, it's valid from negative
infinity, to infinity and what it does is
creates periodic extension and at
discontinuities, you see one right there,
it takes a value halfway between. As you
sum to infinity. By the way, nobody in
this class is going to sum to infinity. I
know, you'd be here for a long time, okay?
All right, we're not summing to infinity.
You always say, yeah. Infinity's big,
right? So lets just take something big and
call it good, okay? So that's going to be
the idea of the Fourier Transform, okay?
Now, by the way, what else are Fourier
Transforms good for? Let me show you.
Actually, I'm going to skip that cause' we
want to get to the. There's some nice
properties it has, that we'll address
later. But for right now, I want to make a
couple comments about this here. This is
exists in MATLAB. You want to take a
signal or a function, and turn it into
frequencies. Okay? You would define it sum
f. Suppose you have a vector F in MATLAB
Actually, let's not call it F. Let's call
it, suppose I have U. I have some function
you define and if I want it in the tran
sform domain all I got to do in MATLAB is
say the following let's call it UT for U
transform is, see how easy that is, nice.
FFT stands for fast fourier transform. Now
why not just call it the FT, right? Seems
a little bit bragging to call it the FFT.
Like, oh, I'm super-fast. Well, it turns
out they can brag cause this operation
here, one of a couple comments I want to
make about this. We're not going to go
into too much detail about it. But when
you do the FFT. FFT was kind of invented
in the late 60s by Tookie and Cooley. Now
picture yourself in the late 60s. Long
hair, tie dye, ladies show up to class,
having a drum circle afterwards. Okay, but
at that time, computers were, slow. My
iPhone over there, would be like, the mega
computer of the world. 1968. Okay, so
given that, these guys are trying to
actually compute pretty important things.
I mean, I mean it depends on what you
think is important. But for instance one
of the big, one of the big pushes in
computational science came from Los
Alamos, when they were actually trying to
compute, pressure waves to create, bombs,
okay? that's a slow computation. It's a
3D, PD comp, computation. And, extremely
slow. Anything you can do to speed up your
code would be greatly appreciated, 'kay?
And generically, when you would compute
this, this would be something like,
whatever the size of your vector you're
working with here, costs you order N
squared to do this. Well, Cooley and
Tookie came up with an algorithm that said
well actually I can do that. N log n. By
the way, if you can beat n log n, you'll
be famous. Okay?You will. Guarantee it.
Promise you. I'll give you more than 25
cents if you can do it. Okay? I'll give
you $one, and buy you lunch. Okay. So
there you go, so what they came up with is
this very, this algorithm that's pretty
much the standard. That's, that's about
as, unless you have a very specific
problem with very specific setup maybe you
can beat this, but for the most part
that's kind of almost like a speed limit.
And if you can beat that, you're, yo u're
rocking it, so that's why they added the
extra F, okay? Fast forward transform and
we're about to use it. Couple things that
they came up with, a couple of
observations, the key to this algorithm is
that what it does, here's, it was actually
a very simple observation they made. What
they found out is because of the
relationship among these four A
coefficients or these, these, these
frequency components. What they found out
is, hey you know what? I could take a
problem of size N and I could break it up
into two sizes of N/2. Well, that problem
size van over two. I can break that up
into two problems sizes of n over four.
Right, so you take this problem of size N.
Two problems of size n over two. Four
problems of size n over four. Eight
problems of size N over eight, blah, blah,
blah, all the way down to. End problems of
one by one. Do you know how easy to solve
one by one, pretty easy. Remember each
problem that you solve what ever size it
is . Computational speed. So if you're
solving the big problem of, order ends,
of, size N, it's very expensive, but if I
can solve, a bunch one by ones, then I
have to just group my answers back
together, and that's what cost, log N,
cost me N, solve, the N equation decides
one, and log N to glue it all back
together, k. So that's the big
contribution to the world, Okay. So that's
what they did, late'60s. Fast Fourier
transform. And at that point, the
algorithm really relied on the fact that
when you take this vector, U, is that. 2N
number of points disquaritazation, right,
so you disquarize it into, sixteen points,
32 points, 64 points, 128 points, however
many points, right, that you, power of
two, so you can always break it down to
smaller and smaller problems. Nowadays,
that's not so important, you can almost
get N log N, there's, really fancy ways to
do this, like if I use a hundred points,
right, which is. Two problems the size 50,
and four problems the size 25. Right,
then, all of a sudden. 25. Okay what do I
got to do to that. So then, right, so you
ran into a prob lem.
But now they pad 0's in there and they can
pretty much get an n login speedout. Okay
and there was a hand, yes. Fidelity, when
gluing the, gluing the solutions back
together. Yup, exactly, sweet right? I
know