Finding Fourier coefficientsRather than proving Fourier's scandal for any arbitrary Problem Statement
If we know that
then how do we find the values of the coefficients Notice the subtle difference: we're not trying to show that any Massaging into a better formThere are many ways to express a sum of sinusoids. The one we wrote above – a sum of sines with of differing amplitudes and phases – is the easiest to intuitively understand, but it's the hardest to work with algebraically. That is, the expressions for the coefficients are going to be pretty messy and unenlightening. So it's well worth our time to massage the expression into other equivalent forms. Sines and cosinesThe first thing we can do is to use the sum and difference formulas to rewrite a sine with a phase shift as a sum of a sine and a cosine:
This form is a wee bit nicer, partly because we don't have the phases floating around in the arguments of our sinusoids anymore.The information about amplitude and phase is contained in the coefficients Making this change, our sum becomes
With the benefit of hindsight, we've renamed the constant term Imaginary ExponentialsWe can do better than this. By far (mathematically), the best way to work with sinusoids is in terms of imaginary exponentials. Remark
Using complex exponentials is a common ‘cute’ trick in math and physics. I remember when I first encountered them in solving for E'n'M waves, it felt pretty awkward and uncomfortable; I didn't really know what I was writing. I never really grew out of the awkwardness; like many things in life, I guess once you see it enough, you just get used to it. Now that I think about it, there's a few places where it's actually quite enlightening and helpful to think in terms of complex exponentials. When you're talking about damped harmonic motion (such as a spring-and-mass system in a bowl of syrup), the motion looks like a sinusoid inside a decaying exponential envelope; it's a bit nicer to just think in terms of an exponential with a complex exponent, where the real part is the decaying envelope and the imaginary part is the frequency. Or when you're talking about electromagnetic waves propagating in a dialectric (aka light travelling through a material), the real and imaginary parts of the dialectric constant correspond to absorptive losses that cause the light intensity to decay exponentially, or a phase shift (ie, light speeding up or slowing down) inside the material. Okay enough of a tangent, back to the class. Remember that Euler's formula relates imaginary exponentials to sinusoids:
We can visualize this fact on a complex plane by observing how the real and imaginary components of the complex phase
Remark
I should say a bit more about how to interpret complex numbers like Functions such as One other thing to note about functions of the form If we flip the sign of the exponent in Euler's formula, we get a expression for
Notice that the functions of positive theta turned into functions of positive and negative theta. As a result, the range of the frequencies After doing a bit of algebraic tickling, we can now express our periodic
where the new (complex!) coefficients
Notice that we are now expressing our real function Solving for the coefficientsAfter all this massaging, our problem can be phrased in a much more tractable and concise way: Problem statement
Suppose we are given a periodic function
How do we find the complex coefficients It's time to perform the classic derivation of the Fourier coefficients. Our goal is to solve for a particular coefficient — let's say the
If we shuffle around the terms so that the
To get
However, this expression isn't particularly useful for us, because all we've done by this point is express the Fourier's TrickTo proceed further, we'll pull a little rabit out of a hat, and do a little bit of calculus to make the the second term on the RHS disappear. David Griffiths likes to call this step ‘‘Fourier's Trick’’ because it's pretty clever and kind of magical. Later on in the course, I'm guessing we'll justify it in terms of the orthogonality of the basis functions, but for now, it's just a magic trick. The trick is to integrate both sides of the equation from 0 to 1. There's no real motivation for why we want to do this quite yet (until we learn about orthogonality!), but at the least, we can justify the limits of integration 0 and 1 by appealing to the periodicity of Anyways, we have three terms to integrate, and we'll consider them one-by-one. The term on the LHS, The first term on the RHS will look like The last term on the RHS is where the magic happens: every single term in the sum becomes zero when you integrate from 0 to 1! To see why, consider what happens when you integrate any one of the terms. We can pull out the constant
which we can evaluate like any other exponential integral:
Now the crucial insight: Since So to summarize: when we integrate both sides of the expression for Result
If a periodic function
then the Fourier coefficients
There's a lot of things to be said about this statement, but classtime was over, so we'll have to hear them next time around. |