Jay Taylor's notes

back to listing index

Why is the Fourier transform so important?

[web search]
Original source (dsp.stackexchange.com)
Tags: math fft fast-fourier-transform discrete-fourier-transform dft fourier-transform signal-processing spectral-analysis frequency-analysis dsp.stackexchange.com
Clipped on: 2016-08-08

Signal Processing Stack Exchange is a question and answer site for practitioners of the art and science of signal, image and video processing. Join them; it only takes a minute:

Join
Here's how it works:
  1. Anybody can ask a question
  2. Anybody can answer
  3. The best answers are voted up and rise to the top

Everyone discusses the Fourier transform when discussing signal processing. Why is it so important to signal processing and what does it tell us about the signal?

Does it only apply to digital signal processing or does it apply to analog signals as well?

asked Aug 19 '11 at 4:40
Image (Asset 2/9) alt=
jcolebrand
514159
7  
Recently a discussion about Fourier transforms was revived on math.SE, and I thought that people on this site might find some of it worthwhile, and might even want to participate. – Dilip Sarwate Oct 14 '11 at 1:35
    
cf. this answer for some excellent historical background. Fourier series date at least as far back as Ptolemy's epicyclic astronomy. Adding more eccentrics and epicycles, akin to adding more terms to a Fourier series, one can account for any continuous motion of an object in the sky. – Geremia Jan 11 at 2:56
up vote 104 down vote accepted

This is quite a broad question and it indeed is quite hard to pinpoint why exactly Fourier transforms are important in signal processing. The simplest, hand waving answer one can provide is that it is an extremely powerful mathematical tool that allows you to view your signals in a different domain, inside which several difficult problems become very simple to analyze.

Its ubiquity in nearly every field of engineering and physical sciences, all for different reasons, makes it all the more harder to narrow down a reason. I hope that looking at some of its properties which led to its widespread adoption along with some practical examples and a dash of history might help one to understand its importance.

History:

To understand the importance of the Fourier transform, it is important to step back a little and appreciate the power of the Fourier series put forth by Joseph Fourier. In a nut-shell, any periodic function g(x)g(x) integrable on the domain D=[π,π]D=[π,π] can be written as an infinite sum of sines and cosines as

g(x)=k=τkeȷkxg(x)=k=τkeȷkx
τk=12πDg(x)eȷkx dxτk=12πDg(x)eȷkx dx

where eıθ=cos(θ)+ȷsin(θ)eıθ=cos(θ)+ȷsin(θ). This idea that a function could be broken down into its constituent frequencies (i.e., into sines and cosines of all frequencies) was a powerful one and forms the backbone of the Fourier transform.

The Fourier transform:

The Fourier transform can be viewed as an extension of the above Fourier series to non-periodic functions. For completeness and for clarity, I'll define the Fourier transform here. If x(t)x(t) is a continuous, integrable signal, then its Fourier transform, X(f)X(f) is given by

X(f)=Rx(t)eȷ2πft dt,fRX(f)=Rx(t)eȷ2πft dt,fR

and the inverse transform is given by

x(t)=RX(f)eȷ2πft df,tRx(t)=RX(f)eȷ2πft df,tR

Importance in signal processing:

First and foremost, a Fourier transform of a signal tells you what frequencies are present in your signal and in what proportions.

Example: Have you ever noticed that each of your phone's number buttons sounds different when you press during a call and that it sounds the same for every phone model? That's because they're each composed of two different sinusoids which can be used to uniquely identify the button. When you use your phone to punch in combinations to navigate a menu, the way that the other party knows what keys you pressed is by doing a Fourier transform of the input and looking at the frequencies present.

Apart from some very useful elementary properties which make the mathematics involved simple, some of the other reasons why it has such a widespread importance in signal processing are:

  1. The magnitude square of the Fourier transform, |X(f)|2|X(f)|2 instantly tells us how much power the signal x(t)x(t) has at a particular frequency ff.
  2. From Parseval's theorem (more generally Plancherel's theorem), we have
    R|x(t)|2 dt=R|X(f)|2 dfR|x(t)|2 dt=R|X(f)|2 df
    which means that the total energy in a signal across all time is equal to the total energy in the transform across all frequencies. Thus, the transform is energy preserving.
  3. Convolutions in the time domain are equivalent to multiplications in the frequency domain, i.e., given two signals x(t)x(t) and y(t)y(t), then if

    z(t)=x(t)y(t)z(t)=x(t)y(t)
    where denotes convolution, then the Fourier transform of z(t)z(t) is merely

    Z(f)=X(f)Y(f)Z(f)=X(f)Y(f)

    For discrete signals, with the development of efficient FFT algorithms, almost always, it is faster to implement a convolution operation in the frequency domain than in the time domain.

  4. Similar to the convolution operation, cross-correlations are also easily implemented in the frequency domain as Z(f)=X(f)Y(f)Z(f)=X(f)Y(f), where denotes complex conjugate.
  5. By being able to split signals into their constituent frequencies, one can easily block out certain frequencies selectively by nullifying their contributions.

    Example: If you're a football (soccer) fan, you might've been annoyed at the constant drone of the vuvuzelas that pretty much drowned all the commentary during the 2010 world cup in South Africa. However, the vuvuzela has a constant pitch of ~235Hz which made it easy for broadcasters to implement a notch filter to cut-off the offending noise.[1]

  6. A shifted (delayed) signal in the time domain manifests as a phase change in the frequency domain. While this falls under the elementary property category, this is a widely used property in practice, especially in imaging and tomography applications,

    Example: When a wave travels through a heterogenous medium, it slows down and speeds up according to changes in the speed of wave propagation in the medium. So by observing a change in phase from what's expected and what's measured, one can infer the excess time delay which in turn tells you how much the wave speed has changed in the medium. This is of course, a very simplified layman explanation, but forms the basis for tomography.

  7. Derivatives of signals (nth derivatives too) can be easily calculated(see 106) using Fourier transforms.

Digital signal processing (DSP) vs. Analog signal processing (ASP)

The theory of Fourier transforms is applicable irrespective of whether the signal is continuous or discrete, as long as it is "nice" and absolutely integrable. So yes, ASP uses Fourier transforms as long as the signals satisfy this criterion. However, it is perhaps more common to talk about Laplace transforms, which is a generalized Fourier transform, in ASP. The Laplace transform is defined as

X(s)=0x(t)est dt,sCX(s)=0x(t)est dt,sC

The advantage is that one is not necessarily confined to "nice signals" as in the Fourier transform, but the transform is valid only within a certain region of convergence. It is widely used in studying/analyzing/designing LC/RC/LCR circuits, which in turn are used in radios/electric guitars, wah-wah pedals, etc.


This is pretty much all I could think of right now, but do note that no amount of writing/explanation can fully capture the true importance of Fourier transforms in signal processing and in science/engineering

answered Aug 19 '11 at 7:50
Image (Asset 3/9) alt=
Lorem Ipsum
4,24332337
2  
Nice answer in giving some realworld application using FT and its properties. +1. – goldenmean Aug 19 '11 at 10:31
    
"This idea that a function could be broken down into its constituent frequencies was a powerful one" I think the Taylor series was first. en.wikipedia.org/wiki/Taylor_series#History en.wikipedia.org/wiki/Fourier_analysis#History "a Fourier transform of a signal tells you what frequencies are present in your signal and in what proportions." Like a prism shows you what frequencies of light are present in a light source – endolith Aug 19 '11 at 14:10
2  
@endolith I didn't say the Fourier transform was first, just that it is powerful. Note that a Taylor series is not an expansion in terms of the constituent frequencies. For e.g., the Taylor series of sin(αx)sin(αx) about 00 is αxα3x3/3!+α5x5/5!αxα3x3/3!+α5x5/5!, whereas the Fourier transform of sin(αx)sin(αx) is [δ(ωα)δ(ω+α)]/(2ȷ)[δ(ωα)δ(ω+α)]/(2ȷ) (give or take some normalization factors). The latter is the correct frequency representation, so I'm not sure if any comparisons with Taylor series is apt here. – Lorem Ipsum Aug 19 '11 at 14:34
5  
When I started reading this response, somehow I knew @yoda wrote it before I scrolled down to see who it actually was = ) – Phonon Aug 19 '11 at 16:06
1  
Peter K's point is really critical. Signals can be represented with respect to many different bases. Sines and cosines are special because they are the eigenfunctions of LTI systems. – nibot Mar 24 '12 at 19:01

Lorem Ipsum's great answer misses one thing: The Fourier transform decomposes signals into constituent complex exponentials:

eȷωteȷωt

and complex exponentials are the eigenfunctions for linear, time invariant systems.

Put simply, if a system, HH is linear and time-invariant, then its response to a complex exponential will be a complex exponential of the same frequency but (possibly) different phase, ϕϕ, and amplitude, AA, --- and the amplitude may be zero:

y=H[eȷωt]=Aeȷϕeȷωty=H[eȷωt]=Aeȷϕeȷωt

So the Fourier transform is a useful tool for analyzing linear, time-invariant systems.

answered Aug 25 '11 at 20:38
Image (Asset 4/9) alt=
Peter K.
11.6k52246
    
excellent point! – nibot Aug 25 '11 at 20:45
    
@Peter K. I think that following the philosophy of choice on the (academic) correctness over "popularity" of an answer, your answer should be be integrated into the above answer provideded by Lorem Ipsum, which despite being selected as the answer with 96 points by the users, lacks this very important point of view. – Fat32 Feb 15 at 20:11

Additional to Peter's answer, there is another reason which is also related to the eigenfunction. That's, ekxekx is the eigenfunction of the differential operator dndxndndxn. That's why Fourier transform (corresponding to pure imaginary kk) and Laplace transform (corresponding to complex kk) can be used to solve differential equations.

Since ekxekx is the eigenfunction of both convolution and differential operator, maybe that's one the of the reasons why LSIV system can be represented by differential equations.

EDIT: As a matter of fact, differential (and integral) operators are LSIV operators, see here.

answered Nov 15 '12 at 18:01
Image (Asset 5/9) alt=
chaohuang
75968

Another reason:

It's fast (e.g. useful for convolution), due to its linearithmic time complexity (specifically, that of the FFT).
I would argue that, if this were not the case, we would probably be doing a lot more in the time domain, and a lot less in the Fourier domain.

Edit: Since people asked me to write why the FFT is fast...

It's because it cleverly avoids doing extra work.

To give an concrete example of how it works, suppose you were multiplying two polynomials, a0x0+a1x1++anxna0x0+a1x1++anxn and b0x0+b1x1++bnxnb0x0+b1x1++bnxn.

If you were to do this naively (using the FOIL method), you would need approximately n2n2 arithmetic operations (give or take a constant factor).

However, we can make a seemingly mundane observation: in order to multiply two polynomials, we don't need to FOIL the coefficients. Instead, we can simply evaluate the polynomials at a (sufficient) number of points, do a pointwise multiplication of the evaluated values, and then interpolate to get back the result.

Why is that useful? After all, each polynomial has nn terms, and if we were to evaluate each one at 2n2n points, that would still result in n2n2 operations, so it doesn't seem to help.

But it does, if we do it correctly! Evaluating a single polynomial at many points at once is faster than evaluating it at those points individually, if we evaluate at the "right" points. What are the "right" points?

It turns out those are the roots of unity (i.e. all complex numbers zz such that zn=1zn=1). If we choose to evaluate the polynomial at the roots of unity, then a lot of expressions will turn out the same (because a lot of monomials will turn out to be the same). This means we can do their arithmetic once, and re-use it thereafter for evaluating the polynomial at all the other points.

We can do a very similar process for interpolating through the points to get back the polynomial coefficients of the result, just by using the inverse roots of unity.

There's obviously a lot of math I'm skipping here, but effectively, the FFT is basically the algorithm I just described, to evaluate and interpolate the polynomials.
One of its uses, as I showed, was to multiply polynomials in a lot less time than normal. It turns out that this saves a tremendous amount of work, bringing down the running time to being proportional to nlognnlogn (i.e. linearithmic) instead of n2n2 (quadratic).

Thus the ability to use the FFT to perform a typical operation (such as polynomial multiplication) much faster is what makes it useful, and that is also why people are now excited by MIT's new discovery of the Sparse FFT algorithm.

answered May 8 '12 at 4:37
Image (Asset 6/9) alt=
Mehrdad
25129
    
What is linearithmic time complexity? I won't downvote this answer but I don't think it adds anything of value to this discussion on Fourier transforms. – Dilip Sarwate May 8 '12 at 11:05
1  
@DilipSarwate I suspect he's using it as shorthand for O(n*log(n)). – Jim Clay May 8 '12 at 12:50
    
@DilipSarwate: Jim is right. It has everything to do with (discrete) Fourier transforms. Without the FFT, your Fourier transforms would take time proportional to the square of the input size, which would make them a lot less useful. But with the FFT, they take time proportional the size of the input (times its logarithm), which makes them much more useful, and which speeds up a lot of calculations. Also this might be an interesting read. – Mehrdad May 8 '12 at 13:01
    
You should mention WHY its fast. Where its fast and why do we care that its fast? – CyberMen May 8 '12 at 15:03
1  
I think that this answer is legitimate. It should be paraphrased - "Beside all the nice characteristics explained in other peoples answer, FFT allows it to become a feasible approach in real-time applications". – Andrey Rubshtein Nov 15 '12 at 18:30

The other people have given great, useful answers. Just think about some signal: you only care what frequencies are in it (and their phase), not about the time domain. I do not this that this is a final or complete answer, but just another reason why the Fourier transform is useful.

When you have some signal, it could be composed of an infinite (or close to) number of frequencies, depending on your sampling rate. But, that's not the case: we know that most signals have the fewest number of frequencies possible, or that we're sampling at a high enough rate.

If we know that, why can't we use it? That's what the field of compressed sensing does. They know that the most likely signal is one that has the least error and has the fewest frequencies. So, they minimize the overall error relative to our measurements as well as the magnitude of the Fourier transform.

A signal of a few frequencies often has a minimal Fourier transform, or mostly zeros (aka "sparse," as they say in compressed sensing). A signal of one frequency just has a delta function as the transform, for example.

We can use the formal mathematical definition too.

x¯=arg min ||yAx||+λ||F(x)||x¯=arg min ||yAx||+λ||F(x)||

Here, all we're doing is minimizing the error (first set of ||||||||) and minimizing the Fourier transform (second set of ||||||||). Here, we have

  • x¯x¯ as our reconstructed signal (most likely close to the original)
  • yy, our measurements
  • AA, a selection matrix
  • xx, our signal
  • λλ some constant
  • F(x)F(x) the Fourier transform.

You may recall that Nyquist said that you have to measure at twice the highest frequency to get a good representation. Well, that was assuming you had infinite frequencies in your signal. We can get past that!

The field of compressed sensing can reconstruct any signal that's mostly zeros (or sparse) in some domain. Well, that's the case for the Fourier transform.

answered Apr 22 '13 at 0:09
Image (Asset 7/9) alt=
Scott
44126

The main importance of the Fourier transform lies with system analysis. The main constituent of our universe is vacuum, and vacuum is a fundamentally linear and time-invariant carrier of fields: different fields superimpose by adding their respective vectors, and regardless of when you repeat the application of certain fields, the outcome will be the same.

As a consequence, a lot of systems also involving physical matter are to a good approximation behaving as linear, time-invariant systems.

Such LTI systems can be described by their "impulse response", and the response to any time-distributed signal is described by convolving the signal with the impulse response.

Convolution is a commutative and associative operation, but it is also quite computationally and conceptually expensive. However, the convolution of functions is mapped by Fourier transform into piecewise multiplication.

That means that the properties of linear time invariant systems and their combinations are much better described and manipulated after Fourier transformation.

As a result, things like "frequency response" are quite characteristic for describing the behavior of a lot of systems and become useful for characterizing them.

Fast Fourier transforms are in the "almost, but not quite, entirely unlike Fourier transforms" class as their results are not really sensibly interpretable as Fourier transforms though firmly routed in their theory. They correspond to Fourier transforms completely only when talking about a sampled signal with the periodicity of the transform interval. In particular the "periodicity" criterion is almost always not met.

There are several techniques for working around that, like the use of overlapping windowing functions.

However the FFT can be employed for doing discrete-time convolution when doing things right, and is it is an efficient algorithm, that makes it useful for a lot of things.

One can employ the basic FFT algorithm also for number theoretic transforms (which work in discrete number fields rather than complex "reals") in order to do fast convolution, like when multiplying humongous numbers or polynomials. In this case, the "frequency domain" is indistinguishable from white noise for basically any input and has no useful interpretation before you do the inverse transform again.

answered Apr 2 '14 at 10:00
Image (Asset 8/9) alt=
David
211

Some of the other answers in this thread have excellent mathematical discussions of the definition and properties of the Fourier transform; as an audio programmer, I merely want to provide my own personal intuition as to why it's important to me.

The Fourier transform permits me to answer questions about a sound that are difficult or impossible to answer with other methods. It makes hard problems easy.

A recording contains a set of three musical notes. What are the notes? If you leave the recording as a set of amplitudes over time, this is not an easy problem. If you convert the recording to a set of frequencies over time, it's really easy.

I want to change the pitch of a recording without changing its duration. How do I do this? It's possible, but not easy to do, by just manipulating the amplitude of an input signal. But it's easy if you know the frequencies that comprise the signal.

Does this recording contain speech or does it contain music? Super hard to do using only amplitude-based methods. But there are good solutions that guess the right answer nearly all of the time based on the Fourier transform and its family.

Almost every question you'd like to ask about a digital audio recording is made easier by transforming the recording using a discrete version of the Fourier transform.

In practice, every modern digital audio device relies heavily on functions very similar to the Fourier transform.

Again, forgive the highly informal description; this is merely my personal intuition as to why the Fourier transform is important.

answered Mar 8 '15 at 21:11
Image (Asset 9/9) alt=
johnwbyrd
20614

the physics relevance of fourier transform is that it tells the relative amplitude of frequencies present in the signal . it can be defined for both discrete time and continuous time signal. Any signal can be represented as mixture of many harmonic frequencies. Fourier transform help in filter applications , where we need only certain range of frequencies then we first need to know what are the amplitudes of frequencies contains in the signal.

answered Nov 13 '14 at 16:53

protected by jojek Mar 9 '15 at 8:05

Thank you for your interest in this question. Because it has attracted low-quality or spam answers that had to be removed, posting an answer now requires 10 reputation on this site (the association bonus does not count).

Would you like to answer one of these unanswered questions instead?

Not the answer you're looking for? Browse other questions tagged or ask your own question.

asked

4 years ago

viewed

54877 times

active

8 months ago

Get the weekly newsletter! In it, you'll get:

  • The week's top questions and answers
  • Important community announcements
  • Questions that need answers

Hot Network Questions

Technology Life / Arts Culture / Recreation Science Other
  1. Stack Overflow
  2. Server Fault
  3. Super User
  4. Web Applications
  5. Ask Ubuntu
  6. Webmasters
  7. Game Development
  8. TeX - LaTeX
  1. Programmers
  2. Unix & Linux
  3. Ask Different (Apple)
  4. WordPress Development
  5. Geographic Information Systems
  6. Electrical Engineering
  7. Android Enthusiasts
  8. Information Security
  1. Database Administrators
  2. Drupal Answers
  3. SharePoint
  4. User Experience
  5. Mathematica
  6. Salesforce
  7. ExpressionEngine® Answers
  8. more (13)
  1. Photography
  2. Science Fiction & Fantasy
  3. Graphic Design
  4. Movies & TV
  5. Seasoned Advice (cooking)
  6. Home Improvement
  7. Personal Finance & Money
  8. Academia
  9. more (9)
  1. English Language & Usage
  2. Skeptics
  3. Mi Yodeya (Judaism)
  4. Travel
  5. Christianity
  6. Arqade (gaming)
  7. Bicycles
  8. Role-playing Games
  9. more (21)
  1. Mathematics
  2. Cross Validated (stats)
  3. Theoretical Computer Science
  4. Physics
  5. MathOverflow
  6. Chemistry
  7. Biology
  8. more (5)
  1. Stack Apps
  2. Meta Stack Exchange
  3. Area 51
  4. Stack Overflow Careers
site design / logo © 2016 Stack Exchange Inc; user contributions licensed under cc by-sa 3.0 with attribution required
rev 2016.8.8.3869