Why do algorithm books generally devote so much space to FFT?
It seems like it has very narrow applicability: apart from signal processing and multiplying big integers or polynomials, what else are they useful for?
Wouldn't it be better to explain divide-and-conquer with some other algorithm having broader applicability?
I've always had the feeling that FFT is taught in the wrong way, at least in CS courses. They just introduce complex roots of unity and this magic Vandermonde matrix and then they state a scary "convolution theorem", and after a lot of handwaving you see that you can use it for polynomial multiplication. This gives basically no insight about why things work.
With just a little bit of linear algebra and abstraction it is easy to gain a lot of insight about Fourier transform. For example this is the way I see it:
* First you introduce shift-invariant linear operators (circulant in the finite-dimensional case), i.e. linear operators that don't change if you shift the vector, apply the operator and then shift it back.
* Then you show that the Fourier basis is THE basis that diagonalizes shift-invariant operators. This is just matter of multiplying the Fourier vectors with the shift operator and noticing that you get back the same vectors, and it is where the cyclicity of the roots of unity comes into play. This also explains why Fourier transform works for signal processing: if you have to operate in the frequency domain, shifting your signal in time shouldn't change anything.
* Then you show that convolution (or polynomial multiplication) is shift-invariant: this is just matter of writing down a convolution and doing a change of variable
* You are done: if convolution is diagonal in the Fourier basis, it can be applied just by multiplying its coefficients in the Fourier domain.
* Now you introduce FFT as an efficient way of computing Fourier transform, but only after you understood what the transform is doing.
When I realized this, something "clicked" in my mind and I finally "understood" Fourier analysis.
Maybe I should expand on this a bit more and write a blog post about it without all the complications that I find in books.
I wouldn't call signal processing "narrow applicability." Signal processing is important for a wide range of computer science applications, for example any voice processing with regard to AI or security (think Siri!), and many programmers will at some point call FFT without really knowing what it is (happened to me twice this year already). Having read the FFT bits really helped to understand what it does. It also happens to be a really cool application of divide and conquer that was absolutely groundbreaking. So like from a practical, historical, and cleverness standpoints, it seems like a great thing to describe. I'm not sure it's important at all to remember how it works, but having some kind of idea of how it works seems valuable.
And it probably just takes a lot of space because it's complicated. Haha.
I think the FFT is just a good case study for how to design nontrivial algorithms. The point isn't so much to teach you super useful algorithms as to show you how you can come up with them yourself, and what sort of thinking that would take. For what it's worth, the book does cover a bunch of other divide and conquer algorithms as well.
Wouldn't it be better to explain divide-and-conquer with some other algorithm having broader applicability?