Could someone elaborate on orthogonal polynomials?

This is the extent of what I was taught in 10 minutes in a graduate-level physics class:

For the differential equation

g_2(x) f_n''(x) + g_1(x)f_n'(x) + f_n(x) = 0,

an orthogonal polynomial f_n(x) could be the solution, where

f_n(x) = sum_(k=0)^(n) a_k x^n,

such that:

int_(a)^(b) w(x)f_n(x)f_m(x)dx = delta_(mn)h_n,

where w(x) is a weight function, w(x) >= 0, x in [a,b].

For example, if w(x) = e^(-alphax^2), the domain of interest could be (-oo,oo).

delta_(mn) = {(0, " "m ne n),(1, " "m = n):}

is the Kronecker delta.

Some examples relevant to me are:

  • Associated Legendre polynomials
  • Associated Laguerre polynomials
  • Hermite polynomials

1 Answer
Aug 26, 2017

Here's a simple start of an answer...

Explanation:

Suppose we have a family of related functions f_1(x), f_2(x),... all defined on the same interval [-1, 1] (say) with the properties:

  • int_(-1)^1 (f_n(x))^2 dx = 1

  • int_(-1)^1 f_m(x) f_n(x) dx = 0" " if m!=n

For any sequence of coefficients a_1, a_2,... we can attempt to define a function by summation:

f(x) = sum_(n=1)^oo a_n f_n(x)

If this converges suitably, then we find:

int_(-1)^1 f_m(x) f(x) dx = int_(-1)^1 f_m(x) sum_(n=1)^oo a_n f_n(x) dx

color(white)(int_(-1)^1 f_m(x) f(x) dx) = sum_(n=1)^oo a_n int_(-1)^1 f_m(x) f_n(x) dx

color(white)(int_(-1)^1 f_m(x) f(x) dx) = a_m

Hence we can recover the coefficients used to construct f(x) as a sum in a unique way.

That means that the functions f_n(x) form an orthogonal (in fact orthonormal) base for the infinite dimensional vector space of functions like f(x) formed by summing scalar multiples of them.

What might such a family of functions f_n(x) look like?

One choice would be basic trigonometric functions:

f_1(x) = sin((pix)/2)

f_2(x) = cos((pix)/2)

f_3(x) = sin(pix)

f_4(x) = cos(pix)

etc.

That is:

{ (f_(2k-1) = sin((kpix)/2)), (f_(2k) = cos((kpix)/2)) :}

for k = 1, 2, 3,...

graph{(y-sin(pix/2))(y-cos(pix/2))(y-sin(pix))(y-cos(pix)) = 0 [-2.5, 2.5, -1.25, 1.25]}

This gives us a rich vector space of functions on [-1, 1]; odd, even, or a combination of both.

I think it's rich enough to include any continuous function defined on [-1, 1].

For this family of functions, the expressions of f(x) as a sum of these f_n(x)'s is essentially its Fourier Series. This is useful for analysing periodic waveforms, breaking them down into harmonic constituents.

We do not need to use trigonometric functions for our family. We can use polynomials of various kinds.

If we use the rather restrictive conditions given above, and start with f_0(x) being of degree 0, then we get a family of polynomials that start:

f_0(x) = sqrt(2)/2

f_1(x) = sqrt(6)/2x

f_2(x) = sqrt(10)/4(3x^2-1)

Here I have derived the first few polynomials by choosing coefficients such that the above conditions hold.

Note that the integral:

int_(-1)^1 f(x) g(x) dx

acts like a sort of dot product on our vector space of functions.

In more general form, we can use integrals of the form:

int_a^b w(x) f(x) g(x) dx

where w(x) is a fixed weighting function and require our family of functions to satisfy:

int_a^b w(x) f_m(x) f_n(x) dx = delta_(mn) h_n

for some sequence of constants h_n.