Could someone elaborate on orthogonal polynomials?
This is the extent of what I was taught in 10 minutes in a graduate-level physics class:
For the differential equation
#g_2(x) f_n''(x) + g_1(x)f_n'(x) + f_n(x) = 0# ,
an orthogonal polynomial #f_n(x)# could be the solution, where
#f_n(x) = sum_(k=0)^(n) a_k x^n# ,
such that:
#int_(a)^(b) w(x)f_n(x)f_m(x)dx = delta_(mn)h_n# ,
where #w(x)# is a weight function, #w(x) >= 0# , #x in [a,b]# .
For example, if #w(x) = e^(-alphax^2)# , the domain of interest could be #(-oo,oo)# .
#delta_(mn) = {(0, " "m ne n),(1, " "m = n):}#
is the Kronecker delta.
Some examples relevant to me are:
- Associated Legendre polynomials
- Associated Laguerre polynomials
- Hermite polynomials
This is the extent of what I was taught in 10 minutes in a graduate-level physics class:
For the differential equation
#g_2(x) f_n''(x) + g_1(x)f_n'(x) + f_n(x) = 0# ,
an orthogonal polynomial
#f_n(x) = sum_(k=0)^(n) a_k x^n# ,
such that:
#int_(a)^(b) w(x)f_n(x)f_m(x)dx = delta_(mn)h_n# ,where
#w(x)# is a weight function,#w(x) >= 0# ,#x in [a,b]# .
For example, if
#delta_(mn) = {(0, " "m ne n),(1, " "m = n):}# is the Kronecker delta.
Some examples relevant to me are:
- Associated Legendre polynomials
- Associated Laguerre polynomials
- Hermite polynomials
1 Answer
Here's a simple start of an answer...
Explanation:
Suppose we have a family of related functions
-
#int_(-1)^1 (f_n(x))^2 dx = 1# -
#int_(-1)^1 f_m(x) f_n(x) dx = 0" "# if#m!=n#
For any sequence of coefficients
#f(x) = sum_(n=1)^oo a_n f_n(x)#
If this converges suitably, then we find:
#int_(-1)^1 f_m(x) f(x) dx = int_(-1)^1 f_m(x) sum_(n=1)^oo a_n f_n(x) dx#
#color(white)(int_(-1)^1 f_m(x) f(x) dx) = sum_(n=1)^oo a_n int_(-1)^1 f_m(x) f_n(x) dx#
#color(white)(int_(-1)^1 f_m(x) f(x) dx) = a_m#
Hence we can recover the coefficients used to construct
That means that the functions
What might such a family of functions
One choice would be basic trigonometric functions:
#f_1(x) = sin((pix)/2)#
#f_2(x) = cos((pix)/2)#
#f_3(x) = sin(pix)#
#f_4(x) = cos(pix)# etc.
That is:
#{ (f_(2k-1) = sin((kpix)/2)), (f_(2k) = cos((kpix)/2)) :}# for
#k = 1, 2, 3,...#
graph{(y-sin(pix/2))(y-cos(pix/2))(y-sin(pix))(y-cos(pix)) = 0 [-2.5, 2.5, -1.25, 1.25]}
This gives us a rich vector space of functions on
I think it's rich enough to include any continuous function defined on
For this family of functions, the expressions of
We do not need to use trigonometric functions for our family. We can use polynomials of various kinds.
If we use the rather restrictive conditions given above, and start with
#f_0(x) = sqrt(2)/2#
#f_1(x) = sqrt(6)/2x#
#f_2(x) = sqrt(10)/4(3x^2-1)#
Here I have derived the first few polynomials by choosing coefficients such that the above conditions hold.
Note that the integral:
#int_(-1)^1 f(x) g(x) dx#
acts like a sort of dot product on our vector space of functions.
In more general form, we can use integrals of the form:
#int_a^b w(x) f(x) g(x) dx#
where
#int_a^b w(x) f_m(x) f_n(x) dx = delta_(mn) h_n#
for some sequence of constants