## Subsection3.1.1Choosing the right basis

A classic problem in numerical analysis is the approximation of a function, $f: \R \rightarrow \R \text{,}$ with a polynomial of degree $n-1 \text{.}$ (The $n-1$ seems cumbersome. Think of it as a polynomial with $n$ terms.)

\begin{equation*} f( \chi ) \approx \gamma_0 + \gamma_1 \chi + \cdots + \gamma_{n-1} \chi^{n-1}. \end{equation*}

* Now, often we know $f$ only "sampled" at points $\chi_0, \ldots, \chi_{m-1} \text{:}$

\begin{equation*} \begin{array}{ccc} f( \chi_0 ) \amp = \amp \phi_0 \\ \vdots \amp \vdots \amp \vdots \\ f( \chi_{m-1} ) \amp = \amp \phi_{m-1}. \end{array} \end{equation*}

In other words, input to the process are the points

\begin{equation*} ( \chi_0, \phi_0 ), \cdots , ( \chi_{m-1}, \phi_{m-1} ) \end{equation*}

and we want to determine the polynomal that approximately fits these points. This means that

\begin{equation*} \begin{array}{c c c c c c c c c} \gamma_0 \amp + \amp \gamma_1 \chi_0 \amp + \amp \cdots \amp + \amp \gamma_{n-1} \chi_0^{n-1} \amp \approx \amp \phi_0 \\ \vdots \amp \vdots \amp \vdots \amp \vdots \amp \amp \vdots \amp \vdots \amp \vdots \amp \vdots \\ \gamma_0 \amp + \amp \gamma_1 \chi_{m-1} \amp + \amp \cdots \amp + \amp \gamma_{n-1} \chi_{m-1}^{n-1} \amp \approx \amp \phi_{m-1} .\\ \end{array} \end{equation*}

This can be reformulated as the approximate linear system

\begin{equation*} \left( \begin{array}{c c c c } 1 \amp \chi_0 \amp \cdots \amp \chi_0^{n-1} \\ 1 \amp \chi_1 \amp \cdots \amp \chi_1^{n-1} \\ \vdots \amp \vdots \amp \amp \vdots \\ 1 \amp \chi_{m-1} \amp \cdots \amp \chi_{m-1}^{n-1} \end{array} \right) \left( \begin{array}{c} \gamma_0 \\ \gamma_1 \\ \vdots \\ \gamma_{n-1} \end{array} \right) \approx \left( \begin{array}{c} \phi_0 \\ \phi_1 \\ \vdots \\ \phi_{m-1} \end{array} \right) . \end{equation*}

which can be solved using the techniques for linear least-squares in Week 4. The matrix in the above equation is known as a Vandermonde matrix.

###### Homework3.1.1.1.

Choose $\chi_0, \chi_1, \cdots , \chi_{m-1}$ to be equally spaced in the interval $[ 0, 1 ] \text{:}$ for $i = 0, \ldots, m-1 \text{,}$ $\chi_i = i h$ ,where $h = {1}/{(m-1)} \text{.}$ Write Matlab code to create the matrix

\begin{equation*} X = \left( \begin{array}{c c c c } 1 \amp \chi_0 \amp \cdots \amp \chi_0^{n-1} \\ 1 \amp \chi_1 \amp \cdots \amp \chi_1^{n-1} \\ \vdots \amp \vdots \amp \amp \vdots \\ 1 \amp \chi_{m-1} \amp \cdots \amp \chi_{m-1}^{n-1} \end{array} \right) \end{equation*}

as a function of $n$ with $m = 5000 \text{.}$ Plot the condition number of $X \text{,}$ $\kappa_2( X ) \text{,}$ as a function of $n$ (Matlab's function for computing $\kappa_2( X )$ is cond( X ).)

Hint

You may want to use the recurrence $x^{j+1} = x x^j$ and the fact that the .* operator in Matlab performs an element-wise multiplication.

Solution
• Here is our implementation:

• The graph of the condition number, $\kappa( X ) \text{,}$ as a function of $n$ is given by • The parent functions $1, x, x^2, \ldots$ on the interval $[ 0, 1 ]$ are visualized as Notice that the curves for $x^j$ and $x^{j+1}$ quickly start to look very similar, which explains why the columns of the Vandermonde matrix quickly become approximately linearly dependent.

Think about how this extends to even more columns of $A \text{.}$

Another way of computing the component orthogonal to a set of orthonormal vectors is to apply the matrix that projects onto the space orthogonal to those vectors.

An alternative set of polynomials that can be used are known as Legendre polynomials. A shifted version (appropriate for the interval $[0,1]$) can be inductively defined by

\begin{equation*} \begin{array}{ccl} P_0( \chi ) \amp = \amp 1 \\ P_1( \chi ) \amp = \amp 2\chi - 1 \\ \vdots \amp = \amp \vdots \\ P_{n+1}( \chi ) \amp = \amp \left( (2n+1) (2\chi-1) P_n( \chi ) - n P_{n-1}( \chi ) \right) / ( n+1 ). \end{array} \end{equation*}

The polynomials have the property that

\begin{equation*} \int_{0}^{1} P_s( \chi ) P_t(\chi) d\chi = \left\{ \begin{array}{c l} 1 \amp \mbox{ if } s=t \\ 0 \amp \mbox{ otherwise} \end{array} \right. \end{equation*}

which is an orthogonality condition on the polynomials.

The function $f: \R \rightarrow \R$ can now instead be approximated by

\begin{equation*} f( \chi ) \approx \gamma_0 P_0( \chi ) + \gamma_1 P_1( \chi ) + \cdots + \gamma_{n-1} P_{n-1}( \chi ). \end{equation*}

and hence given points

\begin{equation*} ( \chi_0, \phi_0 ), \cdots , ( \chi_{m-1}, \phi_{m-1} ) \end{equation*}

we can determine the polynomial from

\begin{equation*} \begin{array}{c c c c c c c c c} \gamma_0 P_0( \chi_0 ) \amp + \amp \gamma_1 P_1( \chi_0 ) \amp + \amp \cdots \amp + \amp \gamma_{n-1} P_{n-1}( \chi_0) \amp = \amp \phi_0 \\ \vdots \amp \vdots \amp \vdots \amp \vdots \amp \amp \vdots \amp \vdots \amp \vdots \amp \vdots \\ \gamma_0 P_0( \chi_{m-1} ) \amp + \amp \gamma_1 P_1( \chi_{m-1} ) \amp + \amp \cdots \amp + \amp \gamma_{n-1} P_{n-1}( \chi_{m-1}) \amp = \amp \phi_{m-1}. \end{array} \end{equation*}

This can be reformulated as the approximate linear system

\begin{equation*} \left( \begin{array}{c c c c } 1 \amp P_1(\chi_0) \amp \cdots \amp P_{n-1}( \chi_0 ) \\ 1 \amp P_1(\chi_1) \amp \cdots \amp P_{n-1}( \chi_1 ) \\ \vdots \amp \vdots \amp \amp \vdots \\ 1 \amp P_1(\chi_{m-1}) \amp \cdots \amp P_{n-1}( \chi_{m-1} ) \\ \end{array} \right) \left( \begin{array}{c} \gamma_0 \\ \gamma_1 \\ \vdots \\ \gamma_{n-1} \end{array} \right) \approx \left( \begin{array}{c} \phi_0 \\ \phi_1 \\ \vdots \\ \phi_{m-1} \end{array} \right) . \end{equation*}

which can also be solved using the techniques for linear least-squares in Week 4. Notice that now the columns of the matrix are (approximately) orthogonal: Notice that if we "sample" $x$ as $\chi_0, \ldots , \chi_{n-1} \text{,}$ then

\begin{equation*} \int_{-1}^{1} P_s( \chi ) P_t(\chi) d\chi \approx \sum_{i=0}^{n-1} P_s( \chi_i ) P_t( \chi_i ), \end{equation*}

which equals the dot product of the columns indexed with $s$ and $t \text{.}$

###### Homework3.1.1.2.

Choose $\chi_0, \chi_1, \cdots , \chi_{m-1}$ to be equally spaced in the interval $[ 0, 1 ] \text{:}$ for $i = 0, \ldots, m-1 \text{,}$ $\chi_i = 2 i h \text{,}$ where $h = {1}/{(m-1)} \text{.}$ Write Matlab code to create the matrix

\begin{equation*} X = \left( \begin{array}{c c c c } 1 \amp P_1(\chi_0) \amp \cdots \amp P_{n-1}( \chi_0 ) \\ 1 \amp P_1(\chi_1) \amp \cdots \amp P_{n-1}( \chi_1 ) \\ \vdots \amp \vdots \amp \amp \vdots \\ 1 \amp P(\chi_{m-1}) \amp \cdots \amp P_{n-1}( \chi_{m-1} ) \end{array} \right) \end{equation*}

as a function of $n$ with $m =5000 \text{.}$ Plot $\kappa_2( X )$ as a function of $n \text{.}$ To check whether the columns of $X$ are mutually orthogonal, report $\| X^T X - D \|_2$ where $D$ equals the diagonal of $X^T X \text{.}$

Solution
• Here is our implementation: ShiftedLegendre.m. (Assignments/Week03/answers/ShiftedLegendre.m)

• The graph of the condition number, as a function of $n$ is given by We notice that the matrices created from shifted Legendre polynomials have a very good condition numbers.

• The shifted Legendre polynomials are visualized as • The columns of the matrix $X$ are now reasonably orthogonal:

X^T * X for n=5:

ans =

5000           0           1           0           1
0        1667           0           1           0
1           0        1001           0           1
0           1           0         715           0
1           0           1           0         556


###### Remark3.1.1.1.

The point is that one ideally formulates a problem in a way that already captures orthogonality, so that when the problem is discretized ("sampled"), the matrices that arise will likely inherit that orthogonality, which we will see again and again is a good thing. In this chapter, we discuss how orthogonality can be exposed if it is not already part of the underlying formulation of the problem.