Subsections


22.11 A Problem in Three Independent Variables

This example addresses a much more complex case. It involves three independent variables and eigenfunctions that turn out to be Bessel functions.


22.11.1 The physical problem

Find the unsteady heat conduction in a disk if the perimeter is insulated. The initial temperature is given.

\begin{displaymath}
\hbox{\epsffile{sv3d1.eps}}
\end{displaymath}


22.11.2 The mathematical problem


\begin{displaymath}
\hbox{\epsffile{sv3d2.eps}}
\end{displaymath}

We will solve using separation of variables in the form

\begin{displaymath}
u(r,\vartheta,t) =
\sum_n \left(\sum_m u_{nm}(t) R_{nm}(r)\right) \Theta_n(\vartheta)
\end{displaymath}

The eigenfunctions $\Theta_n$ will get rid of the $\vartheta$ variable in the partial differential equation, and the eigenfunctions $R_{nm}$ will get rid of the $r$ variable, leaving ordinary differential equations for the Fourier coefficients $u_{nm}(t)$.


22.11.3 Step 1: Find the eigenfunctions

Let's start trying to get rid of one variable first. We might try a solution of the form

\begin{displaymath}
u(r,\vartheta,t) = \sum_n u_n(\vartheta,t) R_n(r)
\end{displaymath}

where the $R_n$ would be the eigenfunctions and the $u_n(\vartheta,t)$ the corresponding Fourier coefficients. Unfortunately, if we try to substitute a single term of the form $C(\vartheta,t) R_n(r)$ into the homogeneous partial differential equation, we are not able to take all $r$ terms to the same side of the equation and $\theta$ and $t$ terms to the other side. So we do not get a Sturm-Liouville problem for $R_n$.

Try again, this time

\begin{displaymath}
u(r,\vartheta,t) = \sum_n u_n(r,t) \Theta_n(\vartheta)
\end{displaymath}

If we substitute $C(r,t) \Theta(\vartheta)$ into the homogeneous partial differential equation $u_t/\kappa$ $\vphantom0\raisebox{1.5pt}{$=$}$ $u_{rr} + u_r/r +
u_{\vartheta\vartheta}/r^2$ we get:

\begin{displaymath}
\frac1\kappa \dot T \Theta =
C''\Theta + \frac{1}{r} C'\Theta + \frac{1}{r^2} C \Theta''
\end{displaymath}

This, fortunately, can be separated:

\begin{displaymath}
r^2 \frac{C''}{C} + r \frac{C'}{C} - r^2 \frac{\dot C}{\ka...
...}
= - \frac{\Theta''}{\Theta} = \hbox{ constant } = \lambda
\end{displaymath}

So we have a Sturm-Liouville problem for $\Theta$:

\begin{displaymath}
- \Theta'' = \lambda \Theta
\end{displaymath}

with boundary conditions that are periodic of period $2\pi$. This problem was already fully solved in 7.38. It was the standard Fourier series for a function of period $2\pi$. In particular, the eigenfunctions were $\cos(n\vartheta)$, $n$ $\vphantom0\raisebox{1.5pt}{$=$}$ $0,1,2,\ldots$, and $\sin(n\vartheta)$, $n$ $\vphantom0\raisebox{1.5pt}{$=$}$ $1,2,\ldots$.

Like we did in 7.38, in order to cut down on writing, we will indicate those eigenfunctions compactly as $\Theta^i_n$, where $\Theta^1_n\equiv \cos(n\vartheta)$ and $\Theta^2_n\equiv
\sin(n\vartheta)$.

So we can concisely write

\begin{displaymath}
u = \sum_{n,i} u^i_n(r,t) \Theta^i_n(\vartheta)
\end{displaymath}

Now, if you put this into the partial differential equation, you will see that you get rid of the $\vartheta$ coordinate as usual, but that still leaves you with $r$ and $t$. So instead of ordinary differential equations in $t$, you get partial differential equations involving both $r$ and $t$ derivatives. That is not good enough.

We must go one step further: in addition we need to expand each Fourier coefficient $u^i_n(r,t)$ in a generalized Fourier series in $r$:

\begin{displaymath}
u(r,\vartheta,t) =
\sum_{n,i} \left(\sum_m u^i_{nm}(t) R^i_{nm}(r)\right) \Theta^i_n(\vartheta)
\end{displaymath}

Now, if you put a single term of the form $T_n(t) R_n(r)
\Theta_n(\vartheta)$ into the homogeneous partial differential equation, you get

\begin{displaymath}
\frac1\kappa \dot T^i_n R^i_n \Theta^i_n =
T^i_n {R^i_n}...
...^i_n}'\Theta^i_n
+ \frac{1}{r^2} T^i_n R^i_n {\Theta^i_n}''
\end{displaymath}

Since ${\Theta^i_n}''$ $\vphantom0\raisebox{1.5pt}{$=$}$ $-\lambda \Theta^i_n$ $\vphantom0\raisebox{1.5pt}{$=$}$ $-n^2
\Theta^i_n$, this is separable:

\begin{displaymath}
\frac{\dot T^i_n}{\kappa T^i_n} =
\frac{R^i_n\strut''}{R...
...rR^i_n}
- n^2 \frac{1}{r^2} = \hbox{ constant } = - \mu_{n}
\end{displaymath}

So we get a Sturm-Liouville problem for $R^i_n$ with eigenvalue $\mu_n$

\begin{displaymath}
r^2 R^i_n\strut'' + r R^i_n\strut' + (\mu_n r^2 - n^2) R^i_n = 0
\end{displaymath}

with again the same homogeneous boundary conditions as $u$:

\begin{displaymath}
R^i_n \hbox{ regular at } r=0 \qquad R^i_n\strut'(a) = 0
\end{displaymath}

We need to find all solutions to this problem.

Unfortunately, the ordinary differential equation above is not a constant coefficient one, so we cannot write a characteristic equation. However, we have seen the special case that $\mu_n$ $\vphantom0\raisebox{1.5pt}{$=$}$ 0 before, 7.38. It was a Euler equation. We found in 7.38 that the only solutions that are regular at $r$ $\vphantom0\raisebox{1.5pt}{$=$}$ 0 were found to be $A_n
r^n$. But over here, the only one of that form that also satisfies the boundary condition ${R^i_n}'$ $\vphantom0\raisebox{1.5pt}{$=$}$ 0 at $r$ $\vphantom0\raisebox{1.5pt}{$=$}$ $a$ is the case $n$ $\vphantom0\raisebox{1.5pt}{$=$}$ 0. So, for $\mu$ $\vphantom0\raisebox{1.5pt}{$=$}$ 0, we only get a single eigenfunction

\begin{displaymath}
R_{00} = 1
\end{displaymath}

For the case $\mu_n\ne 0$, the trick is to define a stretched $r$ coordinate $\rho$ as

\begin{displaymath}
\rho = \sqrt{\mu_n} r
\quad\quad\Rightarrow\quad\quad
...
...\frac{{\rm d}R^i_n}{{\rm d}\rho}
+ (\rho^2 - n^2) R^i_n = 0
\end{displaymath}

This equation can be found in any mathematical handbook in the section on Bessel functions. It says there that solutions are the Bessel functions of the first kind $J_n$ and of the second kind $Y_n$:

\begin{displaymath}
R^i_n = A_n J_n(\sqrt{\mu_n} r) + B_n Y_n(\sqrt{\mu_n} r)
\end{displaymath}

Now we need to apply the boundary conditions. Now if you look up the graphs for the functions $Y_n$, or their power series around the origin, you will see that they are all singular at $r$ $\vphantom0\raisebox{1.5pt}{$=$}$ 0. So, regularity at $r$ $\vphantom0\raisebox{1.5pt}{$=$}$ 0 requires $B_n$ $\vphantom0\raisebox{1.5pt}{$=$}$ 0.

The boundary condition at the perimeter is

\begin{displaymath}
{R^i_n}'(a) = 0 = A_n \sqrt{\mu_n} J_n'(\sqrt{\mu_n} a)
\end{displaymath}

Since $\mu_n$ is nonzero, nontrivial solutions only occur if

\begin{displaymath}
J_n'(\sqrt{\mu_n} a) = 0
\end{displaymath}

Now if you look up the graphs of the various functions $J_0$, $J_1$, $\ldots$, you will see that they are all oscillatory functions, like decaying sines, and have an infinity of maxima and minima where the derivative is zero.

\begin{displaymath}
\hbox{\epsffile{sv3d3.eps}}
\end{displaymath}

Each of the extremal points gives you a value of $\mu_n$, so you will get an infinite of values $\mu_{n1}$, $\mu_{n2}$, $\mu_{n3}$, $\ldots$, $\mu_{nm}$, $\ldots$. There is no simple formula for these values, but you can read them off from the graph. Better still, you can find them in tables for low values of $n$ and $m$. (Schaum's gives a table containing both the zeros of the Bessel functions and the zeros of their derivatives.)

So the $r$-eigenvalues and eigenfunctions are:

\begin{displaymath}
\begin{array}{ccccc}
\frac{\strut}{\strut}
\mu_{n1} & ...
...m} = J_n\left(\sqrt{\mu_{n3}} r\right) & \ldots
\end{array}
\end{displaymath}

where $m$ is the counter over the nonzero stationary points of $J_n$. To include the special case $\mu_n$ $\vphantom0\raisebox{1.5pt}{$=$}$ 0, we can simply add $\mu_{00}$ $\vphantom0\raisebox{1.5pt}{$=$}$ 0, $R^i_{00}$ $\vphantom0\raisebox{1.5pt}{$=$}$ $J_0(0)$ $\vphantom0\raisebox{1.5pt}{$=$}$ 1 to the list above.

In case of negative $\mu_n$, the Bessel function $J_n$ of imaginary argument becomes a modified Bessel function $I_n$ of real argument, and looking at the graph of those, you see that there are no solutions.


22.11.4 Step 2: Solve the problem

We again expand all variables in the problem in generalized Fourier series:

\begin{displaymath}
\hbox{\epsffile{sv3d2.eps}}
\end{displaymath}

Let's start with the initial condition:

\begin{displaymath}
f(r,\vartheta) =
\sum_{n,i} \sum_m
f^i_{nm} \Theta^i_n(\vartheta) J_n(\sqrt{\mu_{nm}} r)
\end{displaymath}

To find the Fourier coefficients $f^i_{nm}$, we need orthogonality for both the $r$ and $\vartheta$ eigenfunctions. Now the ordinary differential equation for the $\Theta$ eigenfunctions was in standard form,

\begin{displaymath}
- \Theta'' = \lambda \Theta
\end{displaymath}

but the one for $R_n$ was not:

\begin{displaymath}
r^2 R^i_n\strut'' + r R^i_n\strut' - n^2 R^i_n = - \mu_n r^2 R^i_n
\end{displaymath}

The derivative of the first coefficient is $2r$, not $r$. To fix it up, we must divide the equation by $r$. And that makes the weight factor $\bar r$ that we need to put in the orthogonality relationship equal to $r$.

As a result, our orthogonality relation for the Fourier coefficients of initial condition $f(r,\vartheta)$ becomes

\begin{displaymath}
f^i_{nm} =
\frac{
\int_0^a
J_n(\sqrt{\mu_{nm}} r)
...
...
\int_0^{2\pi} \Theta^{i2}_n(\vartheta) { \rm d}\vartheta}
\end{displaymath}

The integral within the square brackets turns $f(r,\vartheta)$ into its $\theta$-Fourier coefficient $f^i_n(r)$ and the outer integral turns that coefficient in its generalized $r$-Fourier coefficient $f^i_{nm}$. Note that the total numerator is an integral of $f$ over the area of the disk against a mode shape $J_n(\sqrt{\mu_{nm}} r) \Theta^i_n(\vartheta)$.

The $r$-integral in the denominator can be worked out using Schaum's Mathematical Handbook 24.88/27.88:

\begin{displaymath}
\int_0^a J_n^2(\sqrt{\mu_{nm}} r)  r { \rm d}r =
\left...
...n^2}{2 \mu_{nm}}\right)
J_n^2\left(\sqrt{\mu_{nm}} a\right)
\end{displaymath}

(setting the second term to zero for $\mu_{00}$.)

Hence, while akward, there is no fundamental problem in evaluating as many $f^i_{nm}$ as you want numerically. We will therefor consider them now ``known''.

Next we expand the desired temperature in a generalized Fourier series:

\begin{displaymath}
u(r,\vartheta,t) =
\sum_{n,i} \sum_m
u^i_{nm}(t) \Theta^i_n(\vartheta) J_n(\sqrt{\mu_{nm}} r)
\end{displaymath}

Put into partial differential equation $u_t/\kappa$ $\vphantom0\raisebox{1.5pt}{$=$}$ $u_{rr} +
u_r/r + u_{\vartheta\vartheta}/r^2$:

\begin{displaymath}
\begin{array}{l}
\displaystyle\frac{\strut}{\strut}
\f...
... \Theta^i_n(\vartheta)'' J_n(\sqrt{\mu_{nm}} r)
\end{array}
\end{displaymath}

Because of the SL equation satisfied by the $\Theta^i_n$:

\begin{displaymath}
\begin{array}{l}
\textstyle\displaystyle\frac{\strut}{\s...
...m} \Theta^i_n(\vartheta) J_n(\sqrt{\mu_{nm}} r)
\end{array}
\end{displaymath}

Because of the SL equation satisfied by the $J_n$:

\begin{displaymath}
\begin{array}{l}
\displaystyle\frac{\strut}{\strut}
\f...
...m} \Theta^i_n(\vartheta) J_n(\sqrt{\mu_{nm}} r)
\end{array}
\end{displaymath}

Hence the ordinary differential equation for the Fourier coefficients is:

\begin{displaymath}
\dot u^i_{nm} + \kappa \mu_{nm} u^i_{nm} = 0
\end{displaymath}

with solution:

\begin{displaymath}
u^i_{nm}(t) = u^i_{nm}(0) e^{-\kappa \mu_{nm} t}
\end{displaymath}

At time zero, the series expansion for $u$ must be the same as the one for the given initial condition $f$:

\begin{displaymath}
u^i_{nm}(0) = f^i_{nm}
\end{displaymath}

Hence we have found the Fourier coefficients of $u$ and solved the problem.


22.11.5 Summary of the solution


\begin{displaymath}
\hbox{\epsffile{sv3d3.eps}}
\end{displaymath}

Find the set $\sqrt{\mu_{nm}} a$ of positive stationary points of the Bessel functions $J_n$, $n$ $\vphantom0\raisebox{1.5pt}{$=$}$ $0,1,2, ...$ and add $\mu_{00}$ $\vphantom0\raisebox{1.5pt}{$=$}$ 0.

Find the generalized Fourier coefficients of the initial condition:

\begin{displaymath}
f^1_{0m} =
\frac
{\displaystyle
\int_0^{2\pi} \int_0...
...
{\displaystyle \pi a^2 J_0^2\left(\sqrt{\mu_{0m}}a\right)}
\end{displaymath}


\begin{displaymath}
f^1_{nm} =
\frac
{\displaystyle 2\mu_{nm} \int_0^{2\pi...
...(\mu_{nm} a^2 - n^2\right)J_n^2\left(\sqrt{\mu_{nm}}a\right)}
\end{displaymath}


\begin{displaymath}
f^2_{nm} =
\frac
{\displaystyle 2\mu_{nm} \int_0^{2\pi...
...(\mu_{nm} a^2 - n^2\right)J_n^2\left(\sqrt{\mu_{nm}}a\right)}
\end{displaymath}

Then:

\begin{displaymath}
\begin{array}{l}
\displaystyle \frac{\strut}{\strut}
...
...n(n\vartheta) J_n\left(\sqrt{\mu_{nm}} r\right)
\end{array}
\end{displaymath}

That was not too bad!