We recently proved a couple things involving Laplace Transforms and homogenous, constant-coefficient, ordinary differential equations. However, while the second proof we did in this note was sufficiently rigorous, the first, though correct, was not especially rigorous. Here, we will prove two propositions involving Laplace Transforms and homogenous, constant coefficient partial differential equations. The reader will notice that we will achieve similar results to those mentioned in the aforementioned note, but we will arrive at the conclusions in a much more complete, rigorous fashion. We will prove:

(1) \(\forall\) \(n \in \mathbb{Z}^+\), \(n \geq 1\) and real functions \(f(x_1,x_2,...,x_k)\) such that \(f\) is bounded on \([0,\infty)\) with respect to \(x_1\):

\(\mathcal{L} \left[ \partial_{x_1}^n f(x_1,x_2,...,x_k) \right] = s^n F(x_2,x_3,...,x_k,s) - \displaystyle \sum_{i=1}^n s^{n-i} \partial_{x_1}^{i-1} f |_{x_1 = 0}\)

(2) All partial differential equations of the form:

\(\displaystyle \sum_{i=0}^n b_i \partial_{x_1}^i f(x_1,x_2,...,x_k) = 0\)

Where \(b_i\) are fixed, complex coefficients, have solution of the form:

\(\displaystyle \sum_{i=1}^n A_i(x_2,x_3,...,x_k)e^{r_ix_1}\)

Where \(r_i\) are the roots of the polynomial:

\(P(x) = \displaystyle \sum_{i=0}^{n}b_ix^i\)

And \(A_i\) are functions dependent on \(x_2,x_3,...,x_k\)

Proof:

(1) We will prove (1) using mathematical induction. We establish a base case as \(n=1\):

(For the sake of brevity, let \(f(x_1,x_2,...x_k)\) be represented as \(f\) and \(F(x_2,x_3,...,x_k,s)\) as \(F\))

\(\mathcal{L} \left[ \partial_{x_1} f \right] = s F - f |_{x_1 = 0}\)

Now, if we evaluate the actual transform on the left, we have:

\(\mathcal{L} \left[ \partial_{x_1} f \right] = \displaystyle \int_{0}^{\infty} \partial_{x_1} f e^{-sx_1}\,dx_1\)

Integrating by parts, we arrive at:

\(\mathcal{L} \left[ \partial_{x_1} f \right] = s\displaystyle \int_{0}^{\infty} f e^{-sx_1}\,dx_1 + e^{-sx_1}f|_{0}^{\infty} \)

\(\mathcal{L} \left[ \partial_{x_1} f \right] =sF - f|_{x_1=0}\)

(We arrive at this due to the integrability condition on \(f\) with respect to \(x_1\) over the interval \([0,\infty)\). Restricting \(f\) to being bound on this interval results in the limit at infinity vanishing.)

So the base case clearly holds. We now construct the inductive case as \(n=m\):

\(\mathcal{L} \left[ \partial_{x_1}^m f \right] = s^m F - \displaystyle \sum_{i=1}^m s^{m-i} \partial_{x_1}^{i-1} f |_{x_1 = 0}\)

Now we will proceed to multiply the entire equation by \(s\), due to hypothesis:

\(s\mathcal{L} \left[ \partial_{x_1}^m f \right] = s^{m+1} F - \displaystyle \sum_{i=1}^m s^{m+1-i} \partial_{x_1}^{i-1} f |_{x_1 = 0}\)

Now, note that the \(m+1\) term in the sum on the right is:

\(S_{m+1} = \partial_{x_1}^{m}f|_{x_1=0}\)

Then we may write:

\(s\mathcal{L} \left[ \partial_{x_1}^m f \right] = s^{m+1} F - \displaystyle \sum_{i=1}^{m+1} s^{m+1-i} \partial_{x_1}^{i-1} f |_{x_1 = 0} +\partial_{x_1}^{m}f|_{x_1=0}\)

Now, keep this equation in the back of your head as we evaluate the term on the left side of the equal sign:

\(s\mathcal{L} \left[ \partial_{x_1}^m f \right] = s\displaystyle \int_{0}^{\infty} \partial_{x_1}^m f e^{-sx_1}\,dx_1\)

We integrate by parts, but we will make the following somewhat counter-intuitive variable substitutions:

\(u = \partial_{x_1}^m f\) and \(dv = e^{-sx_1} dx_1\)

Then we have:

\(s \mathcal{L} \left[ \partial_{x_1}^m f \right] = s \left[ -\frac{e^{-sx_1}}{s} \partial_{x_1}^m f|_{0}^{\infty} + \frac{1}{s}\displaystyle \int_{0}^{\infty} \partial_{x_1}^{m+1} f e^{-sx_1}\,dx_1\right]\)

\(\Rightarrow\) \(s\mathcal{L} \left[ \partial_{x_1}^m f \right] = \partial_{x_1}^m f |_{x_1=0} + \mathcal{L} \left[ \partial_{x_1}^{m+1} f \right]\)

\(\Rightarrow\) \(\partial_{x_1}^m f |_{x_1=0} + \mathcal{L} \left[ \partial_{x_1}^{m+1} f \right] =s^{m+1} F - \displaystyle \sum_{i=1}^{m+1} s^{m+1-i} \partial_{x_1}^{i-1} f |_{x_1 = 0} +\partial_{x_1}^{m}f|_{x_1=0}\)

Now, eliminating terms on both sides of the equal sign leads to:

\( \mathcal{L} \left[ \partial_{x_1}^{m+1} f \right] =s^{m+1} F - \displaystyle \sum_{i=1}^{m+1} s^{m+1-i} \partial_{x_1}^{i-1} f |_{x_1 = 0} \)

Which is the intended result. Then we may conclude that the statement is true for \(n=1\) and if it is true for some \(n=m\), then it is true for \(n=m+1\). The proof follows by induction.

(2) Now, given a partial differential equation of the form:

\(\displaystyle \sum_{i=0}^n b_i \partial_{x_1}^i f = 0\)

We proceed to take the Laplace Transform of the entire equation:

\(\mathcal{L} \left[\displaystyle \sum_{i=0}^n b_i \partial_{x_1}^i f \right] = 0\)

\(\Rightarrow\) \( \displaystyle \sum_{i=0}^n b_i \mathcal{L} \left[ \partial_{x_1}^i f \right ] = 0\)

Using what we proved in (1):

\(\Rightarrow\) \( \displaystyle \sum_{i=0}^n b_i \left[ s^iF -s^{i-1} f |_{x_1=0} - ... - \partial_{x_1}^{(i-1)} f |_{x_1=0} \right] = 0\)

\(\Rightarrow\) \( F \displaystyle \sum_{i=0}^n b_i s^i = \displaystyle \sum_{i=0}^n b_i s^{n-i} \partial_{x_1}^{i-1} f|_{x_1=0}\)

\(\Rightarrow\) \(F = \frac{\displaystyle \sum_{i=0}^n b_i s^{n-i} \partial_{x_1}^{i-1} f|_{x_1=0}}{\displaystyle \sum_{i=0}^n b_i s^i }\)

And taking the inverse transform:

\(f = \mathcal{L}^{-1} \left[ \frac{\displaystyle \sum_{i=0}^n b_i s^{n-i} \partial_{x_1}^{i-1} f|_{x_1=0}}{\displaystyle \sum_{i=0}^n b_i s^i }\right]\)

And, as we saw in the last note, we may factor the denominator to reduce this to:

\(f = \mathcal{L}^{-1} \left[ \frac{\displaystyle \sum_{i=0}^n b_i s^{n-i} \partial_{x_1}^{i-1} f|_{x_1=0}}{\displaystyle \prod_{i=1}^n (s-r_i) }\right]\)

Where \(r_i\) are the roots of: \(\displaystyle \sum_{i=0}^n b_i s^i \)

And finally, using partial fraction decomposition as we did in the previous note we arrive at:

\(f = \mathcal{L}^{-1} \left[ \displaystyle \sum_{i=1}^n \frac{A_i}{(s-r_i)} \right]\)

\(\Rightarrow\) \(f= \displaystyle \sum_{i=1}^n A_i e^{r_ix_1} \)

Now, this is essentially the same solution we arrived at in the previous note, with one difference: \(A_i\) are functions of \(x_i\) (not \(x_1\)). That this must be true is apparent if we take a different approach in solving this differential equation:

Return to the original differential equation and apply separation of variables, that is let:

\(f = \displaystyle \prod_{i=1}^k g_i(x_i)\)

\(\Rightarrow\) \( \displaystyle \sum_{i=0}^n b_i \partial_{x_1}^i f = \displaystyle \sum_{i=0}^n b_i \partial_{x_1}^i \displaystyle \prod_{i=1}^k g_i(x_i) = 0\)

Now, note that:

\(\partial_{x_1}^i \displaystyle \prod_{i=1}^k g_i(x_i) =\left(\displaystyle \prod_{i=2}^k g_i(x_i)\right) g_1^{i}(x_1)\)

Then:

\(\left(\displaystyle \prod_{i=2}^k g_i(x_i)\right) \displaystyle \sum_{i=0}^n b_i g_1^{i}(x_1) = 0\)

\(\Rightarrow\) \( \displaystyle \sum_{i=0}^n b_i g_1^{i}(x_1) = 0\)

Which is an ordinary differential equation with solution:

\(g_1(x_1) = \displaystyle \sum_{i=1}^n A_i e^{r_ix_1}\)

As we found in the previous note. Now, remember that we have a product solution for \(f\), hence:

\(f=\left( \displaystyle \prod_{i=2}^k g_i(x_i)\right)\displaystyle \sum_{i=1}^n A_i e^{r_ix_1}\)

Then:

\(A_i = c_i \displaystyle \prod_{i=2}^k g_i(x_i)\) where \(c_i \in \mathbb{C}\).

Thus, in order for \(f\) to even exist as a multivariable function, \(A_i\) must be functions of \(x_i\) excluding \(x_1\).

This completes the proof for (2).

QED

I like to give credit to people who help me to improve my proof writing skills. Part (1) of this note is in response to a suggestion made by Cody Johnson. He suggested that I use induction to prove certain integral relations instead of just saying "From here we notice the pattern" or something to that effect.

## Comments

Sort by:

TopNewest@Cody Johnson – Ethan Robinett · 2 years, 11 months ago

Log in to reply

– Cody Johnson · 2 years, 11 months ago

Prove by induction that \(\mathcal{L}\{x^n\}=\frac{n!}{s^{n+1}}\)Log in to reply

\(\mathcal{L} \left[x \right] = \frac{1}{s^2}\)

Evaluating the transform:

\(\mathcal{L} \left[x \right] = \displaystyle \int_{0}^{\infty} xe^{-sx}\,dx\)

Let \(u=sx\) \(\Rightarrow\) \(du=sdx\)

\(\Rightarrow\) \(\mathcal{L} \left[x \right] = \frac{1}{s^2} \displaystyle \int_{0}^{\infty} ue^{-u} \,du\)

Set \(t-1 =1\) \(\Rightarrow\) \(t=2\)

\(\Rightarrow\) \(\mathcal{L} \left[x \right] =\frac{\Gamma(2)}{s^2} = \frac{1!}{s^2} = \frac{1}{s^2}\)

The base case clearly holds. Now the inductive case \(n=k\):

\(\mathcal{L} \left[x^k \right] = \frac{k!}{s^{k+1}}\)

\(\Rightarrow\) \(\frac{(k+1)}{s}\mathcal{L} \left[x^k \right] =\frac{(k+1)!}{s^{k+2}}\) (By hypothesis)

\(\Rightarrow\) \(\frac{(k+1)}{s} \displaystyle \int_{0}^{\infty} x^k e^{-sx}\,dx = \frac{(k+1)!}{s^{k+2}}\)

Set \(u=e^{-sx}\) and \(dv = x^k dx\) and integrate the transform by parts, obtaining:

\(\frac{(k+1)}{s} \left[ \frac{x^{k+1}e^{-sx}}{k+1} |_{0}^{\infty} + \frac{s}{k+1} \displaystyle \int_{0}^{\infty} x^{k+1} e^{-sx}\,dx \right]= \frac{(k+1)!}{s^{k+2}}\)

\(\Rightarrow\) \( \displaystyle \lim_{x \to \infty} \frac{x^{k+1}}{\frac{s}{e^{-sx}}} +\mathcal{L} \left[ x^{k+1} \right] = \frac{(k+1)!}{s^{k+2}}\)

Differentiate the numerator and the denominator of the first term on the left \(k+1\) times (according to L'Hopital's Rule), obtaining:

\( \displaystyle \lim_{x \to \infty} \frac{(k+1)!}{\frac{s^{k+2}}{e^{-sx}}} +\mathcal{L} \left[ x^{k+1} \right] = \frac{(k+1)!}{s^{k+2}}\)

\(\Rightarrow\) \( \frac{(k+1)!}{\infty} + \mathcal{L} \left[ x^{k+1} \right] = \frac{(k+1)!}{s^{k+2}}\)

\(\Rightarrow\) \(\mathcal{L} \left[ x^{k+1} \right] = \frac{(k+1)!}{s^{k+2}}\)

We may conclude that the statement is true for \(n=1\) and if it is true for some \(n=k\), then it is true for some \(n=k+1\). The proof follows by induction.

QED

Alternative Proof: (Not using induction)

\(\mathcal{L} \left[x^n \right] = \displaystyle \int_{0}^{\infty} x^n e^{-sx}\, dx \)

Let \(u=sx\) \(\Rightarrow\) \(du=sdx\)

\(\Rightarrow\) \( \mathcal{L} \left[x^n \right] = \frac{1}{s^{n+1}} \displaystyle \int_{0}^{\infty} u^n e^{-u} \,du \)

\(\Rightarrow\) \( \mathcal{L} \left[x^n \right] = \frac{\Gamma(n+1)}{s^{n+1}} = \frac{n!}{s^{n+1}}\)

QED – Ethan Robinett · 2 years, 11 months ago

Log in to reply

– Steven Zheng · 2 years, 11 months ago

I like the second one! Gamma functions for the win!Log in to reply

– Ethan Robinett · 2 years, 11 months ago

Gamma functions are so usefulLog in to reply

– Steven Zheng · 2 years, 11 months ago

I know. It is also my favourite function!Log in to reply