Systems of Linear Differential Equations
A system of linear differential equations is a set of linear equations relating a group of functions to their derivatives. Because they involve functions and their derivatives, each of these linear equations is itself a differential equation. For example, \(f'(x)=f(x)+g(x)\) is a linear equation relating \(f'\) to \(f\) and \(g\), but \(f'=fg\) is not, because the \(fg\) term is not linear. These equations can be solved by writing them in matrix form, and then working with them almost as if they were standard differential equations.
Systems of differential equations can be used to model a variety of physical systems, such as predator-prey interactions, but linear systems are the only systems that can be consistently solved explicitly.
Writing a System in Matrix Form
Given a system of linear differential equations of the form
\[ \begin{align} f_1'(x)&=a_{11}f_1(x)+\cdots+a_{1n}f_n(x),\\ f_2'(x)&=a_{21}f_1(x)+\cdots+a_{2n}f_n(x),\\ &\vdots\\ f_n'(x)&=a_{n1}f_1(x)+\cdots+a_{nn}f_n(x)\\ \end{align} \]
this system is easiest to work with if it is rewritten as \(v'=Av\), where \[v=\left[\begin{array}{c} f_1\\ \vdots \\ f_n\end{array}\right]\quad\text{and}\quad A=\left[\begin{array}{ccc} a_{11}&\cdots&a_{1n}\\ \vdots&\ddots&\vdots\\ a_{n1}&\cdots&a_{nn}\end{array}\right].\] Now, the \(n\) equations have been reduced to a single equation that is easier to work with.
Solving a System in Matrix Form
Because the matrix \(A\) is constant, the equation \(v'=Av\) resembles the equation \(f'=cf\). The latter equation has solution \(f=Ce^{cx}\), so it's tempting to guess that this equation has a solution involving an exponential as well. This guess turns out to be correct: the solution to this equation is \(v(x)=e^{Ax}v(0)\), where we define \[e^A=\sum_{n=0}^{\infty} \frac{A^n}{n!}=I+A+\frac{A^2}{2}+\frac{A^3}{6}+\cdots,\] using the power series for the exponential function. This fact can be verified by differentiating the power series.
Solve the system of differential equations \(f'(x)=g(x)\) and \(g'(x)=f(x)\), where \(f(0)=0\) and \(g(0)=1\).
In matrix form, this system is \[\left[\begin{array}{c} f\\ g\end{array}\right]'=\underbrace{\left[\begin{array}{cc} 0&1\\ 1&0\end{array}\right]}_{A}\underbrace{\left[\begin{array}{c} f\\ g\end{array}\right]}_{v}.\] It remains to compute \(e^{Ax}\).
If we compute some small powers of \(A\), we find that \(A^{2n}=I\) and \(A^{2n+1}=A\) i.e. the even powers are the identity and the odd powers are just \(A\). Then, \[e^{Ax}=\sum_{n=0}^{\infty} \frac{(Ax)^n}{n!}=\sum_{n=0}^{\infty} \frac{(Ax)^{2n}}{(2n)!}+\sum_{n=0}^{\infty} \frac{(Ax)^{2n+1}}{(2n+1)!}=A\sum_{n=0}^{\infty} \frac{x^{2n}}{(2n)!}+I\sum_{n=0}^{\infty} \frac{x^{2n+1}}{(2n+1)!}.\] Now, we can recognize those two sums as the Taylor series for the hyperbolic sine and hyperbolic cosine, so we have \[e^{Ax}=A\sinh x+I\cosh x=\left[\begin{array}{cc} \cosh x& \sinh x\\ \sinh x&\cosh x\end{array}\right].\] Finally, we find \[v(x)=e^{Ax}v(0)=\left[\begin{array}{cc} \cosh x& \sinh x\\ \sinh x&\cosh x\end{array}\right]\left[\begin{array}{c} 0\\1 \end{array}\right]=\left[\begin{array}{c} \sinh x\\ \cosh x\end{array}\right],\] so the solution is \(f(x)=\sinh x\) and \(g(x)=\cosh x\).