The matrix exponential can be defined as a power series and used to solve systems of linear differential equations.
This post is part of the book Justin Math: Linear Algebra. Suggested citation: Skycak, J. (2019). Matrix Exponential and Systems of Linear Differential Equations. In Justin Math: Linear Algebra. https://justinmath.com/matrix-exponential-and-systems-of-linear-differential-equations/
Want to get notified about new posts? Join the mailing list.
In this chapter, we will learn how to solve systems of linear differential equations. These systems take the form shown below.
$\begin{align*} \frac{d x_1}{dt} &= a_{11}x_1 + a_{12}x_2 + \cdots + a_{1k}x_k \\ \frac{d x_2}{dt} &= a_{21}x_1 + a_{22}x_2 + \cdots + a_{2k}x_k \\ &\hspace{.2cm}\vdots \\ \frac{d x_k}{dt} &= a_{n1}x_1 + a_{n2}x_2 + \cdots + a_{kk}x_k \end{align*}$
Converting to Matrix Form
We can write the system in matrix form:
$\begin{align*} \begin{pmatrix} \frac{dx_1}{dt} \\ \frac{dx_2}{dt} \\ \vdots \\ \frac{dx_k}{dt} \end{pmatrix} = \begin{pmatrix} a_{11} & a_{12} & \cdots & a_{1k} \\ a_{21} & a_{22} & \cdots & a_{2k} \\ \vdots & \vdots & \ddots & \vdots \\ a_{k1} & a_{k2} & \cdots & a_{kk} \end{pmatrix} \begin{pmatrix} x_1 \\ x_2 \\ \vdots \\ x_k \end{pmatrix} \end{align*}$
Defining
$\begin{align*} x = \begin{pmatrix} x_1 \\ x_2 \\ \vdots \\ x_k \end{pmatrix} \hspace{1cm} A = \begin{pmatrix} a_{11} & a_{12} & \cdots & a_{1k} \\ a_{21} & a_{22} & \cdots & a_{2k} \\ \vdots & \vdots & \ddots & \vdots \\ a_{k1} & a_{k2} & \cdots & a_{kk} \end{pmatrix} \end{align*}$
the system can be written more compactly as
$\begin{align*} \frac{dx}{dt} = Ax. \end{align*}$
This bears resemblance to a familiar differential equation $\frac{dx}{dt} = ax$, where $x$ and $a$ are both scalars. We know that the solution to such a system is given by $x=e^{at}x(0)$.
We infer, then, that the solution to the matrix differential equation is given by
$\begin{align*} x=e^{At}x(0). \end{align*}$
But what does it mean to exponentiate a matrix? How should we compute $e^{At}$?
Matrix Exponential
Recall that $e^t$ can be written as the power series
$\begin{align*} e^{t} = \sum_{n=0}^\infty \frac{t^n}{n!} \, . \end{align*}$
Consequently, $e^{at}$ can be written as the power series
$\begin{align*} e^{at} = \sum_{n=0}^\infty \frac{(at)^n}{n!} = \sum_{n=0}^\infty \frac{a^n t^n}{n!} \, . \end{align*}$
Extending this to the matrix exponential, then, we have
$\begin{align*} e^{At} = \sum_{n=0}^\infty \frac{(At)^n}{n!} = \sum_{n=0}^\infty \frac{A^n t^n}{n!} \, . \end{align*}$
Writing $A$ in the form
$\begin{align*} A=PJP^{-1} \end{align*}$
where $J$ is a Jordan matrix, we have
$\begin{align*} e^{At} &= \sum_{n=0}^\infty \frac{A^n t^n}{n!} \\ &= \sum_{n=0}^\infty \frac{(PJP^{-1})^n t^n}{n!} \\ &= \sum_{n=0}^\infty \frac{PJ^nP^{-1} t^n}{n!} \\ &= P \left[ \sum_{n=0}^\infty \frac{J^n t^n}{n!} \right] P^{-1} \\ &= P e^{Jt} P^{-1} \, . \end{align*}$
Thus, computing the exponential of a matrix reduces to the problem of computing the exponential of the corresponding Jordan matrix. As such, we need only investigate how to compute exponentials of Jordan blocks.
First, we consider the simplest case: a perfectly diagonal block.
$\begin{align*} D = \begin{pmatrix} \lambda_1 & & & \\ & \lambda_2 & & \\ & & \ddots & \\ & & & \lambda_k \end{pmatrix} \end{align*}$
In this case, we have
$\begin{align*} e^{Dt} &= \sum_{n=0}^\infty \frac{D^n t^n}{n!} \end{align*}$
$\begin{align*} e^{Dt} &= \sum_{n=0}^\infty \frac{t^n}{n!} \begin{pmatrix} \lambda_1 & & & \\ & \lambda_2 & & \\ & & \ddots & \\ & & & \lambda_k \end{pmatrix}^n \end{align*}$
$\begin{align*} e^{Dt} &= \sum_{n=0}^\infty \frac{t^n}{n!} \begin{pmatrix} \lambda_1^n & & & \\ & \lambda_2^n & & \\ & & \ddots & \\ & & & \lambda_k^n \end{pmatrix} \end{align*}$
$\begin{align*} e^{Dt} &= \begin{pmatrix} \sum\limits_{n=0}^\infty \frac{t^n}{n!} \lambda_1^n & & & \\ & \sum\limits_{n=0}^\infty \frac{t^n}{n!} \lambda_2^n & & \\ & & \ddots & \\ & & &\sum\limits_{n=0}^\infty \frac{t^n}{n!} \lambda_k^n \end{pmatrix} \end{align*}$
$\begin{align*} e^{Dt} &= \begin{pmatrix} \sum\limits_{n=0}^\infty \frac{(\lambda_1 t)^n}{n!} & & & \\ & \sum\limits_{n=0}^\infty \frac{(\lambda_2 t)^n}{n!} & & \\ & & \ddots & \\ & & & \sum\limits_{n=0}^\infty \frac{(\lambda_k t)^n}{n!} \end{pmatrix} \end{align*}$
$\begin{align*} e^{Dt} &= \begin{pmatrix} e^{\lambda_1 t} & & & \\ & e^{\lambda_2 t} & & \\ & & \ddots & \\ & & & e^{\lambda_k t} \end{pmatrix}. \end{align*}$
Second, we consider the more involved case: a block with an off-diagonal of $1$s.
$\begin{align*} J = \begin{pmatrix} \lambda & 1 & 0 & \cdots & \cdots & 0 \\ & \lambda & 1 & \cdots & \cdots & 0 \\ & & \ddots & \ddots & & \vdots \\ & & & \ddots & \ddots & \vdots \\ & & & & \lambda & 1 \\ & & & & & \lambda \end{pmatrix} \end{align*}$
In this case, we have
$\begin{align*} e^{Jt} &= \sum_{n=0}^\infty \frac{J^n t^n}{n!} \end{align*}$
$\begin{align*} e^{Jt} &= \sum\limits_{n=0}^\infty \frac{t^n}{n!} \begin{pmatrix} \lambda & 1 & 0 & \cdots & \cdots & 0 \\ & \lambda & 1 & \cdots & \cdots & 0 \\ & & \ddots & \ddots & & \vdots \\ & & & \ddots & \ddots & \vdots \\ & & & & \lambda & 1 \\ & & & & & \lambda \end{pmatrix}^n . \end{align*}$
Taking the convention $\binom{n}{c} = 0$ when $c>n$, we can write
$\begin{align*} e^{Jt} &=\sum\limits_{n=0}^\infty \frac{t^n}{n!} \begin{pmatrix} \lambda^n & \binom{n}{1} \lambda^{n-1} & \binom{n}{2} \lambda^{n-2} & \cdots & \cdots & \binom{n}{k-1} \lambda^{n-k+1} \\ & \lambda^{n} & \binom{n}{1} \lambda^{n-1} & \cdots & \cdots & \binom{n}{k-2} \lambda^{n-k+2} \\ & & \ddots & \ddots & & \vdots \\ & & & \ddots & \ddots & \vdots \\ & & & & \lambda^{n} & \binom{n}{1} \lambda^{n-1} \\ & & & & & \lambda^{n} \end{pmatrix} \end{align*}$
$\begin{align*} e^{Jt} &= \begin{pmatrix}\sum\limits_{n=0}^\infty \frac{t^n}{n!} \lambda^n & \sum\limits_{n=1}^\infty \frac{t^n}{n!} \binom{n}{1} \lambda^{n-1} & \sum\limits_{n=2}^\infty \frac{t^n}{n!} \binom{n}{2} \lambda^{n-2} & \cdots & \cdots & \sum\limits_{n=k-1}^\infty \frac{t^n}{n!} \binom{n}{k-1} \lambda^{n-k+1} \\ &\sum\limits_{n=0}^\infty \frac{t^n}{n!} \lambda^{n} &\sum\limits_{n=1}^\infty \frac{t^n}{n!} \binom{n}{1} \lambda^{n-1} & \cdots & \cdots & \sum\limits_{n=k-2}^\infty \frac{t^n}{n!} \binom{n}{k-2} \lambda^{n-k+2} \\ & & \ddots & \ddots & & \vdots \\ & & & \ddots & \ddots & \vdots \\ & & & &\sum\limits_{n=0}^\infty \frac{t^n}{n!} \lambda^{n} &\sum\limits_{n=1}^\infty \frac{t^n}{n!} \binom{n}{1} \lambda^{n-1} \\ & & & & & \sum\limits_{n=0}^\infty \frac{t^n}{n!} \lambda^{n} \end{pmatrix}. \end{align*}$
Notice that the entries in the matrix take the form
$\begin{align*} \sum\limits_{n=c}^\infty \frac{t^n}{n!} \binom{n}{c} \lambda^{n-c} \end{align*}$
where $c$ is the column index of the matrix. We can simplify these expressions as follows:
$\begin{align*} &\sum\limits_{n=c}^\infty \frac{t^n}{n!} \binom{n}{c} \lambda^{n-c} \\ &= \sum\limits_{n=c}^\infty \frac{t^n}{n!} \frac{n!}{c!(n-c)!} \lambda^{n-c} \\ &= \sum\limits_{n=c}^\infty \frac{t^n \lambda^{n-c}}{c!(n-c)!} \\ &= \sum\limits_{n=0}^\infty \frac{t^{n+c} \lambda^{n}}{c!n!} \\ &= \frac{t^c}{c!} \sum\limits_{n=0}^\infty \frac{(\lambda t)^n}{n!} \\ &= \frac{t^c}{c!}e^{\lambda t} \end{align*}$
Thus,
$\begin{align*} e^{Jt} &= \begin{pmatrix} e^{\lambda t} & te^{\lambda t} & \frac{t^2}{2} e^{\lambda t} & \cdots & \cdots & \frac{t^{k-1}}{(k-1)!} e^{\lambda t} \\ &e^{\lambda t}&te^{\lambda t} & \cdots & \cdots & \frac{t^{k-2}}{(k-2)!} e^{\lambda t} \\ & & \ddots & \ddots & & \vdots \\ & & & \ddots & \ddots & \vdots \\ & & & & e^{\lambda t} & te^{\lambda t} \\ & & & & & e^{\lambda t} \end{pmatrix} \end{align*}$
$\begin{align*} e^{Jt} &= e^{\lambda t} \begin{pmatrix} 1 & t & \frac{t^2}{2} & \cdots & \cdots & \frac{t^{k-1}}{(k-1)!} \\ & 1 & t & \cdots & \cdots & \frac{t^{k-2}}{(k-2)!} \\ & & \ddots & \ddots & & \vdots \\ & & & \ddots & \ddots & \vdots \\ & & & & 1 & t \\ & & & & & 1 \end{pmatrix}. \end{align*}$
Demonstration
Now, we’re ready to run through an example. We shall solve the system below.
$\begin{align*} \begin{matrix} \frac{dx_1}{dt} &=& x_1 + x_2 - x_3 + x_4 - x_5 & | & x_1(0)=1 \\ \frac{dx_2}{dt} &=& -x_2 + x_5 & | & x_2(0)=0 \\ \frac{dx_3}{dt} &=& x_2 - x_3 & | & x_3(0)=1 \\ \frac{dx_4}{dt} &=& 3x_2 - 2x_3 + x_4 -3x_5 & | & x_4(0)=0 \\ \frac{dx_5}{dt} &=& -x_5 & | & x_5(0)=1 \end{matrix} \end{align*}$
First, we convert the system to matrix form.
$\begin{align*} \begin{pmatrix} \frac{dx_1}{dt} \\ \frac{dx_2}{dt} \\ \frac{dx_3}{dt} \\ \frac{dx_4}{dt} \\ \frac{dx_5}{dt} \end{pmatrix} &= \begin{pmatrix} 1 & 1 & -1 & 1 & -1 \\ 0 & -1 & 0 & 0 & 1 \\ 0 & 1 & -1 & 0 & 0 \\ 0 & 3 & -2 & 1 & -3 \\ 0 & 0 & 0 & 0 & -1 \end{pmatrix} \begin{pmatrix} x_1 \\ x_2 \\ x_3 \\ x_4 \\ x_5 \end{pmatrix} \\ \frac{dx}{dt} &= \begin{pmatrix} 1 & 1 & -1 & 1 & -1 \\ 0 & -1 & 0 & 0 & 1 \\ 0 & 1 & -1 & 0 & 0 \\ 0 & 3 & -2 & 1 & -3 \\ 0 & 0 & 0 & 0 & -1 \end{pmatrix} x \end{align*}$
Next, we write the matrix in the form $PJP^{-1}$ where $J$ is a Jordan matrix. We did this with the same matrix in the previous chapter, so we will just assume our previous result.
$\begin{align*} \begin{pmatrix} 1 & 1 & -1 & 1 & -1 \\ 0 & -1 & 0 & 0 & 1 \\ 0 & 1 & -1 & 0 & 0 \\ 0 & 3 & -2 & 1 & -3 \\ 0 & 0 & 0 & 0 & -1 \end{pmatrix} = \begin{pmatrix} 1 & 0 & 0 & 0 & 0 \\ 0 & 0 & 0 & 1 & 1 \\ 0 & 0 & 1 & 1 & 0 \\ 0 & 1 & 1 & 0 & 0 \\ 0 & 0 & 0 & 0 & 1 \end{pmatrix} \begin{pmatrix} 1 & 1 & & & \\ 0 & 1 & & & \\ & & -1 & 1 & 0 \\ & & 0 &-1 & 1 \\ & & 0 & 0 & -1 \end{pmatrix} \begin{pmatrix} 1 & 0 & 0 & 0 & 0 \\ 0 & 1 & -1 & 1 & -1 \\ 0 & -1 & 1 & 0 & 1 \\ 0 & 1 & 0 & 0 & -1 \\ 0 & 0 & 0 & 0 & 1 \end{pmatrix} \end{align*}$
We know that the solution to the system is given by $x=Pe^{Jt}P^{-1}x(0)$, which we will be able to multiply once we compute $e^{Jt}$. We break up the computation of $e^{Jt}$ across the two blocks within $J$.
$\begin{align*} J_1 = \begin{pmatrix} 1 & 1 \\ 0 & 1 \end{pmatrix} \hspace{1cm} J_2 = \begin{pmatrix} -1 & 1 & 0 \\ 0 & -1 & 1 \\ 0 & 0 & -1 \end{pmatrix} \end{align*}$
Applying our formula from earlier, we have
$\begin{align*} e^{J_1 t} = e^t \begin{pmatrix} 1 & t \\ 0 & 1 \end{pmatrix} \hspace{1cm} e^{J_2 t} = e^{-t} \begin{pmatrix} 1 & t & \frac{t^2}{2} \\ 0 & 1 & t \\ 0 & 0 & 1 \end{pmatrix}. \end{align*}$
Putting this together, we have
$\begin{align*} e^{Jt} = \begin{pmatrix} e^t & te^t & & & \\ 0 & e^t & & & \\ & & e^{-t} & te^{-t} & \frac{t^2}{2} e^{-t} \\ & & 0 & e^{-t} & te^{-t} \\ & & 0 & 0 & e^{-t} \end{pmatrix}. \end{align*}$
Finally, we compute the solution.
$\begin{align*} x=Pe^{Jt}P^{-1} x(0) \end{align*}$
$\begin{align*} x=\begin{pmatrix} 1 & 0 & 0 & 0 & 0 \\ 0 & 0 & 0 & 1 & 1 \\ 0 & 0 & 1 & 1 & 0 \\ 0 & 1 & 1 & 0 & 0 \\ 0 & 0 & 0 & 0 & 1 \end{pmatrix} \begin{pmatrix} e^t & te^t & & & \\ 0 & e^t & & & \\ & & e^{-t} & te^{-t} & \frac{t^2}{2} e^{-t} \\ & & 0 & e^{-t} & te^{-t} \\ & & 0 & 0 & e^{-t} \end{pmatrix} \begin{pmatrix} 1 & 0 & 0 & 0 & 0 \\ 0 & 0 & 0 & 1 & 1 \\ 0 & 0 & 1 & 1 & 0 \\ 0 & 1 & 1 & 0 & 0 \\ 0 & 0 & 0 & 0 & 1 \end{pmatrix}^{-1} \begin{pmatrix} 1 \\ 0 \\ 1 \\ 0 \\ 1 \end{pmatrix} \end{align*}$
$\begin{align*} x=\begin{pmatrix} 1 & 0 & 0 & 0 & 0 \\ 0 & 0 & 0 & 1 & 1 \\ 0 & 0 & 1 & 1 & 0 \\ 0 & 1 & 1 & 0 & 0 \\ 0 & 0 & 0 & 0 & 1 \end{pmatrix} \begin{pmatrix} e^t & te^t & & & \\ 0 & e^t & & & \\ & & e^{-t} & te^{-t} & \frac{t^2}{2} e^{-t} \\ & & 0 & e^{-t} & te^{-t} \\ & & 0 & 0 & e^{-t} \end{pmatrix} \begin{pmatrix} 1 & 0 & 0 & 0 & 0 \\ 0 & 1 & -1 & 1 & -1 \\ 0 & -1 & 1 & 0 & 1 \\ 0 & 1 & 0 & 0 & -1 \\ 0 & 0 & 0 & 0 & 1 \end{pmatrix} \begin{pmatrix} 1 \\ 0 \\ 1 \\ 0 \\ 1 \end{pmatrix} \end{align*}$
$\begin{align*} x=\begin{pmatrix} 1 & 0 & 0 & 0 & 0 \\ 0 & 0 & 0 & 1 & 1 \\ 0 & 0 & 1 & 1 & 0 \\ 0 & 1 & 1 & 0 & 0 \\ 0 & 0 & 0 & 0 & 1 \end{pmatrix} \begin{pmatrix} e^t & te^t & & & \\ 0 & e^t & & & \\ & & e^{-t} & te^{-t} & \frac{t^2}{2} e^{-t} \\ & & 0 & e^{-t} & te^{-t} \\ & & 0 & 0 & e^{-t} \end{pmatrix} \begin{pmatrix} 1 \\ -2 \\ 2 \\ -1 \\ 1 \end{pmatrix} \end{align*}$
$\begin{align*} x=\begin{pmatrix} 1 & 0 & 0 & 0 & 0 \\ 0 & 0 & 0 & 1 & 1 \\ 0 & 0 & 1 & 1 & 0 \\ 0 & 1 & 1 & 0 & 0 \\ 0 & 0 & 0 & 0 & 1 \end{pmatrix} \begin{pmatrix} (1-2t)e^t \\ -2e^t \\ \left( \frac{t^2}{2} - t + 2 \right) e^{-t} \\ (t-1)e^{-t} \\ e^{-t} \end{pmatrix} \end{align*}$
$\begin{align*} x=\begin{pmatrix} \left( 1-2t \right) e^t \\ te^{-t} \\ \left( \frac{t^2}{2} + 1 \right) e^{-t} \\ \left( \frac{t^2}{2} - t + 2 \right) e^{-t} - 2e^t \\ e^{-t} \end{pmatrix} \end{align*}$
Exercises
Solve each system of linear differential equations by converting it to a matrix equation $\frac{dx}{dt}=Ax$ and computing the solution $x=e^{At}x(0)$. (You can view the solution by clicking on the problem.)
$\begin{align*} 1) \hspace{.5cm} \begin{matrix} \frac{dx_1}{dt} &=& x_1 + x_2 & | & x_1(0)=0 \\ \frac{dx_2}{dt} &=& -x_2 & | & x_2(0)=1 \end{matrix} \end{align*}$
Solution:
$\begin{align*} x=\begin{pmatrix} \frac{1}{2}e^x - \frac{1}{2} e^{-x} \\ e^{-x} \end{pmatrix} \end{align*}$
$\begin{align*} 2) \hspace{.5cm} \begin{matrix} \frac{dx_1}{dt} &=& x_1 + x_2 & | & x_1(0)=1 \\ \frac{dx_2}{dt} &=& x_1-x_2 & | & x_2(0)=1 \end{matrix} \end{align*}$
Solution:
$\begin{align*} x=\begin{pmatrix} \left( \frac{1+\sqrt{2}}{2} \right) e^{\sqrt{2}t} + \left( \frac{1-\sqrt{2}}{2} \right) e^{-\sqrt{2}t} \\ \frac{1}{2} e^{\sqrt{2}t} + \frac{1}{2} e^{-\sqrt{2}t} \end{pmatrix} \end{align*}$
$\begin{align*} 3) \hspace{.5cm} \begin{matrix} \frac{dx_1}{dt} &=& x_2 & | & x_1(0)=0 \\ \frac{dx_2}{dt} &=& x_1+x_3 & | & x_2(0)=0 \\ \frac{dx_3}{dt} &=& x_3 & | & x_3(0)=1 \end{matrix} \end{align*}$
Solution:
$\begin{align*} x=\begin{pmatrix} \frac{1}{4}(2t-1)e^t + \frac{1}{4}e^{-t} \\ \frac{1}{4}(2t+1)e^t - \frac{1}{4} e^{-t} \\ e^t \end{pmatrix} \end{align*}$
$\begin{align*} 4) \hspace{.5cm} \begin{matrix} \frac{dx_1}{dt} &=& x_1+x_3 & | & x_1(0)=0 \\ \frac{dx_2}{dt} &=& x_1+x_2 & | & x_2(0)=1 \\ \frac{dx_3}{dt} &=& x_2+x_3 & | & x_3(0)=1 \end{matrix} \end{align*}$
Solution:
$\begin{align*} x=\begin{pmatrix} \frac{2}{3} e^{2t} - \frac{2}{3} e^{t/2} \cos \left( \frac{\sqrt{3}}{2} t \right) \\ \frac{2}{3} e^{2t} + \frac{1}{3} e^{t/2} \left[ \cos \left( \frac{\sqrt{3}}{2} t \right) - \sqrt{3} \sin \left( \frac{\sqrt{3}}{2} t \right) \right] \\ \frac{2}{3} e^{2t} + \frac{1}{3} e^{t/2} \left[ \cos \left( \frac{\sqrt{3}}{2} t \right) + \sqrt{3} \sin \left( \frac{\sqrt{3}}{2} t \right) \right] \end{pmatrix} \end{align*}$
$\begin{align*} 5) \hspace{.5cm} \begin{matrix} \frac{dx_1}{dt} &=& x_2 & | & x_1(0)=1 \\ \frac{dx_2}{dt} &=& x_3 & | & x_2(0)=0 \\ \frac{dx_3}{dt} &=& x_4 & | & x_3(0)=0 \\ \frac{dx_4}{dt} &=& x_1 & | & x_3(0)=0 \end{matrix} \end{align*}$
Solution:
$\begin{align*} x=\begin{pmatrix} \frac{1}{4} \left[ e^t + e^{-t} + 2 \cos t \right] \\ \frac{1}{4} \left[ e^t - e^{-t} - 2 \sin t \right] \\ \frac{1}{4} \left[ e^t + e^{-t} - 2 \cos t \right] \\ \frac{1}{4} \left[ e^t - e^{-t} + 2 \sin t \right] \end{pmatrix} \end{align*}$
$\begin{align*} 6) \hspace{.5cm} \begin{matrix} \frac{dx_1}{dt} &=& x_2-x_3 & | & x_1(0)=1 \\ \frac{dx_2}{dt} &=& x_3-x_4 & | & x_2(0)=0 \\ \frac{dx_3}{dt} &=& x_4-x_1 & | & x_3(0)=0 \\ \frac{dx_4}{dt} &=& x_1-x_2 & | & x_3(0)=0 \end{matrix} \end{align*}$
Solution:
$\begin{align*} x=\begin{pmatrix} \frac{1}{4} \left[ 1+e^{-2t} + 2e^t \cos t \right] \\ \frac{1}{4} \left[ 1-e^{-2t} - 2e^t \sin t \right] \\ \frac{1}{4} \left[ 1+e^{-2t} - 2e^t \cos t \right] \\ \frac{1}{4} \left[ 1-e^{-2t} + 2e^t \sin t \right] \end{pmatrix} \end{align*}$
This post is part of the book Justin Math: Linear Algebra. Suggested citation: Skycak, J. (2019). Matrix Exponential and Systems of Linear Differential Equations. In Justin Math: Linear Algebra. https://justinmath.com/matrix-exponential-and-systems-of-linear-differential-equations/
Want to get notified about new posts? Join the mailing list.