Matrix Multiplication
How to multiply a matrix by another matrix.
This post is part of the book Justin Math: Linear Algebra. Suggested citation: Skycak, J. (2019). Matrix Multiplication. In Justin Math: Linear Algebra. https://justinmath.com/matrix-multiplication/
Want to get notified about new posts? Join the mailing list and follow on X/Twitter.
We have seen how to multiply a vector by a matrix. Now, we will see how to multiply a matrix by another matrix.
Whereas multiplying a vector by a matrix corresponds to a linear transformation of that vector, multiplying a matrix by another matrix corresponds to a composition of linear transformations.
General Procedure
The procedure for matrix multiplication is quite familiar: we simply multiply each column vector in the right matrix by the left matrix.
Really, we’re just trying to figure out where the points $(1,0)$ and $(0,1)$ map to after being transformed once by the right matrix and then again by the left matrix. We already know that the right matrix maps those points to its columns, so all we have to do is map those columns according to the left matrix.
An example is shown below.
We can verify that multiplying a vector by this new matrix gives the same result as multiplying the vector first by the original right matrix, and then by the original left matrix.
Case of Rectangular Matrices
Matrix multiplication isn’t limited to just square matrices. The matrices can be rectangular, too.
But notice that if we switch the above example around, it no longer makes sense to multiply the matrices, because we are unable to multiply each column vector in the right matrix by the left matrix. There are fewer columns in the left matrix than there are entries in each column of the right matrix.
Criterion for Multiplication
The trick to telling whether matrix multiplication is defined in a particular case is to check whether the width of the left matrix matches the height of the right matrix.
Matrix dimensions are usually written as $\text{height } \times \text{ width}$, so matrix multiplication is defined whenever the inner dimensions match up.
For example, in the multiplication
the left matrix has dimensions $4 \times 3$ and the right matrix has dimensions $3 \times 2$.
Writing these dimensions in the order of multiplication, we see that the inner dimensions do indeed match up: they’re $3$ and $3$.
Moreover, the outer dimensions give the dimensions of the resulting product: $4 \times 2$.
On the other hand, the matrices in the multiplication
have dimensions $(3 \times 2) \times (4 \times 3)$. The inner dimensions don’t match up: they’re $2$ and $4$. Therefore, the matrix multiplication is not defined.
Notice the implications for square matrices: multiplication is defined for square matrices only when they both have the same dimensions, say $N \times N$, and multiplication remains defined even if we switch the order of the square matrices, because the dimensions of the product stay the same:
Moreover, the output is itself a square matrix of the same dimension, $N \times N$.
Non-Commutativity
Even for square matrices, though, matrix multiplication is generally not commutative – if we switch the order of two matrices in a product, we tend to get a different result.
For example, switching the two matrices in the most recent example yields a different result:
Even simple matrices generally do not commute:
The reason matrices tend not to commute is that left-multiplication and right-multiplication have different interpretations: left-multiplication sums combinations of row vectors, whereas right-multiplication sums combinations of column vectors.
Applying some operation to the rows of a matrix is generally not the same as applying that operation to the columns of a matrix.
Diagonal Matrices
That being said, there are some instances in which matrices do commute.
For example, diagonal matrices commute with each other. (A diagonal matrix consists of zero everywhere except the diagonal running from the top-left entry to the bottom-right entry.)
Diagonal matrices commute with each other because the diagonal components end up being multiplied independently as scalars rather than vectors, and scalar multiplication does in fact commute.
Be aware, though, that antidiagonal matrices generally do not commute with each other. (An antidiagonal matrix is like a diagonal matrix, but with the diagonal running from top-right to bottom-left.)
Exercises
Compute the product of the given matrices, if possible, using A) the left-multiplication interpretation, and B) the right-multiplication interpretation.
Otherwise, if it is not possible to compute the product, then state the dimensions that A) the left matrix would need to have for the multiplication to be defined, or B) that the right matrix would need to have for the multiplication to be defined. (You can view the solution by clicking on the problem.)
$\begin{align*} 1) \hspace{.5cm} \begin{pmatrix} 3 & -1 \\ 1 & -2 \end{pmatrix} \begin{pmatrix} 2 & 3 \\ 0 & 5 \end{pmatrix} \end{align*}$
Solution:
$\begin{align*} \begin{pmatrix} 6 & 4 \\ 2 & -7 \end{pmatrix} \end{align*}$
$\begin{align*} 2) \hspace{.5cm} \begin{pmatrix} 4 & 0 \\ -2 & 3 \\ 3 & 1 \\ 1 & 5 \end{pmatrix} \begin{pmatrix} 3 & 1 & 1 \\ 4 & 5 & 1 \end{pmatrix} \end{align*}$
Solution:
$\begin{align*} \begin{pmatrix} 12 & 4 & 4 \\ 6 & 13 & 1 \\ 13 & 8 & 4 \\ 23 & 26 & 6 \end{pmatrix} \end{align*}$
$\begin{align*} 3) \hspace{.5cm} \begin{pmatrix} -4 & 1 \\ 2 & 3 \\ -1 & 3 \end{pmatrix} \begin{pmatrix} 2 & 2 \\ 0 & 4 \\ 7 & -3 \end{pmatrix} \end{align*}$
Solution:
$\begin{align*} &\mbox{A) } N \times 3 \\ &\mbox{B) } 2 \times N \end{align*}$
$\begin{align*} 4) \hspace{.5cm} \begin{pmatrix} 4 & 3 & 0 \\ 2 & 1 & -2 \end{pmatrix} \begin{pmatrix} 1 & 2 & 1 \\ 0 & -1 & 2 \\ -1 & 1 & 0 \end{pmatrix} \end{align*}$
Solution:
$\begin{align*} \begin{pmatrix} 4 & 5 & 10 \\ 4 & 1 & 4 \end{pmatrix} \end{align*}$
$\begin{align*} 5) \hspace{.5cm} \begin{pmatrix} 1 & 3 & 2 & -1 \end{pmatrix} \begin{pmatrix} 2 & 0 \\ 4 & 3 \\ 2 & 6 \\ 1 & 1 \end{pmatrix} \end{align*}$
Solution:
$\begin{align*} \begin{pmatrix} 17 & 20 \end{pmatrix} \end{align*}$
$\begin{align*} 6) \hspace{.5cm} \begin{pmatrix} 2 & 1 \\ 4 & 0 \\ 1 & -1 \end{pmatrix} \begin{pmatrix} 3 & 2 & 1 \end{pmatrix} \end{align*}$
Solution:
$\begin{align*} &\mbox{A) } N \times 1 \\ &\mbox{B) } 2 \times N \end{align*}$
$\begin{align*} 7) \hspace{.5cm} \begin{pmatrix} 1 & 2 & 3 \end{pmatrix} \begin{pmatrix} 3 \\ 2 \\ 1 \end{pmatrix} \end{align*}$
Solution:
$\begin{align*} \begin{pmatrix} 10 \end{pmatrix} \end{align*}$
$\begin{align*} 8) \hspace{.5cm} \begin{pmatrix} 3 \\ 2 \\ 1 \end{pmatrix} \begin{pmatrix} 1 & 2 & 3 \end{pmatrix} \end{align*}$
Solution:
$\begin{align*} \begin{pmatrix}3 & 6 & 9 \\ 2 & 4 & 6 \\ 1 & 2 & 3 \end{pmatrix} \end{align*}$
$\begin{align*} 9) \hspace{.5cm} \begin{pmatrix} 1 & 4 & 1 \\ 0 & 5 & 0 \\ 4 & 4 & 3 \end{pmatrix} \begin{pmatrix} -1 & 2 \\ 3 & -2 \\ 1 & 1 \end{pmatrix} \end{align*}$
Solution:
$\begin{align*} \begin{pmatrix} 12 & -5 \\ 15 & -10 \\ 11 & 3 \end{pmatrix} \end{align*}$
$\begin{align*} 10) \hspace{.5cm} \begin{pmatrix} 3 & 5 \\ 3 & -2 \end{pmatrix} \begin{pmatrix} 3 & 1 & 4 & 0 \\ 3 & 1 & 1 & -1 \end{pmatrix} \end{align*}$
Solution:
$\begin{align*} \begin{pmatrix} 24 & 8 & 17 & -5 \\ 3 & 1 & 10 & 2 \end{pmatrix} \end{align*}$
This post is part of the book Justin Math: Linear Algebra. Suggested citation: Skycak, J. (2019). Matrix Multiplication. In Justin Math: Linear Algebra. https://justinmath.com/matrix-multiplication/
Want to get notified about new posts? Join the mailing list and follow on X/Twitter.