Matrix Multiplication

by Justin Skycak on

How to multiply a matrix by another matrix.

This post is a chapter in the book Justin Math: Linear Algebra. Suggested citation: Skycak, J. (2019). Matrix Multiplication. Justin Math: Linear Algebra. https://justinmath.com/matrix-multiplication/


We have seen how to multiply a vector by a matrix. Now, we will see how to multiply a matrix by another matrix.

Whereas multiplying a vector by a matrix corresponds to a linear transformation of that vector, multiplying a matrix by another matrix corresponds to a composition of linear transformations.

General Procedure

The procedure for matrix multiplication is quite familiar: we simply multiply each column vector in the right matrix by the left matrix.

Really, we’re just trying to figure out where the points $(1,0)$ and $(0,1)$ map to after being transformed once by the right matrix and then again by the left matrix. We already know that the right matrix maps those points to its columns, so all we have to do is map those columns according to the left matrix.

An example is shown below.

$\begin{align*} \begin{pmatrix} 5 & 6 \\ 7 & 8 \end{pmatrix} \begin{pmatrix} 1 & 2 \\ 3 & 4 \end{pmatrix} &= \begin{pmatrix} \begin{pmatrix} 5 & 6 \\ 7 & 8 \end{pmatrix} \begin{pmatrix} 1 \\ 3 \end{pmatrix} & \begin{pmatrix} 5 & 6 \\ 7 & 8 \end{pmatrix} \begin{pmatrix} 2 \\ 4 \end{pmatrix} \end{pmatrix} \\ &= \begin{pmatrix} \begin{pmatrix} 23 \\ 31 \end{pmatrix} \begin{pmatrix} 34 \\ 46 \end{pmatrix} \end{pmatrix} \\ &= \begin{pmatrix} 23 & 34 \\ 31 & 46 \end{pmatrix} \end{align*}$


We can verify that multiplying a vector by this new matrix gives the same result as multiplying the vector first by the original right matrix, and then by the original left matrix.

$\begin{align*} &\begin{pmatrix} 5 & 6 \\ 7 & 8 \end{pmatrix} \begin{pmatrix} 1 & 2 \\ 3 & 4 \end{pmatrix} \begin{pmatrix} 1 \\ 2 \end{pmatrix} = \begin{pmatrix} 5 & 6 \\ 7 & 8 \end{pmatrix} \begin{pmatrix} 5 \\ 11 \end{pmatrix} = \begin{pmatrix} 91 \\ 123 \end{pmatrix} \\ &\begin{pmatrix} 5 & 6 \\ 7 & 8 \end{pmatrix} \begin{pmatrix} 1 & 2 \\ 3 & 4 \end{pmatrix} \begin{pmatrix} 1 \\ 2 \end{pmatrix} = \begin{pmatrix} 23 & 34 \\ 31 & 46 \end{pmatrix} \begin{pmatrix} 1 \\ 2 \end{pmatrix} = \begin{pmatrix} 91 \\ 123 \end{pmatrix} \end{align*}$


Case of Rectangular Matrices

Matrix multiplication isn’t limited to just square matrices. The matrices can be rectangular, too.

$\begin{align*} \begin{pmatrix} 1 & 2 & 3 \\ 4 & 5 & 6 \\ 7 & 8 & 9 \\ 10 & 11 & 12 \end{pmatrix} \begin{pmatrix} 1 & 2 \\ 3 & 4 \\ 5 & 6 \end{pmatrix} &= \begin{pmatrix} \begin{pmatrix} 1 & 2 & 3 \\ 4 & 5 & 6 \\ 7 & 8 & 9 \\ 10 & 11 & 12 \end{pmatrix} \begin{pmatrix} 1 \\ 3 \\ 5 \end{pmatrix} &\begin{pmatrix} 1 & 2 & 3 \\ 4 & 5 & 6 \\ 7 & 8 & 9 \\ 10 & 11 & 12 \end{pmatrix} \begin{pmatrix} 2 \\ 4 \\ 6 \end{pmatrix} \end{pmatrix} \\ &= \begin{pmatrix} \begin{pmatrix} 22 \\ 49 \\ 76 \\ 103 \end{pmatrix} \begin{pmatrix} 28 \\ 64 \\ 100 \\ 136 \end{pmatrix} \end{pmatrix} \\ &= \begin{pmatrix} 22 & 28 \\ 49 & 64 \\ 76 & 100 \\ 103 & 136 \end{pmatrix} \end{align*}$


But notice that if we switch the above example around, it no longer makes sense to multiply the matrices, because we are unable to multiply each column vector in the right matrix by the left matrix. There are fewer columns in the left matrix than there are entries in each column of the right matrix.

$\begin{align*} \begin{pmatrix} 1 & 2 \\ 3 & 4 \\ 5 & 6 \end{pmatrix} \begin{pmatrix} 1 & 2 & 3 \\ 4 & 5 & 6 \\ 7 & 8 & 9 \\ 10 & 11 & 12 \end{pmatrix} &= \begin{pmatrix} \begin{pmatrix} 1 & 2 \\ 3 & 4 \\ 5 & 6 \end{pmatrix} \begin{pmatrix} 1 \\ 4 \\ 7 \\ 10 \end{pmatrix} &\begin{pmatrix} 1 & 2 \\ 3 & 4 \\ 5 & 6 \end{pmatrix} \begin{pmatrix} 2 \\ 5 \\ 8 \\ 11 \end{pmatrix} &\begin{pmatrix} 1 & 2 \\ 3 & 4 \\ 5 & 6 \end{pmatrix} \begin{pmatrix} 3 \\ 6 \\ 9 \\ 12 \end{pmatrix} \end{pmatrix} \\ &= \begin{pmatrix} ? & ? & ? \end{pmatrix} \end{align*}$


Criterion for Multiplication

The trick to telling whether matrix multiplication is defined in a particular case is to check whether the width of the left matrix matches the height of the right matrix.

Matrix dimensions are usually written as $\text{height } \times \text{ width}$, so matrix multiplication is defined whenever the inner dimensions match up.

For example, in the multiplication

$\begin{align*} \begin{pmatrix} 1 & 2 & 3 \\ 4 & 5 & 6 \\ 7 & 8 & 9 \\ 10 & 11 & 12 \end{pmatrix} \begin{pmatrix} 1 & 2 \\ 3 & 4 \\ 5 & 6 \end{pmatrix} = \begin{pmatrix} 22 & 28 \\ 49 & 64 \\ 76 & 100 \\ 103 & 136 \end{pmatrix} \end{align*}$


the left matrix has dimensions $4 \times 3$ and the right matrix has dimensions $3 \times 2$.

Writing these dimensions in the order of multiplication, we see that the inner dimensions do indeed match up: they’re $3$ and $3$.

$\begin{align*} (4 \times 3) \times (3 \times 2) \end{align*}$


Moreover, the outer dimensions give the dimensions of the resulting product: $4 \times 2$.

On the other hand, the matrices in the multiplication

$\begin{align*} \begin{pmatrix} 1 & 2 \\ 3 & 4 \\ 5 & 6 \end{pmatrix} \begin{pmatrix} 1 & 2 & 3 \\ 4 & 5 & 6 \\ 7 & 8 & 9 \\ 10 & 11 & 12 \end{pmatrix} = \hspace{.25cm} ? \end{align*}$


have dimensions $(3 \times 2) \times (4 \times 3)$. The inner dimensions don’t match up: they’re $2$ and $4$. Therefore, the matrix multiplication is not defined.

Notice the implications for square matrices: multiplication is defined for square matrices only when they both have the same dimensions, say $N \times N$, and multiplication remains defined even if we switch the order of the square matrices, because the dimensions of the product stay the same:

$\begin{align*} (N \times N) \times (N \times N) \end{align*}$


Moreover, the output is itself a square matrix of the same dimension, $N \times N$.

Non-Commutativity

Even for square matrices, though, matrix multiplication is generally not commutative – if we switch the order of two matrices in a product, we tend to get a different result.

For example, switching the two matrices in the most recent example yields a different result:

$\begin{align*} \begin{pmatrix} 1 & 2 \\ 3 & 4 \end{pmatrix} \begin{pmatrix} 5 & 6 \\ 7 & 8 \end{pmatrix} &= \begin{pmatrix} 19 & 22 \\ 43 & 50 \end{pmatrix} \\ \text{ } \\ \begin{pmatrix} 5 & 6 \\ 7 & 8 \end{pmatrix} \begin{pmatrix} 1 & 2 \\ 3 & 4 \end{pmatrix} &= \begin{pmatrix} 23 & 34 \\ 31 & 46 \end{pmatrix} \end{align*}$


Even simple matrices generally do not commute:

$\begin{align*} \begin{pmatrix} 1 & 0 \\ 0 & 2 \end{pmatrix} \begin{pmatrix} 0 & 1 \\ 1 & 0 \end{pmatrix} &= \begin{pmatrix} 0 & 1 \\ 2 & 0 \end{pmatrix} \\ \text{ } \\ \begin{pmatrix} 0 & 1 \\ 1 & 0 \end{pmatrix} \begin{pmatrix} 1 & 0 \\ 0 & 2 \end{pmatrix} &= \begin{pmatrix} 0 & 2 \\ 1 & 0 \end{pmatrix} \end{align*}$


The reason matrices tend not to commute is that left-multiplication and right-multiplication have different interpretations: left-multiplication sums combinations of row vectors, whereas right-multiplication sums combinations of column vectors.

Applying some operation to the rows of a matrix is generally not the same as applying that operation to the columns of a matrix.

$\begin{align*} &\textbf{Left-multiplication of } \begin{pmatrix} a & b \\ c & d \end{pmatrix} \textbf{ by } \begin{pmatrix} A & B \\ C & D \end{pmatrix} \\ &\begin{pmatrix} A & B \\ C & D \end{pmatrix} \begin{pmatrix} a & b \\ c & d \end{pmatrix} \\ &\hspace{.25cm}= \begin{pmatrix} Aa+Bc & Ab+Bd \\ Ca+Dc & Cb+Dd \end{pmatrix} \\ &\hspace{.25cm}= \begin{pmatrix} A\left< a,b \right> + B \left< c,d \right> \\ C \left< a,b \right> + D \left< c,d \right> \end{pmatrix} \end{align*}$


$\begin{align*} &\textbf{Right-multiplication of } \begin{pmatrix} a & b \\ c & d \end{pmatrix} \textbf{ by } \begin{pmatrix} A & B \\ C & D \end{pmatrix} \\ &\begin{pmatrix} a & b \\ c & d \end{pmatrix} \begin{pmatrix} A & B \\ C & D \end{pmatrix} \\&\hspace{.25cm}= \begin{pmatrix} aA+bC & aB+bD \\ cA+dC & cB+dD \end{pmatrix} \\ &\hspace{.25cm}= \left( A\begin{pmatrix} a \\ c \end{pmatrix} + C\begin{pmatrix} b \\ d \end{pmatrix} \hspace{.25cm} B\begin{pmatrix} a \\ c \end{pmatrix} + D\begin{pmatrix} b \\ d \end{pmatrix} \right) \end{align*}$


Diagonal Matrices

That being said, there are some instances in which matrices do commute.

For example, diagonal matrices commute with each other. (A diagonal matrix consists of zero everywhere except the diagonal running from the top-left entry to the bottom-right entry.)

$\begin{align*} \begin{pmatrix} 1 & 0 \\ 0 & 2 \end{pmatrix} \begin{pmatrix} 3 & 0 \\ 0 & 4 \end{pmatrix} &= \begin{pmatrix} 3 & 0 \\ 0 & 8 \end{pmatrix} \\ \begin{pmatrix} 3 & 0 \\ 0 & 4 \end{pmatrix} \begin{pmatrix} 1 & 0 \\ 0 & 2 \end{pmatrix} &= \begin{pmatrix} 3 & 0 \\ 0 & 8 \end{pmatrix} \end{align*}$


Diagonal matrices commute with each other because the diagonal components end up being multiplied independently as scalars rather than vectors, and scalar multiplication does in fact commute.

$\begin{align*} \begin{pmatrix} A & 0 \\ 0 & B \end{pmatrix} \begin{pmatrix} a & 0 \\ 0 & b \end{pmatrix} = \begin{pmatrix} Aa & 0 \\ 0 & Bb \end{pmatrix} = \begin{pmatrix} aA & 0 \\ 0 & bB \end{pmatrix} = \begin{pmatrix} a & 0 \\ 0 & b \end{pmatrix} \begin{pmatrix} A & 0 \\ 0 & B \end{pmatrix} \end{align*}$


Be aware, though, that antidiagonal matrices generally do not commute with each other. (An antidiagonal matrix is like a diagonal matrix, but with the diagonal running from top-right to bottom-left.)

$\begin{align*} \begin{pmatrix} 0 & 2 \\ 1 & 0 \end{pmatrix} \begin{pmatrix} 0 & 4 \\ 3 & 0 \end{pmatrix} &= \begin{pmatrix} 6 & 0 \\ 0 & 4 \end{pmatrix} \\ \text{ } \\ \begin{pmatrix} 0 & 4 \\ 3 & 0 \end{pmatrix} \begin{pmatrix} 0 & 2 \\ 1 & 0 \end{pmatrix} &= \begin{pmatrix} 4 & 0 \\ 0 & 6 \end{pmatrix} \end{align*}$


Exercises

Compute the product of the given matrices, if possible, using A) the left-multiplication interpretation, and B) the right-multiplication interpretation.

Otherwise, if it is not possible to compute the product, then state the dimensions that A) the left matrix would need to have for the multiplication to be defined, or B) that the right matrix would need to have for the multiplication to be defined. (You can view the solution by clicking on the problem.)

$\begin{align*} 1) \hspace{.5cm} \begin{pmatrix} 3 & -1 \\ 1 & -2 \end{pmatrix} \begin{pmatrix} 2 & 3 \\ 0 & 5 \end{pmatrix} \end{align*}$
Solution:
$\begin{align*} \begin{pmatrix} 6 & 4 \\ 2 & -7 \end{pmatrix} \end{align*}$


$\begin{align*} 2) \hspace{.5cm} \begin{pmatrix} 4 & 0 \\ -2 & 3 \\ 3 & 1 \\ 1 & 5 \end{pmatrix} \begin{pmatrix} 3 & 1 & 1 \\ 4 & 5 & 1 \end{pmatrix} \end{align*}$
Solution:
$\begin{align*} \begin{pmatrix} 12 & 4 & 4 \\ 6 & 13 & 1 \\ 13 & 8 & 4 \\ 23 & 26 & 6 \end{pmatrix} \end{align*}$


$\begin{align*} 3) \hspace{.5cm} \begin{pmatrix} -4 & 1 \\ 2 & 3 \\ -1 & 3 \end{pmatrix} \begin{pmatrix} 2 & 2 \\ 0 & 4 \\ 7 & -3 \end{pmatrix} \end{align*}$
Solution:
$\begin{align*} &\mbox{A) } N \times 3 \\ &\mbox{B) } 2 \times N \end{align*}$


$\begin{align*} 4) \hspace{.5cm} \begin{pmatrix} 4 & 3 & 0 \\ 2 & 1 & -2 \end{pmatrix} \begin{pmatrix} 1 & 2 & 1 \\ 0 & -1 & 2 \\ -1 & 1 & 0 \end{pmatrix} \end{align*}$
Solution:
$\begin{align*} \begin{pmatrix} 4 & 5 & 10 \\ 4 & 1 & 4 \end{pmatrix} \end{align*}$


$\begin{align*} 5) \hspace{.5cm} \begin{pmatrix} 1 & 3 & 2 & -1 \end{pmatrix} \begin{pmatrix} 2 & 0 \\ 4 & 3 \\ 2 & 6 \\ 1 & 1 \end{pmatrix} \end{align*}$
Solution:
$\begin{align*} \begin{pmatrix} 17 & 20 \end{pmatrix} \end{align*}$


$\begin{align*} 6) \hspace{.5cm} \begin{pmatrix} 2 & 1 \\ 4 & 0 \\ 1 & -1 \end{pmatrix} \begin{pmatrix} 3 & 2 & 1 \end{pmatrix} \end{align*}$
Solution:
$\begin{align*} &\mbox{A) } N \times 1 \\ &\mbox{B) } 2 \times N \end{align*}$


$\begin{align*} 7) \hspace{.5cm} \begin{pmatrix} 1 & 2 & 3 \end{pmatrix} \begin{pmatrix} 3 \\ 2 \\ 1 \end{pmatrix} \end{align*}$
Solution:
$\begin{align*} \begin{pmatrix} 10 \end{pmatrix} \end{align*}$


$\begin{align*} 8) \hspace{.5cm} \begin{pmatrix} 3 \\ 2 \\ 1 \end{pmatrix} \begin{pmatrix} 1 & 2 & 3 \end{pmatrix} \end{align*}$
Solution:
$\begin{align*} \begin{pmatrix}3 & 6 & 9 \\ 2 & 4 & 6 \\ 1 & 2 & 3 \end{pmatrix} \end{align*}$


$\begin{align*} 9) \hspace{.5cm} \begin{pmatrix} 1 & 4 & 1 \\ 0 & 5 & 0 \\ 4 & 4 & 3 \end{pmatrix} \begin{pmatrix} -1 & 2 \\ 3 & -2 \\ 1 & 1 \end{pmatrix} \end{align*}$
Solution:
$\begin{align*} \begin{pmatrix} 12 & -5 \\ 15 & -10 \\ 11 & 3 \end{pmatrix} \end{align*}$


$\begin{align*} 10) \hspace{.5cm} \begin{pmatrix} 3 & 5 \\ 3 & -2 \end{pmatrix} \begin{pmatrix} 3 & 1 & 4 & 0 \\ 3 & 1 & 1 & -1 \end{pmatrix} \end{align*}$
Solution:
$\begin{align*} \begin{pmatrix} 24 & 8 & 17 & -5 \\ 3 & 1 & 10 & 2 \end{pmatrix} \end{align*}$



This post is a chapter in the book Justin Math: Linear Algebra. Suggested citation: Skycak, J. (2019). Matrix Multiplication. Justin Math: Linear Algebra. https://justinmath.com/matrix-multiplication/