Author:
c o

- to introduce matrix multiplication

- to determine when two matrices can be multiplied

- to know how to find the shape of a product of matrices

- to be introduced the algebraic properties of matrix multiplication

Row and column vector multiplication is introduced as a stepping stone before getting to matrix multiplication. An example is worked in a short video before the algebraic properties of matrix multiplication are covered.

Tutorial

Before you begin with this lesson, you ought to be comfortable with the definition of a matrix, with adding and subtracting matrices, and with multiplying matrices by scalar values. These topics are covered here, feel free to review.

There is no easy way to approach the multiplication of matrices. While the concept itself is not difficult to grasp, the procedure does seem unnecessarily tedious when it is first encountered. So how do you multiply matrices?

To start, we will look at how to multiply vectors. A **vector **is just a matrix that consists entirely in either a single row or a single column. In the first case, the matrix is called a **row vector**, and in the second it is called a **column vector**. The multiplication of vectors is just a simpler version of the same process that is used to multiply larger matrices.

To multiply a **1 x n** matrix (a row vector with

In general, we can express the dot product like this:

It is just the of the products of the two vectors' entries.

Now that we have seen an example using row and column vectors, we'll move on to the real thing. Here is an example. Try to get a sense for what is happening just by looking, but don't worry, an explanation follows:

**We find the entries in the product matrix by taking rows from the first matrix and columns from the second matrix and calculating their dot products**. If * a_{i,j }*is the entry in the product found in the

An example matrix multiplication is worked through in full.

Source: Colin O'Keefe, youtube

Unlike ordinary multiplication of numbers, in which * xy = yx* (i.e.

In the example we see that **AB** is not the same matrix as **BA**. This is usually the case with matrix multiplication, but not always. For instance, if **A = B**, then **AB = AA = BB = BA**, so clearly it is possible for commutativity to hold for certain, special matrices, but such special cases are not the norm.

Just like matrix addition, and just like the multiplication of regular numbers, matrix multiplication is associative.

So long as each pair in a chain of factors can be legally multiplied, then the whole chain can be, regardless of where you put the parentheses. In the above, the product **ABC** will be an **m ****x*** s* matrix. We can see this easily -

Finally, just like scalar multiplication and regular multiplication, a matrix factor can be distributed across a sum. E.g.