We are going to prove a theorem connecting invertibility, RREF, and solutions of matrix equations. To do it we need a lemma on the RREF of square matrices.
Let be an matrix in RREF which is not . Then has a column with no leading entry.
We’ll prove the contrapositive: if every column of has a leading entry, then . Suppose every column has a leading entry, so there are leading entries. There’s at most one leading entry per row and there are rows, so every row must have a leading entry.
The leading entries go from left to right as we move down the rows of the matrix, so the leading entries in row 1, 2, …, must be in columns 1, 2, … otherwise there would be no room to fit them in.
Because is in RREF, columns with leading entries have zeroes in all other positions. So the first column is , the second column is , and so on. These are the columns of the identity matrix, so . ∎
Now we can prove the theorem.
For an matrix , the following statements are equivalent.
is invertible.
The only solution to is .
There is a sequence of row operations taking to .
We will show that (1) implies (2), (2) implies (3), and (3) implies (1). When we’ve done this, it’s easy to see that the truth of any one of these three statements implies the truth of any other.
(1) (2). If then multiplying on the left by gives .
(2) (3). Let be a matrix in RREF obtained by doing a sequence of row operations to ; we know such a matrix exists by Theorem 3.10.1. The solutions to are the same as the solutions to by Theorem 3.9.1, so has only the solution . must be , otherwise by Lemma 3.12.1 it would have a column with no leading entry so there would be a free variable in the solutions to and in particular there would be a nonzero solution.
(3) (1). Since there is a sequence of row operations taking to , there is an invertible matrix such that by Theorem 3.8.3. Then , so is invertible with inverse . ∎
We saw in the previous chapter that an arbitrary function could have a left inverse but not a right inverse, or a right inverse but not a left inverse. The same is not true of square matrices.
Let be an matrix. The following are equivalent:
is invertible.
There is a matrix such that .
There is a matrix such that .
We will show that (i) (ii) and then that (i) (iii).
Certainly (i) implies (ii). Conversely, suppose that . Let be any solution of . Multiplying on the left by we get , so is invertible by Theorem 3.12.2.
Again, (i) obviously implies (iii). If then transposing both sides, . By the same argument as above, is invertible, and so is invertible by Theorem 3.5.5. ∎