We are going to prove the following theorem:
A square matrix is invertible if and only if there is a sequence of row operations taking to the identity matrix.
We need a lemma to make the proof work.
Let be an matrix in RREF. Either or has a column with no leading entry.
Suppose every column has a leading entry, so there are leading entries. There’s at most one leading entry per row and there are rows, so every row must have a leading entry.
The leading entries go from left to right as we move down the rows of the matrix, so the leading entries in row 1, 2, …, must be in columns 1, 2, … otherwise there would be no room to fit them in.
Because is in RREF, columns with leading entries have zeroes in all other positions. So the first column is , the second column is , and so on. These are the columns of the identity matrix, so . ∎
Now we can prove the theorem.
Suppose there is a sequence of row operations taking to , say . Let , the elementary matrix associated to . Then
since we know from Theorem 3.8.1 that doing is the same as left-multiplication by . Every elementary matrix is invertible by Corollary 3.8.2. The matrix is invertible as it is a product of invertible matrices (Theorem 3.5.3). , so which is invertible (with inverse ).
Conversely suppose there is no sequence of row operations taking to . We can do a sequence of row operations to any matrix and end up with a RREF matrix, so when we do this to , the RREF matrix we get cannot be .
Our lemma tells us that in this case has a column with no leading entry, so there are or fewer leading entries, so there’s a row with no leading entry, that is, a zero row. So isn’t invertible by Theorem 3.5.2.
As before, there’s an invertible matrix such that . By Corollary 3.5.4, isn’t invertible. ∎
A square matrix is invertible if and only if the only solution to is .
If is invertible and then .
If is not invertible, we can do a sequence of row operations to ending with a RREF matrix which cannot be the identity because of Theorem 3.12.1.