An matrix is called invertible if and only if there exists an matrix such that .
If there is such a matrix , we can prove that there is only one such matrix :
If and then .
This means that when a matrix is invertible we can talk about the inverse of . We write for the inverse of when it exists.
If an matrix has a row of zeroes, or a column of zeroes, then it is not invertible.
Suppose has a column of zeroes and that is any other matrix. By Theorem 3.2.3, the columns of are times the columns of . In particular, one of these columns is times the zero vector, which is the zero vector. Since one of the columns of is all zeroes, is not the identity.
If has a row of zeroes, we can make a similar argument using Theorem 3.2.6. ∎
If you multiply any number of invertible matrices together, the result is invertible. Recall the shoes-and-socks result about the inverse of a composition of two functions: exactly the same thing is true.
If are invertible matrices then is invertible with inverse .
The proof is the same as for functions: you can simply check that is a two sided inverse to using the associativity property for matrix multiplication.
This theorem has a useful corollary about when matrix products are invertible.
Let and be matrices with invertible. Then is invertible if and only if is invertible, and is invertible if and only if is invertible.
If is invertible then the theorem tells us that so are and .
Suppose is invertible. Certainly is invertible (its inverse is ), so by the theorem is invertible, that is, is invertible. The argument for is similar. ∎
Let be an matrix. Then is invertible if and only if is invertible.