We met the idea of a linear combination of column vectors in chapter 3. Here it is for elements of an arbitrary vector space.
Let be a vector space and . A linear combination of is an element of of the form
where the are scalars.
Let be a vector space.
A sequence of elements of is linearly independent if and only if the only scalars such that are
A sequence which is not linearly independent is called linearly dependent.
It is important that linear independence is a property of sequences (not sets) of vectors. Sequences have a particular order, and they can contain the same element multiple times.
Checking whether elements of a vector space are linearly independent is simple. You just have to try and find a linear combination that gives the zero vector where not all the scalars are zero. If you can do it, the sequence is linearly dependent, if you can’t it is linearly independent. When we’re talking about vectors in , or matrices, this is just solving linear equations.
, are not linearly independent in , because .
are linearly independent in . For if then . This is a system of linear equations:
For such a simple system it’s easy to see that the only solution is . This tells you that the only solution to is , which is the definition of linear independence for .
and are linearly independent in . You can prove this in a similar (but easier) way to the previous example.
More generally if is the height column vector with 0 everywhere except 1 at position , then the sequence is linearly independent.
In , the vector space of all functions , I claim that the functions and are linearly independent. Suppose that , that is, suppose for all .
Take . Since we get . Now take to get , that is . We have shown and so these functions are linearly independent.
Often it turns out that deciding whether a sequence of vectors is linearly independent is equivalent to seeing whether a system of linear equations has only the solution where every variable is zero — so you can apply the methods we learned in chapter 3.