|
Question 106547: Sorry, last question!
Prove that if the columns of A are linearly independent, then they must for a basis for col(A).
the only thing that i could think of was to take a matrix and reduce it but you can use an example to prove something (they don't give A and they don't say if it is a 2x2 or a 4x4 or anything). I have no idea where to go from here.
Thanks in advance!
Answer by jim_thompson5910(35256) (Show Source):
You can put this solution on YOUR website! By definition
Col(A)=Column Space of A=span{columns of A}
Remember a basis is a linearly independent set which spans the given subspace (which in this case is the column space of A).
Since the span the of the columns of A is Col(A), this means one of the two properties for a basis has been satisfied (that the set of vectors must span the column space of A)
So if we're assuming that the columns of A are linearly independent, this means the other property is satisfied (that a basis is linearly independent).
Since this assumption satisfies both of these properties, the columns of A forms a basis for Col(A) (that's if the columns of A are independent)
|
|
|
| |