Linear Independence/Dependence of Vectors

In summary: You can't say that because they are just equations. You could have easily solved for 3k1, 3k2, and 3k3. You have to come to a conclusion using the definitions.The given equations are a system of linear equations. A system of linear equations can have a unique solution, infinite solutions, or no solution. In this case, we are looking for the values of k1, k2, and k3 that make the system consistent, which means that it has at least one solution. If the system only has the trivial solution (k1 = k2 = k3 = 0), then the vectors are linearly independent. If there are other solutions, then the vectors are linearly dependent.
  • #1
FancyChancey
4
0
Hi all. I'm having a really tough time figuring out how to solve this problem:

Suppose that {v1, v2, v3} are linearly independent vectors in R7.
If
a1 = v1 + 2v2

a2 = 3v2 – v3

a3 = v1 – v2 + v3,

determine directly from the definitions whether the vectors {a1, a2, a3} are linearly independent or linearly dependent.

Can anyone help me? I know what linear dependence and linear independence are, and I know how to check for either using Gauss-Jordan elimination. But I'm not sure where to start on this problem.
 
Physics news on Phys.org
  • #2
In this case, you are given a1, a2 and a3 in terms of v1...v3. So that means to say you can represent a_i, where i is 1,2 or 3, in terms of a coordinate matrix of a_i with respect to the ordered set (or basis B) of {v1, v2, v3}. So to start yourself off, write out the respective coordinate vectors of a_i.

Then invoke the definition of linear independence:

a1,a2,a3 are linearly independent iff

k1a1 + k2a2 + k3a3 = 0 (the zero coordinate vector with respect to basis B)

only has the trivial solution for all ki.

So from the above, you can write out a square matrix from which you can then apply those techniques you know to determine if only the trivial solution exists.
 
  • #3
As Defennder said, the definition of "independent" says that these vectors are independent if and only if [itex]k_1a_1+ k_2a_2+ k_3a_3= 0[/itex] implies [itex]a_1= a_2= a_3= 0[/itex]. Since you are given that [itex]a_1= v_1+ 2v_2[/itex], [itex]a_2= 3v_2- v_3[/itex] and [itex]a_3= v_1- v_2+ v_3[/itex] , that equation becomes
[itex]k_1(v_1+ 2v_2)+ k_2(3v_2- v_3)+ k_3(v_1- v_2+ v_3)= 0[/itex]
Multiplying that out and combining "like" terms (combining same vn), will give coefficients in terms of k1, k2, and k3 mutltiplying v1, v2, and v3 equal to 0. Since you are given that v1, v2, and v3 are independent, their coefficients MUST be 0. That gives you 3 equations for k1, k2, and k3. Solve those equations. If you get that they are all 0, the vectors are independent. That is basically just what Defennder said but you did say "directly from the definitions" and I consider this more "fundamental" than reducing matrices as Defennder suggested.
 
  • #4
Let A = [2 0 -1 1 -3
1 1-3 0 -2
1 0 -1 -1 3]
Find linearly independant vectors {u1,u3,...,um} such that row(A) = span{u1,u2,...,um}.
Note: You must justify that your vectors are linearly independant.

I have know idea how to even begin. Could someone really help me out here? Thanks heaps.
 
  • #5
squenshl said:
Let A = [2 0 -1 1 -3
1 1-3 0 -2
1 0 -1 -1 3]
Find linearly independant vectors {u1,u3,...,um} such that row(A) = span{u1,u2,...,um}.
Note: You must justify that your vectors are linearly independant.

I have know idea how to even begin. Could someone really help me out here? Thanks heaps.
Well, the first thing you would have to do is tell us what u1, u2, ..., um are! Here you give a matrix A, but the problem says nothing about A or any matrix. Are we to assume that u1, u2, ..., um are the rows of that matrix? The columns?
 
  • #6
I am trying to find {u1,u2,...,um} so that col(A) = span{u1,u2,...,um}
 
  • #7
I was learning about the same topic and found this problem.
As per HallsofIvy,

k1(1v1 + 2v2 + 0v3) + k2(0v1 + 3v2 - 1v3) + k3(1v1 - 1v2 + 1v3) = 0

rearranging...

v1(1k1 + 0k2 + 1k3) + v2(2k1 + 3k2 - 1k3) + v3(0k1 - 1k2 + 1k3) = 0

now, coefficients of v must be 0

1k1 + 0k2 + 1k3 = 0
2k1 + 3k2 - 1k3 = 0
0k1 - 1k2 + 1k3 = 0

which leaves me with

1k1 + 1k3 = 0
3k2 - 3k3 = 0

as you see, these are just 2 equations with 3 different variables. Can you say just from this that they are dependent?
 

FAQ: Linear Independence/Dependence of Vectors

What is the definition of linear independence/dependence of vectors?

Linear independence/dependence of vectors refers to the relationship between two or more vectors in a vector space. Linearly independent vectors are those that cannot be written as a linear combination of each other, while linearly dependent vectors can be expressed as a linear combination of each other.

How do you determine if a set of vectors are linearly independent?

To determine if a set of vectors are linearly independent, you can use the determinant method. Arrange the vectors as columns in a matrix, then find the determinant of the matrix. If the determinant is equal to zero, the vectors are linearly dependent. If the determinant is not equal to zero, the vectors are linearly independent.

What is the geometric interpretation of linear independence/dependence?

The geometric interpretation of linear independence/dependence is that linearly independent vectors span different directions in a vector space, while linearly dependent vectors lie on the same line or plane in a vector space.

Can a set of three or more vectors be linearly dependent?

Yes, a set of three or more vectors can be linearly dependent. The number of vectors in a set does not determine if they are linearly independent or dependent. It is the relationship between the vectors that determines their linear dependence or independence.

How is linear independence/dependence related to the solution space of a system of linear equations?

If a set of vectors is linearly independent, then the corresponding system of linear equations has a unique solution. If the vectors are linearly dependent, then the system of linear equations has infinitely many solutions or no solutions at all.

Back
Top