Linear dependence of two vectors

In summary, we have discussed the concept of linear independence and dependence between vectors. We have also considered the situation where a new vector is formed as a linear combination of two other vectors, and whether it will be linearly independent to the original vectors. The discussion led to a question about the rank of a matrix and whether it remains the same when a row is replaced by another linear combination of rows. We have concluded that the span of a set of vectors remains unchanged when a linear combination is formed, and this does not necessarily affect the number of linearly independent vectors in the set.
  • #1
blue_leaf77
Science Advisor
Homework Helper
2,637
786
Suppose the vectors ##v_a## and ##v_b## are linearly independent, another vector ##v_c## is linearly dependent to both ##v_a## and ##v_b##. Now if I form a new vector ##v_d##, where ##v_d = v_b+cv_c## with ##c## a constant, will ##v_d## be linearly independent to ##v_a##?
I need to check how I can satisfy the equation
$$C_1 v_a + C_2 v_d = 0$$
If that's only possible when ##C_1 = C_2 = 0##, then ##v_a## and ##v_d## are linearly independent, but I don't think that's necessarily the case.
 
Physics news on Phys.org
  • #2
blue_leaf77 said:
Suppose the vectors ##v_a## and ##v_b## are linearly independent, another vector ##v_c## is linearly dependent to both ##v_a## and ##v_b##. Now if I form a new vector ##v_d##, where ##v_d = v_b+cv_c## with ##c## a constant, will ##v_d## be linearly independent to ##v_a##?
I need to check how I can satisfy the equation
$$C_1 v_a + C_2 v_d = 0$$
If that's only possible when ##C_1 = C_2 = 0##, then ##v_a## and ##v_d## are linearly independent, but I don't think that's necessarily the case.

Do you mean ##v_c## is linearly independent to both ##v_a## and ##v_b##?

If so, it's clear that ##v_a## and ##v_d## are linearly independent.
 
  • #3
In general I want ##v_c## to be arbitrary vector (in the same vector space as ##v_a## and ##v_b##).
Actually this question arose when I read about the rank of a matrix. In the book I read, it says that when a row of a matrix ##A##, let's say ##v_i## is replaced by ##v_i + cv_j## where ##c## constant and ##v_j## another row of the same matrix, then the rank of the newly formed matrix (after replacement) is the same as that of ##A##. That's the same as saying that the number of linearly independent rows in the new matrix after the replacement is the same as the old matrix ##A##. I'm trying to prove that it's true.
 
Last edited:
  • #4
blue_leaf77 said:
In general I want ##v_c## to be arbitrary vector (in the same vector space as ##v_a## and ##v_b##).
Actually this question arose when I read about the rank of a matrix. In the book I read, it says that when a row of a matrix ##A##, let's say ##v_i## is replaced by ##v_i + cv_j## where ##c## constant and ##v_j## another row of the same matrix, then the rank of the newly formed matrix (after replacement) is the same as that of ##A##. That's the same as saying that the number of linearly independent rows in the new matrix after the replacement is the same as the old matrix ##A##. I'm trying to prove that it's true.

In that case, you can consider the span of ##v_i## and ##v_j## and the span of ##v_i + cv_j## and ##v_j##.

These are equal, so the span of the rows is unchanged.
 
  • Like
Likes blue_leaf77
  • #5
But ##v_i## and ##v_j## can be arbitrary pair of rows, they need not belong to the linearly independent sets of vectors which determine the rank of A. In the case when they are not linearly independent, does it also make sense to say that they span some space?
 
  • #6
blue_leaf77 said:
But ##v_i## and ##v_j## can be arbitrary pair of rows, they need not belong to the linearly independent sets of vectors which determine the rank of A. In the case when they are not linearly independent, does it also make sense to say that they span some space?

Yes, absolutely. There are lots of ways to prove this for arbitrary vectors. One way is simply to equate any linear combination in each case. Another is to think of the two vectors in terms of the basis vectors needed to represent them: for whatever basis you choose.
 
  • #7
blue_leaf77 said:
But ##v_i## and ##v_j## can be arbitrary pair of rows, they need not belong to the linearly independent sets of vectors which determine the rank of A. In the case when they are not linearly independent, does it also make sense to say that they span some space?

Yes, you can prove this by induction. If ##v_1, v_2## are independent, then show that the set { ## av_1+bv_2##} is a subspace, for ##a,b## in the base field ##K##. Now, you can generalize to ##n ## vectors.
 
  • #8
blue_leaf77 said:
But ##v_i## and ##v_j## can be arbitrary pair of rows, they need not belong to the linearly independent sets of vectors which determine the rank of A. In the case when they are not linearly independent, does it also make sense to say that they span some space?
Any set of vectors "span some space". If they happen to be linearly independent, then they form a basis for that space. If they are dependent, then some vectors in the set can be written as a linear combination of the others. Dropping those vectors from the set will give a basis for the space.
 
  • #9
PeroK said:
In that case, you can consider the span of ##v_i## and ##v_j## and the span of ##v_i + cv_j## and ##v_j##.
These are equal, so the span of the rows is unchanged.
Ok I think this is caused by my lack of understanding of the definition of the span of a set of vectors. I just found both in wikipedia and my textbook that
"The set of all linear combinations of ##(v_1, \dots , v_m)## is called the span of ##(v_1, \dots , v_m)##"
There it doesn't mention that all vectors in that list must be linearly independent. Now I agree with what you said there.

But then what does it imply with the unchanging of the rank of A if the span of its rows remains the same? Does it automatically imply that the number of linearly independent rows in A also stays the same?
 
  • #10
blue_leaf77 said:
Ok I think this is caused by my lack of understanding of the definition of the span of a set of vectors. I just found both in wikipedia and my textbook that
"The set of all linear combinations of ##(v_1, \dots , v_m)## is called the span of ##(v_1, \dots , v_m)##"
There it doesn't mention that all vectors in that list must be linearly independent. Now I agree with what you said there.

But then what does it imply with the unchanging of the rank of A if the span of its rows remains the same? Does it automatically imply that the number of linearly independent rows in A also stays the same?

The row rank of a matrix is the dimension of the span of of its rows = the number of linearly independent rows = the number of basis vectors needed to represent the span of all the rows.

You can prove the result you want using any of these equivalent properties. But, perhaps using the last is the easiest, since:

The span of ##v_i + cv_j## and ##v_j## requires precisely the same set of basis vectors to represent them as the span of ##v_i## and ##v_j## and this doesn't affect the basis vectors required to represent the span of the other row vectors. Hence, the row rank is unchanged by this row operation.

PS in the end, however, you prove it, it all hinges on the fact that all linear combinations of ##v_i + cv_j## and ##v_j## are linear combinations of ##v_i## and ##v_j## and vice versa.
 
Last edited:
  • Like
Likes blue_leaf77
  • #11
Thanks that really helps.
 
  • #12
PeroK said:
the number of linearly independent rows = the number of basis vectors needed to represent the span of all the rows.
Does this mean that I can represent any row as a linear combination of the linearly independent rows?
 
  • #13
blue_leaf77 said:
Suppose the vectors ##v_a## and ##v_b## are linearly independent, another vector ##v_c## is linearly dependent to both ##v_a## and ##v_b##. Now if I form a new vector ##v_d##, where ##v_d = v_b+cv_c## with ##c## a constant, will ##v_d## be linearly independent to ##v_a##?
Consider the simplest case, va=(1,0); vb=(0,1).
Define vc = va - vb and vd = vb + 1*vc = va.
 

FAQ: Linear dependence of two vectors

1. What is linear dependence of two vectors?

Linear dependence of two vectors refers to the relationship between two vectors where one vector can be written as a linear combination of the other vector. This means that one vector is a scalar multiple of the other vector, or they are parallel to each other.

2. How can I determine if two vectors are linearly dependent?

To determine if two vectors are linearly dependent, you can use the following formula: c1v1 + c2v2 = 0, where c1 and c2 are scalars and v1 and v2 are the two vectors. If there exist non-zero values for c1 and c2 that satisfy the equation, then the vectors are linearly dependent. Another way to determine linear dependence is to calculate the determinant of the matrix formed by the two vectors. If the determinant is equal to 0, then the vectors are linearly dependent.

3. What is the significance of linear dependence in linear algebra?

Linear dependence is an important concept in linear algebra because it helps us understand the relationship between vectors and their span. If two vectors are linearly dependent, it means that they lie on the same line and do not add any new information to the vector space. This can affect the number of solutions to a system of linear equations and the dimension of the vector space.

4. Can three or more vectors be linearly dependent?

Yes, three or more vectors can also be linearly dependent. In fact, any number of vectors can be linearly dependent if one vector can be written as a linear combination of the other vectors. For example, in three-dimensional space, three vectors can be linearly dependent if they lie on the same plane.

5. How do I know if a set of vectors is linearly independent?

A set of vectors is linearly independent if none of the vectors can be written as a linear combination of the other vectors. In other words, if the only solution to the equation c1v1 + c2v2 + ... + cnvn = 0 is when all the scalars c1, c2, ..., cn are equal to 0. Another way to check for linear independence is by calculating the determinant of the matrix formed by the vectors. If the determinant is non-zero, then the vectors are linearly independent.

Similar threads

Replies
3
Views
1K
Replies
4
Views
2K
Replies
6
Views
2K
Replies
3
Views
1K
Replies
3
Views
593
Back
Top