Linear Dependence Proof for a Set of Vectors

In summary, to prove that a set of vectors S is linearly dependent if and only if one of the vectors in S is a linear combination of all the other vectors in S, one can use the definition of linear independence and the assumption that not all constants in c1v1 + c2v2 + ... + cnvn = 0 are equal to zero. By rearranging the equation and dividing by one of the non-zero constants, it can be shown that this vector is a linear combination of all the other vectors in S. Similarly, if one vector in S can be written as multiples of all the other vectors, then S is linearly dependent.
  • #1
newtomath
37
0
given S is a set of vectors S= (v1,v2,..vn), prove that S is linearly dependent if and only if one of the vectors in S is a linear combination of all the other vectors in S?

Can someone point me in the right direction of how to start this proof? I am completely lost.
 
Physics news on Phys.org
  • #2
A good idea would be to start with the definition of linear dependence/independence.
 
  • #3
Start with a 2-D case where S=(X,Y,A), where A=c1 . X + c2 . Y and then proceed.
 
  • #4
newtomath said:
given S is a set of vectors S= (v1,v2,..vn), prove that S is linearly dependent if and only if one of the vectors in S is a linear combination of all the other vectors in S?

Can someone point me in the right direction of how to start this proof? I am completely lost.

Do you know how to prove an "if and only if" statement (aka "iff")?
"P if and only if Q", or
"P <=> Q"...

Typically, you'll prove each "direction" separately (i.e. "=>" separate from "<=").
So we start by proving that "S linearly dependent => S is a linear combination...".
The way to prove a conditional statement is to assume the first part ("P"), and prove the second part ("Q").

The other guys were right in that you'll want to use the definition of linearly independent. You might consider contradiction.

Most proofs that I have seen rely on the assumption (at some point) that one of the coefficients of the vectors is DIFFERENT from zero. Then you can divide by it and rearrange the terms to get it to "fit" the definition (usually involving a1v1 + ... + anvn = 0 implies that all the a's are equal to zero)
 
  • #5
Thanks guys.

Can you take a peek below and advise if you agree with me?


if S is linearly dependent then S is a linear combination of all the other vectors in S.

s is linearly dependent if constants (not all zero) exist where
c1v1 +c2v2 + c3v3 + cnvn =0

if the answer are all zeros, it is a trivial solution and the vectors are linearly independent.
if one vector in S is equal to the sum of scalar multiples of the other vectors, then it is a linear combination of the other vectors in S.

We can re arrange the equation as so: c1v1= -c2v2 -c3v3 - cnvn. Recall that linearly dependency requires an answer where not all the constants(c1,c2...cn) are zeros, ie...not the trivial solution. So c1v1 above must be a combination of the other vectors of all the other vectors in S.

if s is a linear combination of all the other vectors in S then S is linearly dependent.
A vector v is a linear combination of vector space S if constants exists for the below:
c1v1 +c2v2 + c3v3 + cnvn= v.
If we assume that is true, then s is linearly dependent because a vector v can be written as multiples of all the other vectors.
 
  • #6
newtomath said:
...

We can re arrange the equation as so: c1v1= -c2v2 -c3v3 - cnvn. Recall that linearly dependency requires an answer where not all the constants(c1,c2...cn) are zeros, ie...not the trivial solution. So c1v1 above must be a combination of the other vectors of all the other vectors in S.
...

This is close, but you have to show that they are not all zero.
Since all of the ci are not zero, there exists at least one that is not zero. We can assume, without loss of generality (might want to brush up on that phrase), that it is the first one (c1). Since this is not zero, you can divide by it.
It might help to look at "where you are going". If you want to prove that something is a linear combination, look at that definition, and manipulate your equation until you have the same form as the definition.
 

FAQ: Linear Dependence Proof for a Set of Vectors

What is meant by "linear dependence"?

Linear dependence refers to a mathematical relationship between two or more variables where one variable can be expressed as a linear combination of the others. In other words, one variable can be written as a multiple of the other variables plus a constant.

How do you prove linear dependence?

To prove linear dependence, you must show that there exist coefficients (multipliers) of the variables that, when multiplied by those variables and added together, equal zero. This is known as a linear combination that equals zero. If such coefficients exist, the variables are linearly dependent.

What is the difference between linear dependence and linear independence?

Linear independence refers to a mathematical relationship between two or more variables where no one variable can be expressed as a linear combination of the others. This means that the variables are not dependent on each other and each variable provides unique information. In contrast, linear dependence means that one variable can be expressed as a linear combination of the others, making them dependent on each other.

What is the importance of proving linear dependence?

Proving linear dependence is important in various fields of science, such as physics and chemistry, as it helps to understand the relationships between variables and how they affect each other. It also allows for the simplification of complex equations and can aid in solving systems of equations.

Can linear dependence occur in more than two variables?

Yes, linear dependence can occur in any number of variables. In fact, it is more common to have linear dependence in a system with multiple variables than in a system with only two variables. This is because as the number of variables increases, the chances of one variable being a linear combination of the others also increases.

Back
Top