Linear Independence of a Set of Vectors

In summary: The problem is from Friedberg, a widely used textbook. So I don't think the statement could be wrong. Maybe it is just a trivial exercise...I think the idea is to use the trivial case as your base case. That is S has no linearly dependent vectors if S has no linearly dependent vectors.Now you want to prove the iff part via induction / recursive cases. The idea is: for a vector with n entries (and a spanning set with n vectors), basically you can split any set into 2 sets, one with k entries and one with k - n entries, for k = 0, 1, 2, ... n-1, n. (You could actually truncate k
  • #1
Mr Davis 97
1,462
44

Homework Statement


Prove that a set S of vectors is linearly independent if and only if each finite subset of S is linearly independent.

Homework Equations

The Attempt at a Solution



I think that that it would be easier to prove the logically equivalent statement: Prove that a set S of vectors is linearly dependent if and only if there exists a finite subset of S that is linearly dependent.

First assume that a set S is linearly dependent. Then there exists a linear dependence relation among its vectors ##a_1s_1 + \cdots + a_ns_n = 0##, such that not all ##a_1,..., a_n## are zero. Since S is a subset of itself, there must exist a finite subset of S that is linearly dependent.

Second, assume that S has a finite subset S' that is linearly dependent. Since S' is a subset of S, this means that there exists a linear dependence relation among the vectors in S. Thus, S is also linearly dependent.

Does this prove the original statement? It seemed a little too easy.
 
Physics news on Phys.org
  • #2
Mr Davis 97 said:

Homework Statement


Prove that a set S of vectors is linearly independent if and only if each finite subset of S is linearly independent.

Homework Equations

The Attempt at a Solution



I think that that it would be easier to prove the logically equivalent statement: Prove that a set S of vectors is linearly dependent if and only if there exists a finite subset of S that is linearly dependent.

First assume that a set S is linearly dependent. Then there exists a linear dependence relation among its vectors ##a_1s_1 + \cdots + a_ns_n = 0##, such that not all ##a_1,..., a_n## are zero. Since S is a subset of itself, there must exist a finite subset of S that is linearly dependent.

Second, assume that S has a finite subset S' that is linearly dependent. Since S' is a subset of S, this means that there exists a linear dependence relation among the vectors in S. Thus, S is also linearly dependent.

Does this prove the original statement? It seemed a little too easy.

I think the result is wrong: consider ##S = \{ (1,0,0), (0,1,0), (0,0,1), (1,1,1)\}.## Any three of the vectors in ##S## are linearly independent, but ##S## itself is not.
 
  • #3
Ray Vickson said:
I think the result is wrong: consider ##S = \{ (1,0,0), (0,1,0), (0,0,1), (1,1,1)\}.## Any three of the vectors in ##S## are linearly independent, but ##S## itself is not.
But S is a subset of itself
 
  • #4
Mr Davis 97 said:
But S is a subset of itself

Of course I understand that; but if the question meant "including itself" then it is trivial and pointless.
 
  • #5
Ray Vickson said:
Of course I understand that; but if the question meant "including itself" then it is trivial and pointless.
The problem is from Friedberg, a widely used textbook. So I don't think the statement could be wrong. Maybe it is just a trivial exercise...
 
  • #6
I think the idea is to use the trivial case as your base case. That is S has no linearly dependent vectors if S has no linearly dependent vectors.

Now you want to prove the iff part via induction / recursive cases. The idea is: for a vector with n entries (and a spanning set with n vectors), basically you can split any set into 2 sets, one with k entries and one with k - n entries, for k = 0, 1, 2, ... n-1, n. (You could actually truncate k at n//2 + 1, but that's a low level optimization that doesn't really do much so feel free to ignore this comment.)

Each of those subsets must have no linearly dependent vectors. Why? assume the opposite and do a union of the 2 sets, this leads to a contradiction with your base case. Recursively call this logic over and over until you've covered the entire powerset. Also for special handing, if you are evaluating an empty set, this evaluates True... this is a leaf in your recursive tree. In effect you start on row n of Pascal's triangle and work your way to the 0th row.

I was about to start talking about memoization... which is a sign its getting late and that computer code may be seeping into my math.
 
  • #7
Mr Davis 97 said:
Prove that a set S of vectors is linearly independent if and only if each finite subset of S is linearly independent.
How do your text materials define "a set S of vectors is linearly independent"? - and how does this definition apply to a case where S is an infinite set of vectors ?
 

FAQ: Linear Independence of a Set of Vectors

What is the definition of linear independence?

Linear independence refers to a set of vectors that cannot be expressed as a linear combination of each other. In other words, no vector in the set can be written as a sum of multiples of the other vectors in the set.

How do you prove linear independence?

To prove linear independence, you can use the definition of linear independence and the properties of vector operations. Specifically, you can set up a system of equations and use methods such as substitution or elimination to show that the only solution is the trivial solution (all coefficients equal to 0).

What is the difference between linear independence and linear dependence?

Linear independence and linear dependence are opposite concepts. Linear independence refers to a set of vectors that cannot be expressed as a linear combination of each other, while linear dependence refers to a set of vectors that can be written as a linear combination of each other.

Can a set of 3 vectors be linearly independent in a 2-dimensional space?

No, a set of 3 vectors cannot be linearly independent in a 2-dimensional space. This is because in a 2-dimensional space, any set of 3 vectors will be linearly dependent since at least one of the vectors can be expressed as a linear combination of the other two.

Why is proving linear independence important?

Proving linear independence is important because it is a fundamental concept in linear algebra that is used in many applications, including solving systems of linear equations, finding basis vectors, and determining the dimension of a vector space. It also helps to understand the properties of vector spaces and the relationships between different sets of vectors.

Back
Top