Linear Independence to Determining Vectors' Dependence"

  • Thread starter Gregg
  • Start date
  • Tags
    Linear
In summary: This problem can be solved using the vector co-efficients a_1, a_2, \cdot\cdot\cdot a_n to determine whether a_1v_1+ a_2v_2+ \cdot\cdot\cdot a_nv_n=0. If this equation has only one solution, then the vectors are linearly independent. If the equation has an infinite number of solutions, then the vectors are linearly dependent.
  • #1
Gregg
459
0
linear independence

Homework Statement



given that a,b and c are linearly independant vectors determine if the following vectors are linearly independant.

a) a,0

b) a+b, b+c, c+a

c) a+2b+c, a-b-c, 5a+b-c

The Attempt at a Solution



I'm not sure how to tackle the question in this form.

Edit:

a) a=0a Dependant

[itex]
\text{Det}\left[\left(
\begin{array}{ccc}
1 & 0 & 1 \\
1 & 1 & 0 \\
0 & 1 & 1
\end{array}
\right)\right]=2[/itex]

Independant

(c)
[itex]
\text{Det}\left[\left(
\begin{array}{ccc}
1 & 1 & 5 \\
2 & -1 & 1 \\
1 & -1 & -1
\end{array}
\right)\right]=0[/itex] Dependant

Is it ok to use those vector co-efficients in a matrix like that?
 
Last edited:
Physics news on Phys.org
  • #2
Where'd you get you those co-efficient matrices from?
What is the definition of vectors being linearly dependent?
 
  • #3
That actually works. For example, if you row-reduce the matrix in c), you get
[tex]\left(\begin{array}{ccc} 1 & 0 & 2 \\ 0 & 1 & 3 \\ 0 & 0 & 0\end{array}\right)[/tex]

This says that c1 = -2c3, c2 = -3c3, and c3 = c3. If you take c3 = 1, then c1 = -2 and c2 = -3. That linear combination of the vectors given in part c results in a sum of 0, thus demonstrating that the set is linearly dependent. Note spelling of "dependent" Gregg. Similar for independent.
 
  • #4
Mark44 said:
That actually works. For example, if you Note spelling of "dependent" Gregg. Similar for independent.

whoops
 
  • #5
I think it is always better to use the basic definitions than try to memorize a specific method without understanding it.

The definition of "dependent" for a set of vectors [itex]\left{v_1, v_2, \cdot\cdot\cdot v_n}[/quote] is that there are numbers, [itex]a_1, a_2, \cdot\cdot\cdot a_n[/itex], not all 0, such that \(\displaystyle a_1v_1+ a_2v_2+ \cdot\cdot\cdot a_nv_n= 0[/itex].

For the first problem, { a, 0}, take [itex]a_0= 0[/itex], [itex]a_1= 1[/itex]: [itex]a_0a+ a_10= 0(a)+ 1(0)= 0[/itex].

For (b), with a+b, b+c, c+a, if [itex]a_1(a+b)+ a_2(b+ c)+ a_3(c+a)= 0[/itex] then [itex](a_1+ a_3)a+ (a_1+ a_2)b+ (a_2+ a_3)c= 0[/itex]. Since a, b, and c are independent, we must have [itex]a_1+ a_3= 0[/itex], [itex]a_1+ a_2= 0[/itex], and [itex]a_2+ a_3= 0[/itex]. Obviously, [itex]a_0= a_1= a_2= 0[/itex] satisfies that but is it the only solution?\)
 
  • #6
I couldn't agree with HallsOfIvy more, in what he said about the importance of understanding definitions as opposed to memorizing a technique without understanding why you are doing it. To often students get tangled up in the details of calculating a determinant or row reducing a matrix without understanding what it means that the matrix determinant is zero or why the matrix should be row reduced.

The definition of linear independence of a set of vectors is stated very simply, but there is a subtlety to it that escapes many students. The only thing that distinguishes a set of linearly independent vectors from a set that is linearly dependent is whether the equation [itex]c_1 v_1 + c_2 v_2 + c_3 v_3 + ... + c_n v_n + = 0[/itex] has only one solution (independent vectors) or an infinite number of solutions (dependent vectors).
 

FAQ: Linear Independence to Determining Vectors' Dependence"

What is linear independence?

Linear independence refers to the property of a set of vectors where none of the vectors in the set can be expressed as a linear combination of the other vectors. In other words, if a vector can be written as a linear combination of other vectors, then it is considered linearly dependent.

How do you determine if a set of vectors is linearly independent?

There are a few methods for determining linear independence, but one of the most common is the row reduction method. This involves creating a matrix with the given vectors as rows, and then performing row reduction to obtain the row echelon form. If there are any rows of all zeros, then the vectors are linearly dependent. If there are no rows of all zeros, then the vectors are linearly independent.

Can a set of two vectors be linearly dependent?

Yes, a set of two vectors can be linearly dependent if one vector is a scalar multiple of the other. In other words, if one vector is a multiple of the other, then they are linearly dependent.

What is the importance of linear independence in linear algebra?

Linear independence is important in linear algebra because it allows us to determine if a set of vectors can form a basis for a vector space. A basis is a set of linearly independent vectors that can be used to represent all other vectors in the vector space. Additionally, linear independence is a fundamental concept in solving systems of linear equations and understanding the properties of vector spaces.

Can a set of vectors be linearly dependent in one vector space but linearly independent in another?

Yes, a set of vectors can be linearly dependent in one vector space but linearly independent in another. This is because the definition of linear independence depends on the specific vector space in which the vectors are being considered. So, a set of vectors may be linearly dependent in one vector space, but when considered in a different vector space, they may be linearly independent.

Similar threads

Replies
14
Views
3K
Replies
3
Views
2K
Replies
1
Views
1K
Replies
4
Views
1K
Replies
13
Views
1K
Replies
32
Views
1K
Replies
4
Views
2K
Back
Top