Verifying whether my working is correct in showing Linear Independence

In summary, the conversation revolves around determining linear dependence and independence of given vectors using the definition of independence and the determinant of a system of equations. The first set of vectors is found to be independent while the second set is dependent, with a discussion on the uniqueness of solutions and the concept of "independence exists." Confusion arises over the equations a1+a2=0 and a1+a3=0, which can be solved by setting a1, a2, and a3 to 0.
  • #1
savva
39
0

Homework Statement


I have attempted the questions below but am not sure if I am applying the method correctly to show linear dependence/independence.

a)Show that the vectors
e1=[1 1 0]T, e2=[1 0 1]T, e3=[0 1 1]T
are linearly independent

b) Show that the vectors
e1=[1 1 0]T, e2=[1 0 -1]T, e3=[0 1 1]T
are linearly independent

(T = Transverse)

Homework Equations


The determinant

The Attempt at a Solution


I have attempted to find the determinant by putting the vectors in a 3x3 matrix and finding the determinant which when =0 should give linear dependence and when ≠0 give linear independence. My working and the questions are attached in a pdf file with this thread.

I'd greatly appreciate any help
 

Attachments

  • Maths3 - Linear Independence.pdf
    213.1 KB · Views: 226
Physics news on Phys.org
  • #2
Do you know the basic definition of "independent", "dependent" vectors?

A set of vectors [itex]\{v_1, v_2, \cdot\cdot\cdot, v_n\}[/itex] is "independent" if and only if in order to have [itex]a_1v_1+ a_2v_2+ \cdot\cdot\cdot+ a_nv_n\}[/itex], we must have [itex]a_1= a_2= \cdot\cdot\cdot+ a_n= 0[/itex].

Here, such a sum would be of the form
[tex]a_1\begin{bmatrix}1 \\ 1 \\ 0 \end{bmatrix}+ a_2\begin{bmatrix}1 \\ 0 \\ 1 \end{bmatrix}+ a_3\begin{bmatrix}0 \\ 1 \\ 1\end{bmatrix}= \begin{bmatrix}0\\ 0 \\ 0 \end{bmatrix}[/tex]
Of course multiplying the scalars and adding that is the same as
[tex]\begin{bmatrix}a_1+ a_2 \\ a_1+ a_3 \\a_1+ a_2\end{bmatrix}= \begin{bmatrix}0 \\ 0 \\ 0\end{bmatrix}[/tex]

which, in turn, is equivalent to the three equations
[itex]a_1+ a_2= 0[/itex], [itex]a_1+ a_3= 0[/itex], [itex]a_1+ a_2= 0[/itex]

[itex]a_1= a_2= a_3= 0[/itex] is obviously a solution to that system of equations. Is it the only one (if so the vectors are independent. If there exist another, non "trivial" solution, they are dependent).

Of course, one can determine whether or not a system of equations is has unique solution by looking at the determinant of coefficients. As you say, these sets of vectors are independent (I would not say "independence exists").
 
  • #3
HallsofIvy said:
Do you know the basic definition of "independent", "dependent" vectors?

A set of vectors [itex]\{v_1, v_2, \cdot\cdot\cdot, v_n\}[/itex] is "independent" if and only if in order to have [itex]a_1v_1+ a_2v_2+ \cdot\cdot\cdot+ a_nv_n\}[/itex], we must have [itex]a_1= a_2= \cdot\cdot\cdot+ a_n= 0[/itex].

Here, such a sum would be of the form
[tex]a_1\begin{bmatrix}1 \\ 1 \\ 0 \end{bmatrix}+ a_2\begin{bmatrix}1 \\ 0 \\ 1 \end{bmatrix}+ a_3\begin{bmatrix}0 \\ 1 \\ 1\end{bmatrix}= \begin{bmatrix}0\\ 0 \\ 0 \end{bmatrix}[/tex]
Of course multiplying the scalars and adding that is the same as
[tex]\begin{bmatrix}a_1+ a_2 \\ a_1+ a_3 \\a_1+ a_2\end{bmatrix}= \begin{bmatrix}0 \\ 0 \\ 0\end{bmatrix}[/tex]

which, in turn, is equivalent to the three equations
[itex]a_1+ a_2= 0[/itex], [itex]a_1+ a_3= 0[/itex], [itex]a_1+ a_2= 0[/itex]

[itex]a_1= a_2= a_3= 0[/itex] is obviously a solution to that system of equations. Is it the only one (if so the vectors are independent. If there exist another, non "trivial" solution, they are dependent).

Of course, one can determine whether or not a system of equations is has unique solution by looking at the determinant of coefficients. As you say, these sets of vectors are independent (I would not say "independence exists").

I do not understand by what you mean by "As you say, these sets of vectors are independent (I would not say "independence exists")"
From my calculations I found the first question to be independent and the second dependent. What do you mean you would not say independence exists?

I don't understand how a1+a2=0 or a1+a3=0
if you add these up;
a1+a2= [1 1 0] + [1 0 1] = [2 1 1]
a1+a3= [1 1 0] + [0 1 1] = [1 2 1]

how do you get them to = 0?
 

Related to Verifying whether my working is correct in showing Linear Independence

1. What is the definition of linear independence?

Linear independence refers to a set of vectors in a vector space that cannot be written as a linear combination of other vectors in the same space. In other words, the vectors are not redundant and can be used to uniquely represent any vector in the space.

2. How can I determine if a set of vectors is linearly independent?

One way to check for linear independence is by using the determinant method. Arrange the vectors as columns in a matrix and calculate the determinant. If the determinant is non-zero, then the vectors are linearly independent. Another method is by using the span of vectors - if the span of the vectors is equal to the dimension of the vector space, then the vectors are linearly independent.

3. Can a set of two vectors be linearly independent in one vector space but dependent in another?

Yes, a set of vectors can be linearly independent in one vector space but dependent in another. This is because the dimension and basis of a vector space can vary, and what may be independent in one space may not be in another. It is important to check the vector space and context when determining linear independence.

4. Is it possible for a set of vectors to be neither linearly independent nor dependent?

No, a set of vectors must be either linearly independent or dependent. If the vectors are not linearly independent, then they must be dependent, as there is no other option. The only exception is the set containing the zero vector, which is linearly dependent.

5. How is linear independence important in mathematics and science?

Linear independence is important in mathematics and science as it allows us to uniquely represent vectors and solve equations involving vectors. It is also used in many fields such as physics, engineering, and computer science to model and analyze systems. Linear independence is a fundamental concept in linear algebra and plays a crucial role in many mathematical and scientific applications.

Similar threads

Back
Top