Proving Linear Independence in a System of Vectors

In summary: Nothing subtle :PE: Well, apparently Mark44 was faster in answering his own post :P.no, mark44's is correct. i realized something wait let me refrase my definition. its hard to use this symbolsI didtn't write he was incorrect. I just didn't agree with his calling the deffinition of linear independence "subtle".ooooooooooo, if that, hmm like at question 1st
  • #1
annoymage
362
0

Homework Statement



let u1 , u2 , u3 [tex]\in[/tex] R3 be such that

Au1T = u1T , Au2T = [tex]\frac{1}{2}[/tex]u2T , Au3T = [tex]\frac{1}{3}[/tex]u3T

Suppose u1 , u2 , u3 [tex]\neq[/tex] 0

Show that

u1 , u2 , u3 are linearly independent

Homework Equations



a1u1 + a2u2 + a3u3 = 0

a1 = a2 = a3 = 0 for it to become linearly independent

The Attempt at a Solution



AuiT = uiT iff ui = ui AT for all i=1,2,3

implies

AT( u1 + [tex]\frac{1}{2}[/tex] u2 + [tex]\frac{1}{3}[/tex] u3) = 0

how to proof this is linear independent?
gimme clue please T_T
 
Last edited:
Physics news on Phys.org
  • #2
annoymage said:

Homework Statement



let u1 , u2 , u3 [tex]\in[/tex] R3 be such that

Au1T = u1T , Au2T = u2T , Au3T = u3T

Suppose u1 , u2 , u3 [tex]\neq[/tex] 0

Show that

u1 , u2 , u3 are linearly independent

Homework Equations



a1u1 + a2u2 + a3u3 = 0
There's more to the definition of linear independence. For example, there is always a solution for a1, a2, a3 in the equation above no matter what vectors you have. It doesn't matter whether they're independent or dependent.
annoymage said:
a1 = a2 = a3 = 0 for it to become linearly independent

The Attempt at a Solution



AuiT = uiT iff ui = ui AT for all i=1,2,3

implies

AT( u1 + [tex]\frac{1}{2}[/tex] u2 + [tex]\frac{1}{3}[/tex] u3) = 0

how to proof this is linear independent?
gimme clue please T_T

Here's a clue - what is the definition of linear independence?
 
  • #3
hmm, i still in "i don't know" owho..

Mark44 said:
there is always a solution for a1, a2, a3 in the equation above no matter what vectors you have. It doesn't matter whether they're independent or dependent.

im not good in english, so what you mean is that

a1 a2 a3 always have a solution because Av=0 , the v is always have solution, for all A and 0 inside matrix mxn. right?

and what i know the definition of linear independece is what i state

a1u1 + a2u2 + a3u3 = 0

a1 = a2 = a3 = 0 for it to become linearly independent.

hehe, can give me other clue about other definition.. clueless right now, sorry T_T
 
  • #4
There is more to the definition of linear independence, and this is a subtlety that escapes very many students.

Here are two examples that illustrate what I'm talking about.

1. u = <1, 0, 0>, v = <0, 1, 0>, w = <1, 1, 0>
Consider the equation au + bv + cw = 0. Clearly a = b = c = 0 is a solution. Does that make these vectors linearly independent?

2. u = <1, 0, 0>, v = <0, 1, 0>, w = <1, 0, 1>
Consider the equation au + bv + cw = 0. Clearly a = b = c = 0 is a solution. Does that make these vectors linearly independent?
 
  • #5
annoymage said:
hmm, i still in "i don't know" owho..



im not good in english, so what you mean is that

a1 a2 a3 always have a solution because Av=0 , the v is always have solution, for all A and 0 inside matrix mxn. right?
Your English is OK, and is probably better than my attempts to speak your language. We're not talking about matrices here; we're just talking about what it means for a set of vectors to be linearly dependent or linearly independent.

For vectors u, v, and w, and constants c1, c2, and c3, the equation c1u + c2v + c3w = 0 always has a solution for the constants c1, c2, and c3. No matter what the vectors u, v, and w are. It doesn't matter whether they're linearly dependent or linearly independent. That equation always has a solution. So there must be something else that allows us to make a distinction between linearly independent and linearly dependent vectors. That's what I'm trying to get you to realize.
annoymage said:
and what i know the definition of linear independece is what i state

a1u1 + a2u2 + a3u3 = 0

a1 = a2 = a3 = 0 for it to become linearly independent.

hehe, can give me other clue about other definition.. clueless right now, sorry T_T
 
  • #6
Oh c'mon, the definition is very simple. Since I'm not familiar with typing the mathematic formulas yet, let me write it for 3 vectors, but it's easy to generalise:
x,y,z are linearly independent iff:
(ax+by+cz=0) =>( a=b=c=0)
In Mark44's example 1 there is another solution: a=b=1, c=(-1). However, there is no other solution for vectors shown in (2), hence they are independent.

Nothing subtle :P

E: Well, apparently Mark44 was faster in answering his own post :P.
 
  • #7
no, mark44's is correct. i realized something wait let me refrase my definition. its hard to use this symbols
 
  • #8
I didtn't write he was incorrect. I just didn't agree with his calling the deffinition of linear independence "subtle".
 
  • #9
ooooooooooo, if that, hmm like at question 1st

1. (a,b,c) = (-t,-t,t) for all t (real number)

proving that this is linear dependent
but (0,0,0) is also a solution

hmm, i get it what you are trying to say..

let me refrase my definition

a1u1 + a2u2 + a3u3 = 0

for at least ai [tex]\neq[/tex] 0 , i=1,2,3

then it is linearly dependent

otherwise, it is linear independent

am i right?
 
  • #10
owho, sorry irycio ;P
 
  • #11
but, how do i solve the question then? should i assume that the statement is linear independent and make contradiction?
 
  • #12
irycio said:
Oh c'mon, the definition is very simple. Since I'm not familiar with typing the mathematic formulas yet, let me write it for 3 vectors, but it's easy to generalise:
x,y,z are linearly independent iff:
(ax+by+cz=0) =>( a=b=c=0)
Nope, that's not it. Let's look at my first example.
u = <1, 0, 0>, v = <0, 1, 0>, w = <1, 1, 0>
I can see by inspection that au + bv + cz = 0 has a solution a = b = c = 0. Does that make vectors u, v, and w linearly independent? The answer is no.
irycio said:
In Mark44's example 1 there is another solution: a=b=1, c=(-1). However, there is no other solution for vectors shown in (2), hence they are independent.

Nothing subtle :P
I disagree. It's very subtle. Students always focus on the equation (e.g. for 3 vectors) au + bv + cz = 0 and completely miss the subtlety of the distinction between linear independence and linear dependence.

As I said before the equation always has a solution regardless of whether the vectors are linearly dependent or linearly independent. The subtlety comes with whether the solution for the constants is unique (a = b = c = 0) or not unique; i.e., that there are nonzero solutions for the constants. If the solution is unique, the vectors are linearly independent. If the solution is not unique, the vectors are linearly dependent.
irycio said:
E: Well, apparently Mark44 was faster in answering his own post :P.
 
  • #13
annoymage said:
ooooooooooo, if that, hmm like at question 1st

1. (a,b,c) = (-t,-t,t) for all t (real number)

proving that this is linear dependent
but (0,0,0) is also a solution
This is a different question from the first one you posted.

(a, b, c) is a vector, and a single vector is linearly independent as long as it's not identically the zero vector. If u = <a, b, c> = <-t, -t, t> then the equation c1 u = 0 has only a single solution for arbitrary values of t, c1 = 0.

I'm not sure about what you're asking, though.
annoymage said:
hmm, i get it what you are trying to say..

let me refrase my definition

a1u1 + a2u2 + a3u3 = 0

for at least ai [tex]\neq[/tex] 0 , i=1,2,3

then it is linearly dependent

otherwise, it is linear independent

am i right?
 
  • #14
Oh of course it always does have the zero-solution, but in my opinion it just is not definition.

What I wrote in thefragment which You first quoted means literally, that the vectors are independent ONLY IF zero is the ONLY solution. Your first example has another one (1,1,-1) and hence those vectors are NOT independent.

I believe we totally agree on the linear dependencies between vectors. It's just the definition I was told concentrated more on the uniqueness of the solution than on the equation SUM=0 itself (as important as it is, though :)).
 
  • #15
irycio said:
Oh of course it always does have the zero-solution, but in my opinion it just is not definition.
I don't understand what you're saying here.
irycio said:
What I wrote in thefragment which You first quoted means literally, that the vectors are independent ONLY IF zero is the ONLY solution.
That's not what you wrote, which was
(ax+by+cz=0) =>( a=b=c=0)
This is the part after iff. In my counterexample I supplied an equation ax + by + cz = 0 that is obviously true, and came up with a solution a=b=c=0 (which are true statements), so the implication is true. The problem is that according to your definition the vectors x, y, and z are linearly independent when in fact they are not.
irycio said:
Your first example has another one (1,1,-1) and hence those vectors are NOT independent.
Right. So your definition is defective. I recognize that you understand the difference between linear dependence/independence. All I'm saying is that your definition doesn't work. It should say that the equation ax + by + cz = 0 has exactly one solution for the constants a, b, and c. That is exactly the subtlety I've been talking about, and the one that has escaped many generations of linear algebra students.
irycio said:
I believe we totally agree on the linear dependencies between vectors. It's just the definition I was told concentrated more on the uniqueness of the solution than on the equation SUM=0 itself (as important as it is, though :)).
That is the correct definition - the uniqueness of the solution. The fact that a linear combination of vectors adds up to the zero vector is completely immaterial.
 
  • #16
You're right, my definition "(ax+by+cz=0)..." didn't mean what I wanted it to mean.
 
  • #17
im still twisting my brain, how to do this,

hmm,

u1(AT - I) = 0

u1[tex]\neq[/tex] 0 => (AT - I) = 0

am I in the right path? ;P
 
  • #18
irycio, can you help me ? T_T

mark44, are you off9 ?

someone please , ngahaha I am begging you
 
  • #19
Do you know about eigenvalues, eigenvectors, and similar matrices? For this problem you have Au1 = u1, Au2 = (1/2)u2, and Au3 = (1/3)u3, where u1, u2, and u3 are not zero vectors.

You need to show that u1, u2, and u3 are linearly independent. I.e., show that the only solution of the equation c1u1 + c2u2 + c3u3 = 0 is c1 = c2 = c3 = 0.

To do this, you need to be able to say something about the matrix A.
 
  • #20
i see i see, let me do some reading, n then try solve it, after i go bath. owhoho..
my body all sticky, later if i get the proof, i'll post to let you check it.
 

FAQ: Proving Linear Independence in a System of Vectors

1. What is a linear system?

A linear system is a set of equations that can be represented by linear functions. This means that the variables in the equations are raised to the first power and are not multiplied together. The general form of a linear system is Ax + By = C, where A, B, and C are constants and x and y are variables.

2. How do you prove that a system is linear?

To prove that a system is linear, you must show that each equation in the system can be represented by a linear function. This means that the variables are not multiplied together and are raised to the first power. You can also check that the system satisfies the properties of linearity, such as the principle of superposition and the principle of homogeneity.

3. What is the principle of superposition in linear systems?

The principle of superposition states that if a system is linear, then the sum of two solutions to the system will also be a solution. In other words, if x and y are solutions to a linear system, then ax + by (where a and b are constants) will also be a solution. This principle is useful in proving the linearity of a system.

4. Can a system be linear if it has one nonlinear equation?

No, a system cannot be considered linear if it has even one nonlinear equation. This is because a nonlinear equation cannot be represented by a linear function and therefore does not satisfy the definition of a linear system.

5. How can I use matrices to prove the linearity of a system?

Matrices can be used to represent a linear system in a more compact and organized form. To prove the linearity of a system using matrices, you can perform operations such as row reduction to show that the system can be reduced to its simplest form while still satisfying the properties of linearity.

Similar threads

Back
Top