Is This a Correct Approach to Determining Linear Independence?

In summary: So there is no way to choose A and B to make A \cos(x) + B \sin(x) = 0 true for all x other than A=B=0. Therefore the functions are linearly independent.In summary, the conversation discussed the concept of linear independence in linear algebra and how to check for linear independence using a linear combination of vectors or functions. The approach of choosing specific values for the variables was discussed as a method for proving linear independence.
  • #1
Jamin2112
986
12

Homework Statement



Critique my understanding.

Homework Equations



From the omniscient Wikipedia:

In linear algebra, a family of vectors is linearly independent if none of them can be written as a linear combination of finitely many other vectors in the collection.

The Attempt at a Solution



So if I had (0 0 1)T, (0 2 -2)T, (1 -2 1)T, and (4 2 3)T, then I'd check whether at least one of them can be written as a linear combination of the others by looking at

k1 (0 0 1)T + k2 (0 2 -2)T + k3 (1 -2 1)T + k4 (4 2 3)T = 0

and seeing whether there is a nontrivial solution.

Why? --- Because I looked to see whether the first one can be written as a linear combination of the other three, I'd have

C (0 0 1)T = C2 (0 2 -2)T + C3 (1 -2 1)T + C4 (4 2 3)T

----> 0 = - C (0 0 1)T + C2 (0 2 -2)T + C3 (1 -2 1)T + C4 (4 2 3)T ,

which is essentially the same thing. Ditto when I check so see if the other three vectors can be written as a linear combination of each of their other three. Sometimes my profesor throws functions at us. For example, he asked whether the functions f(x) = cos(x) and g(x) = sin(x) (both R --> R) were linearly independent.

I look at

A cos(x) + B sin(x) = 0.

Since the domain is all real numbers x, I can just choose a convenient real number x and to confirm my sneaking suspicion of linear independence. Let x = 0. Then A = 0 and B = B. Let x = π / 2. Then B = 0 and A = A. … I'm a little confused. Can someone help me with the logic leading to the conclusion that my functions are linearly independent?
 
Physics news on Phys.org
  • #2
Jamin2112 said:

Homework Statement



Critique my understanding.

Homework Equations



From the omniscient Wikipedia:

The Attempt at a Solution



So if I had (0 0 1)T, (0 2 -2)T, (1 -2 1)T, and (4 2 3)T, then I'd check whether at least one of them can be written as a linear combination of the others by looking at

k1 (0 0 1)T + k2 (0 2 -2)T + k3 (1 -2 1)T + k4 (4 2 3)T = 0

and seeing whether there is a nontrivial solution.

Why? --- Because I looked to see whether the first one can be written as a linear combination of the other three, I'd have

C (0 0 1)T = C2 (0 2 -2)T + C3 (1 -2 1)T + C4 (4 2 3)T

----> 0 = - C (0 0 1)T + C2 (0 2 -2)T + C3 (1 -2 1)T + C4 (4 2 3)T ,

which is essentially the same thing. Ditto when I check so see if the other three vectors can be written as a linear combination of each of their other three.
This part looks good but if you can get the coefficients you are done. A easier way in this case is to notice that that you have a set of 4 vectors and the dimension is 3 which guaranties that they are dependent since you can't have a set of 4 independents vectors in R3

Jamin2112 said:
Sometimes my profesor throws functions at us. For example, he asked whether the functions f(x) = cos(x) and g(x) = sin(x) (both R --> R) were linearly independent.
I look at
A cos(x) + B sin(x) = 0.

Since the domain is all real numbers x, I can just choose a convenient real number x and to confirm my sneaking suspicion of linear independence. Let x = 0. Then A = 0 and B = B. Let x = π / 2. Then B = 0 and A = A. … I'm a little confused. Can someone help me with the logic leading to the conclusion that my functions are linearly independent?

Starting from
A cos(x) + B sin(x) = 0. you can get:cos(x)=K sin(x) where K=-A/B .Is there any K except 0 that satisfies the relation?
 
  • #3
Jamin2112 said:
Since the domain is all real numbers x, I can just choose a convenient real number x and to confirm my sneaking suspicion of linear independence. Let x = 0. Then A = 0 and B = B. Let x = π / 2. Then B = 0 and A = A. … I'm a little confused. Can someone help me with the logic leading to the conclusion that my functions are linearly independent?
This is a good way to start. There are more high level ways of checking whether functions are linearly independent, for example using the Wronskian, but those are for another class. As bp_psy points out, there are easier ways to do it in the case of two functions. But if there are more, this technique generalizes quite nicely. At the moment we're interested in is whether there exist numbers A and B such that, for every possible choice of x, we have
[tex] A \cos(x) + B \sin(x) = 0[/tex]
If it's true for all x, then in particular it's true if x=0. Plugging in x=0, we know cos(0)=1 and sin(0)=0 so we get the equation A=0. So if [tex]A \cos(x) + B \sin(x) = 0[/tex] is true, then A=0 must be true. Similarly, if we plug in [tex] x=\pi/2[/tex], we know that [tex] \cos(\frac{\pi}{2}) =0[/tex] and [tex] \sin(\frac{\pi}{2}) = 1[/tex] so if [tex]A \cos(x) + B \sin(x) = 0[/tex] is true for all x, it must be true for [tex]x=\pi/2[/tex] and plugging that number in gives us the equation B=0.

So by checking several values of x, we get A=0 and B=0 are necessary for [tex]A \cos(x) + B \sin(x) = 0[/tex] to be true for all x (in fact, for just the two values of x that we guessed). Hence cos(x) and sin(x) must be linearly independent.

If someone asks you to prove a bunch of functions are linearly independent, this is always a valid approach to use. Pick a bunch of points, and prove that linear independence fails on those points (pick as many points as you have functions). There are two problems with this approach - you might pick a bad set of points (if you had tried x=0 and [tex] x=\pi[/tex] we wouldn't have gotten linear independence), and the functions might not be linearly independent at all. If you're not able to prove the coefficients are all 0 by picking a finite set of points, it might be because the functions are linearly dependent, but it might be that you just picked a bad set of points, and it's hard to tell which is the case
 
  • #4
Office_Shredder said:
This is a good way to start. There are more high level ways of checking whether functions are linearly independent, for example using the Wronskian, but those are for another class. As bp_psy points out, there are easier ways to do it in the case of two functions. But if there are more, this technique generalizes quite nicely. At the moment we're interested in is whether there exist numbers A and B such that, for every possible choice of x, we have
[tex] A \cos(x) + B \sin(x) = 0[/tex]
If it's true for all x, then in particular it's true if x=0. Plugging in x=0, we know cos(0)=1 and sin(0)=0 so we get the equation A=0. So if [tex]A \cos(x) + B \sin(x) = 0[/tex] is true, then A=0 must be true. Similarly, if we plug in [tex] x=\pi/2[/tex], we know that [tex] \cos(\frac{\pi}{2}) =0[/tex] and [tex] \sin(\frac{\pi}{2}) = 1[/tex] so if [tex]A \cos(x) + B \sin(x) = 0[/tex] is true for all x, it must be true for [tex]x=\pi/2[/tex] and plugging that number in gives us the equation B=0.

So by checking several values of x, we get A=0 and B=0 are necessary for [tex]A \cos(x) + B \sin(x) = 0[/tex] to be true for all x (in fact, for just the two values of x that we guessed). Hence cos(x) and sin(x) must be linearly independent.

If someone asks you to prove a bunch of functions are linearly independent, this is always a valid approach to use. Pick a bunch of points, and prove that linear independence fails on those points (pick as many points as you have functions). There are two problems with this approach - you might pick a bad set of points (if you had tried x=0 and [tex] x=\pi[/tex] we wouldn't have gotten linear independence), and the functions might not be linearly independent at all. If you're not able to prove the coefficients are all 0 by picking a finite set of points, it might be because the functions are linearly dependent, but it might be that you just picked a bad set of points, and it's hard to tell which is the case


I see. So it's sort of like the following?

Proof. By Contradiction. Assume that for all x there exist real numbers A and B such that A ≠ 0, B ≠ 0, and A sin(x) + B cos(x) = 0. Call that assumption P.

Let x = 0. Then B = 0. Thus P implies B = 0.

Now let x = π / 2. Then A = 0. Thus P implies A = 0.

We have that P implies A = B = 0, a contradiction. Thus there exists an x such that for all real numbers A and B, A = B = 0 and A sin(x) + B cos(x) = 0. The set of functions {sin(x), cos(x)} is therefore a linearly dependent set.
 
  • #5
bp_psy said:
Is there any K except 0 that satisfies the relation?

No, but how do I show that?
 
  • #6
bp_psy said:
Starting from
A cos(x) + B sin(x) = 0. you can get:cos(x)=K sin(x) where K=-A/B .Is there any K except 0 that satisfies the relation?
Well, for x = pi/4, K = 1, but this might not have been what bp_psy had in mind. Since the equation A cos(x) + B sin(x) = 0 has to be identically true for any value of x, my solution of K = 1 does NOT show that cos(x) and sin(x) are linearly independent functions.

Jamin2112 said:
No, but how do I show that?

Office_shredder showed one way, and alluded to another way that uses the Wronskian of the functions involved.

Here's a third way:
A cos(x) + B sin(x) = 0

-A sin(x) + B cos(x) = 0 (taking the derivative of both sides)

Solving this system of two equations in two unknowns (A and B) yields A = B = 0 as the only solutions. Hence cos(x) and sin(x) are linearly independent functions.
 
  • #7
Mark44 said:
Well, for x = pi/4, K = 1, but this might not have been what bp_psy had in mind. Since the equation A cos(x) + B sin(x) = 0 has to be identically true for any value of x, my solution of K = 1 does NOT show that cos(x) and sin(x) are linearly independent functions.



Office_shredder showed one way, and alluded to another way that uses the Wronskian of the functions involved.

Here's a third way:
A cos(x) + B sin(x) = 0

-A sin(x) + B cos(x) = 0 (taking the derivative of both sides)

Solving this system of two equations in two unknowns (A and B) yields A = B = 0 as the only solutions. Hence cos(x) and sin(x) are linearly independent functions.

Grrr ... The logic of it is still a little confusing for me.

cos(x) and sin(x) are linearly dependent if for all x there exists A and B s.t. ~(A = B = 0) and A * sin(x) + B * cos(x) = 0.

Is that right?
 
  • #8
Jamin2112 said:
Grrr ... The logic of it is still a little confusing for me.

cos(x) and sin(x) are linearly dependent if for all x there exists A and B s.t. ~(A = B = 0) and A * sin(x) + B * cos(x) = 0.

Is that right?
No. Assuming that ~(A = B = 0) means "it's not true that A = 0 and B = 0"

Here are some examples.
1) cos(x) and 2cos(x) are obviously linearly dependent.
The equation A cos(x) + B*2cos(x) = 0 has a solution A = B = 0, independent of the value of x. There are also many other solutions for A and B; namely, A = 1, B = -1/2, A = 2, B = -1, and so on.
2) cos(x) and sin(x) are linearly independent (which is what part of this thread establishes). The equation A cos(x) + B sin(x) = 0 has exactly one solution for A and B - A = B = 0, and this solution is independent of the value of x.

The difference between linear independence and linear dependence is strictly the difference between how many solutions for the constants there are in the equation c1 f1(x) + c2 f2(x) + ... + cn fn(x) = 0. For a set of n linearly independent functions, there will be exactly one solution for the constants. For a set of n linearly dependent functions, there will be an infinite number of solutions, one of which is c1 = c2 = ... = cn = 0.
 
  • #9
I think that I'm understanding this a little better.

A sin(x) + B cos(x) = 0 ----> tan(x) = - A / B

So the question is: Does there exist an x such that A sin(x) + B cos(x) = 0 and ~(A = B = 0). If so, then sin(x) and cos(x) and linearly dependent functions. If not, then sin(x) and cos(x) are linearly independent functions.
 
  • #10
Jamin2112 said:
I think that I'm understanding this a little better.

A sin(x) + B cos(x) = 0 ----> tan(x) = - A / B
But to get to the second equation, you are tacitly assuming that B is not 0.
Jamin2112 said:
So the question is: Does there exist an x such that A sin(x) + B cos(x) = 0 and ~(A = B = 0). If so, then sin(x) and cos(x) and linearly dependent functions. If not, then sin(x) and cos(x) are linearly independent functions.
 
  • #11
Mark44 said:
But to get to the second equation, you are tacitly assuming that B is not 0.

Is this the following valid reasoning?

A sin(x) + B cos(x) = 0
A cos(x) - B sin(x) = 0 [1]
A sin(x) + B (B sin(x) / A) = 0 [2]
sin(x) (A + B2/A) = 0
sin(x) ( (A2+B2) / A) = 0
A2+B2=0 [3]
A = B = 0


[1] took the derivative of both sides
[2] combined lines 1 and 2
[3] since sin(x) isn't always zero, the numerator of A2+B2) / A must always be zero
 
  • #12
Jamin2112 said:
Is this the following valid reasoning?

A sin(x) + B cos(x) = 0
A cos(x) - B sin(x) = 0 [1]
A sin(x) + B (B sin(x) / A) = 0 [2]
In the line above, you replaced cos(x) by Bsin(x)/A. You got this by solving for cos(x) in the second equation, and substituting for cos(x) in the first equation.
Acos(x) = Bsin(x) ==> cos(x) = Bsin(x)/A

To isolate cos(x) you are tacitly assuming that A != 0. You can't then later come along and say that A = 0.
Jamin2112 said:
sin(x) (A + B2/A) = 0
sin(x) ( (A2+B2) / A) = 0
A2+B2=0 [3]
A = B = 0


[1] took the derivative of both sides
[2] combined lines 1 and 2
[3] since sin(x) isn't always zero, the numerator of A2+B2) / A must always be zero

Instead, what you should do to solve the system
Acos(x) + Bsin(x) = 0
-Asin(x) + Bcos(x) = 0

is add sin(x) times the first equation to cos(x) times the second. That causes the terms in A to drop out and yields an equation in B alone; namely, B(sin2(x) + cos2(x)) = 0 ==> B = 0. Then go back to either of the original equations to show that A must also be zero.
 
Last edited:
  • #13
Mark44 said:
To isolate cos(x) you are tacitly assuming that A != 0. You can't then later come along and say that A = 0.

What's the quick fix to that problem?
 
  • #14
I added more to my post, showing how it should be done.
 
  • #15
Mark44 said:
Instead, what you should do to solve the system
Acos(x) + Bsin(x) = 0
-Asin(x) + Bcos(x) = 0

is add sin(x) times the first equation to cos(x) times the second. That causes the terms in A to drop out and yields an equation in B alone; namely, B(sin2(x) + cos2(x)) = 0 ==> B = 0. Then go back to either of the original equations to show that A must also be zero.

Thanks! I probably wouldn't have figured that out myself. Other sets of functions seem easier.

So, proceeding as before, this time with {ex, e-x}:

Aex + Be-x = 0
Aex - Be-x = 0
---> 2Be-x= 0 ---> B = 0 ---> Aex = 0 -----> A = 0
 
  • #17
Mark44 said:
Yes, perfect!


http://www.layoutcodez.net/personalized/google/success_baby70989908.jpg





Another question:

(My exam begins 8 minutes from now)

When we look for a basis of a matrix A, we're basically (pun intended) simplifying Ax.

For example, if we have A = [2 -1 3; 1 0 1; 0 -2 1; 1 1 4], well, Ax row reduced is

[1 0 0; 0 1 0; 0 0 1; 0 0 0], so Ax is really just the span of e1, e2, e3.
 
  • #18
Frankly, it doesn't even make sense to talk about a "basis" for a matrix. Only vector spaces have "bases". What you are doing is finding a basis for the range of A. Of course, your matrix has non zero determinant so its range is all of R3.
If, for instance, your matrix were
[tex]A= \begin{bmatrix}2 & -2 & 3 \\ 1 & 0 & 1 \\ 2 & 0 & 2\end{bmatrix}[/tex]
then "row reduction" gives
[tex]\begin{bmatrix}1 & 0 & 1 \\ 0 & -1 & 1 \\ 0 & 0 & 0\end{bmatrix}[/tex]

Of course, any vector, (x, y, z), in the range of A must satisfy A(a, b, c)= (2a- b+ 3c, a+ b, 2a+ 2b)= (x, y, z), for some (a, b, c) so y= a+ b, z= 2a+ 2b= 2y. A basis for that space is {(1, 0, 0), (0, 1, 2)}. Can you get that from the row reduction of A?
 

FAQ: Is This a Correct Approach to Determining Linear Independence?

What is Linear Independence?

Linear Independence is a concept in linear algebra where a set of vectors is said to be linearly independent if none of the vectors in the set can be written as a linear combination of the others. In simpler terms, it means that no vector in the set can be expressed as a combination of the other vectors, making each vector unique and necessary.

How is Linear Independence different from Linear Dependence?

Linear Independence is the opposite of Linear Dependence. A set of vectors is said to be linearly dependent if at least one vector in the set can be expressed as a linear combination of the others. This means that there is redundancy in the set, as some vectors can be represented by a combination of the others.

Why is Linear Independence important in mathematics and science?

Linear Independence is important because it helps us understand the relationship between vectors and their linear combinations. It allows us to identify unique and essential vectors in a set and determine the dimensionality of a space. It is also a fundamental concept in many mathematical and scientific fields, such as physics, engineering, and finance.

How do you determine if a set of vectors is linearly independent?

To determine if a set of vectors is linearly independent, we can use the concept of linear dependence and check if any of the vectors in the set can be expressed as a linear combination of the others. If none of the vectors can be written as a linear combination, then the set is linearly independent. Another method is to use the determinant of the matrix formed by the vectors. If the determinant is non-zero, then the set is linearly independent.

Can a set of three or more vectors be linearly dependent?

Yes, a set of three or more vectors can be linearly dependent. In fact, any set of vectors with more than two vectors can be linearly dependent. This happens when at least one vector can be expressed as a linear combination of the others. However, in a set of three or more vectors, there can also be a subset of vectors that are linearly independent.

Back
Top