Determining Linear Independence: Use Coordinate Vectors

In summary, the conversation discussed the concept of linear independence and dependence in a set of vectors. The question asked for help in using coordinate vectors to determine if a given set is linearly independent and if not, to express one vector as a linear combination of the others. The conversation also touched on the method of substituting values for x to show linear dependence and the importance of choosing a basis as a starting point.
  • #1
Benny
584
0
Can someone help me out with the following question?

Use coordinate vectors to determine whether or not the given set is linearly independent. If it is linearly dependent, express one of the vectors as a linear combination of the others.

The set S, is [tex]\left\{ {2 + x - 3\sin x + \cos x,x + \sin x - 3\cos x,1 - 2x + 3\sin x + \cos x,2 - x - \sin x - \cos x,2 + \sin x - 3\cos x} \right\}[/tex].

So I assume that I start off by letting c_i (i = 1,2,3,4,5) be scalars multiply each of the c_i by each of the elements of S and get an equation which looks something like:

(something) + (something else)x + (another thing)sinx + (something different)cosx = 0.

I'm not really sure how to proceed at this point. One of the examples in my book, with a different set S(with 3 elements), substitutes 3 specific values of x into the equation and gets c_1 = c_2 = c_3 = 0 so that the set is linearly independent.

However, I'm not sure if that is the right method because if I have S = {1, sin^2(x), cos^2(x)} then S is linearly dependent since 1 = (1)cos^2(x) + (1)sin^2(x). But if I substitute x = 0, x = pi/2, x = pi into the equation 1 + sin^2(x) + cos^2(x) = 0 then I get a homegeneous system which only has the trivial solution and my books to suggest that it is enough to conclude from that, the set S is linearly independent(when it is clearly isn't as I just demonstrated before).

More specifically, my book says that the equation(say c_1(1) + c_2(x) + c_3(sinx) = 0 must hold for all values of x so it holds for specific values of x. I just don't know if that's a valid 'method' to use. If it is then I could simply substitute 'convenient' values of x for the question that I included at the beginning of this message to get a simple system of equations.

In short, I'm not sure how to proceed with the question I included at the start of this message. Can someone please help me out?
 
Physics news on Phys.org
  • #2
You should be able to see immediately that it's linearly dependent. The basis vectors here are 1, x, sin(x), cos(x)... there are only 4 of them while there are 5 elements of your set. However, the questions asks you to use co-ordinates to show this, and then express one in terms of the others. So let your basis be {1, x, sin(x), cos(x)}. You can express the first element of your set with respect to this basis with co-ordinates (2,1,-3,1). Do something similar with the other vectors, and then show linear dependence as you would if you were just dealing with vectors in R4.
 
  • #3
However, I'm not sure if that is the right method because if I have S = {1, sin^2(x), cos^2(x)} then S is linearly dependent since 1 = (1)cos^2(x) + (1)sin^2(x). But if I substitute x = 0, x = pi/2, x = pi into the equation 1 + sin^2(x) + cos^2(x) = 0 then I get a homegeneous system which only has the trivial solution and my books to suggest that it is enough to conclude from that, the set S is linearly independent(when it is clearly isn't as I just demonstrated before).

Are you sure the homogenous system you got had only the trivial solution? Suppose a + b * sin^2(x) + c * cos^2(x) = 0 for all values of x, substituting x = 0, x = pi/2 and x = pi (as per your suggestion) I get

{ a + c = 0
{ a + b = 0
{ a + c = 0

(And yes, a + c = 0 is supposed to be repeated). That system clearly has many solutions, for example (a, b, c) = (1, -1, -1).
 
  • #4
Muzza - Yeah, you're right. I forgot to square the negative one with the cos^2(x) term.

AKG - That seems right. After reading your response I see that what I wasn't doing in my previous attempts was to choose a basis as a starting point.

Thanks for the help.
 

FAQ: Determining Linear Independence: Use Coordinate Vectors

What is linear independence?

Linear independence refers to a set of vectors that cannot be written as a linear combination of each other. In other words, no vector in the set can be expressed as a scalar multiple of another vector in the set.

How do you determine if a set of vectors is linearly independent?

To determine linear independence of a set of vectors, you can use the coordinate vectors method. This involves creating a matrix with the vectors as columns, and then performing row reduction to determine if there is a solution to the equation Ax = 0 (where A is the matrix and x is the vector of coefficients). If there is no non-trivial solution, the vectors are linearly independent.

What is the difference between linear independence and linear dependence?

Linear independence refers to a set of vectors that are not dependent on each other, meaning they cannot be written as a linear combination of each other. Linear dependence, on the other hand, refers to a set of vectors that are dependent on each other, meaning at least one vector can be written as a linear combination of the others.

Can a set of vectors be both linearly independent and linearly dependent?

No, a set of vectors cannot be both linearly independent and linearly dependent. They are mutually exclusive concepts. A set of vectors can either be linearly independent or linearly dependent, but not both.

Why is determining linear independence important in mathematics and science?

Determining linear independence is important in mathematics and science because it helps us understand the relationships between vectors and their span. It also allows us to solve systems of equations and make predictions in a variety of fields, such as physics, engineering, and computer science.

Back
Top