Linear algebra linearly independent

In summary, linear independence refers to a set of vectors that cannot be written as a linear combination of each other. To determine if a set of vectors is linearly independent, one can use the definition or check the determinant of a matrix formed by the vectors. Linear independence is important in linear algebra for finding a basis, solving systems of linear equations, and understanding geometric properties of vector spaces. A set of linearly dependent vectors cannot be linearly independent because they do not contribute unique information. Linear independence can also be used to solve systems of linear equations with a unique solution.
  • #1
Mdhiggenz
327
1

Homework Statement


Determine whether the following vectors are linearly independent in P3 "not sure what P3 stands for maybe polynomial of third degree?"

1,x2,x2-2

let p1(x)=1
p2(X)=x2
p3=x2-2

c1p1(x)+c2p2(x)+c3p3(x)=z

where z=0x2+0x+0

I then create a matrix using the above relation where I get

(c2+c3)x2+c1-2c3

The matrix I'm thinking if solving looks like this [0 1 1;0 0 0; 1 0 -2] I know the zero must be at the bottom so I switch things up and get [1 0 -2; 0 1 1;0 0 0]

When I actually solve it I get c2=-c3
which is a nontrivial solution, so I state that it is linearly dependent.

Is my work correct, and or is there a faster way to come up with the conclusion that it is linearly dependent?

Thanks




Homework Equations





The Attempt at a Solution

 
Physics news on Phys.org
  • #2
Mdhiggenz said:

Homework Statement


Determine whether the following vectors are linearly independent in P3 "not sure what P3 stands for maybe polynomial of third degree?"
It's probably "polynomials of degree less than 3."

The notation isn't consistent from textbook to textbook, in my experience.
Mdhiggenz said:
1,x2,x2-2

let p1(x)=1
p2(X)=x2
p3=x2-2

c1p1(x)+c2p2(x)+c3p3(x)=z

where z=0x2+0x+0

I then create a matrix using the above relation where I get

(c2+c3)x2+c1-2c3

The matrix I'm thinking if solving looks like this [0 1 1;0 0 0; 1 0 -2] I know the zero must be at the bottom so I switch things up and get [1 0 -2; 0 1 1;0 0 0]

When I actually solve it I get c2=-c3
which is a nontrivial solution, so I state that it is linearly dependent.

Is my work correct, and or is there a faster way to come up with the conclusion that it is linearly dependent?

Thanks
This one is simple enough that you can tell that the set of functions is linearly dependent merely by observation. p3(x) is pretty obviously a linear combination of the other two functions.
 
  • #3
Mdhiggenz said:

Homework Statement


Determine whether the following vectors are linearly independent in P3 "not sure what P3 stands for maybe polynomial of third degree?"
Most likely the set of all polynomials of degree 3 or less. The polynomials of degree 3 don't form a vector space. Nonetheless, shouldn't you find out for sure before attempting the problem?
 
  • #4
Mark44 said:
It's probably "polynomials of degree less than 3."

The notation isn't consistent from textbook to textbook, in my experience.

This one is simple enough that you can tell that the set of functions is linearly dependent merely by observation. p3(x) is pretty obviously a linear combination of the other two functions.

How so mark?

Would it simply be because the first two could be represented as x2-2(1) which would give you px3?
 
  • #5
jbunniii said:
Most likely the set of all polynomials of degree 3 or less. The polynomials of degree 3 don't form a vector space. Nonetheless, shouldn't you find out for sure before attempting the problem?

True.
 
  • #6
Mdhiggenz said:
How so mark?

Would it simply be because the first two could be represented as x2-2(1) which would give you px3?

I understand what you're trying to say, but you aren't saying it very well. It is because p3(x) = (1) * p2(x) + (-2) * p1(x). This shows that p3 is a linear combination of p1 and p2.
 
  • #7
Mark44 said:
I understand what you're trying to say, but you aren't saying it very well. It is because p3(x) = (1) * p2(x) + (-2) * p1(x). This shows that p3 is a linear combination of p1 and p2.

Oh! So just like in differential equations then? the same logic is being used in this problem?
 
  • #8
Mdhiggenz said:
Oh! So just like in differential equations then? the same logic is being used in this problem?
Yes. And the term "linear combination" means the same thing in both areas.
 

FAQ: Linear algebra linearly independent

What does it mean for vectors to be linearly independent?

Linear independence refers to a set of vectors in a vector space that cannot be written as a linear combination of each other. In other words, no vector in the set can be expressed as a linear combination of the other vectors in the set. This means that each vector in the set contributes unique information and is necessary for the complete representation of the space.

How can I determine if a set of vectors is linearly independent?

To determine if a set of vectors is linearly independent, you can use the definition of linear independence. Arrange the vectors as columns in a matrix and perform row operations to transform it into row-echelon form. If there are no zero rows in the reduced matrix, then the vectors are linearly independent. Another way is to check the determinant of the matrix formed by the vectors. If the determinant is non-zero, then the vectors are linearly independent.

What is the importance of linear independence in linear algebra?

Linear independence is a fundamental concept in linear algebra and is crucial for many applications. It allows us to determine a basis for a vector space, which is a set of vectors that can be used to represent any other vector in the space. Linear independence also plays a key role in solving systems of linear equations and in understanding the geometric properties of vector spaces.

Can a set of linearly dependent vectors be linearly independent?

No, a set of linearly dependent vectors cannot be linearly independent. This is because linear dependence means that one or more vectors in the set can be written as a linear combination of the other vectors. If this is the case, then the vectors are not independent and do not contribute unique information to the space.

How can I use linear independence to solve a system of linear equations?

If the coefficient matrix of a system of linear equations has linearly independent columns, then the system has a unique solution. This is because the linearly independent columns form a basis for the column space of the matrix, which means that every vector in the column space can be written as a unique linear combination of the vectors. This allows us to solve for the variables in the system and find the unique solution.

Back
Top