Prove Independence of P3 Basis S

In summary: Yes, but with just two vectors, that's overkill. Two vectors are linearly independent as long as neither one is a multiple of the other. If you have three vectors, though, it's not as obvious. I gave you an example of this in another thread.
  • #1
judahs_lion
56
0

Homework Statement


Given S = (1+x2, x +x3

And augment S to form a Basis S' of P3

The Attempt at a Solution



0 + 0x + 0x2 + 0x3 = a(1+x2)+b(x +x3)

= a + ax2 + bx + bx3
 
Physics news on Phys.org
  • #2
Isn't S dependent? X + X3 = (1 + X2)X
 
  • #3
judahs_lion said:

Homework Statement


Given S = (1+x2, x +x3
I'm pretty sure you mean, S = {1 + x2, x + x3}
judahs_lion said:
And augment S to form a Basis S' of P3


The Attempt at a Solution



0 + 0x + 0x2 + 0x3 = a(1+x2)+b(x +x3)

= a + ax2 + bx + bx3
It would be helpful for you to state the complete problem. My guess is that it is two parts:
a) Prove that the functions in S = {1 + x2, x + x3} are linearly independent.
b) Augment S to a set S' that is a basis for P3.

For a, how is linear independence defined? From your work above, I'm not sure that you know. The definitions for linear independence and linear independence are similar, and there is a subtlety that students often don't grasp.
For b, have you learned about the Gram-Schmidt process?
 
  • #4
I scanned it in. Its problem # 15
 

Attachments

  • takehomePage2.jpg
    takehomePage2.jpg
    21.9 KB · Views: 305
  • #5
Linear independence mean the members of a set of vectors are independent of each other. None is a multiple of the other.
Haven't gotten to Gram-Schmidt process
 
  • #6
judahs_lion said:
I scanned it in. Its problem # 15
Prob. 15 is almost identical to prob. 13. The polynomials in P3 are essentially the same as vectors in R4. For example, 1 + 2x2 <---> <1, 0, 2, 0>.
 
  • #7
judahs_lion said:
Linear independence mean the members of a set of vectors are independent of each other.
This isn't the definition, and besides, a definition of a term ought not use the same term in the definition. Look in your book and see how it defines linear independence.
judahs_lion said:
None is a multiple of the other.
This is a necessary condition for linear independence, but it is not sufficient. For example, consider the set {<1, 0, 0>, <0, 1, 0>, <1, 1, 0>}. None of these vectors is a multiple of any other vector in the set, yet these vectors are not linearly independent.
 
  • #8
Mark44 said:
This isn't the definition, and besides, a definition of a term ought not use the same term in the definition. Look in your book and see how it defines linear independence.
This is a necessary condition for linear independence, but it is not sufficient. For example, consider the set {<1, 0, 0>, <0, 1, 0>, <1, 1, 0>}. None of these vectors is a multiple of any other vector in the set, yet these vectors are not linearly independent.

Thanks for pointing out that so , what i need to do is a transformtion as i did in the attached. Then the rest is just as problem 13?
 

Attachments

  • afde.jpg
    afde.jpg
    10.4 KB · Views: 312
  • #10
Mark44 said:
Yes.

But the second part. Augment S to form a basis S' for P3 , that would still be in the form of a polynomial?
 
  • #11
You can use the augmented basis you found in #13, and "untransform" the vectors to get the other two polynomials you need for a basis for P3.
 
  • #12
Mark44 said:
You can use the augmented basis you found in #13, and "untransform" the vectors to get the other two polynomials you need for a basis for P3.

Ok, to verify 13 is done properly?
 

Attachments

  • test13.jpg
    test13.jpg
    16.8 KB · Views: 295
  • #13
Sure, those vectors are linearly independent, one of many possible sets of four vectors that span R^4.
 
  • #14
S' = {1+x2, x+x3, 1, x }
 
  • #15
Would Reducing the Matrix (1, 0; 0,1; 1, 0; 0,1) to (1,0; 0,1; 0,0; 0,0) have been another way to prove independence?
 
  • #16
Yes, but with just two vectors, that's overkill. Two vectors are linearly independent as long as neither one is a multiple of the other. If you have three vectors, though, it's not as obvious. I gave you an example of this in another thread.
 

FAQ: Prove Independence of P3 Basis S

What is the P3 basis S?

The P3 basis S is a set of three vectors that form a basis for a three-dimensional vector space. It is commonly used in linear algebra and is denoted as {s1, s2, s3}.

Why is it important to prove the independence of P3 basis S?

Proving the independence of the P3 basis S is crucial in determining if the three vectors in the set are linearly independent. This means that none of the vectors can be written as a linear combination of the other two, which is important in many mathematical and scientific applications.

How do you prove the independence of P3 basis S?

To prove the independence of P3 basis S, you must show that the only solution to the equation c1s1 + c2s2 + c3s3 = 0 is when c1 = c2 = c3 = 0. This can be done through various methods, such as using Gaussian elimination or the determinant test.

What are the consequences if the P3 basis S is not independent?

If the P3 basis S is not independent, it means that at least one of the vectors can be written as a linear combination of the other two. This can lead to errors and inaccuracies in calculations and can also affect the overall results and conclusions in scientific studies.

Are there any real-world applications of proving the independence of P3 basis S?

Yes, there are many real-world applications of proving the independence of P3 basis S. For example, it is used in computer graphics to determine if a set of three vectors can form a three-dimensional object. It is also important in physics, engineering, and economics for solving problems involving three-dimensional vector spaces.

Back
Top