Linear Algebra: Linear indepency of a set of Polynomials

In summary, for a set of linearly independent polynomials {p, q}, the addition of the third polynomial pq will result in a linearly independent set only if both p and q have a degree greater than or equal to 1. This can be proven by showing that if either p or q has a degree of 0, the set will become linearly dependent, and that the product of two polynomials with a degree greater than or equal to 1 will have a degree greater than or equal to 2, making it impossible for the set to be linearly dependent.
  • #1
Millacol88
86
0

Homework Statement



Let {p, q} be linearly independent polynomials. Show that {p, q, pq} is linearly independent if and only if deg p ≥ 1 and deg q ≥ 1.

Homework Equations



λ1p + λ2q = 0 ⇔ λ1 = λ2 = 0

The Attempt at a Solution



λ1p + λ2q + λ3pq = 0

I know if λ3 = 0, then the coefficients of and q must be zero, but if λ3 ≠ 0, how do I show the degrees must be greater than or equal to 1?
 
Physics news on Phys.org
  • #2
Assuming {p, q} is linearly independent.
{p, q, pq} is linearly independent [itex]\Leftrightarrow [/itex] deg p ≥ 1 and deg q ≥ 1.

{p, q, pq} is linearly independent [itex]\Rightarrow [/itex] deg p ≥ 1 and deg q ≥ 1. (1)
{p, q, pq} is linearly independent [itex]\Leftarrow [/itex] deg p ≥ 1 and deg q ≥ 1. (2)

You can easily prove (1) by the transposition (~q => ~p):
If deg p =0 or deg q =0 then {p, q, pq} is not linearly independent

And (2), if deg p ≥ 1 and deg q ≥ 1 .What happens with deg pq?
 
Last edited:
  • #3
Thanks for the reply! Today I was able to prove (1) by contraposition but I thought that meant I had to prove (2) by contraposition as well. Can I just prove the positive statement for (2)?

Edit: Wait, taking statement A as {p, q, pq} is linearly independent and statement B as deg p ≥ 1 and deg q ≥ 1, is the contrapositive of (1) ~A => ~B or ~B => ~A? I used the first one, but I'm beginning to think its the second one.
 
Last edited:
  • #4
(1)
{p, q, pq} is linearly independent as P
deg p ≥ 1 and deg q ≥ 1 as Q
~P is {p, q, pq} is linearly dependent
~Q is p < 1 or deg q < 1
Then ~Q => ~P is:
~Q is p < 1 or deg q < 1 =>~P is {p, q, pq} is linearly dependent

PD: Is contraposition the right word? I searched in wiki to know how it's named in English, in Spanish is "contrarecripoco" and I thought it translation was "transposition"

(2) I thought something like:
If deg p ≥ 1 and deg q ≥ 1, then deg (pq) ≥ 2, so λ_{1}p + λ_{2}q + λ_{3}pq = 0 only if λ_{3}=0, and as p and q are linearly independent...
I don't remember exactly how to show that degres must be lesser than 1, but if you write both polynomials as sums, then the product of them will have a x with a power greater than one and it thought it was trivial enough to say that you can't make λ_{1}*(0)+λ_{2}*(0)+λ_{3}*aX^{n}=0 (with a≠0) without making λ_{3}=0 (because the polynomial sum to be equal to zero needs to be every coefficient of a power of X be equal to zero).
 
Last edited:

FAQ: Linear Algebra: Linear indepency of a set of Polynomials

What is linear independence?

Linear independence is a property of a set of vectors or functions where no vector or function in the set can be written as a linear combination of the others. In other words, there is no redundancy in the set and each vector or function is necessary to span the space it belongs to.

How is linear independence determined for a set of polynomials?

To determine linear independence for a set of polynomials, we can use the method of coefficients. This involves setting up a system of equations using the coefficients of the polynomials and solving for the variables. If the only solution is the trivial solution (all variables are equal to 0), then the set of polynomials is linearly independent.

Can a set of polynomials be both linearly independent and dependent?

No, a set of polynomials can only be either linearly independent or dependent. If the set is linearly independent, then no polynomial can be written as a linear combination of the others. If the set is linearly dependent, then at least one polynomial can be written as a linear combination of the others.

What is the relationship between linear independence and the dimension of a vector space?

The dimension of a vector space is equal to the number of linearly independent vectors or functions in a set that spans the space. In other words, the dimension of a vector space is the maximum number of linearly independent vectors or functions that can be found in a set within that space.

Why is linear independence important in linear algebra?

Linear independence is important in linear algebra because it allows us to determine the basis of a vector space, which is a set of linearly independent vectors or functions that can be used to represent any other vector or function in that space. This is crucial for solving many problems in mathematics, physics, and engineering that involve linear transformations and systems of linear equations.

Similar threads

Back
Top