Show that Polynomials p0 to pn Form Basis of F[t] ≤ n

  • Thread starter e.gedge
  • Start date
  • Tags
    Basis
In summary, the polynomials p0= 1 + x + x2+ x3...+ xn, p1= x + x2+ x3... +xn, p2= x2 + x3 +...+ xn, ... pn=xn form a basis of F[t] less than or equal to n.
  • #1
e.gedge
7
0
Quick a easy question i need help with, so thanks to anyone who will try it out..

Show that the polynomials p0= 1 + x + x2+ x3...+ xn, p1= x + x2+ x3... +xn, p2= x2 + x3 +...+ xn, ... pn=xn form a basis of F[t] less than or equal to n

Thanks!
xo
 
Physics news on Phys.org
  • #2
For a problem like this it is best to use the definition. A "basis" for a vector space is a set of vectors that (1) span the space and (2) are independent.

Now look at the definitions of "span" and "independent".
To span the space means that every vector in the space can be written as a linear combinations in the set. Any vector in this space is a polynomal of degree less than or equal to n: any such vector can be written f(x)= [itex]\alpha_nx^n+ \alpha_{n-1}x^{n-1}+ \cdot\cdot\cdot+ \alpha_1 x+ \alpha_0[/itex]. Can you find numbers [itex]a_0, a_1, \cdot\cdot\cdot, a_n[/itex] so that [itex]a_0(1 + x + x^2+ x^3+ \cdot\cdot\cdot+ x^n)[itex][itex]+ a_1(x+ x^2+ x^3+ \cdot\cdot\cdot+ x^n)+ [/itex][itex]\cdot\cdot\cdot + a_n(x^n)[/itex][itex]= [itex]\alpha_nx^n+ \alpha_{n-1}x^{n-1}+ \cdot\cdot\cdot+ \alpha_1 x+ \alpha_0[/itex]?

"Independent" means that the only linear combination of the vectors that equals the 0 vector is the "trivial" combination with all coefficients equal to 0. Suppose
[itex]a_0(1 + x + x^2+ x^3+ \cdot\cdot\cdot+ x^n)[itex][itex]+ a_1(x+ x^2+ x^3+ \cdot\cdot\cdot+ x^n)+ [/itex][itex]\cdot\cdot\cdot + a_n(x^n)[/itex][itex]= 0 for all x. Can you prove that all the "a"s must be 0?
 
  • #3
Uhh... I am pretty sure that i ca do the span part, but i still am unsure as to how to go about proving that all a's must equal 0
 
  • #4
oh, nevermind, i got it now. Thanks a ton for the help!
 

FAQ: Show that Polynomials p0 to pn Form Basis of F[t] ≤ n

What is the definition of a basis?

A basis is a set of vectors that can be used to represent any other vector in a given vector space. In other words, any vector in the space can be written as a linear combination of the basis vectors.

How do you show that polynomials form a basis?

In order to show that polynomials form a basis, we must first prove that they are linearly independent and that they span the entire vector space. This can be done by showing that every polynomial of degree n or less can be written as a linear combination of the polynomials p0 to pn, and that this representation is unique.

What is F[t] ≤ n?

F[t] ≤ n represents the set of all polynomials with coefficients from a field F, with a degree of n or less. In other words, it is the set of all polynomials with the form antn + an-1tn-1 + ... + a1t + a0, where ai are elements of the field F.

Why is it important to show that polynomials form a basis?

It is important to show that polynomials form a basis because it allows us to use them as a tool for solving various mathematical problems. By representing vectors in a vector space as linear combinations of polynomials, we can use algebraic techniques to manipulate and solve these vectors.

How does the concept of a basis relate to linear independence?

A basis is a set of linearly independent vectors that span a vector space. This means that the vectors in the basis are not redundant and can be used to uniquely represent any other vector in the space. Thus, the concept of a basis is closely related to linear independence, as it is necessary for a set of vectors to be linearly independent in order to form a basis.

Back
Top