Bases of functions and Matrices with respect to the bases

In summary: Actually, since we have a finite-dimensional vector space and since $|\beta| = |\beta'|$ it is sufficient, since:$\dim(\text{span}(\beta')) \leq |\beta'|$.The condition you state: "the ability to express every vector as a linear combination of vectors from $\beta'$"is equivalent to saying: $\text{span}(\beta') = P_2$, which has dimension 3, so "less than" is not an...
  • #1
Kronos1
5
0
Hi All struggling with concepts involved here

So I have \(\displaystyle {P}_{2} = \left\{ a{t}^{2}+bt+c \mid a,b,c\epsilon R\right\}\) is a real vector space with respect to the usual addition of polynomials and multiplication of a polynomial by a constant.

I need to show that both \(\displaystyle \beta=\left\{1,t,{t}^{2}\right\} and \space \beta^{\prime}=\left\{t,{t}^{2}+t,{t}^{2}+t+1\right\} \) are bases for \(\displaystyle {P}_{2}\)

Then a real polynomial \(\displaystyle p(t)\) defines the differentiable function
\(\displaystyle p:R\to R, \space x\to p(x) \)
As shown in elementary calculus, differentiation is the linear transformation
\(\displaystyle D:{P}_{2} \to {P}_{2}, \space p\to p^{\prime}=\pd{p}{x}\)
Find the Matrix of $D$ with respect to the bases

(i) \(\displaystyle \beta\) in both the domain and co-domain
(ii) \(\displaystyle \beta\) in the domain and \(\displaystyle \beta^{\prime}\) in the co-domain
(iii) \(\displaystyle \beta^{\prime}\) in the domain and \(\displaystyle \beta\) in the co-domain
(iv) \(\displaystyle \beta^{\prime}\) in both the domain and co-domain

Any help would be appreciated as well as a detailed explanation as to why thanks in advance
 
Physics news on Phys.org
  • #2
You are asking questions that are explained in any textbook of linear algebra. Yes, it is possible to duplicate a textbook and write a detailed explanation here, but what would justify this effort? Can you convince us that you have no access to textbooks?

See rule #11 http://mathhelpboards.com/rules/ (click on the "Expand" button on top). With respect to proving that the given sets of polynomials are bases, I suggest you show your effort by demonstrating that you know what a basis is. Can you prove that the given sets satisfy at least a part of that definition?

With respect to the matrix of the differentiation operator, its columns are coordinates of the images of basis vectors. See Wikipedia here. In (i), you need to take each vector from $\beta$, differentiate it and find the coordinates of the result in the same basis $\beta$. For example, $D(t^2)=2t$, so the coordinates of $Dt^2$ with respect to $\beta$ are $(0,2,0)$. This is the last column (since $t^2$ is the last basis vector) of $D$ with respect to $\beta$ and $\beta$.
 
  • #3
You assume that I have not tried a textbook already. coming to this forum is a last resort as I am struggling to understand the definitions in the textbooks.

For all the reading I have done I have only ascertained that the basis is a set of vectors that you can make all other vectors in that space out of via the addition of scalar multiples of the basis vectors. I cannot quite understand how you go about proving that they are basis vectors?
 
  • #4
Kronos said:
I have only ascertained that the basis is a set of vectors that you can make all other vectors in that space out of via the addition of scalar multiples of the basis vectors. I cannot quite understand how you go about proving that they are basis vectors?
Hmm, I think it should be obvious that every element $at^2+bt+c$ of $P_2$ can be obtained from $1$, $t$ and $t^2$ using addition and multiplication by scalars. As for the second basis, suppose that $a$, $b$ and $c$ are fixed and we want to express $at^2+bt+c$ through elements of $\beta'$. We must have
\[
xt+y(t^2+t)+z(t^2+t+1)=at^2+bt+c
\]
for some coefficients $x$, $y$ an $z$. Equating coefficients of the same powers of $t$, we get the following system of equations.
\[
\left\{
\begin{array}{rcl}
z&=&c\\
x+y+z&=&b\\
y+z&=&a
\end{array}
\right.
\qquad(*)
\]
Moving the first equation down converts it into echelon form, so you can determine if it has solutions.

It is important that the ability to express every vector as a linear combination of vectors from $\beta'$ does not make $\beta'$ a basis: it's only one of the two conditions on a basis. The condition you named says that $\beta'$ has sufficiently many vectors to express all vectors in the space. The second condition says that it does not have too many vectors so that expressing other vectors through $\beta'$ becomes ambiguous. Every vector must be expressed in a unique way. It is sufficient to show this only for the zero vector. We know that $0\cdot t^2+0\cdot t+0=0\cdot t+0(t^2+t)+0(t^2+t+1)$. It remains to show that no other linear combination gives the zero vector. You can determine if this is so by looking at the system (*) above where $a=b=c=0$ and figuring out how many solutions it has.
 
  • #5
Evgeny.Makarov said:
It is important that the ability to express every vector as a linear combination of vectors from $\beta'$ does not make $\beta'$ a basis: it's only one of the two conditions on a basis.

Actually, since we have a finite-dimensional vector space and since $|\beta| = |\beta'|$ it is sufficient, since:

$\dim(\text{span}(\beta')) \leq |\beta'|$.

The condition you state: "the ability to express every vector as a linear combination of vectors from $\beta'$"

is equivalent to saying: $\text{span}(\beta') = P_2$, which has dimension 3, so "less than" is not an option.
 
  • #6
Evgeny.Makarov said:
You are asking questions that are explained in any textbook of linear algebra. Yes, it is possible to duplicate a textbook and write a detailed explanation here, but what would justify this effort? Can you convince us that you have no access to textbooks?

See rule #11 http://mathhelpboards.com/rules/ (click on the "Expand" button on top). With respect to proving that the given sets of polynomials are bases, I suggest you show your effort by demonstrating that you know what a basis is. Can you prove that the given sets satisfy at least a part of that definition?

With respect to the matrix of the differentiation operator, its columns are coordinates of the images of basis vectors. See Wikipedia here. In (i), you need to take each vector from $\beta$, differentiate it and find the coordinates of the result in the same basis $\beta$. For example, $D(t^2)=2t$, so the coordinates of $Dt^2$ with respect to $\beta$ are $(0,2,0)$. This is the last column (since $t^2$ is the last basis vector) of $D$ with respect to $\beta$ and $\beta$.
Could you recommend one of these textbooks - I have 3 at the moment and cannot find direction from them.
 

Related to Bases of functions and Matrices with respect to the bases

1. What are bases of functions and matrices?

Bases of functions and matrices refer to the set of vectors that can be used to represent any other vector in a vector space. In other words, they are the building blocks that make up a vector space.

2. How are bases of functions and matrices related?

Functions and matrices can both be represented as vectors in a vector space. Therefore, the bases of functions and matrices are essentially the same - a set of vectors that can be used to represent any other vector in the vector space.

3. What is the importance of bases of functions and matrices in linear algebra?

Bases of functions and matrices are crucial in linear algebra because they allow us to represent complex vector spaces in a simpler way. By using a set of basis vectors, we can easily perform operations such as addition, multiplication, and transformation on the vectors in the space.

4. How do you find the basis of a given function or matrix?

To find the basis of a function or matrix, you first need to determine the vector space it belongs to. Then, you can use methods such as Gaussian elimination or finding eigenvalues and eigenvectors to find a set of linearly independent vectors that span the vector space.

5. Can the basis of a function or matrix change?

Yes, the basis of a function or matrix can change depending on the vector space it is being represented in. Different vector spaces may require different sets of basis vectors, and a function or matrix can have a different basis in each vector space.

Similar threads

Replies
27
Views
2K
  • Linear and Abstract Algebra
Replies
11
Views
1K
  • Linear and Abstract Algebra
Replies
15
Views
1K
Replies
4
Views
2K
  • Linear and Abstract Algebra
Replies
1
Views
2K
  • Quantum Physics
Replies
5
Views
658
  • Computing and Technology
Replies
4
Views
1K
  • Linear and Abstract Algebra
Replies
1
Views
2K
  • Special and General Relativity
Replies
3
Views
372
Replies
5
Views
1K
Back
Top