Regarding Basis and Dual Basis

Now, if ##\{\tilde x^i\}## is an arbitrary basis for V*, then there exists a set of vectors ##\{\vec x_i\}## in V such that ##\tilde x^i(x_j)=\delta^i_j##.In summary, a basis for V has the property that a basis for V* is given by a set of functionals that map the basis to a Kronecker delta. And a basis for V* has the property that a basis for V is given by a set of vectors that map the
  • #1
simpleton
58
0
Hi, I'm learning about vector spaces and I would like to ask some questions about it.

Suppose I have a vector space [itex]V[/itex], and a basis for [itex]V \{v_1, ... v_n\}[/itex]. Then there is a dual space [itex]V^*[/itex] consisting all linear functions whose domain is [itex]V[/itex] and range is ℝ. Then the space [itex]V^*[/itex] has a dual basis [itex]\{x_1 ... x_n\}[/itex], and one way of constructing this dual basis is to let [itex]x_i[/itex] be a function that returns the [itex]i^{th}[/itex] coordinate of the vector when expressed in the given basis for [itex]V[/itex]. The claim is that the span of these functions is equal to [itex]V^*[/itex], and therefore the space and dual space have the same dimension.

I have a few questions. My textbook states that any linear function on [itex]V[/itex] can be expressed as a linear combination of the coordinate functions, but it does not explain why. May I know why this is the case? I am assuming that this is because a linear function on a vector must be a linear combination of its coordinates, but I'm not sure if this must always be the case, since I think a linear function is merely defined to be something that is closed under addition and multiplication under scalars.

The textbook also says that the coordinate functions are linear independent. By this, I think they mean that if there exists a linear combination of the coordinate functions such that any input vector returns 0 on the equation, then the coefficients of all the coordinate functions must be 0. I think this makes sense, but may I know if I interpreted it correctly?

Finally, it is mentioned that given any dual basis [itex]\{x_1...x_n\}[/itex] of [itex]V^*[/itex], you can construct a basis for [itex]V[/itex], such that the dual basis acts as coordinate functions for the basis of [itex]V[/itex]. By this, I think they mean that I can find a set [itex]\{v_1...v_n\}[/itex] such that [itex]x_i(v_j) = 1[/itex] iff [itex]i = j[/itex] and [itex]x_i(v_j) = 0[/itex] otherwise. But I'm not sure why this is true. I think that in order for this to be true, the following questions have to be answered.

1) Such a set [itex]\{v_1...v_n\}[/itex] exists.
2) Any vector in [itex]V[/itex] can be expressed as a linear combination of the vectors.
3) The constructed vectors are linearly independent.

I am not sure how 1) can be answered. I think 2) and 3) are actually the same question because we know that both [itex]V[/itex] and [itex]V^*[/itex] have the same dimension, and therefore both 2) and 3) have to be true at the same time so that the Linear Dependence Lemma is not violated. I am not sure how to prove 2), but I think I can prove 3). Suppose a linear combination exists, so [itex]\Sigma a_iv_i = 0[/itex] for some [itex]a_1...a_n[/itex]. But if I apply [itex]x_i[/itex] to this linear combination, I will get [itex]a_i = 0[/itex] so in the end I deduce that all the coefficients are 0. May I know if there is a way to prove 2) directly?

Thank you very much!
 
Physics news on Phys.org
  • #3
Let V be a vector space of dimension n. Then it has a basis [itex]\{x_1, x_2, \cdot\cdot\cdot, x_n\}[/itex]. For i from 1 to n, we define [itex]f_i[/itex] by requiring that [itex]f_i(x_j)= \delta_{ij}[/itex]. That is, [itex]f_i(x_j)[/itex] is equal to 1 if i= j, 0 if not. We define [itex]f_i[/itex] for any vector in V "by linearity": if [itex]x= a_1x_1+ a_2x_2+ \cdot\cdot\cdot+ a_nx_n[/itex] then [itex]f_i(x)= a_1f_i(x_1)+ a_2f_i(x_2)+ \cdot\cdot\cdot+ a_nf_i(x_n)= a_i[/itex].

Now, we show that this set of linear functions is a basis for V*. Suppose f is any linear function from V to its underlying field. For all i from 0 to n, let [itex]a_i= f(x_i)[/itex]. Then it is easy to show that [itex]f(x)= a_1f_1(x)+ a_2f_2(x)+ \cdot\cdot\cdot+ a_nf_n(x)[/itex] so that [itex]f= a_1f_1+ a_2f_2+ \cdot\cdot\cdot+ a_nf_n[/itex]. That is, this set of n linear functionals spans the entire space of all linear functionals on V.

Of course the linear functional f that takes all vectors in V to 0, in particular, takes all of the basis vectors, [itex]x_i[/itex], to 0 and so is [itex]f= 0f_1+ 0f_2+ \cdot\cdot\cdot+ 0f_n[/itex], showing that these functionals are independent.


(By the way, all of this requires that V be finite dimensional. Infinite dimensional vector spaces are not necessarily isomorphic to their duals.)
 
  • #4
Hi HallsofIvy,

Thank you very much for your reply! I understand now how the linear functions work. Could you help me with the second part as well? How do you prove that you can find a corresponding basis [itex]\{v_1...v_n\}[/itex] exists given the basis of the dual space?
 
  • #5
This is how I like to do these things: First a few comments about notation.

I use the convention that basis vectors of V are written with indices downstairs, and components of vectors of V are written with indices upstairs. So we can write a basis for V as ##\{e_i\}##, and if x is in V, we have ##x=\sum_i x^i e_i##. But I'll write this as ##x=x^i e_i##. The idea is that since there's always a sum over the indices that occur twice, and never a sum over other indices, it's a waste of time and space to type summation sigmas.

For members of V*, the convention for indices is the opposite. We'll write a basis as ##\{e^i\}##, and if x is in V*, we'll write ##x=x_i e^i##.

Since you're new at this, you may find it difficult to keep track of which symbols denote members of V and which symbols denote members of V*, so for your benefit, I will write members of V as ##\vec x## and members of V* as ##\tilde x##. I found this notation helpful when I was learning this stuff, but I don't use it anymore. By the way, I'm going to omit some "for all" statements, in particular "for all i" and "for all j". I hope it will be obvious when I'm doing so.

Let ##\{\vec e_i\}## be an arbitrary basis for V. I like to define the dual basis ##\{\tilde e^i\}## by saying that for each i, ##\tilde e^i## is the member of V* such that ##\tilde e^i(\vec e_j)=\delta^i_j##. For all ##\vec x\in V##, we have
$$\tilde e^i(\vec x)=\tilde e^i(x^j\vec e_i)=x^j\tilde e^i(\vec e_j)=x^j\delta^i_j=x^i.$$ To prove that ##\{\tilde e^i\}## is a basis for V*, we must prove that it's a linearly independent set that spans V*. To see that it spans V*, let ##\tilde y\in V^*## be arbitrary. For all ##\vec x\in V##,
$$\tilde y(\vec x)=\tilde y(x^i\vec e_i)=x^i\tilde y(\vec e_i)=\tilde e^i(\vec x)\tilde y(\vec e_i) =\tilde y(\vec e_i)\tilde e^i(\vec x) =\big(\tilde y(\vec e_i)\tilde e^i\big)(\vec x).$$ So ##\tilde y=\tilde y(\vec e_i)\tilde e^i##. To see that ##\{\tilde e^i\}## is linearly independent, suppose that ##a_i\tilde e^i=0##. Then ##(a_i\tilde e^i)(\vec x)## for all ##\vec x\in V##. This implies that ##(a_i\tilde e^i)(\vec e_j)=0## for all j. So for all j,
$$0=(a_i\tilde e^i)(\vec e_j) =a_i \tilde e^i(\vec e_j)=a_i\delta^i_j=a_j.$$ So all the ##a_i## are 0.

Now let ##\{\tilde e^i\}## be an arbitrary basis for V*. We want to use this to construct a basis ##\{\vec e_i\}## for V such that ##\tilde e^i(\vec e_j)=\delta^i_j##. Maybe there's a more direct approach, but this is the one I see immediately:

Let's write members of V** (the dual of V*) as ##\overleftarrow x##. We will write the dual basis of ##\{\tilde e_i\}## as ##\{\overleftarrow e_i\}##. Define a function ##f:V\to V^{**}## by ##f(\vec x)(\tilde y)=\tilde y(\vec x)##. (Since ##f(\tilde x)## is in V**, it takes a member of V* as input). It's not hard to show that this f is an isomorphism. Then we can define the basis for V by ##\vec e_i=f^{-1}(\overleftarrow e_i)##. Then we just verify that everything works out as intended.
 
  • #6
The vector space is isomorphic to the dual space of the dual space. Any vector v defines the linear map on dual vectors, l -> l(v).

The dual basis of the dual basis is the basis back again.
 
Last edited:

Related to Regarding Basis and Dual Basis

1. What is a basis and dual basis in mathematics?

A basis is a set of linearly independent vectors that can be used to span a vector space. A dual basis is a set of linear functionals that can be used to define coordinates for each vector in the basis.

2. How are a basis and dual basis related?

A dual basis is the set of linear functionals that correspond to the basis vectors in a vector space. This means that the dual basis can be used to represent any vector in the original basis.

3. What is the significance of a basis and dual basis?

A basis and dual basis are important because they provide a way to easily represent and manipulate vectors in a vector space. They also allow for the definition of coordinates and calculations of inner products.

4. How can one determine the basis and dual basis for a given vector space?

To determine the basis for a vector space, one can use the process of Gaussian elimination to find a set of linearly independent vectors. The dual basis can then be found by taking the transpose of the inverse of the matrix formed by the basis vectors.

5. Can a vector space have more than one basis and dual basis?

Yes, a vector space can have multiple bases and dual bases. This is because there can be more than one set of linearly independent vectors that can be used to span a vector space. However, the dimension of the vector space will remain the same regardless of the chosen basis and dual basis.

Similar threads

  • Linear and Abstract Algebra
Replies
17
Views
2K
Replies
3
Views
2K
  • Linear and Abstract Algebra
Replies
7
Views
653
  • Linear and Abstract Algebra
Replies
9
Views
944
  • Linear and Abstract Algebra
Replies
6
Views
1K
  • Linear and Abstract Algebra
Replies
9
Views
629
  • Linear and Abstract Algebra
Replies
1
Views
2K
  • Linear and Abstract Algebra
Replies
3
Views
1K
  • Calculus and Beyond Homework Help
Replies
0
Views
536
  • Linear and Abstract Algebra
Replies
13
Views
1K
Back
Top