- #1
VrhoZna
- 14
- 3
(From Hoffman and Kunze, Linear Algebra: Chapter 6.7, Exercise 11.) Note that ##V_j^0## means the annihilator of the space ##V_j##. V* means the dual space of V.
1. Homework Statement
Let V be a vector space, Let ##W_1 , \cdots , W_k## be subspaces of V, and let
$$V_j = W_1 + \cdots + W_{j-1} + W_{j+1} + \cdots + W_k$$
Suppose that ##V = W_1 \oplus \cdots \oplus W_k##. Prove that the dual space V* has the direct-sum decomposition ##V^{*}= V_1^0 \oplus \cdots \oplus V_k^0##
I use a portion of a theorem in the text referred to as Theorem 9 that states "If ##E_1 , \cdots , E_k## are k linear operators on V which satisfy conditions;
(i) each ##E_i## is a projection.
(ii) ##E_i \circ E_j = 0##, if i ≠ j
(iii): I = ##E_1 + \cdots + E_k##, where I is the identity operator.
then if we let ##W_i## be the range of ##E_i##, then ##V = W_1 \oplus \cdots \oplus W_k##."
Let ##E_i## (i = 1,...,k) be a linear operator on V such that, if ##\alpha## is in V and ##\alpha = \alpha_1 + \cdots + \alpha_k## (with ##\alpha_j \in W_j##), then ##E_i(\alpha) = \alpha_i##. Then the image of each function ##E_i## is the corresponding subspace ##W_i## and the null space is the sum ##W_1 + \cdots + W_{i-1} + W_{i+1} + \cdots + W_k = V_i##.
Let ##^tE_i## denote the transpose of ##E_i## (i.e., the linear operator from V* into V* defined by ##^tE_i(f) = f \circ E_i##), then the image of ##^tE_i## is the annihilator of the space ##V_i##. We seek to show that that V* is the direct sum of the images of each ##^tE_i## for i = 1, ... , k.
(i): To show each transpose operator is a projection, let ##f \in V^{*}## and we have $$(^tE_j \circ ^tE_j)(f) = ^tE_j(f \circ E_j) = f \circ E_j \circ E_j = f \circ E_j^2 = f \circ E_j = ^tE_j(f)$$ Thus ##^tE_j## is a projection for j = 1, ..., k.
(ii): We now show that ##^tE_i \circ ^tE_j = 0## for i ≠ j. Once again if ##f \in V^{*}##, we have
##(^tE_i \circ ^tE_j)(f) = ^tE_i(^tE_j(f)) = ^tE_i(f \circ E_j) = f \circ E_j \circ E_i = 0## where the last result follows from the fact that ##E_i \circ E_j = 0## for i ≠ j as the image of ##E_j## is in the null space of ##E_i##.
(iii): Lastly, we must show that, $$I = ^tE_1 + \cdots + ^tE_k$$ It follows that ##I = E_1 + \cdots + E_k## as if ##\alpha \in V## and ##\alpha = \alpha_1 + \cdots + \alpha_k## (##\alpha_j \in W_j##) then ##\alpha = E_1(\alpha) + \cdots + E_k(\alpha)## for all ##\alpha \in V##. Also as ##E_i \circ E_j = E_j \circ E_i## for all i and j and each ##E_i## is diagonalizable it follows that there is a basis for V such that the matrix associated with each projection with respect to that basis is a diagonal matrix. And as the associated matrix of a transpose linear transformation is the transpose of the matrix of its associated map with respect to the same base(s), it follows that ##I = ^tE_1 + \cdots + ^tE_k##.
Therefore, by Theorem 9, $$V^{*}= V_1^0 \oplus \cdots \oplus V_k^0$$ as was to be shown.A part I'm not entirely clear on is the proof of property (iii). Mainly that I'm fairly sure that if a linear transformation is the sum of some other linear transformations then the matrix associated with the sum map with respect to some pair of bases should be the sum the of the matrix representations of the summand linear transformations in the same basis but Chapter 3 was awhile ago :P[/B]
1. Homework Statement
Let V be a vector space, Let ##W_1 , \cdots , W_k## be subspaces of V, and let
$$V_j = W_1 + \cdots + W_{j-1} + W_{j+1} + \cdots + W_k$$
Suppose that ##V = W_1 \oplus \cdots \oplus W_k##. Prove that the dual space V* has the direct-sum decomposition ##V^{*}= V_1^0 \oplus \cdots \oplus V_k^0##
Homework Equations
I use a portion of a theorem in the text referred to as Theorem 9 that states "If ##E_1 , \cdots , E_k## are k linear operators on V which satisfy conditions;
(i) each ##E_i## is a projection.
(ii) ##E_i \circ E_j = 0##, if i ≠ j
(iii): I = ##E_1 + \cdots + E_k##, where I is the identity operator.
then if we let ##W_i## be the range of ##E_i##, then ##V = W_1 \oplus \cdots \oplus W_k##."
The Attempt at a Solution
Let ##E_i## (i = 1,...,k) be a linear operator on V such that, if ##\alpha## is in V and ##\alpha = \alpha_1 + \cdots + \alpha_k## (with ##\alpha_j \in W_j##), then ##E_i(\alpha) = \alpha_i##. Then the image of each function ##E_i## is the corresponding subspace ##W_i## and the null space is the sum ##W_1 + \cdots + W_{i-1} + W_{i+1} + \cdots + W_k = V_i##.
Let ##^tE_i## denote the transpose of ##E_i## (i.e., the linear operator from V* into V* defined by ##^tE_i(f) = f \circ E_i##), then the image of ##^tE_i## is the annihilator of the space ##V_i##. We seek to show that that V* is the direct sum of the images of each ##^tE_i## for i = 1, ... , k.
(i): To show each transpose operator is a projection, let ##f \in V^{*}## and we have $$(^tE_j \circ ^tE_j)(f) = ^tE_j(f \circ E_j) = f \circ E_j \circ E_j = f \circ E_j^2 = f \circ E_j = ^tE_j(f)$$ Thus ##^tE_j## is a projection for j = 1, ..., k.
(ii): We now show that ##^tE_i \circ ^tE_j = 0## for i ≠ j. Once again if ##f \in V^{*}##, we have
##(^tE_i \circ ^tE_j)(f) = ^tE_i(^tE_j(f)) = ^tE_i(f \circ E_j) = f \circ E_j \circ E_i = 0## where the last result follows from the fact that ##E_i \circ E_j = 0## for i ≠ j as the image of ##E_j## is in the null space of ##E_i##.
(iii): Lastly, we must show that, $$I = ^tE_1 + \cdots + ^tE_k$$ It follows that ##I = E_1 + \cdots + E_k## as if ##\alpha \in V## and ##\alpha = \alpha_1 + \cdots + \alpha_k## (##\alpha_j \in W_j##) then ##\alpha = E_1(\alpha) + \cdots + E_k(\alpha)## for all ##\alpha \in V##. Also as ##E_i \circ E_j = E_j \circ E_i## for all i and j and each ##E_i## is diagonalizable it follows that there is a basis for V such that the matrix associated with each projection with respect to that basis is a diagonal matrix. And as the associated matrix of a transpose linear transformation is the transpose of the matrix of its associated map with respect to the same base(s), it follows that ##I = ^tE_1 + \cdots + ^tE_k##.
Therefore, by Theorem 9, $$V^{*}= V_1^0 \oplus \cdots \oplus V_k^0$$ as was to be shown.A part I'm not entirely clear on is the proof of property (iii). Mainly that I'm fairly sure that if a linear transformation is the sum of some other linear transformations then the matrix associated with the sum map with respect to some pair of bases should be the sum the of the matrix representations of the summand linear transformations in the same basis but Chapter 3 was awhile ago :P[/B]