Proving that ##V = U_1 \oplus U_2 \oplus \ldots \oplus U_k##

  • I
  • Thread starter JD_PM
  • Start date
In summary: Therefore ##x = 0## and as a result ##U_i \cap U_j = \{0\}##.In summary, we are proving the uniqueness of vectors in a sum of subspaces by showing that the intersection of any two subspaces is only the zero vector. This is done by using the linear independence of the basis vectors and the fact that the sum of subspaces yields a subspace.
  • #1
JD_PM
1,131
158
TL;DR Summary
I want to prove the following: Let ##V## be a vectorspace and ##\beta## a basis for $V$. Now make a partition of ##\beta## in a disjoint union of subsets ##\beta_1, \ldots, \beta_k## and let ##U_i = \text{span}(\beta_i)## for every ##i = 1, \ldots, k##. Prove then that ##V = U_1 \oplus U_2 \oplus \ldots \oplus U_k##.
Attempt:

Take an arbitrary vector ##v \in V##. Then we have to show that there are unique vectors ##u_1 \in U_1, u_2 \in U_2, \ldots, u_k \in U_k## such that \begin{align*} v = u_1 + u_2 + \ldots + u_k. \end{align*}

We prove by contradiction: Suppose there are two such ways, i.e. that \begin{align*} v= u_1' + u_2' + \ldots + u_k'. \end{align*} also holds. Then we have \begin{align*} \sum_{i=1}^k u_i = \sum_{i=1}^k u_i', \end{align*} or \begin{align*} (u_1 - u_1') + (u_2 - u_2') + \ldots + (u_k - u_k') = 0 \end{align*} The latter equation requires that ##u_i = u_i'##, a contradiction.

Do you agree? Or is there a neater way to show it? :)

Thanks! :biggrin:
 
Physics news on Phys.org
  • #2
No, not really. We have to show that ##U_1 +\ldots+ U_k \supseteq V## which is immediately clear by writing
JD_PM said:
##v=u_1+u_2+…+u_k.##
You should have been more detailed here. Set ##\beta_{i}:=\{u_{i1},\ldots,u_{in_i}\}.## Then we have ##v=\sum_{i=1}^k\sum_{j=1}^{n_i}c_{ij}u_{ij}##. Now define ##u_i:=\sum_{j=1}^{n_i}c_{ij}u_{ij}## for all ## i ##. Then
JD_PM said:
##v=u_1+u_2+…+u_k.##
##\in U_1+\ldots+U_k## and thus ##V\subseteq U_1+\ldots+U_k.##

This would have been better than merely writing
JD_PM said:
##v=u_1+u_2+…+u_k.##
which should have been the conclusion, not the first line.

Finally, you have to show now, that ##U_i\cap U_j=\{0\}## for all ##1\leq i \neq j\leq k.##
 
Last edited:
  • Like
Likes JD_PM
  • #3
I appreciate the detailed explanation!

Given that we want to prove an equality, shouldn't we also discuss why the inclusion ##U_1 +\ldots+ U_k \subseteq V## holds?

My reasoning: the sum of subspaces of ##V## yields a subspace of ##V## (proof).

fresh_42 said:
Finally, you have to show now, that ##U_i\cap U_j=\{0\}## for all ##1\leq i \neq j\leq k.##

Here's my attempt:

Let ##x \in U_i\cap U_j##. Then ##x = \underbrace{x}_{\in U_i} + \underbrace{0}_{\in U_j}## and ##x = \underbrace{0}_{\in U_i} + \underbrace{x}_{\in U_j}##.

Given that ##\beta_{i}:=\{u_{i1},\ldots,u_{in_i}\}## are disjoint subsets, the only element subspaces ##U_i## and ##U_j## have in common is the zero element. Hence, it follows that ##x=0## and ##U_i\cap U_j=\{0\}##
 
  • #4
JD_PM said:
I appreciate the detailed explanation!

Given that we want to prove an equality, shouldn't we also discuss why the inclusion ##U_1 +\ldots+ U_k \subseteq V## holds?

My reasoning: the sum of subspaces of ##V## yields a subspace of ##V## (proof).
Here's my attempt:

Let ##x \in U_i\cap U_j##. Then ##x = \underbrace{x}_{\in U_i} + \underbrace{0}_{\in U_j}## and ##x = \underbrace{0}_{\in U_i} + \underbrace{x}_{\in U_j}##.

Given that ##\beta_{i}:=\{u_{i1},\ldots,u_{in_i}\}## are disjoint subsets, the only element subspaces ##U_i## and ##U_j## have in common is the zero element. Hence, it follows that ##x=0## and ##U_i\cap U_j=\{0\}##
This is not right. You need to use the linear independence of the basis vectors.
 
  • #5
One of your problems is poor technique. For example, whenever you have ##x \in A \cap B## the next line automatically should be ##x \in A## and ##x \in B##.
 
  • #6
PeroK said:
This is not right. You need to use the linear independence of the basis vectors.

OK, might you please provide more details?

PeroK said:
One of your problems is poor technique.

I take this comment as constructive criticism. I do my best to get better.

PeroK said:
For example, whenever you have ##x \in A \cap B## the next line automatically should be ##x \in A## and ##x \in B##.

Alright.
 
  • #7
JD_PM said:
Here's my attempt:

Let ##x \in U_i\cap U_j##. Then ##x = \underbrace{x}_{\in U_i} + \underbrace{0}_{\in U_j}## and ##x = \underbrace{0}_{\in U_i} + \underbrace{x}_{\in U_j}##.
It's difficult to comment on that because it doesn't look like mathematics. It's like you are trying to formulate your ideas without the proper technique.

To follow on from the above:
$$x \in U_i \ \Rightarrow \ x = a\beta_i$$ for some scalar ##a##. Then the same argument for ##j## leads to ##a\beta_i = b\beta_j## which contradicts linear independence, unless ##a = b = 0##.
 
  • Like
Likes JD_PM

FAQ: Proving that ##V = U_1 \oplus U_2 \oplus \ldots \oplus U_k##

What does it mean for two subspaces to be direct sums?

Two subspaces ##U## and ##V## are said to be direct sums if their intersection is only the zero vector and their sum spans the entire vector space. This means that every vector in the vector space can be uniquely written as a sum of vectors from ##U## and ##V##.

How do you prove that ##V = U_1 \oplus U_2 \oplus \ldots \oplus U_k##?

To prove that ##V = U_1 \oplus U_2 \oplus \ldots \oplus U_k##, you need to show that the intersection of all the subspaces ##U_1, U_2, ..., U_k## is only the zero vector and that their sum spans the entire vector space. This can be done by showing that every vector in ##V## can be written as a unique sum of vectors from ##U_1, U_2, ..., U_k##.

What is the significance of proving that ##V = U_1 \oplus U_2 \oplus \ldots \oplus U_k##?

Proving that ##V = U_1 \oplus U_2 \oplus \ldots \oplus U_k## has several implications. It shows that the subspaces ##U_1, U_2, ..., U_k## are independent and that together they span the entire vector space ##V##. This can be useful in solving linear systems of equations and understanding the structure of a vector space.

Can a vector space have more than one direct sum decomposition?

Yes, a vector space can have multiple direct sum decompositions. This is because there can be different combinations of subspaces that satisfy the conditions for being a direct sum. However, the number of subspaces in each decomposition will be the same.

How is the direct sum related to the concept of linear independence?

The direct sum is closely related to linear independence. If a set of vectors is linearly independent, then the subspaces spanned by those vectors will be a direct sum. Conversely, if a set of subspaces forms a direct sum, then the vectors that span those subspaces will be linearly independent.

Similar threads

Back
Top