Dimension of orthogonal subspaces sum

In summary, the conversation discusses the existence of an orthonormal basis in vector spaces ##\mathbb{V}^{n_1}_1## and ##\mathbb{V}^{n_2}_2##, which is ensured by the Gram-Schmidt theorem. The proof shows that the dimension of the sum of these subspaces is exactly ##n_1+n_2##, and the hint about the triangle inequality is likely a typo and should refer to theorem 4, which deals with the dimension of a vector space.
  • #1
Virgileo
2
0
Homework Statement
Suppose ##\mathbb{V}^{n_1}_1## and ##\mathbb{V}^{n_2}_2## are two subspaces such that any element of ##\mathbb{V}_1## is orthogonal to any element of ##\mathbb{V}_2##. Show that the dimensionality of ##\mathbb{V}_1 \oplus \mathbb{V}_2## is ##n_1+n_2##. (Hint: the triangle inequality).
Relevant Equations
The triangle inequality:
$$|V + W| \leq |V| + |W|$$

Inner product and the norm:
$$\langle V | W \rangle = \sum_{i=1}^n v_i w_i$$, where ##v_i## and ##w_i## are components of vectors ##V## and ##W## in some orthonormal basis ##{| u_i \rangle}##.
$$|V|^2 = \langle V | V \rangle$$

Overall this question is from Shankar's quantum mechanics book and he uses Dirac notation from the beginning to tell the linear algebra needed for the course, so I would use this notation too.
##| V_1 \rangle \in \mathbb{V}^{n_1}_1## and there is an orthonormal basis in ##\mathbb{V}^{n_1}_1##: ##|u_1\rangle, |u_2\rangle ... |u_{n_1}\rangle##
##| V_2 \rangle \in \mathbb{V}^{n_2}_2## and there is an orthonormal basis in ##\mathbb{V}^{n_2}_2##: ##|w_1\rangle, |w_2\rangle ... |w_{n_2}\rangle##
The existence of such orthonormal basis is ensured by Gram-Schmidt theorem.

##|V_1\rangle \in \mathbb{V}_1##, ##|V_1\rangle = \sum_{i=1}^{n_1} v_i |u_i\rangle##
##|V_2\rangle \in \mathbb{V}_2##, ##|V_2\rangle = \sum_{j=1}^{n_2} v_j |w_j\rangle##
##|V\rangle \in \mathbb{V} = \mathbb{V}_1 \oplus \mathbb{V}_2##
##|V\rangle = |V_1\rangle + |V_2\rangle = \sum_{i=1}^{n_1} v_i |u_i\rangle + \sum_{j=1}^{n_2} v_j |w_j\rangle## (1)

Each sum in equation (1) is formed from linearly independent vectors (because they are the multiples of basis vectors). We need to show that together both sets of vectors from the sum are linearly independent with each other - that will ensure that the dimension of the sum space will be at least ##n_1+n_2##. And it can't be bigger than ##n_1+n_2##, because all vectors from ##\mathbb{V}## can be represented with equation (1) by the definition of subspace sum, and so there can't be an additional vector that is linearly independent with all the others and is used in component representation of some vector ##|V\rangle##.

To show the linear independence of two orthogonal basis, that also have the property of each element of one to be orthogonal with all elements of the other the linear combination needs to be written and shown to be trivial:
##\sum_{i=1}^{n_1} v_i |u_i\rangle + \sum_{j=1}^{n_2} v_j |w_j\rangle = 0##
Apply inner product to both sides with all ##\langle u_i |## and ##\langle w_j |## we will get correspondingly:
##|v_i\rangle = 0## and ##|v_j\rangle = 0## - for same subspace elements due to all the other vectors being orthogonal as members of orthonormal basis and for foreign subspace elements due to all other vectors being orthogonal by the property of ##\mathbb{V}^{n_1}_1## and ##\mathbb{V}^{n_2}_2##.

Thus the dimension of the sum of subspaces is exactly ##n_1+n_2##.

This proof seems sound to me, and I hope it is such. But aside from checking it I am interested in the hint about triangle inequality - how does one use it in this case to prove this fact?
 
Physics news on Phys.org
  • #2
I'm not super familiar with physics notation here, but I would have thought your final equations are just ##v_j=0## since they are numbers, not vectors. Other than that I think what you did is right here. I don't know what they are talking about with the triangle inequality either.
 
  • Like
Likes Virgileo
  • #3
Virgileo said:
##|V_1\rangle \in \mathbb{V}_1##, ##|V_1\rangle = \sum_{i=1}^{n_1} v_i |u_i\rangle##
##|V_2\rangle \in \mathbb{V}_2##, ##|V_2\rangle = \sum_{j=1}^{n_2} v_j |w_j\rangle##
You should use different coefficients for the expansion of ##\lvert V_2 \rangle##. The coefficient ##v_1## in the expansion of ##\lvert V_1 \rangle## is not necessarily equal to the coefficient ##v_1## in the expansion of ##\lvert V_2 \rangle##, so you shouldn't use the same symbol in both places.
 
  • Like
Likes Virgileo
  • #4
Office_Shredder said:
I'm not super familiar with physics notation here, but I would have thought your final equations are just ##v_j=0## since they are numbers, not vectors. Other than that I think what you did is right here. I don't know what they are talking about with the triangle inequality either.

Yeah, they are indeed just numbers. About the triangle inequality maybe it is something to do with the fact that if we take a vector which has component representation in an orthonormal basis of all ones:
$$\begin{bmatrix}
1 \\
1 \\
... \\
1
\end{bmatrix}$$
Then the inner product of this vector with itself will yield dimension of the space. I was thinking maybe to take vectors ##|V_1\rangle## and ##|V_2\rangle## as such vectors, and then look the inner product of their sum with itself. But after playing with it it still lead me nowhere...

Anyway, thanks for checking.

vela said:
You should use different coefficients for the expansion of ##\lvert V_2 \rangle##. The coefficient ##v_1## in the expansion of ##\lvert V_1 \rangle## is not necessarily equal to the coefficient ##v_1## in the expansion of ##\lvert V_2 \rangle##, so you shouldn't use the same symbol in both places.
Yeah you are right, I was being lazy with this notation...
 
  • #5
Virgileo said:
This proof seems sound to me, and I hope it is such. But aside from checking it I am interested in the hint about triangle inequality - how does one use it in this case to prove this fact?
I took a look in the second edition of Shankar I found online. I think it was just a typo. The hint should have referred to theorem 4, which has to do with the dimension of a vector space.
 
  • Like
Likes Virgileo

FAQ: Dimension of orthogonal subspaces sum

What is the dimension of the sum of two orthogonal subspaces?

The dimension of the sum of two orthogonal subspaces is equal to the sum of the dimensions of the individual subspaces. This is because orthogonal subspaces have no shared basis vectors, so the total number of linearly independent vectors in the sum is the sum of the number of vectors in each subspace.

How do you find the dimension of the sum of two orthogonal subspaces?

To find the dimension of the sum of two orthogonal subspaces, you can use the formula dim(U + V) = dim(U) + dim(V), where U and V are the two subspaces. Alternatively, you can find a basis for each subspace, combine them, and then use the Gram-Schmidt process to orthogonalize the combined basis vectors and count the number of resulting vectors.

Can the dimension of the sum of two orthogonal subspaces be greater than the dimension of the whole space?

No, the dimension of the sum of two orthogonal subspaces cannot be greater than the dimension of the whole space. This is because the sum of two subspaces can only contain vectors that are linear combinations of the basis vectors of each subspace, and therefore cannot have more linearly independent vectors than the whole space.

What is the relationship between the dimension of the sum of two orthogonal subspaces and their intersection?

If two subspaces are orthogonal, their intersection is the zero vector and therefore has dimension 0. This means that the dimension of the sum of two orthogonal subspaces is equal to the sum of their individual dimensions. In other words, the dimension of the sum is equal to the dimension of each subspace added together.

How does the dimension of the sum of two orthogonal subspaces change if the subspaces are not orthogonal?

If the two subspaces are not orthogonal, their intersection will not be the zero vector and therefore the dimension of their intersection will be greater than 0. This means that the dimension of the sum of the two subspaces will be less than the sum of their individual dimensions, as some of the vectors in the sum will be linearly dependent. In general, the dimension of the sum of two subspaces will be equal to the sum of their individual dimensions minus the dimension of their intersection.

Back
Top