- #1
Virgileo
- 2
- 0
- Homework Statement
- Suppose ##\mathbb{V}^{n_1}_1## and ##\mathbb{V}^{n_2}_2## are two subspaces such that any element of ##\mathbb{V}_1## is orthogonal to any element of ##\mathbb{V}_2##. Show that the dimensionality of ##\mathbb{V}_1 \oplus \mathbb{V}_2## is ##n_1+n_2##. (Hint: the triangle inequality).
- Relevant Equations
- The triangle inequality:
$$|V + W| \leq |V| + |W|$$
Inner product and the norm:
$$\langle V | W \rangle = \sum_{i=1}^n v_i w_i$$, where ##v_i## and ##w_i## are components of vectors ##V## and ##W## in some orthonormal basis ##{| u_i \rangle}##.
$$|V|^2 = \langle V | V \rangle$$
Overall this question is from Shankar's quantum mechanics book and he uses Dirac notation from the beginning to tell the linear algebra needed for the course, so I would use this notation too.
##| V_1 \rangle \in \mathbb{V}^{n_1}_1## and there is an orthonormal basis in ##\mathbb{V}^{n_1}_1##: ##|u_1\rangle, |u_2\rangle ... |u_{n_1}\rangle##
##| V_2 \rangle \in \mathbb{V}^{n_2}_2## and there is an orthonormal basis in ##\mathbb{V}^{n_2}_2##: ##|w_1\rangle, |w_2\rangle ... |w_{n_2}\rangle##
The existence of such orthonormal basis is ensured by Gram-Schmidt theorem.
##|V_1\rangle \in \mathbb{V}_1##, ##|V_1\rangle = \sum_{i=1}^{n_1} v_i |u_i\rangle##
##|V_2\rangle \in \mathbb{V}_2##, ##|V_2\rangle = \sum_{j=1}^{n_2} v_j |w_j\rangle##
##|V\rangle \in \mathbb{V} = \mathbb{V}_1 \oplus \mathbb{V}_2##
##|V\rangle = |V_1\rangle + |V_2\rangle = \sum_{i=1}^{n_1} v_i |u_i\rangle + \sum_{j=1}^{n_2} v_j |w_j\rangle## (1)
Each sum in equation (1) is formed from linearly independent vectors (because they are the multiples of basis vectors). We need to show that together both sets of vectors from the sum are linearly independent with each other - that will ensure that the dimension of the sum space will be at least ##n_1+n_2##. And it can't be bigger than ##n_1+n_2##, because all vectors from ##\mathbb{V}## can be represented with equation (1) by the definition of subspace sum, and so there can't be an additional vector that is linearly independent with all the others and is used in component representation of some vector ##|V\rangle##.
To show the linear independence of two orthogonal basis, that also have the property of each element of one to be orthogonal with all elements of the other the linear combination needs to be written and shown to be trivial:
##\sum_{i=1}^{n_1} v_i |u_i\rangle + \sum_{j=1}^{n_2} v_j |w_j\rangle = 0##
Apply inner product to both sides with all ##\langle u_i |## and ##\langle w_j |## we will get correspondingly:
##|v_i\rangle = 0## and ##|v_j\rangle = 0## - for same subspace elements due to all the other vectors being orthogonal as members of orthonormal basis and for foreign subspace elements due to all other vectors being orthogonal by the property of ##\mathbb{V}^{n_1}_1## and ##\mathbb{V}^{n_2}_2##.
Thus the dimension of the sum of subspaces is exactly ##n_1+n_2##.
This proof seems sound to me, and I hope it is such. But aside from checking it I am interested in the hint about triangle inequality - how does one use it in this case to prove this fact?
##| V_2 \rangle \in \mathbb{V}^{n_2}_2## and there is an orthonormal basis in ##\mathbb{V}^{n_2}_2##: ##|w_1\rangle, |w_2\rangle ... |w_{n_2}\rangle##
The existence of such orthonormal basis is ensured by Gram-Schmidt theorem.
##|V_1\rangle \in \mathbb{V}_1##, ##|V_1\rangle = \sum_{i=1}^{n_1} v_i |u_i\rangle##
##|V_2\rangle \in \mathbb{V}_2##, ##|V_2\rangle = \sum_{j=1}^{n_2} v_j |w_j\rangle##
##|V\rangle \in \mathbb{V} = \mathbb{V}_1 \oplus \mathbb{V}_2##
##|V\rangle = |V_1\rangle + |V_2\rangle = \sum_{i=1}^{n_1} v_i |u_i\rangle + \sum_{j=1}^{n_2} v_j |w_j\rangle## (1)
Each sum in equation (1) is formed from linearly independent vectors (because they are the multiples of basis vectors). We need to show that together both sets of vectors from the sum are linearly independent with each other - that will ensure that the dimension of the sum space will be at least ##n_1+n_2##. And it can't be bigger than ##n_1+n_2##, because all vectors from ##\mathbb{V}## can be represented with equation (1) by the definition of subspace sum, and so there can't be an additional vector that is linearly independent with all the others and is used in component representation of some vector ##|V\rangle##.
To show the linear independence of two orthogonal basis, that also have the property of each element of one to be orthogonal with all elements of the other the linear combination needs to be written and shown to be trivial:
##\sum_{i=1}^{n_1} v_i |u_i\rangle + \sum_{j=1}^{n_2} v_j |w_j\rangle = 0##
Apply inner product to both sides with all ##\langle u_i |## and ##\langle w_j |## we will get correspondingly:
##|v_i\rangle = 0## and ##|v_j\rangle = 0## - for same subspace elements due to all the other vectors being orthogonal as members of orthonormal basis and for foreign subspace elements due to all other vectors being orthogonal by the property of ##\mathbb{V}^{n_1}_1## and ##\mathbb{V}^{n_2}_2##.
Thus the dimension of the sum of subspaces is exactly ##n_1+n_2##.
This proof seems sound to me, and I hope it is such. But aside from checking it I am interested in the hint about triangle inequality - how does one use it in this case to prove this fact?