Proving Orthogonal Bases Homework Statement

In summary: I don't understand how to use the rules to evaluate v1⋅v2. I know what v1 and v2 are, but I don't understand how to break them down into linear combinations of the basis vectors or what rules to use.v1 = a1b1 + a2b2 + ... + akbkv2 = c1b1 + c2b2 + ... + ckbkv1 ⋅ v2 = (a1b1 + a2b2 + ... + akbk) ⋅ (c1b1 + c2b2 + ... + ckbk)You already found that v1 = (a1v1 + a
  • #71
LosTacos said:
Problem: Let B be an ordered orthonormal basis for a k-dimensional subspace V of ℝn. Prove that for all v1,v2 ∈ V, v1·v2 = [v1]B · [v2]B, where the first dot product takes place in ℝn and the second takes place in ℝk.

Okay so:

v1v2 = (a1b1 + a2b2 +... + ak bk) ⋅ (c1b1 +c2b2 ... + ck bk)

= (a1c1b1b1) + (a2c2b2b2) + ... +
(akckbkbk)

= (a1c1 + a2c2 + ... +
akck

= [v1]B ⋅ [v2]B

What you have here is the right side of the equation you're trying to prove. IOW, what you have shown is that [v1]B · [v2]B = (a1c1 + a2c2 + ... +
akck.

What you need to do is work with the other side of the equation - v1v2 - taking into consideration that, although v1 and v2 are vectors in the k-dimensional subspace V, they are also vectors in the n-dimensional vector space Rn.
 
Physics news on Phys.org
  • #72
LosTacos said:
So,

[v1]B ⋅ [v2]B = [a1c1, a2c2, ... , akck]

Since each v is expressed as the coordinatization with respect to basis B, these can just be expanded to the linear combination of each and therefore = v1⋅[v2
This part would require more explanation. (Edit: Actually, it's wrong. See below). Of course if you do the calculation one step at a time, then all you have to do to "go in the other direction" is to read the string of equalities from right to left.
 
Last edited:
  • #73
LosTacos said:
So,

[v1]B ⋅ [v2]B = [a1c1, a2c2, ... , akck]

Since each v is expressed as the coordinatization with respect to basis B, these can just be expanded to the linear combination of each and therefore = v1⋅[v2
Fredrik said:
This part would require more explanation. Of course if you do the calculation one step at a time, then all you have to do to "go in the other direction" is to read the string of equalities from right to left.
The equation above doesn't make sense to me. I'm reading the right side as a vector, which doesn't make sense as the output of a dot product.
 
  • #74
Mark44 said:
The equation above doesn't make sense to me. I'm reading the right side as a vector, which doesn't make sense as the output of a dot product.
Ah, yes you're right. I was too fast on the trigger there.
 
  • #75
Fredrik said:
Ah, yes you're right. I was too fast on the trigger there.
I can say from personal experience, it happens.
 
  • #76
I am confused as to what is correct for the reverse direction.

From the definition of coordinatization,
Let B = (b1, b2, ..., bk) be an ordered basis. Suppose v1= a1b1 + a2b2 + ... + anbn. Then, [v1]B, the coordinatization of v1 with respect to B is the n-vector [a1, a2, ..., an]

So if this follows true for [v2]B as well, the dot product will give me

[v1]B⋅[v1]B = [a1, a2, ..., an] ⋅ [c1, c2, ..., cn].

So, how when doing the dot product, why do the non-identical terms cancel out. OR is this not the right approach.
 
  • #77
LosTacos said:
I am confused as to what is correct for the reverse direction.

From the definition of coordinatization,
Let B = (b1, b2, ..., bk) be an ordered basis. Suppose v1= a1b1 + a2b2 + ... + anbn. Then, [v1]B, the coordinatization of v1 with respect to B is the n-vector [a1, a2, ..., an]

So if this follows true for [v2]B as well, the dot product will give me

[v1]B⋅[v1]B = [a1, a2, ..., an] ⋅ [c1, c2, ..., cn].

So, how when doing the dot product, why do the non-identical terms cancel out. OR is this not the right approach.
This is fine, but it's much harder to see what the next step is when you start at this end. If you want to see what the next step is, all you have to do is to write out all the steps of the previous calculation, the one that started with

v1v2 = ...

and ended with

= [v1]B ⋅ [v2]B. Now you can just read this string of equalities from right to left, and you should see it. (It might still be somewhat hard to see it, because of the strange way you have continued to write the calculation, in spite of what I said in #43).
 
  • #78
Well to prove that each side is equal to one another, I have to prove that each is a subset of the other. In essence, prove it both ways. So when doing the dot product of [a1, a2, ..., an] ⋅ [c1, c2, ..., cn] are you saying that i need to create 9 different products for this example. And then bc it is orthonormal, 6 of them cancel out and the other 3 are left which then show that this equals v1v2
 
  • #79
LosTacos said:
Well to prove that each side is equal to one another, I have to prove that each is a subset of the other. In essence, prove it both ways.
I was wondering why you were talking about "both ways". You don't have to do anything like that here. It's true that every equality in mathematics (at least in the branch of mathematics defined by ZFC set theory) is an equality between sets, and that this means that the equality
v1v2 = [v1]B ⋅ [v2]B
holds if and only if every element of the left-hand side is an element of the right-hand side, and vice versa.

This doesn't mean that you haven't already proved the equality above. You have. You did it by proving a string of equalities
v1v2 = X = Y = Z = [v1]B ⋅ [v2]B.​
This is sufficient since equality is a transitive operation. (If x=y and y=z, then x=z).

Another way of looking at it is that the first of these equalities means that every element of v1v2 is an element of X and that every element of X is an element of v1v2. That's just what the equality sign means. The other equalities can be interpreted similarly. Because of this, there's no need to do anything more than to prove this string of equalities.

I thought that what you wanted to do was just to start with [v1]B ⋅ [v2]B, and here use the definitions of [v1]B, [v2]B and the dot product, to discover each step in the string of equalities in the opposite order compared to before. This is why I said is that all you need to do to see the steps is to read the string of equalities from right to left.
 
Last edited:
  • #80
Did you understand that explanation? Is everything clear now? Do you understand why we found the way you wrote the solution confusing?

Since you solved the problem, I'm going to show you how I would have done it. I typed this up a week ago. I just didn't want to post it until you had worked out the solution for yourself.

Define ##v_1,\cdots,v_k\in V## by ##B=(v_1,\cdots,v_n)##. Let ##x,y\in V## be arbitrary. Let ##x_1,\dots,x_k## and ##y_1,\dots,y_k## be the unique real numbers such that ##x=\sum_{i=1}^k x_i v_i## and ##y=\sum_{i=1}^k y_i v_i##. We have
$$x\cdot y=\bigg(\sum_{i=1}^k x_i v_i\bigg)\cdot\bigg(\sum_{j=1}^k y_j v_j\bigg)=\sum_{i=1}^k\sum_{j=1}^k x_i y_j \underbrace{v_i\cdot v_j}_{=\delta_{ij}}=\sum_{i=1}^k x_i y_i =[x]_B\cdot[y]_B.$$
In case you're not familiar with the Kronecker delta, it's defined by
$$\delta_{ij}=\begin{cases}1 &\text{if }i=j\\0 &\text{if }i\neq j.\end{cases}$$
 

Similar threads

Back
Top