Multi-variable calculus orhogonality problem

In summary, the given conversation discusses how to prove that if the curve p:R->R^n and the vector v∈R^n satisfy the conditions of being orthogonal for all t, then p(t) and v are also orthogonal for all t. This can be shown by applying the identity for the dot product and using the fact that v is a constant vector.
  • #1
K29
108
0

Homework Statement


The curve [tex]p:R->R^{n}[/tex] and the vector [tex]v \in R^{n}[/tex]. Assume [tex]v [/tex] and [tex]p'(t)[/tex] are orthogonal for all [tex]t[/tex]. And that [tex]p(0)[/tex] is orthogonal to [tex]v [/tex].
Prove that [tex]p(t)[/tex] and [tex]v[/tex] are orthogonal for all t.


Homework Equations


Since the previous question in the same main question (ie 2(a) and we are now working on 2(b)is the following proof who's result may be relevant.
[tex](u\cdot v)' = u\cdot v' +v\cdot u'[/tex]
[tex]u,v\in R^{n}[/tex]


The Attempt at a Solution


I can see that this should be true, just struggling with how to prove it:
I've applied the above relation to both the given dot products and set them equal to zero, since they are orthogonal. I've also expanded
[tex](p(t) \cdot v)'[/tex] with the same above relation and I end up with 4 equations which may have been useful:
[tex]v\cdot p'(t)=0[/tex]
[tex]v\cdot p(0)=0[/tex]
[tex](v\cdot p(0))'=v\cdot p(0)'+p(0)\cdot v'=0[/tex]
[tex](v\cdot p(t))'=v\cdot p'(t)+p(t)\cdot v'[/tex]
however I see no way to relate u.v itself(rather than the derivative) using this method. I've substituted the above 4 equations into each other in various ways, however there is a fundamental relationship that I think I am missing that somehow links p(0) to everything else.
I've also tried writing the last two equations as summantions(dot product definition); and as the full matrix that the derivatives would yield, but the result is much the same dead end, other than the fact that I can pull the derivative out of the summations on the left hand side of the equations

Help pls?
 
Last edited:
Physics news on Phys.org
  • #2
Isn't this much easier than it looks?
Applying the identity for u = p(t), v = v gives you
[tex]
(p(t) \cdot v)' = p'(t) \cdot v + p(t) \cdot v'
[/tex]

What does it mean for two vectors to be orthogonal?
Now what can you say about [itex]p'(t) \cdot v[/itex] and about v'?
 
  • #3
If they are orthogonal the dot product is zero, because the vectors are sitting perpendicularly at 90 degrees and uvcos(90)=0

We know that [tex]p'(t)\cdot v[/tex] is zero.
This leaves:
[tex](p(t) \cdot v)' = p(t) \cdot v'[/tex]

Unfortunately I don't see anything specific about v' other than it's perpendicular only to p(0) which was given..
 
Last edited:
  • #4
Isn't v a constant vector?
 
  • #5
nvm, solved. Thank U for the constant vector clue, I missed that.. :), .I have difficulty seeing the subtleness of some proofs.
 
Last edited:

FAQ: Multi-variable calculus orhogonality problem

What is multi-variable calculus orthogonality problem?

The multi-variable calculus orthogonality problem involves finding the orthogonal basis for a set of vectors in a multi-dimensional space. This is important in many mathematical applications, such as optimization and linear algebra.

How is multi-variable calculus orthogonality problem different from single-variable calculus?

In single-variable calculus, the focus is on finding the derivative and integral of a function. In multi-variable calculus, there are multiple variables involved, and the focus is on finding the partial derivatives and multiple integrals of a function. The orthogonality problem arises when trying to find the orthogonal basis for these multiple variables.

What is the importance of solving the multi-variable calculus orthogonality problem?

Solving the orthogonality problem allows us to find the best representation of a set of vectors in a multi-dimensional space. This can be useful in applications such as data analysis, image processing, and optimization problems.

What are some common methods used to solve the multi-variable calculus orthogonality problem?

Some common methods include the Gram-Schmidt process, QR factorization, and Singular Value Decomposition (SVD). These methods involve finding the orthogonal basis through various calculations and algorithms.

How is the multi-variable calculus orthogonality problem applied in real-world situations?

The orthogonality problem has various applications in fields such as engineering, physics, and economics. For example, in signal processing, finding the orthogonal basis can help in noise reduction and data compression. In economics, it can be used in regression analysis to find the best fit for a set of data points.

Back
Top