In mathematics, orthogonality is the generalization of the notion of perpendicularity to the linear algebra of bilinear forms. Two elements u and v of a vector space with bilinear form B are orthogonal when B(u, v) = 0. Depending on the bilinear form, the vector space may contain nonzero self-orthogonal vectors. In the case of function spaces, families of orthogonal functions are used to form a basis.
By extension, orthogonality is also used to refer to the separation of specific features of a system. The term also has specialized meanings in other fields including art and chemistry.
I've been banging my head against this problem for some time now, and I just can't solve it. The problem seems fairly simple, but for some reason I don't get it.
Given the coordinate transformation matrix
A=\left(...
After 10 years of teaching middle school, I am going back to grad school in math. I haven't seen Linear Algebra in more than a decade, but my first class is on Generalized Inverses of Matrices (what am I thinking?). I have a general "rememberance" understanding of most of the concepts we're...
I need some help in understanding what I need to do to solve these poblems, I can't get them started.
1. Find an orthogonal set of vectos that spans the same subspace as a,b,c.
a=(1,1,-1)
b=(-2,-3,1)
c=(-1,-2,0)
2. Use the Gram-Schmidt process to find and orthogonal basis that...
I have the set
s = span ( [[0][1][-1][1]]^{T} )
And I need to find the orthogonal complement of the set.
It seems like it should be straight foward, but I'm a bit confused. I know that S is a subspace of R^4, and that there should be three free vairables.
What I did so far is to take the...
I've got a question regarding orthogonal matrices. I am given an orthogonal matrix M, and a symmetric matrix A. I need to prove that (M^-1)*A*M is also symmetric (all of the matrices are n x n). I know that for an orthogonal matrix, its inverse is equal to its transpose. Can anyone give me...
I have a question on matrix norms and orthogonal transformations. The 2-norm in invariant under orthogonal transformation, for if Q^T*Q=I. But i have trouble showing that for orthogonal Q and Q^H with appropriate dimensions
|| Q^H*A*Q ||2 =|| A ||2
Plz Help :(
Hi
I want 2 know how 2 solve 1st order partial differintial equation (PDE) with constant coefficient using orthogonal transformation
example :
solve: 2Ux + 2Uy + Uz = 0
THnx :blushing:
Hello, I have a "simple" problem for you guys. I am not expert in math and so try to be simple.
I explain the problem by starting with one example. The polar coordinate system has the following main property: with two parameters, rho and theta, each point is described as the intersection of...
Let {E1,E2,...En} be an orthogonal basis of Rn. Given k, 1<=k<=n, define Pk: Rn -> Rn by P_{k} (r_{1} E_{1} + ... + r_{n} E_{n}) = r_{k} E_{k}. Show that P_{k} = proj_{U} () where U = span {Ek}
well \mbox{proj}_{U} \vec{m}= \sum_{i} \frac{ m \bullet u_{i}}{||u_{i}||^2} \vec{u}
right...
Let T in L(V) be an idempotent linear operator on a finite dimensional inner product space. What does it mean for T to be "the orthogonal projection onto its image"?
Show that the functions sin x, sin 2x, sin 3x, ... are orthogonal on the interval (0,pi) with respect to p(x) = 1 (where p is supposed to be rho)
i know i have to use this
\int_{0}^{\pi} \phi (x) \ psi (x) \rho (x) dx = 0 and i have no trouble doing it for n = 1, and n=2
but how wouldi go...
Hello again,
This question confuses me for a reason. I read the questions and they sound to simple and to easy to answer. So maybe its something I am reading wrong and not answering. Help would be greatly apreciated.
first off
Let O(n) = { A | A is an n x n matrix with A^t A = I } be the...
Find an equation of a plane through the point (-1, -2, -3) which is orthogonal to the line x=5+2t,y=-3-5t,z=2-2t
in which the coefficient of x is 2.
______________________________=0
i don't get this problem at all, but here's what i came up with after sitting here at the computer for a...
Find a vector orthogonal to both <-3,2,0> and to <0,2,2> of the form
<1,_,_> (suppose to fill in the blanks)
well i thought the cross product would do the trick, but i keep getting the wrong answer.
I|2 0| - j |-3 0| + k |-3 2|
|2 2| |0 2| |0 2|
(format is kinda messed up...
Hi everybody,
I read that sin(2x) and sin(3x) are orthogonal to each other.
In general if I want to check if two functions are orthogonal or not I must integrate their product
First: why the integration of their multiplication (not their addition for example)?
Second: Orthogonal...
find the orthogonal trajectories of the following
(a) x^2y=c_1
(b) x^2+c_{1}y^3=1
for part (a) I've found y=\frac{1}{2}\log{|x|} + C_2
for part (b) if i solve this integral this should be the O.T.
\frac{3}{2}\int{(\frac{1}{x^2}-1)}dx= \frac{y^2}{2}
is this correct?
A thm says:
if W is a subspace of V then V = direct sum of W and CW( ort. complement of W)
i.e. for all v € V there exist w € W & w' € CW s.t. v= w+w'
Does it mean that we can write a function as a sum of two orthogonal funcs ?
Also i don't know the proof...
Is it correct to assume that there is no such thing as non-orthogonal basis? The orthogonal eigenbasis is the "easiest" to work with, but generally to be a basis a set of vectors has to be lin. indep and span the space, and being "lin. indep." means orthogonal.
Is it correct?
Thanks.
How do I prove that two curves are orthogonal when they interest each other at a specific point?
Do I just take the derivative of both and compare the slopes?
The slopes should be negative reciprocals of each other, correct?
Hello, could someone please give me some help with the following question?
Q. Determine all planes (in R³) orthogonal to the vector (1,1,1).
This is how I started off but I am not really sure how I need to go about solving this problem. I begin(by somewhat assuming that the vector (1,1,1)...
Hi, this might be very easy, but I forgot how to do the following: I have a vector in R^6: (x1, x2, x3, x4, x5, x6). How do I find a vector such that their dot product vanishes? I know how to do it for the two dimensional case: (x1, x2), so the vector that is perpendicular to it is c(-x2, x1)...
Why a timelike vector and a null vector cannot be orthogonal?
Isn't a null vector orthogonal to any vector, by definition? Anyway, each component of a vector is multiplied by zero, so in the end the sum is zero.
Hey there I'm working on questions for a sample review for finals I'm stuck on these three I think I'm starting to confuse all the different theorem, I'm so lost please help
1) Find the coordinate vector of the polynomial
p(x)=1+x+x^2
relative to the following basis of P2:
p1=1+x...
Find the matrix A of the orthogonal projection onto the line L in R2 that consists of all scalar multiples of the vector [2 5]T .
OK...I really don't know how to start off with this problem. If somehow could just help me out there I will try to muddle my way through the rest ! Thanks.
Here is the problem:
Determine the orthogonal trajectories of the given family of curves.
y = \sqrt{2\ln{|x|}+C}
This is what I've done so far:
y = (2\ln{|x|}+C)^\frac{-1}{2}
y' = -1/2(2\ln{|x|+C)(2/x)
Now I understand to find the orthogonal lines I need to divide -1 by...
Let vectorB be a vector from the origin to a point D fixed in space. Let vectorW be a vector from the origin to a variable point Q(x,y,z). Show that vectorW (dot) vectorB = B^2 is the equation of a plane perpendicular to vectorB and passing through D.
Thank you for any help
Ok, I'm working with the Pauli Matrices, and I've already gone through showing a few bits of information. I've got a good idea how to keep going, but I'm not exactly sure about this one--
say M= 1/2(alphaI + a*sigma)
where alpha E C, a=(ax, ay, az) a complex vector, a*sigma=ax sigmax+ay...
Can anyone give me a physical interpretation of what orthogonal eigenfunctions are please? I understand the mathematical idea, the overlap integral, but I'm not clear about what it implies for the different states. At the moment the way I'm thinking of it is that the energy eigenfunctions of an...
If each spacetime point p_i can be associated with a contant force f_i then the interaction \sum_{i=1}^\infty f_i between points can be described with the use of orthogonal forces.