Prove l*conj(l)=1 for Orthogonal Matrix A

In summary: If your text uses some other definition you may have to do a bit more work.In summary, to prove that l * conj(l) = r^2 + s^2 = 1 for an eigenvalue l of an orthogonal matrix A, we first need to understand the definition of an orthogonal matrix. Some texts define it as a matrix where the inner product of any two vectors remains the same after the matrix transformation, while others define it as a unitary matrix with complex entries. Using the first definition, we can show that the length of any eigenvalue of an orthogonal matrix is 1, by showing that ||Ax|| = ||x|| for any vector x. This leads to the conclusion that l * conj(l) = r
  • #1
phrygian
80
0

Homework Statement



Let l be an eigenvalue of an orthogonal matrix A, where l = r + is. Prove that l * conj(l) = r^2 + s^2 = 1.

Homework Equations





The Attempt at a Solution



I am really confused on where to go with this one.

I have Ax = A I x = A A^T A x = l^3 x

and Ax = l x so l x = l^3 x

l = l^3

l^2 = 1
l = 1 or -1

But I can't really figure out what to do from here, am I even on the right track?

Thanks for the help
 
Physics news on Phys.org
  • #2
Hi, i am just studying linear algebra (the final is coming next week, so stressed!).

I think the question about l*conjugate(l) = r^2+s^2 = 1 just mean the length of l is 1

But i think if Matrix with complex entries, say M, is orthogonal means M is unitary. So the length of every eigenvalue is 1

since unitary matrix won't change vector's langth so its eigenvalues' length is always 1
 
  • #3
fanxiu said:
But i think if Matrix with complex entries, say M, is orthogonal means M is unitary. So the length of every eigenvalue is 1

since unitary matrix won't change vector's langth so its eigenvalues' length is always 1
phrygian cannot use this fact; he/she has to prove it.

phrygian, the cube route isn't really going to help here. You went one step too far.

You know that [tex]\mathbf A \vec x = \lambda \vec x[/tex]. The conjugate transpose of the right-hand side is [tex](\lambda \vec x)^* = \vec x^*\lambda^*[/tex]. What is the matrix product of this conjugate transpose with [tex] \lambda \vec x[/tex]?
 
  • #4
Is there a way to do this without using conjugate transposes? The book has a hint that says first show that ||Ax|| = ||x|| for any vector x, but i just can't seem to get past that first step
 
  • #5
What is the precise definition of "orthogonal" matrix? Some texts use that a matrix, Q, is orthogonal if and only if <Qu, v>= <u, Qv> for all vector u and v (< u, v> is the inner product). From that it is not too hard to show that ||Qv||= ||v||.
 

FAQ: Prove l*conj(l)=1 for Orthogonal Matrix A

What is an orthogonal matrix?

An orthogonal matrix is a square matrix in which each column and row is orthogonal to each other. This means that the dot product of any two different columns or rows is equal to zero. In other words, an orthogonal matrix is a matrix whose columns and rows are perpendicular to each other.

Why is proving l*conj(l)=1 important for an orthogonal matrix A?

Proving l*conj(l)=1 is important because it demonstrates that the columns and rows of an orthogonal matrix are unit vectors, meaning they have a magnitude of 1. This is a key property of orthogonal matrices and is necessary for many applications in linear algebra and other fields.

How can we prove l*conj(l)=1 for an orthogonal matrix A?

To prove l*conj(l)=1 for an orthogonal matrix A, we can use the fact that the inverse of an orthogonal matrix is equal to its transpose. This means that A*A-1=I, where I is the identity matrix. By substituting A-1=AT into this equation, we get A*AT=I, which shows that the columns and rows of A are unit vectors.

Can you give an example of an orthogonal matrix?

One example of an orthogonal matrix is the 3x3 rotation matrix in 2D space. This matrix has the following form:

cosθ -sinθ 0

sinθ cosθ 0

0 0 1

Where θ is the angle of rotation. This matrix is orthogonal because its columns and rows are perpendicular to each other, and its inverse (transpose) is equal to its original form.

What are some practical applications of orthogonal matrices?

Orthogonal matrices have many practical applications, including in computer graphics, robotics, data compression, and signal processing. They are also used in solving systems of linear equations and in finding eigenvalues and eigenvectors of matrices. Additionally, orthogonal matrices are important in the field of quantum mechanics for representing quantum states and transformations.

Back
Top