Linear algebra, orthogonal matrix proof

In summary: For 1), is it |\lambda x|?For 2), here I'm confused. A is a real matrix, I think it means that all his entries are real. I realize that it can have complex eigenvalues though. Also, I thought that x was in R^n. It seems like the entries of x can be in C^n? Because otherwise I don't see how to reach \lambda x with \lambda being complex valued.I think I need to make a nap, I'm extremely tired so I feel I'm not thinking as hard as I should right now; unlike when I started the problem.
  • #1
fluidistic
Gold Member
3,949
264

Homework Statement


Demonstrate that the following propositions hold if A is an nxn real and orthogonal matrix:
1)If [itex]\lambda[/itex] is a real eigenvalue of A then [itex]\lambda =1[/itex] or [itex]-1[/itex].
2)If [itex]\lambda[/itex] is a complex eigenvalue of A, the conjugate of [itex]\lambda[/itex] is also an eigenvalue of A.


Homework Equations


For part 1) I used the fact that A orthogonal implies [itex]A^{-1}=A^T[/itex]. Also that [itex]\det A = \frac{1}{det A^{-1}}[/itex], and that [itex]\det A = \det A^T[/itex]. I didn't demonstrate the 2 latter relations, I just assumed them to be true.


The Attempt at a Solution


I've done part 1), I'm just lazy to write down the detailed proof. I made use of the relevant equations I've written.
However for 2) I'm totally stuck at even planting the problem. I started by writing [itex]\lambda = a+ib[/itex] and its conjugate [itex]\overline \lambda = a-ib[/itex] but this lead me nowhere and stuck immediately.
I think I've read in wikipedia yesterday that the eigenvalues of an orthogonal matrices all have modulus 1. If I remember well, A is diagonalizable and put under this form I should "see that all eigenvalues of A have modulus 1".
If this is the simpler approach, please let me know. I could demonstrate that A is diagonalizable first and then try to go further.
In the demonstration in 1), at one point I reach the fact that [itex]\lambda A ^T =I=\lambda A[/itex]. So clearly A is... diagonal (this is not necessarily true so this looks like a mistake... damn it.)

My proof for 1):
I must show that [itex]Ax=\lambda x \Rightarrow \lambda = \pm 1 \forall x \in \mathbb{R}^n[/itex].
I multiply both sides by the inverse of A: [itex]A^{-1}Ax= A ^{-1} \lambda x \Rightarrow x= \lambda A^{-1}x[/itex].
[itex]\Rightarrow \lambda A^{-1}=I \Rightarrow \det A^{-1}=\frac{1}{\lambda }[/itex]. But we also have that [itex]\lambda A^T=I[/itex] because A is orthogonal. Since A is a square matrix, [itex]\det A =\det A^T[/itex] too. Thus we have that [itex]\det A =\det A^T \Rightarrow \det (\lambda A ^T )=1 \Rightarrow \det (\lambda A )=1 \Rightarrow \det A = \frac{1}{\lambda}[/itex].
[itex]\Rightarrow \det A = \det A^{-1}[/itex]. But for any invertible matrix A, [itex]\det A = \frac{1}{\det A ^{-1}}[/itex].
So that if [itex]a = \det A[/itex], then [itex]a= \frac{1}{a} \Rightarrow a^2=1 \Rightarrow a = \pm 1[/itex]. And since [itex]\lambda = \frac{1}{\det A}[/itex], I reach [itex]\lambda = \pm 1[/itex].
Any tip/help will be appreciated. Thanks.
 
Physics news on Phys.org
  • #2
Hi fluidistic! :smile:

Let's start with your proof for 1).

You infer [itex]x= \lambda A^{-1}x \Rightarrow \lambda A^{-1} = I[/itex].
I'm afraid this is not generally true.
The implication would be that A is a diagonal matrix, which it obviously doesn't have to be.

As a hint for 1): what is |Ax|?

As a hint for 2): suppose [itex]Av = \lambda v[/itex], what is the conjugate of (Av)?
 
  • #3
[itex] x= \lambda A^{-1}x \Rightarrow \lambda A^{-1}=I [/itex].

[itex] \lambda A^{-1} [/itex] mapping all eigenvectors of A to themselves does not imply [itex] \lambda A^{-1} = I. [/itex]
 
  • #4
Hi and thanks to both of you guys.
I like Serena said:
Hi fluidistic! :smile:

Let's start with your proof for 1).

You infer [itex]x= \lambda A^{-1}x \Rightarrow \lambda A^{-1} = I[/itex].
I'm afraid this is not generally true.
The implication would be that A is a diagonal matrix, which it obviously doesn't have to be.
Oh I see. I really wasn't aware of this so I'm quite surprised. So basically I have a vector x in R^n and it's worth a matrix times exactly the same vector x. If this matrix isn't necessarily the identity, can you give me an example of such a matrix? Say a 2x2 matrix. I just tried myself to find such an example and failed (probably due to some algebra mistake, I don't see what I did wrong).

As a hint for 1): what is |Ax|?

As a hint for 2): suppose [itex]Av = \lambda v[/itex], what is the conjugate of (Av)?
For 1), is it [itex]|\lambda x|[/itex]?
For 2), here I'm confused. A is a real matrix, I think it means that all his entries are real. I realize that it can have complex eigenvalues though. Also, I thought that x was in R^n. It seems like the entries of x can be in C^n? Because otherwise I don't see how to reach [itex]\lambda x[/itex] with [itex]\lambda[/itex] being complex valued.
I think I need to make a nap, I'm extremely tired so I feel I'm not thinking as hard as I should right now; unlike when I started the problem.
Thanks for any further push.
 
  • #5
fluidistic said:
Hi and thanks to both of you guys.

Oh I see. I really wasn't aware of this so I'm quite surprised. So basically I have a vector x in R^n and it's worth a matrix times exactly the same vector x. If this matrix isn't necessarily the identity, can you give me an example of such a matrix? Say a 2x2 matrix. I just tried myself to find such an example and failed (probably due to some algebra mistake, I don't see what I did wrong).

Try [itex]\begin{pmatrix}0 & 1 \\ 1 & 0\end{pmatrix}[/itex].


fluidistic said:
For 1), is it [itex]|\lambda x|[/itex]?

Noooo, I didn't say x was an eigenvector. :rolleyes:
Try to use the properties of an orthogonal matrix.


fluidistic said:
For 2), here I'm confused. A is a real matrix, I think it means that all his entries are real. I realize that it can have complex eigenvalues though. Also, I thought that x was in R^n. It seems like the entries of x can be in C^n? Because otherwise I don't see how to reach [itex]\lambda x[/itex] with [itex]\lambda[/itex] being complex valued.
I think I need to make a nap, I'm extremely tired so I feel I'm not thinking as hard as I should right now; unlike when I started the problem.
Thanks for any further push.

Your problem doesn't say that x has to be real.
And anyway, if lambda is complex, the corresponding eigenvector has to be complex too (can you proof that?)
As you can see I avoided using x here.
I'm thinking of x as any real valued vector, and I'm thinking of v as a specific eigenvector, which may be complex valued.
 
  • #6
fluidistic said:
Oh I see. I really wasn't aware of this so I'm quite surprised. So basically I have a vector x in R^n and it's worth a matrix times exactly the same vector x. If this matrix isn't necessarily the identity, can you give me an example of such a matrix? Say a 2x2 matrix. I just tried myself to find such an example and failed (probably due to some algebra mistake, I don't see what I did wrong).
Such a vector is called an eigenvector of the matrix. In your attempt, I noticed you said Ax=λx for all x in Rn, but that's not correct. It only holds for certain vectors, the eigenvectors of A.
For 1), is it [itex]|\lambda x|[/itex]?
ILS wants you to use the definition of the norm of a vector and apply it to the vector Ax.
 
  • #7
I like Serena said:
Try [itex]\begin{pmatrix}0 & 1 \\ 1 & 0\end{pmatrix}[/itex].
Well I reach [itex]\begin {bmatrix} x_1 \\ x_2 \end{bmatrix}=\begin {bmatrix} x_2 \\ x_1 \end{bmatrix}[/itex]. So unless I'm mistaken this matrix doesn't work.


Noooo, I didn't say x was an eigenvector. :rolleyes:
Try to use the properties of an orthogonal matrix.
Ah I see! If I remember well something I've read somewhere I don't remember, orthogonal matrices preserve lengths so that |Ax|=|x|.




Your problem doesn't say that x has to be real.
And anyway, if lambda is complex, the corresponding eigenvector has to be complex too (can you proof that?)
I think I can prove it. I was having problems because I assumed x was in R^n.
If lambda is complex, A has real entries and x also has real entries then there's no way that a multiplication/sum of real numbers give a complex number. So that x has to be complex valued. (I'm having in mind the picture Ax=lambda x.)

As you can see I avoided using x here.
I'm thinking of x as any real valued vector, and I'm thinking of v as a specific eigenvector, which may be complex valued.
You mean x any complex and real valued?

vela said:
Such a vector is called an eigenvector of the matrix. In your attempt, I noticed you said Ax=λx for all x in Rn, but that's not correct. It only holds for certain vectors, the eigenvectors of A.
Oh thanks for pointing this out. I had the doubt for a moment and made an error with this. I assumed for some reason that having an infinity of eigenvectors couldn't be possible while of course it is. Only the direction matters, versus their lengths.

ILS wants you to use the definition of the norm of a vector and apply it to the vector Ax.
[itex]||Ax||=\sqrt {<Ax,Ax>}[/itex] where <,> denotes the inner product. Or do you mean a specific norm?
 
  • #8
fluidistic said:
Well I reach [itex]\begin {bmatrix} x_1 \\ x_2 \end{bmatrix}=\begin {bmatrix} x_2 \\ x_1 \end{bmatrix}[/itex]. So unless I'm mistaken this matrix doesn't work.
But it will work for specific vectors
\begin{align*}
\begin{pmatrix} 0 & 1 \\ 1 & 0 \end{pmatrix}\begin{pmatrix}1 \\ 1\end{pmatrix} &= (1)\begin{pmatrix} 1 \\ 1\end{pmatrix} \\
\begin{pmatrix} 0 & 1 \\ 1 & 0 \end{pmatrix}\begin{pmatrix}1 \\ -1\end{pmatrix} &= (-1)\begin{pmatrix} 1 \\ -1\end{pmatrix}
\end{align*}
If you demand that every vector x in Rn satisfies Ax=λx, then you're right that A is a multiple of the identity matrix.
Oh thanks for pointing this out. I had the doubt for a moment and made an error with this. I assumed for some reason that having an infinity of eigenvectors couldn't be possible while of course it is. Only the direction matters, versus their lengths.
I'm not sure what you're saying here. :)
[itex]||Ax||=\sqrt {<Ax,Ax>}[/itex] where <,> denotes the inner product. Or do you mean a specific norm?
Now using the fact that A is orthogonal, you can show |Ax|=|x|, which is what I think ILS was trying to get you to see. Now suppose x is an eigenvector of A.
 
  • #9
I see vela has already given the answers that you need to move on.

I'll just give a couple of additional comments. :smile:


fluidistic said:
Ah I see! If I remember well something I've read somewhere I don't remember, orthogonal matrices preserve lengths so that |Ax|=|x|.

Yep! That's it. :)

fluidistic said:
I think I can prove it. I was having problems because I assumed x was in R^n.
If lambda is complex, A has real entries and x also has real entries then there's no way that a multiplication/sum of real numbers give a complex number. So that x has to be complex valued. (I'm having in mind the picture Ax=lambda x.)

Yep!

fluidistic said:
You mean x any complex and real valued?

Whatever.

fluidistic said:
[itex]||Ax||=\sqrt {<Ax,Ax>}[/itex] where <,> denotes the inner product. Or do you mean a specific norm?

Yes, I meant this one.
When not specified, this one is always meant.
 
  • #10
Thanks once again guys!
Well I'm stuck at showing that [itex]||Ax||=||x||[/itex]. I know I have to use the fact that [itex]||Ax||=\sqrt {<Ax,Ax>}[/itex] and that A is orthogonal ([itex]A^T=A^{-1}[/itex]).
I'm thinking about using some properties of the inner product but I can't find any interesting for this case.
 
  • #11
fluidistic said:
Thanks once again guys!
Well I'm stuck at showing that [itex]||Ax||=||x||[/itex]. I know I have to use the fact that [itex]||Ax||=\sqrt {<Ax,Ax>}[/itex] and that A is orthogonal ([itex]A^T=A^{-1}[/itex]).
I'm thinking about using some properties of the inner product but I can't find any interesting for this case.

Ah, well, I didn't really intend for you to proof it.
As far as I'm concerned it's simply a property of an orthogonal matrix.

But if you want to proof it, you can use: ||Ax||2 = (Ax)T(Ax).
Do you know how to simplify that?
 
  • #12
I like Serena said:
Ah, well, I didn't really intend for you to proof it.
As far as I'm concerned it's simply a property of an orthogonal matrix.

But if you want to proof it, you can use: ||Ax||2 = (Ax)T(Ax).
Do you know how to simplify that?

Yeah I'd rather prove it, otherwise I feel like I'm assuming what I want to prove. :)
Oh bright idea. I think I do know how to simplify it. [itex](Ax)^T(Ax)=x^TA^TAx=x^Tx=||x||^2[/itex]. Since [itex]||v|| \geq 0[/itex] for any vector v, we reach [itex]|Ax|=|x|[/itex]. I'm going to think on how to proceed further. Will post here as soon as I get results or get stuck.
Thanks :)
 
  • #13
What I get: Let x be an eigenvector associated with the eigenvalue lambda. [itex]Ax= \lambda x \Rightarrow |Ax|= |\lambda x | =|x| =|\lambda || x|\Rightarrow |\lambda |=1[/itex]. Thus if [itex]\lambda \in \mathbb{R}[/itex] as stated, then [itex]\lambda = 1[/itex] or [itex]-1[/itex].
 
  • #14
fluidistic said:
What I get: Let x be an eigenvector associated with the eigenvalue lambda. [itex]Ax= \lambda x \Rightarrow |Ax|= |\lambda x | =|x| =|\lambda || x|\Rightarrow |\lambda |=1[/itex]. Thus if [itex]\lambda \in \mathbb{R}[/itex] as stated, then [itex]\lambda = 1[/itex] or [itex]-1[/itex].

Good! :smile:

(Although in a proper proof, you should mention that you are using the property of an orthogonal matrix that the norm of a vector is invariant.
Of course, I already know in this case. :wink:)


As an afterthought, you may want to distinguish the vector norm ||*|| from the absolute value |*| (when applied to lambda) here.
 
  • #15
Thanks once again. I've been so busy I can't even believe it's been 3 (oh 4 this minute) days I've last written in this post.
Yeah on my draft I've redone the exercise and made use of the norm of lambda*x and modulus of lambda, I think I did it well.
Thanks for pointing this out though. :)
 

FAQ: Linear algebra, orthogonal matrix proof

What is linear algebra?

Linear algebra is a branch of mathematics that deals with linear equations and their representations in vector spaces. It involves the study of linear transformations, matrices, and systems of linear equations.

What is an orthogonal matrix?

An orthogonal matrix is a square matrix in which all the columns and rows are orthogonal to each other. This means that the dot product of any two columns or rows is equal to 0. In other words, the matrix is composed of perpendicular vectors.

What is the proof for an orthogonal matrix?

The proof for an orthogonal matrix involves showing that the dot product of any two columns or rows is equal to 0, and that the inverse of the matrix is equal to its transpose. This can be done using basic algebraic manipulations and properties of dot products.

How is linear algebra used in real life?

Linear algebra has many applications in various fields, including engineering, physics, economics, and computer graphics. It is used to solve systems of linear equations, analyze complex data sets, and model real-world scenarios.

What are some key concepts in linear algebra?

Some key concepts in linear algebra include vector spaces, linear transformations, matrices, determinants, eigenvalues and eigenvectors, and systems of linear equations. These concepts form the foundation for understanding more advanced topics in linear algebra.

Similar threads

Back
Top