The Orthogonality of the Eigenvectors of a 2x2 Hermitian Matrix

  • #1
rghurst
7
0
TL;DR Summary
I am unable to demonstrate that the eigenvectors of this 2x2 hermitian matrix are orthogonal.
The eigenvectors of a hermitian matrix corresponding to unique eigenvalues are orthogonal. This is not too difficult of a statement to prove using mathematical induction. However, this case is seriously bothering me. Why is the dot product of the vectors not rightly zero? Is there something more fundamental that I am missing here? I appreciate any and all comments.
 

Attachments

  • Hermitian_Matrix_Eigenvectors.pdf
    101.3 KB · Views: 10
Physics news on Phys.org
  • #2
Is ##\bigl\langle x,\overline{y} \bigr\rangle =0##?

My eigen vectors are
\begin{align*}
\begin{pmatrix}x\\y\end{pmatrix}&\in \left\{
\begin{pmatrix} 2i \\ 1+\sqrt{5} \end{pmatrix}\, , \,
\begin{pmatrix} 2i \\ 1-\sqrt{5} \end{pmatrix}
\right\}
\end{align*}



Please use LaTeX:
https://www.physicsforums.com/help/latexhelp/
and post such questions in the homework forums. Rule of thumb: if it has numbers it is homework.
 
  • #3
rghurst said:
TL;DR Summary: I am unable to demonstrate that the eigenvectors of this 2x2 hermitian matrix are orthogonal.
Your calculations are all correct until the very last step. The proper inner-product ##\left\langle \:,\right\rangle ## for complex eigenvectors is ##\left\langle \mathbf{v}_{1},\mathbf{v}_{2}\right\rangle =\mathbf{v}_{1}^{*}\cdot\mathbf{v}_{2}##. Don't forget to complex-conjugate the entries of one of your eigenvectors!
 
  • #4
renormalize said:
Your calculations are all correct until the very last step. The proper inner-product ##\left\langle \:,\right\rangle ## for complex eigenvectors is ##\left\langle \mathbf{v}_{1},\mathbf{v}_{2}\right\rangle =\mathbf{v}_{1}^{*}\cdot\mathbf{v}_{2}##. Don't forget to complex-conjugate the entries of one of your eigenvectors!
Thank you for explaining this, but why does the rule change? In my mind, arbitrarily conjugating one of the eigenvectors makes another vector that is neither of the two actual eigenvectors. Do not the two actual eigenvectors as originally calculated have to be orthogonal? Perhaps the inner product is defined flexibly based on something foundational that I am missing?
 
  • #5
rghurst said:
Thank you for explaining this, but why does the rule change? In my mind, arbitrarily conjugating one of the eigenvectors makes another vector that is neither of the two actual eigenvectors. Do not the two actual eigenvectors as originally calculated have to be orthogonal? Perhaps the inner product is defined flexibly based on something foundational that I am missing?
We do not want ##\begin{pmatrix}i \\ 1\end{pmatrix}## and ##\begin{pmatrix} -1 \\ i\end{pmatrix}## to be orthogonal because they are complex linear dependent.
 
  • #6
rghurst said:
Thank you for explaining this, but why does the rule change? In my mind, arbitrarily conjugating one of the eigenvectors makes another vector that is neither of the two actual eigenvectors. Do not the two actual eigenvectors as originally calculated have to be orthogonal? Perhaps the inner product is defined flexibly based on something foundational that I am missing?
For Hermitian matrices and the complex vectors they act on, ##\mathbf{v}_{1}^{*}\cdot\mathbf{v}_{2}## is not an arbitrary rule change, it is the natural inner product. This can be seen by establishing three simple properties of a Hermitian matrix and its eigenvectors.
Let the matrix ##A## be Hermitian; i.e., ##A^{\dagger}\equiv\left(A^{\ast}\right)^{T}=A##, and let ##v## be any complex vector. Then:$$\left(v^{\dagger}Av\right)^{\dagger}=v^{\dagger}A^{\dagger}v=v^{\dagger}Av\Rightarrow v^{\dagger}Av\;\text{is a real number}\tag{1}$$Now let ##v## be an eigenvector of ##A## with eigenvalue ##\lambda##; i.e., ##Av=\lambda v##. Then:$$v^{\dagger}Av=v^{\dagger}\left(\lambda v\right)=\lambda v^{\dagger}v=\lambda\left\Vert v\right\Vert ^{2}\Rightarrow\lambda=\frac{v^{\dagger}Av}{\left\Vert v\right\Vert ^{2}}\;\text{is a real number}\tag{2}$$This shows that the eigenvalues of a Hermitian matrix are always real and demonstrates that ##v^{\dagger}v## is the natural length-squared of the complex vector ##v##. Finally, let ##v_1,v_2## be two eigenvectors of ##A## with distinct eigenvalues: ##Av_{1}=\lambda_{1}v_{1},Av_{2}=\lambda_{2}v_{2},\lambda_{1}\neq\lambda_{2}##. Then:$$\lambda_{1}v_{1}^{\dagger}v_{2}=\left(\lambda_{1}v_{1}\right)^{\dagger}v_{2}=\left(Av_{1}\right)^{\dagger}v_{2}=v_{1}^{\dagger}A^{\dagger}v_{2}=v_{1}^{\dagger}Av_{2}=v_{1}^{\dagger}\left(\lambda_{2}v_{2}\right)=\lambda_{2}v_{1}^{\dagger}v_{2}\Rightarrow\left(\lambda_{1}-\lambda_{2}\right)v_{1}^{\dagger}v_{2}=0$$Thus:$$v_{1}^{\dagger}v_{2}=0\;\text{for two eigenvectors with distinct eigenvalues}\tag{3}$$This proves that the eigenvectors of a Hermitian matrix with distinct eigenvalues are always orthogonal when measured by the natural inner product ##u^{\dagger}w=\mathbf{u}^{*}\cdot\mathbf{w}##.
 
  • Like
Likes rghurst
  • #7
Awesome! Thanks for the explanation. I have understood this as a general rule for as long as I can remember, but I now have a much better intuitive understanding of the need to define the inner product this way for complex vectors. Now if I could only have a more intuitive grasp of diagonalization and the profundity of eigenspaces themselves… that would be great. Phrased more specifically, why is it especially those vectors lying in the the null space of a diagonalizable nxn matrix A minus an nxn diagonal matrix D so profound? As a person not from a graduate-level physics or engineering background, I simply haven’t been exposed to the applications that unveil the significance of eigenspaces. Any sage words from learned minds are welcome.
 
  • #8
I started my list and realized Wikipedia had probably done the work for me. And sure enough, it has. Have a look through the applications section: Wiki link. It really crops up everywhere.
 
  • #9
rghurst said:
Thank you for explaining this, but why does the rule change? In my mind, arbitrarily conjugating one of the eigenvectors makes another vector that is neither of the two actual eigenvectors. Do not the two actual eigenvectors as originally calculated have to be orthogonal? Perhaps the inner product is defined flexibly based on something foundational that I am missing?

"Orthogonal" is not some absolute concept; it has meaning only in relation to a specific inner product. Here that inner product is [itex]\langle v,w \rangle = \sum_i v_i w_i^{*}[/itex].

An inner product on a vector space [itex]V[/itex] over [itex]\mathbb{C}[/itex] is any function [itex]f: V \times V \to \mathbb{C}[/itex] which satisfies the following conditions:
  • For every [itex]v \in V[/itex], [itex]f(v,v)[/itex] is real and non-negative with [itex]f(v,v) = 0[/itex] if and only if [itex]v = 0[/itex].
  • For every two vectors [itex]v[/itex] and [itex]w[/itex], [tex]f(v, w) =f(w, v)^{*}.[/tex]
  • For every three vectors [itex]u[/itex], [itex]v[/itex] and [itex]w[/itex] and every scalar [itex]\alpha[/itex], [tex]
    f(\alpha v + w, u) = \alpha f(v, u) + f(w, u).[/tex] (This is the convention in mathematics; in physics an inner product is required to be linear in the second argument rather than the first.)

It follows that [itex]g(u,v) = \sum v_i w_i[/itex] is not an inner product: [itex]g((i,0), (i,0)) = -1[/itex] is negative, and [itex]g((1, i), (1, i)) = 0[/itex] even though [itex](1, i) \neq (0,0)[/itex]. [itex]\langle v ,w \rangle = \sum_i v_i w_i^{*}[/itex] is an inner product, since [tex]\langle v, v \rangle = \sum_i |v_i|^2 \geq 0[/tex] is real and non-negative and is zero if and only if each [itex]v_i = 0[/itex], ie. if and only if [itex]v = 0[/itex].
 
  • #10
pasmith said:
"Orthogonal" is not some absolute concept; it has meaning only in relation to a specific inner product.
Or phrased a bit more basically: Orthogonality depends on how angles are measured.

Whereas angles in a real Euclidean plane are usually measured by things like

375px-Set_square_Geodreieck.svg.png

that corresponds to an inner product defined by the identity matrix ##\boldsymbol M =\boldsymbol I ## (cp. https://en.wikipedia.org/wiki/Inner_product_space#Euclidean_vector_space). A (squared##^*)##) length in the real plane is defined by ##x^2+y^2,## the inner product of a vector by itself. This is always a positive number as we would expect. But it is no longer automatically positive if the coordinates ##x,y## are allowed to be complex. The identity matrix is therefore no longer suited to measure angles and lengths. The "geodreieck" isn't an appropriate tool any longer. Besides, it would be difficult to use it if we all of a sudden have four real coordinates ##a,b,u,v## from ##x=a+i b\, , \,y=u+ i v.## However,
$$xx^\dagger =(a+ib)\cdot \overline{(a+ib)}=(a+ib)\cdot (a-ib)=a^2-i^2b^2=a^2+b^2$$
turns out to be always a positive real number (except for the zero vector, of course) and is thus suited to define a (squared##^*)##) length again. Hence taking the complex conjugate in the second ##^{**})## coordinate resolves the dilemma. This defines a complex version of an inner product, and via
$$
\cos \sphericalangle \left(\vec{x},\vec{y}\right)=\dfrac{\bigl\langle \vec{x}\, , \,\vec{y} \bigr\rangle }{\sqrt{\bigl\langle \vec{x}\, , \,\vec{x} \bigr\rangle\, \cdot \,\bigl\langle \vec{y}\, , \,\vec{y} \bigr\rangle}}=\dfrac{\vec{x}\cdot \vec{y}^\dagger}{\sqrt{\left(\vec{x}\cdot \vec{x}^\dagger\right)\cdot\left( \vec{y}\cdot \vec{y}^\dagger\right)}} \, , \,
$$
an angle defined by that inner product, i.e. the definition of how we measure lengths.

____________
##^*)## Taking the root of the formulas ##x^2+y^2## or ##a^2+b^2## is necessary by Pythagoras, or simpler: to make sure that the units like inches still fit. We get square inches from the formula, i.e. a squared length.

##^{**})## Choosing the first coordinate is equally possible, as long as it is always the first one, or always the second one. Physicists and mathematicians use standardly the opposite coordinate for conjugation. IIRC then physicists use the first argument for conjugation and mathematicians the second. Don't ask me why or if I am even sure. It's reasonable to check it on the case, author by author.
 
  • #11
rghurst said:
Awesome! Thanks for the explanation. I have understood this as a general rule for as long as I can remember, but I now have a much better intuitive understanding of the need to define the inner product this way for complex vectors. Now if I could only have a more intuitive grasp of diagonalization and the profundity of eigenspaces themselves… that would be great. Phrased more specifically, why is it especially those vectors lying in the the null space of a diagonalizable nxn matrix A minus an nxn diagonal matrix D so profound? As a person not from a graduate-level physics or engineering background, I simply haven’t been exposed to the applications that unveil the significance of eigenspaces. Any sage words from learned minds are welcome.
Hmm, maybe as an application/real-life scenario, consider the case of the Tacoma-Narrows bridge and resonant frequency. It would be nice if someone who knew more engineering than I did ( essentially 100% of engineers) could chime in on how to determine the resonant frequency of something they've built or are building.
https://www.britannica.com/topic/Tacoma-Narrows-Bridge
Edit 1: It was mostly due to air fluttering, though resonance and natural frequency had to see with it

Edit:Maybe @russwaters or @Drakkith can expand on it? @fresh_42 can help me tag him?
 
Last edited:
  • #12
WWGD said:
Edit:Maybe @russwaters or @Drakkith can expand on it? @fresh_42 can help me tag him?
Fun fact: I wanted to take Tacoma-Narrows as an example in the thread about ##\pi .## The Roman aqueducts and the European cathedrals with ##\pi\approx \dfrac{22}{7}## still stand, whereas the first version of the modern Tacoma-Narrows-Bridge ...

The bridge is a good example of resonance and positive feedback coupling because it is on film. It's more impressive than singing a glass to break, which by the way works so much better with an oscilloscope than with vocal chords.

Not sure if this is a good example of complex orthogonality.
 
  • Like
Likes WWGD
  • #13
fresh_42 said:
Fun fact: I wanted to take Tacoma-Narrows as an example in the thread about ##\pi .## The Roman aqueducts and the European cathedrals with ##\pi\approx \dfrac{22}{7}## still stand, whereas the first version of the modern Tacoma-Narrows-Bridge ...

The bridge is a good example of resonance and positive feedback coupling because it is on film. It's more impressive than singing a glass to break, which by the way works so much better with an oscilloscope than with vocal chords.

Not sure if this is a good example of complex orthogonality.
You may go further, somewhat philosophically. We have very limited knowledge of facts, understanding of how the world works, yet our lives, society, aren't completely chaotic.
 
  • #14
In a different sense of applicacions, there was an old Math Olympics problem of computing the value of ##(\frac{1+\sqrt 5}{2})^{12}##. I knew it was an Eigenvalue of the matrix representation of the Fibonnaci recursion, and used that to compute it, without expanding the binomial.
 
Back
Top