Eigenvalues of the square of an operator

In summary: The conclusion is that if L is a diagonalizable matrix, then either f is an eigenfunction of L with eigenvalue -\lambda, or g is with eigenvalue \lambda, or both.
  • #1
StatusX
Homework Helper
2,570
2
If L^2 |f> = k^2 |f>, where L is a linear operator, |f> is a function, and k is a scalar, does that mean that L|f> = +/- k |f>? How would you prove this?
 
Physics news on Phys.org
  • #2
No. Consider the operator on some 2 dimensional (sub)space. given by

[tex] \left( \begin{array}{cc} 0 &1\\0&1 \end{array} \right) [/tex]

then L^2=0 for all vectors in the space, but not all of them are eigenvectors with eigenvalue 0.
 
  • #3
I don't see how L^2 is 0 there. Isn't L^2 just L? But in any case, what are the necessary conditions for it to be true? For example, I am working with hermition operators. Is it true for them?
 
  • #4
Sorry, typo, the bottom right 1 should be a zero.
 
  • #5
As to the general case, if L is hermitian, then it is diagonalizable, so you're into the case of commuting diagonalizable operators.

However, even then the asnwer is still no.

Let f and g be two eigenvectors with eigen values 1 and -1 resp. Then L^2(f+g)=f+g, yet L(f+g)=f-g, which isn't equal to f+g or -f-g.
 
  • #6
OK thanks for your help so far. It's fine if you don't want to continue, but if not, I'll post this anyway in case someone else might have any ideas. I'm trying to find the necessary conditions for this to be true, and I've come up with the following:

[tex] \hat L^2 f = \lambda^2 f [/tex]

[tex] (\hat L - \lambda)(\hat L + \lambda) f = 0 [/tex]

and

[tex] (\hat L + \lambda)(\hat L - \lambda) f = 0 [/tex]

Take the first one first. Now, I assumed [itex](\hat L - \lambda) f[/itex] vanished, but it could just as easily be that it is just in the kernel of [itex](\hat L + \lambda)[/itex]. But this would mean [itex]\lambda[/itex] would still be an eigenvalue, but of a different eigenfunction, [itex](\hat L + \lambda) f[/itex]. Call this g. So either f is an eigenfunction of L with eigenvalue [itex]-\lambda[/itex] or g is with eigenvalue [itex]\lambda[/itex], or both.

Taking the second one gives the opposite results, that either f is an eigenfunction of L with eigenvalue [itex]\lambda[/itex] or h is with eigenvalue [itex]-\lambda[/itex], or both, with [itex]h=(\hat L - \lambda) f[/itex].

Assume f->[itex]\lambda[/itex]. Unless [itex]\lambda[/itex]=0, this means f-/->[itex]-\lambda[/itex], which means g->[itex]\lambda[/itex]. [itex]h=(\hat L - \lambda) f = 0[/itex], which is trivial. The same goes for [itex]-\lambda[/itex]. If f isn't an eigenfunction at all, then g and h must both be.

So the conclusion is that if f is an eigenfunction of [itex]\hat L^2[/itex], then you can always construct two eigenfunctions for [itex]\hat L[/itex] with eigenvalue [itex]\lambda[/itex] (for (f,g)), [itex]-\lambda[/itex] (for (f,h)), or both (for (g,h)).

Do you know if there is any way to get more specific than this, to determine what decides whether the eigenfunctions will be (f,g), (f,h), or (g,h)? (eg, in your example, it is (g,h))

By the way, if [itex]\lambda[/itex] is 0, nothing can be said. Then g=h=Lf, and we have either that L f=0 or [itex]\hat L^2[/itex]f =0, which we already knew.
 
Last edited:
  • #7
Can't you conclude g always satisfies the equation Lg = λg (And similarly for h)?

Anyways, have you considered reconstructing f from g and h? That gives you something interesting...
 
Last edited:
  • #8
Thanks for the reply. Yes, g and h are always eigenfunctions, but when f is an eigenfunction, one of the will be trivial. As for your hint, I tried it and I didn't get anything new:

[tex] \hat L f = \frac{1}{2}(g+h) [/tex]

[tex] \hat L^2 f = \frac{\lambda}{2}(g-h) = \lambda^2 f [/tex]

so:

[tex] f = \frac{1}{2\lambda} (g - h) [/tex]

Is this what you had in mind? Then:

[tex] \hat L f = \frac{1}{2} (g + h) [/tex]

is a multiple of f iff g or h is 0 (or if they are linearly dependent? I don't know if this is possible). But I already knew this. Is there something else you can get from this result? I'm getting the feeling the only way to know is to check one of f, g, or h to see if they are an eigenfunction or 0, from which the rest should follow. I'm not sure exactly what I was looking for, maybe some condition on the operator itself that would make this determination less bluntly.
 
Last edited:
  • #9
Maybe I just made a typo, but I'm not seeing it... this is my work:

g := (L + λI) f
h := (L - λI) f

Lg = λg
Lh = -λh

2λLf = L(2λf) = L(g-h) = λ(g+h) = 2Lf
 
  • #10
Hurkyl said:
λ(g+h) = 2Lf

g+h = 2Lf, so λ(g+h) =2λLf, which is what you started with.
 
  • #11
Ah, now that looks like a mistake! Bleh, I looked at it a zillion times and never saw it. :frown:
 
  • #12
Hrm.

When L is diagonalizable, one can make progress. I make this claim:

If L is a diagonalizable matrix, then [itex]L^2 f = \lambda^2 f \implies Lf = \lambda f \vee Lf = -\lambda f[/itex] can fail if and only if both λ and -λ are eigenvalues of L.

Doesn't seem like a very impressive statement, though.
 
Last edited:
  • #13
I haven't checked it at all, but I have a hunch it may be true if the operator is positive definite.
 

FAQ: Eigenvalues of the square of an operator

What are eigenvalues of the square of an operator?

The eigenvalues of the square of an operator are the values that satisfy the characteristic equation of the operator's square. These values represent the possible outcomes when the operator is applied to an eigenvector.

How are eigenvalues of the square of an operator calculated?

The eigenvalues of the square of an operator can be calculated by first finding the eigenvalues of the operator itself. Then, the square of each eigenvalue is taken to determine the eigenvalues of the square of the operator.

Can the eigenvalues of the square of an operator be negative?

Yes, the eigenvalues of the square of an operator can be negative. This can happen when the operator has negative eigenvalues or when the operator changes the direction of the eigenvector's vector space.

What is the significance of the eigenvalues of the square of an operator?

The eigenvalues of the square of an operator can provide important information about the operator's behavior and its effect on vectors. They can also be used to determine the stability and convergence of a system described by the operator.

Can the eigenvalues of the square of an operator be complex numbers?

Yes, the eigenvalues of the square of an operator can be complex numbers. This can occur when the operator has complex eigenvalues or when the operator has real eigenvalues but changes the complex structure of the eigenvector's vector space.

Back
Top