If |a> is an eigenvector of A, is f(B)|a> an eigenvector of A?

In summary, the question is asking if an arbitrary vector can be an eigenvector of an arbitrary operator. The answer is no, as the eigenvalues and eigenvectors of an operator are determined by its specific properties and cannot be generalized to any arbitrary operator.
  • #1
happyparticle
465
21
Homework Statement
Show that a vector is an eigenvector of an operator
Relevant Equations
##A|a\rangle = a|a\rangle##
Hi,
If ##|a\rangle## is an eigenvector of the operator ##A##, I know that for any scalar ##c \neq 0## , ##c|a\rangle## is also an eigenvector of ##A##

Now, is the ket ##F(B)|a\rangle## an eigenvector of ##A##? Where ##B## is an operator and ##F(B)## a function of ##B##.

Is there way to show that ##F(B)|a\rangle## is and eigenvector of ##A## and find the eigenvalue?

Thank you!
 
Physics news on Phys.org
  • #2
happyparticle said:
Homework Statement:: Show that a vector is an eigenvector of an operator
Relevant Equations:: ##A|a\rangle = a|a\rangle##

Hi,
If ##|a\rangle## is an eigenvector of the operator ##A##, I know that for any scalar ##c \neq 0## , ##c|a\rangle## is also an eigenvector of ##A##

Now, is the ket ##F(B)|a\rangle## an eigenvector of ##A##? Where ##B## is an operator and ##F(B)## a function of ##B##.

Is there way to show that ##F(B)|a\rangle## is and eigenvector of ##A## and find the eigenvalue?

Thank you!
The usual method to discuss something like this is to start with the Taylor expansion of F(B). Since this is a polynomial in B then we are applying B directly to ##\mid a \rangle##. So what you are really asking is whether ##\mid a \rangle## is an eigenstate of B.

The only way this can happen is if [A, B] = 0. In this case we can calculate the eigenvalues of B in the ##\mid a \rangle## basis: ## B \mid a \rangle = ( \langle a \mid B \rangle ) \mid a \rangle##
(The proof is fairly simple, just use closure twice.)

From there you can construct what happens when we apply F(B) to ##\mid a \rangle##.

-Da n
 
  • Like
Likes happyparticle and PeroK
  • #3
@topsquark
For instance, If I have ##[X,F(P_x)]## and ##X ,P_x## don't commute so we can't measure a value for X and ##P_x## at the same time so they don't share an eigenvalue ?

I know that [A,F(B)] doesn't commute, but I have to show that ##
F(B)|a\rangle
## is also an eigenvector of ##A##, where ##|a\rangle ## is an eigenvector of ##A##

So I misunderstood something.

I'm trying to show that ##B|x\rangle = iP_x/h|x\rangle## is an eigenvector of ##X## if ##|x\rangle## is an eigenvector of ##X##
 
Last edited:
  • #4
happyparticle said:
Homework Statement:: Show that a vector is an eigenvector of an operator
Relevant Equations:: ##A|a\rangle = a|a\rangle##

Hi,
If ##|a\rangle## is an eigenvector of the operator ##A##, I know that for any scalar ##c \neq 0## , ##c|a\rangle## is also an eigenvector of ##A##

Now, is the ket ##F(B)|a\rangle## an eigenvector of ##A##? Where ##B## is an operator and ##F(B)## a function of ##B##.

Is there way to show that ##F(B)|a\rangle## is and eigenvector of ##A## and find the eigenvalue?

Thank you!
I suggest you look for a counterexample using the simplest case of 2x2 matrices.
 
  • Like
Likes topsquark
  • #5
happyparticle said:
@topsquark
For instance, If I have ##[X,F(P_x)]## and ##X ,P_x## don't commute so we can't measure a value for X and ##P_x## at the same time so they don't share an eigenvalue ?

I know that [A,F(B)] doesn't commute, but I have to show that ##
F(B)|a\rangle
## is also an eigenvector of ##A##, where ##|a\rangle ## is an eigenvector of ##A##

So I misunderstood something.

I'm trying to show that ##B|x\rangle = iP_x/h|x\rangle## is an eigenvector of ##X## if ##|x\rangle## is an eigenvector of ##X##
So the actual problem is
1) Show that if ##\mid x\rangle## is an eigenstate of X then it is also an eigenstate of ##P_x##?

Or is it
2) Show that ## \mid x \rangle## is an eigenstate of ##F(P_x)##?

In the future, please post the whole question.
1) Clearly not, as
##B \mid x \rangle = \dfrac{d}{dx} \mid x \rangle##
which cannot give us an eigenvalue equation in ##\mid x \rangle## since ##\dfrac{d}{dx} \mid x \rangle \neq \text{(constant)} \mid x \rangle##.

2) Because of 1), the only function that F(B) could be is a constant, but if I now have the full question, F(B) is an arbitrary function.

-Dan
 
  • #6
happyparticle said:
Homework Statement:: Show that a vector is an eigenvector of an operator
Relevant Equations:: ##A|a\rangle = a|a\rangle##

Hi,
If ##|a\rangle## is an eigenvector of the operator ##A##, I know that for any scalar ##c \neq 0## , ##c|a\rangle## is also an eigenvector of ##A##

Now, is the ket ##F(B)|a\rangle## an eigenvector of ##A##? Where ##B## is an operator and ##F(B)## a function of ##B##.

Is there way to show that ##F(B)|a\rangle## is and eigenvector of ##A## and find the eigenvalue?

Thank you!
PS note that you have an arbitrary operator ##B## and an arbitrary function ##F##. That means that ##F(B)## is an arbitrary operator. E.g. take any operator ##B## and the identity function as ##F##.

Now, as ##F(B)## is arbitrary, then ##F(B) \ket a## can be any vector. I.e. this is an arbitrary vector. So, you're question can be rephrased as:

If ##\ket a## is an eigenvector of ##A##, then are all vectors eigenvectors of ##A##?
 
  • Like
Likes topsquark
  • #7
topsquark said:
The usual method to discuss something like this is to start with the Taylor expansion of F(B). Since this is a polynomial in B then we are applying B directly to ##\mid a \rangle##. So what you are really asking is whether ##\mid a \rangle## is an eigenstate of B.

The only way this can happen is if [A, B] = 0. In this case we can calculate the eigenvalues of B in the ##\mid a \rangle## basis: ## B \mid a \rangle = ( \langle a \mid B \rangle ) \mid a \rangle##
(The proof is fairly simple, just use closure twice.)

From there you can construct what happens when we apply F(B) to ##\mid a \rangle##.

-Da n
Alternatively, if ##B## commutes with ##A##, then:
$$A(B\ket a) = (AB)\ket a = (BA)\ket a = B(A\ket a) = B( \lambda \ket a) = \lambda(B\ket a)$$And we see that ##B\ket a## is also an eigenvector of ##A## with the same eigenvalue, ##\lambda##.

Morevover, if ##B## commutes with ##A##, then powers of ##B## commute with ##A## and hence any function of ##B## that has a valid power series expansion commutes with ##A##
 
  • Like
Likes topsquark
  • #8
topsquark said:
The usual method to discuss something like this is to start with the Taylor expansion of F(B). Since this is a polynomial in B then we are applying B directly to ##\mid a \rangle##. So what you are really asking is whether ##\mid a \rangle## is an eigenstate of B.

The only way this can happen is if [A, B] = 0. In this case we can calculate the eigenvalues of B in the ##\mid a \rangle## basis: ## B \mid a \rangle = ( \langle a \mid B \rangle ) \mid a \rangle##
(The proof is fairly simple, just use closure twice.)

From there you can construct what happens when we apply F(B) to ##\mid a \rangle##.

-Da n
That's not true. An important counter-example are the annihilation and creation operators of the harmonic oscillator ##\hat{a}## and ##\hat{a}^{\dagger}## and the "phonon-number operator" ##\hat{N}=\hat{a}^{\dagger} \hat{a}##. Since ##[\hat{a},\hat{a}^{\dagger}## it's easy to show that if ##|n \rangle## is an eigenvector of ##\hat{N}## with eigenvalue ##n##, then ##\hat{a}|n \rangle## is an eigenvector of ##\hat{N}## with eigenvalue ##(n-1)## (or the null vector) and ##\hat{a}^{\dagger}|n \rangle## is an eigenvector of ##\hat{N}## with eigenvalue ##(n+1)##. Since ##\hat{N}## is a positive definite self-adjoint operator, it's easy to show that there's an eigenvector of ##\hat{N}##, ##|\Omega \rangle## such that ##\hat{a} |\Omega \rangle=0##, i.e., it's the eigenvector of ##\hat{N}## with eigenvalue ##0##, and thus the eigenvalues of ##\hat{N}## must be ##n \in \{0,1,2,3,\ldots \}=\mathbb{N}_0##.
 
  • Love
Likes topsquark
  • #9
My problem is that I can't show that my operators commutes.
Since ##X## and ##P_x## don't commute, I think that any function of ##P_x## doesn't commute with ##X##.
Thus, ##[X,e^{ip_x}]## doesn't commute.
 
  • #10
vanhees71 said:
That's not true. An important counter-example are the annihilation and creation operators of the harmonic oscillator ##\hat{a}## and ##\hat{a}^{\dagger}## and the "phonon-number operator" ##\hat{N}=\hat{a}^{\dagger} \hat{a}##. Since ##[\hat{a},\hat{a}^{\dagger}## it's easy to show that if ##|n \rangle## is an eigenvector of ##\hat{N}## with eigenvalue ##n##, then ##\hat{a}|n \rangle## is an eigenvector of ##\hat{N}## with eigenvalue ##(n-1)## (or the null vector) and ##\hat{a}^{\dagger}|n \rangle## is an eigenvector of ##\hat{N}## with eigenvalue ##(n+1)##. Since ##\hat{N}## is a positive definite self-adjoint operator, it's easy to show that there's an eigenvector of ##\hat{N}##, ##|\Omega \rangle## such that ##\hat{a} |\Omega \rangle=0##, i.e., it's the eigenvector of ##\hat{N}## with eigenvalue ##0##, and thus the eigenvalues of ##\hat{N}## must be ##n \in \{0,1,2,3,\ldots \}=\mathbb{N}_0##.
Huh! Okay, I see what I did now. I learned this out of Sakuri, but what I didn't realize is that here I was arguing the converse of what he was saying. I went back and reviewed it and, for some reason, I had it in my head that compatible observables was a biconditional with common eigenstates. Clearly that isn't the case.

Thanks for the correction!

-Dan
 
  • #11
happyparticle said:
My problem is that I can't show that my operators commutes.
Since ##X## and ##P_x## don't commute, I think that any function of ##P_x## doesn't commute with ##X##.
Thus, ##[X,e^{ip_x}]## doesn't commute.
Right. As I mentioned above we expand ##e^{i p_x}## as a Taylor series.
Explicitly:
##e^{i p_x} = 1 + i p_x + \dfrac{i^2}{2} p_x^2 + \dfrac{i^3}{3!} p_x^3 + \dots##

So
##[x, e^{i p_x} ] = [x , 1 + i p_x + \dfrac{i^2}{2} p_x^2 + \dfrac{i^3}{3!} p_x^3 + \dots ]##

##= [x,x] + i [x, p_x] + \dfrac{i^2}{2!} [x, p_x^2 ] + \dfrac{i^3}{3!} [ x, p_x^3 ] + \dots ##

and
##[x, p_x^n] = i \hbar p_x^{n-1}##

So
##[x, e^{i p_x} ] = i i \hbar + \dfrac{i^2}{2!} 2 i \hbar p_x + \dfrac{i^3}{3!} 3 i \hbar p_x^2 + \dots##

## = - \hbar - \hbar p_x + \dfrac{1}{2} p_x^2 + \dots ##

Eventually
##[x, e^{i p_x} ] = - \hbar e^{i p_x} \neq 0##

(Which we could have gotten to in one step by ##[x, g(p_x)] = i \hbar g^{ \prime } (p_x)##.)

-Dan
 
  • Like
Likes happyparticle and PeroK
  • #12
happyparticle said:
My problem is that I can't show that my operators commutes.
Since ##X## and ##P_x## don't commute, I think that any function of ##P_x## doesn't commute with ##X##.
That definitely can't be true. A function can transform ##P_x## into any other operator. A trivial example would be ##(P_x)^0 = I##.
happyparticle said:
Thus, ##[X,e^{ip_x}]## doesn't commute.
This doesn't make sense. You should say that the commutator is non-zero: thus ##[X,e^{ip_x}] \ne 0##.
 
  • #13
topsquark said:
Huh! Okay, I see what I did now. I learned this out of Sakuri, but what I didn't realize is that here I was arguing the converse of what he was saying. I went back and reviewed it and, for some reason, I had it in my head that compatible observables was a biconditional with common eigenstates. Clearly that isn't the case.

Thanks for the correction!

-Dan
No, now you misunderstood what I said. Of course, if you want to have a complete set of common eigenvectors of two self-adjoint operators these operators must commute. That's easy to prove: Assume that there is a common complete set of orthonormalized eigenvectors,
$$\hat{A}|a,b \rangle=a|a,b \rangle, \quad \hat{B}|a,b \rangle=b |a,b \rangle.$$
Then any vector can be written as
$$|\psi \rangle=\sum_{a,b} \psi_{ab} |a,b \rangle.$$
Here I assumed that all the eigenvalues are discrete. If you have continuous spectra or if part of the spectrum of the one or the other operator is continuous, you just have to add the usual integrals, but this doesn't change too much on the argument (at the level of "robust mathematics" used by physicists in these matters ;-)). Now we have
$$\hat{A} \hat{B} |\psi \rangle=\sum_{a,b} \psi_{ab} b \hat{A}|a,b \rangle = \sum_{a,b} \psi_{ab} a b|a,b \rangle$$
and
$$\hat{B} \hat{A} |\psi \rangle=\sum_{a,b} \psi_{ab} a \hat{B}|a,b \rangle=\sum_{a,b} \psi_{ab} a b |a,b \rangle,$$
but this shows that for all vectors
$$\hat{A} \hat{B} |\psi \rangle=\hat{B} \hat{A} \psi \rangle \; \Rightarrow \; \hat{A} \hat{B}=\hat{B} \hat{A},$$
i.e., that the operators commute.
 
  • Like
Likes happyparticle, hutchphd and topsquark
  • #14
PeroK said:
That definitely can't be true. A function can transform ##P_x## into any other operator. A trivial example would be ##(P_x)^0 = I##.

This doesn't make sense. You should say that the commutator is non-zero: thus ##[X,e^{ip_x}] \ne 0##.
I'm not sure to understand. I thought that if 2 operators A and B doesn't commute then ##[A,B] \neq 0##
 
  • #15
happyparticle said:
I'm not sure to understand. I thought that if 2 operators A and B doesn't commute then ##[A,B] \neq 0##
You've repeatedly used the terminology "##[A,B]## doesn't commute". Which is nonsensical:
happyparticle said:
I know that [A,F(B)] doesn't commute,
happyparticle said:
Thus, ##[X,e^{ip_x}]## doesn't commute.
That's what I was pointing out.
 
  • Like
Likes happyparticle and topsquark
  • #16
I'm still confuse. Is it the term I used that is wrong or should I say ##A## and ##B## don't commute instead of ##[A,B] ## doesn't commute.
 
  • #17
happyparticle said:
I'm still confuse. Is it the term I used that is wrong or should I say ##A## and ##B## don't commute instead of ##[A,B] ## doesn't commute.
Yes.
 
  • Like
Likes happyparticle, topsquark and vanhees71
  • #18
Alright, thank you
 

FAQ: If |a> is an eigenvector of A, is f(B)|a> an eigenvector of A?

What is an eigenvector?

An eigenvector of a matrix A is a non-zero vector |a> such that when A acts on |a>, the result is a scalar multiple of |a>. Mathematically, this is expressed as A|a> = λ|a>, where λ is the eigenvalue corresponding to the eigenvector |a>.

What does f(B) represent in this context?

In this context, f(B) typically represents a function of the matrix B, which can be defined in various ways depending on the function f. For instance, if f(x) = x^2, then f(B) would be the matrix B squared (B^2).

If |a> is an eigenvector of A, under what conditions is f(B)|a> also an eigenvector of A?

For f(B)|a> to be an eigenvector of A, it must satisfy the eigenvector equation A(f(B)|a>) = λ(f(B)|a>) for some scalar λ. This generally requires a specific relationship between the matrices A and B, as well as the function f. In many cases, this condition is not automatically satisfied and would need to be verified for the specific matrices and function in question.

Can you provide an example where f(B)|a> is an eigenvector of A?

Consider matrices A and B where A and B commute (i.e., AB = BA), and let f(B) be the identity function (f(x) = x). If |a> is an eigenvector of both A and B, then B|a> = μ|a> for some scalar μ. In this case, f(B)|a> = B|a> = μ|a>, and since A and B commute, A(B|a>) = B(A|a>) = B(λ|a>) = λ(B|a>) = λμ|a>, showing that B|a> is also an eigenvector of A.

Is the relationship between A and B crucial for determining if f(B)|a> is an eigenvector of A?

Yes, the relationship between A and B is crucial. If A and B do not have a specific commutative or other structural relationship, it is generally not guaranteed that f(B)|a> will be an eigenvector of A. The specific properties of A, B, and the function f all play a critical role in determining whether f(B)|a> maintains the eigenvector property with respect to A.

Back
Top