A few questions about Griffiths book

In summary: Could you write those two equations in latex and post them? That might get this past the digression about different versions...(Actually I'm pretty sure that I can guess what they are but if I happen to guess wrong, or just assume a different notation than you're looking at, I'll end up introducing unnecessary noise into the thread - so let's do it right).Neither of those have anything to do with my questions.
  • #36
Apologies for my loose notation in previous posts.

Isaac0427 said:
So, the expectation value of ^QQ^\hat Q is the same as the expectation value of its eigenvalues, correct?

That's correct. The eigenvalues of an observable are the expectation values of the corresponding operator when the wavefunction is the correspond eigenstate. In math, ## q_{n} = \int \psi^{*}_{n} \hat{Q} \psi_{n} dx##, where ##\psi_{n}## is the n-th eigenstate corresponding to ##q_{n}## the n-th eigenvalue of the operator ##\hat{Q}##. That is in part why you can re-write the expectation value of ##\hat{Q}## in any state in terms of its eigenvalues ##q_{n}##, from a mathematical point of view. If an arbitrary state is given by ##\psi = \sum_{n} c_{n} \psi_{n}##, then the connection between the expectation value of ##\hat{Q}## and the eigenvalues ##q_{n}## is given by ##\langle \hat{Q} \rangle = \sum_{n} |c_{n}|^{2} q_{n}##. (Watch out for non-normalizable wavefunctions though. These formulas are not accurate in that case.)

Nugatory said:
##\hbar k## is also an operator (albeit a rather trivial one), and its expectation value is ##\int\psi^*\hbar k\psi=\hbar k\int\psi*\psi=\hbar k##.

I may be wrong, but I think this is a bit misleading. By treating k as a constant (by pulling it out of the integral like that), it's assumed that the wavefunction is something like a perfectly single-frequency plane wave. That isn't generally the case though. In general, the wavevector operator is given as $$\hat{k_{x}} = -i \frac{\partial}{\partial x}$$ and its expectation value is $$\langle \hat{k_{x}} \rangle = \int_{-\infty}^{\infty} \psi^{*}(x) (-i) \frac{\partial \psi}{\partial x} dx$$.

Isaac0427 said:
So, given that, wouldn't the expectation value of ℏkℏk\hbar k (as an observable, not an opperator) be the same as the expectation value of ^pp^\hat p? Evaluating the integral with keeping in mind that the momentum opperator is Hermitian appears to give me momentum or ℏkℏk\hbar k.

Can you show how you evaluated it? In general, one needs a wavefunction to take expectation values, so I'm not sure what you mean. I may be missing something.

Isaac0427 said:
Can you please put this in terms of opperators and observables? Thanks.

The point I'm trying to get across is that the observable is the operator (they are the same thing, as far as the math is concerned), and the value you'd see in a measurement (on average) is the expectation value of that operator. In short, an observable in quantum mechanics is a hermitian linear operator ##\hat{Q}##, which can be measured in the lab, giving us an average measured value of ##\langle \hat{Q} \rangle## (the expectation value) with a standard deviation of the measured values given by ##\sqrt{\langle (\hat{Q} - \langle \hat{Q} \rangle)^{2} \rangle}## (I know I haven't answered your other question about this definition yet, I'll address that in the next section of this post).

Isaac0427 said:
Again, this raises the question on why we can say that the difference between the momentum opperator and the expectation value of momentum has anything to do with the standard deviation.

The definition of the standard deviation of an observable in terms of the difference of its operator and expectation values is identical (at least for position and momentum, that I can prove) to the usual definition of the standard deviation of a continuous random variable in general statistics. (This is what I attempted to show two posts back.) So, before I can answer your question, I need to know: are you confused at all by the general definition (outside of quantum mechanics) of the standard deviation for a continuous random variable, where the standard deviation of a continuous random variable x is defined ##\sigma_{x} = \sqrt{\int (x - \langle x \rangle)^{2} \rho(x) dx}## where ##\rho(x)## is the probability density function and ##\langle x \rangle ## is the expected value of x? I need to know if your question is really about the general statistical definition or about the quantum mechanical version of it before I can help. Are you comfortable with the general definition given in statistics but confused about the way it translates into quantum mechanics? Or are you confused about the general definition used in statistics?
 
Physics news on Phys.org
  • #37
Isaac0427 said:
So, the expectation value of ##\hat Q## is the same as the expectation value of its eigenvalues, correct?
[After some sleep, breakfast and a couple of cups of coffee, I rewrote this a bit to make the notation more consistent]

Suppose an operator ##\hat Q## has a set of eigenstates ##\psi_i## with corresponding eigenvalues ##Q_i## so that ##\hat Q \psi_i = Q_i \psi_i##. These eigenstates are orthogonal, i.e. ##\int \psi_i^* \psi_j dx = 0## for ##i \ne j##. Assume they're also normalized, i.e. ##\int \psi_i^* \psi_i dx = 1##

If the state is an eigenstate of ##\hat Q##, i.e. one of the ##\psi_i##, then then the expectation value of ##\hat Q## for that state is the corresponding eigenvalue, ##Q_i##: $$\langle \hat Q \rangle = \int \psi_i^* \hat Q \psi_i dx = \int \psi_i^* Q_i \psi_i dx = Q_i \int \psi_i^* \psi_i dx = Q_i$$ If the state is not an eigenstate of ##\hat Q##, then it can be written as a linear combination of the eigenstates: $$\psi = \sum_i c_i \psi_i$$ The expectation value of ##\hat Q## for this state is $$\langle \hat Q \rangle = \int \psi^* \hat Q \psi dx = \int \left( \sum_i c_i^* \psi_i^* \right) \hat Q \left( \sum_j c_j \psi_j \right) dx$$ which works out to be $$\langle \hat Q \rangle = \sum_i c_i^* c_i Q_i$$ That is, the expectation value of ##\hat Q## is a weighted average of the eigenvalues ##Q_i##, which I suppose one could call the "expectation value of the eigenvalues" although I don't remember seeing anyone ever actually use that phrase. (Corrections and references welcome!)
 
Last edited:
  • Like
Likes DrClaude
  • #38
Thanks guys, I think I understand it now.
 
  • #39
Something which no one seems to have pointed out (I may have missed it and it may not have been the issue anyway) is (using stats notation, where ##\mu = E[X]##):

##Var(X) = E[(X-\mu)^2] = E[X^2 - 2\mu X + \mu^2] = E[X^2] - 2\mu E[X] + \mu^2 = E[X^2] - 2\mu^2 + \mu^2 = E[X^2] - \mu^2##

Hence:

##Var(X) = E[(X-\mu)^2]## (by definition)

and

##Var(X) = E[X^2] - E[X]^2##
 
  • Like
Likes vanhees71, Isaac0427 and Twigg

Similar threads

Back
Top