I'm teaching Quantum mechanics to freshmen in my college, but I'm stuck on a very basic concept that I always took for granted. I'm a chemist major, not a physicist, but I thought it's something I should clearly understand. I am embarrassed to ask this question, but I'd rather be embarrassed now...
My question relates to subsection 2.2.1 of [this article][1]. This subsection recalls the work of Lindgren, Rue, and Lindström (2011) on Gaussian Markov Random Fields (GMRFs). The subsection starts with a two-dimensional regular lattice where the 4 first-order neighbours of $u_{i,j}$ are...
$$\begin{align*}
E[(A+B)^2]&=E[A^2+2AB+B^2]\\
&=E[A^2]+2E[AB]+E[B^2]\\
&=2E[AB]+E[B^2].
\end{align*}$$
Can the terms ##2E[AB]## and ##E[B^2]## be simplified any more? Thanks, friends.
Problem :Let ##X_0,X_1,\dots,X_n## be independent random variables, each distributed uniformly on [0,1].Find ## E\left[ \min_{1\leq i\leq n}\vert X_0 -X_i\vert \right] ##.
Would any member of Physics Forum take efforts to explain with all details the following author's solution to this...
Suppose ##X## is a nonnegative random variable and ##p\in (0,\infty)##. Show that ##\mathbb{E}[X^p] < \infty## if and only if ##\sum_{n = 1}^\infty n^{p-1}P(X \ge n) < \infty##.
hi all
how do I prove that
$$
<A^{n}>=<A>^{n}
$$
It seems intuitive but how do I rigorously prove it, My attempt was like , the LHS can be written as:
$$
\bra{\Psi}\hat{A}.\hat{A}.\hat{A}...\ket{\Psi}=\lambda^{n} \bra{\Psi}\ket{\Psi}=\lambda^{n}\delta_{ii}=\lambda^{n}
$$
and the RHS equal:
$$...
Dear Forum,
I am solving for the expectation value of the kinetic energy for the deuteron (Krane problem 4.3). I must be missing something since this has become far more complicated than I remember.
The problem is as follows:
## <T> = \frac{\hbar^{2}}{2m} \int_{0}^{\infty}...
We have three Random variable or vector A,B,C. Condition is A & B are independent as well as B & C are independent RVs . But A & C are the same random variable with same distribution . So How can determine E{ABC}. Can I write this E{ABC}= E{AE{B}C}?
Problem:
In a box there are ##120## balls with ## X ## of them being white and ## 120 - X ## being red for random variable ##X##.
We know that ## E[ X] = 30 ##. We are taking out ## k ## balls randomly and with returning ( we return each ball we take out, so there is equal probability for each...
Apart from the usual integral method, are there any other ways to find expectation value of momentum? I know one way is by using ehrenfest theorem, relating it time derivative of expectation value of position operator.
Even using the uncertainty principle, we might get it if we know the...
so from Fourier transform we know that
Ψ(r)=1/2πℏ∫φ(p)exp(ipr/ℏ)dp
I proved that <p>= ∫φ(p)*pφ(p)dp from <p>=∫Ψ(r)*pΨ(r)dr
so will the same hold any operator??
When the expectation value of spin in the z direction for one particle is zero and I make measurements for an even number of particles in the same state, do I get exactly half to be spin up and half to be spin down along the z direction? More generally, what does spin expectation value for one...
The shape of the sampling distribution of the Pearson product moment correlation coefficient depends on the size of the sample. Is the expectation of the sampling distribution of the Pearson product moment correlation coefficient always equal to the population correlation coefficient, regardless...
Dear Everybody,
I am about to teach my first course, College Algebra at my university as an instructor of record. Most of the students take this course is just for liberal arts requirement for critical thinking. I feel like I have too high expectation of my students when I should not have too...
Hi
A theorem states that if V(x , t) ≥ V0 then <E> is real and <E> ≥V0 for any normalizable state. The proof contains the following line
<E> = (ħ2/2m)∫∇ψ*∇ψ d3x + ∫ Vψ*ψ d3x ≥ ∫ V0ψ*ψ
Can anybody explain why that inequality is true ?
Thanks
So, I have a hamiltonian for screening effect, written like:
$$ H=\sum_{k}^{}\epsilon_{k}c_{k}^{\dagger}c_{k}+ \frac{1}{\Omega}\sum_{k,q}^{}V(q,t)c_{k+q}^{\dagger}c_{k} $$
And I have to find an equation for the time evolution of the expected value of the operator ##c_{k-Q}^{\dagger}c_{k}##.
I...
I'm reading a website where they're doing a derivation. Within the derivation they write $$E(X_n | X_{n-1}) = X_{n-1} + f \implies E(X_n) = E(X_{n-1} ) + f$$. Evidently the implication stems from the law of iterated expectation, but I can't see how. If it helps, the question asked is "what is...
Hello there, for the above problem the wavefunctions can be shown to be:
$$\psi_{n,l}=\left[ \frac {b}{2\pi l_b^2} \frac{n!}{2^l(n+l)!}\right]^{\frac12} \exp{(-il\theta - \frac {r^2\sqrt{b}}{4l_b^2})} \left( \frac {r\sqrt{b}}{l_b}\right)^lL_n^l(\frac {r^2b}{4l_b^2})$$
Here ##b = \sqrt{1 +...
I first normalized the given wavefunction and found the value of n that satisfies the normalization condition. I then used E = <E> = pi^2* h_bar^2* n^2/(2*m) to get the expectation value of energy. Assuming that this was the right process, I'm now trying to find <E^2> using the same equation...
In non relativistic quantum mechanics, the expectation value of an operator ##\hat{O}## in state ##\psi## is defined as $$<\psi |\hat{O}|\psi>=\int\psi^* \hat{O} \psi dx$$.
Since the scalar product in relativistic quantum has been altered into $$|\psi|^2=i\int\left(\psi^*\frac{\partial...
This is the problem;
Find my working to solution below;
find mark scheme solution below;
I seek any other approach ( shorter way of doing it) will be appreciated...
In This wikipedia article is said:
"If the quantum field theory can be accurately described through perturbation theory, then the properties of the vacuum are analogous to the properties of the ground state of a quantum mechanical harmonic oscillator, or more accurately, the ground state of a...
In the thermal interpretation, the collection of all q-expectations (and q-correlations) is the state of a system. The interpretation of q-expectations is used only to provide an ontology, the apparent randomness is analysed and explained separately. This may be non-intuitive. Callen's criterion...
This is a question from a mathematical statistics textbook, used at the first and most basic mathematical statistics course for undergraduate students. This exercise follows the chapter on nonparametric inference. An attempt at a solution is given. Any help is appreciated.
Exercise:
Suppose...
This is what I did:
Let Y = number of sixes occurred when ##n## dice are thrown
Y ~ B (n, 1/6)
E(Y) = ##\frac{1}{6}n##Let Z = amount of money received → Z = ##\frac{1}{2}Y##
E(Z) = E(1/2 Y) = 1/2 E(Y) = ##\frac{1}{12}n##I got the answer but I am not sure about my working because I didn't...
Hi all,
I found this notation of expectation values in a NMR text.
In class, I learned that expectation values are given by
$$<\hat{X}>=\int_{-\infty}^\infty\psi^*x\psi dx$$
why does this textbook divide by the integral of probability density ##\int \psi^*\psi dx##?
I know that the eigenstates of momentum operator are given by exp(ikx)
To construct a real-valued and normalized wavefunction out of these eigenstates,
I have,
psi(x) = [exp(ikx) + exp(-ikx)]/ sqrt(2)
But my trouble is, how do I find the expectation value of momentum operator <p> using this...
The expectation value of the kinetic energy operator in the ground state ##\psi_0## is given by
$$<\psi_0|\frac{\hat{p^2}}{2m}|\psi_0>$$
$$=<\psi_0|\frac{1}{2m}\Big(-i\sqrt{\frac{\hbar mw}{2}}(\hat{a}-\hat{a^{\dagger}})\Big)^2|\psi_0>$$
$$=\frac{-\hbar...
A recent thread by @coolcantalope was accidentally deleted by a Mentor (I won't say which one...), so to restore it we had to use the cached version from Yahoo.com. Below are the posts and replies from that thread.
The cached 2-page thread can be found by searching on the thread title, and is...
Given an Exponentially Distributed Random Variable $X\sim \exp(1)$, I need to find $\mathbb{E}[P_v]$, where $P_v$ is given as:$$ P_v=
\left\{
\begin{array}{ll}
a\left(\frac{b}{1+\exp\left(-\bar \mu\frac{P_s X}{r^\alpha}+\varphi\right)}-1\right), & \text{if}\ \frac{P_s X}{r^\alpha}\geq P_a,\\
0...
Given that $X$ is exponentially distributed continuous random variable $X\sim \exp(1)$ and $g(x)$ is as below. How can I find the Expectectaion of $g(x)$ for the condition that $x\geq Q$, i.e. $\mathbb{E}[g(x)\ | \ x\geq Q]$.
$$g(x) = \frac{A}{\exp(-bQ+c)}\Big(\frac{1 + \exp(-bQ+c)}{1 +...
I may have misunderstood the expectation value, but if not then with the Copenhagen Interpretation it is easy to understand the expectation value for a wave function. It is just based on the probability of each event. If there were 4 possible events, and the probability of the event having a...
In the 3rd edition of the Introduction to Quantum Mechanics textbook by Griffiths, he normally does the notation of the expectation value as <x> for example. But, in Chapter 3 when he derives the uncertainity principle, he keeps the operator notation in the expectation value. See the pasted...
How can I prove the Cauchy distribution has no moments?
##E(X^n)=\int_{-\infty}^\infty\frac{x^n}{\pi(1+x^2)}\ dx.##
I can prove myself, letting ##n=1## or ##n=2## that it does not have any moment. However, how would I prove for ALL ##n##, that the Cauchy distribution has no moments?
I can not solve this problem:
However, I have a similar problem with proper solution:
Can you please guide me to solve my question? I am not being able to relate Y R (from first question) and U (from second question), and solve the question at the top above...
$\newcommand{\doop}{\operatorname{do}}$
Problem: (This is from Study question 4.3.1 from Causal Inference in Statistics: A Primer, by Pearl, Glymour, and Jewell.) Consider the causal model in the following figure and assume that $U_1$ and $U_2$ are two independent Gaussian variables, each with...
I would like to demonstrate the equation (1) below in the general form of the Log-likelihood :
##E\Big[\frac{\partial \mathcal{L}}{\partial \theta} \frac{\partial \mathcal{L}^{\prime}}{\partial \theta}\Big]=E\Big[\frac{-\partial^{2} \mathcal{L}}{\partial \theta \partial...
Hi,
I have a question which asks me to use the generalised Ehrenfest Theorem to find expressions for
##\frac {d<Sx>} {dt}## and ##\frac {d<Sy>} {dt}## - I have worked out <Sx> and <Sy> earlier in the question.
Since the generalised Ehrenfest Theorem takes the form...
Summary:: Linear Quantum harmonic oscillator and expectation value of the potential energy (time dependent)
Hello, I have attached a picture of the full question, but I am stuck on part b). I have found the expectation value of the <momentum> and the <total energy> However I am struggling with...
I am looking for the expectation of a fraction of Gauss hypergeometric functions.
$$E_X\left[\frac{{}_2F_1\left(\begin{matrix}x+a+1\\x+a+1\end{matrix},a+1,c\right)}{{}_2F_1\left(\begin{matrix}x+a\\x+a\end{matrix},a,c\right)}\right]=?$$
Are there any identities that could be used to simplify or...
Given that operator ##S_M##, which consists entirely of ##Y## and ##Z## Pauli operators, is a stabilizer of some graph state ##G## i.e. the eigenvalue equation is given as ##S_MG = G## (eigenvalue ##1##).
In the paper 'Graph States as a Resource for Quantum Metrology' (page 3) it states that...
I am struggling to figure out how to calculate the expectation value because I am finding it hard to do something with the exponential. I tried using Euler's formula and some commutator relations, but I am always left with some term like ##\exp(L_z)## that I am not sure how to get rid of.
I can show that ##\frac{d}{dt} \langle \psi (t) \vert X^2 \vert \psi (t) \rangle = \frac{1}{m} \langle \psi (t) \vert PX+XP \vert \psi (t) \rangle##.
Taking another derivative with respect to time of this, I get ##\frac{d^2}{dt^2} \langle \psi (t) \vert X^2 \vert \psi (t) \rangle = \frac{i}{m...