- #876
saviourmachine
- 96
- 0
The math is not difficult at all...
Matrix mechanics
Probability theory
Dot product
Recapitulation. Taken into consideration the tabel C we talked about. G does map the B's in that tabel to another tabel with a same amount of entries, but with only two columns (the real and imaginary part). The dot product between G and [itex]\vec{G}^\dagger[/itex] does lead us to another tabel with one column. This column is integrated n times, each time over one of his (n) elements.
Psi function
Rewriting the psi function
Result
Andy
Matrix mechanics
Aah. The basic concepts about matrix multiplication etc, I know. I did my bachelor electrical engineering (e.m. waves etc). I'll say it, if something is too difficult for me. I don't know a thing about Heisenberg's matrix mechanics. I forgot a lot about Schrödinger's equation. It was thrown at me in a course about semiconductor physics.DrDick said:With regard to the issue of mathematics and simplicity, do you have any knowledge of matrix mechanics or matrix multiplication? I am wondering if I will have to teach you the subject as it comes up pretty quickly from where we are at the moment.
Probability theory
Interesting. And that's not Stephen Jay Gould in "Full house" I guess... Who is it? What is his/her message?(Just as an aside, there is an individual out there who has some major difficulties with probability theory and is getting a reception roughly equivalent to the one I manage to generate with authorities. I have a strong suspicion his complaints are very rational.)
Dot product
I am clueless about what you're doing overhere. You defined an universal function: G, linking a list input numbers with a list results. You defined it's adjoint. Okay. And now you're defining a dot product of these functions. Does that have any meaning? And subsequently taking a volume integral. Does that mean anything? Or are that conventional mathematical tricks that always apply?Now add to the above the standard definition of a "dot" product of vectors (seen as a definition of a procedure) and the notation [itex]\vec{G}^\dagger \cdot \vec{G}[/itex] results in a sum over a collection of positive real numbers which must be positive definite. Lastly, the sum over all possibilities (or the integral if the number of possibilities is infinite) must be greater than any sum (or integral) over any sub set of possibilities. It follows that
[tex] 1 \geq
\frac{ { \int \int \cdots \int \vector{G}^\dagger \cdot \vector{G} \, d^n x} }{ { { \int \int \cdots \int \vector{G}^\dagger \cdot \vector{G} \, d^n x} } }
\geq 0[/tex]
so long as the denominator is summed (or integrated) over all possibilities.
Recapitulation. Taken into consideration the tabel C we talked about. G does map the B's in that tabel to another tabel with a same amount of entries, but with only two columns (the real and imaginary part). The dot product between G and [itex]\vec{G}^\dagger[/itex] does lead us to another tabel with one column. This column is integrated n times, each time over one of his (n) elements.
Psi function
Ah, there we have our old familiar P again. I don't know how you did achieve that. It's pretty if it's right. It's like Euler's formula connecting pi, e, and i in some magical way. Can you explain why you did take a square root? Can you explain why the probability P is given by [itex] \vec{\Psi}[/itex] and its adjoint? What kind of value is the denominator?If follows that, if one defines the function [itex]\vec{\Psi}[/itex] via
[tex] \vec{\Psi}(\vec{x},t) \equiv \frac{ \vec{G}(\vec{x},t) }{ { \sqrt{ \oint \vector{G}^\dagger \cdot \vector{G} dv} } }[/tex]
we can "define" the probability of the [itex]B_j[/itex] to be given by
[tex]
P(\vec{x},t) = \vec{\Psi}^\dagger(\vec{x},t)\cdot\vec{\Psi}(\vec{x},t)dv
[/tex]
where [itex]dv \equiv d^n x [/itex].
Rewriting the psi function
And this is quite difficult for me too. Is this matrix mechanics?Finally, since we want to work with [itex]\vec{\Psi}[/itex], we need to re-express the relationships developed earlier in terms of the probability. The relationships already written may be rewritten as
[tex]
\sum_{i=1}^n \frac{\partial}{\partial x_i}\vec{\Psi}\,=\, i \kappa \vec{\Psi}\,\,\,and\,\,\frac{\partial}{\partial t}\vec{\Psi}\,=\, im\vec{\Psi} [/tex]
This can be proved quite simply. The complex conjugates of the above expressions are,
[tex]
\sum_{i=1}^n \frac{\partial}{\partial x_i}\vec{\Psi}^\dagger\,=\, -i \kappa \vec{\Psi}^\dagger \,\,\,and\,\,\frac{\partial}{\partial t}\vec{\Psi}^\dagger\,=\, -im\vec{\Psi}^\dagger .[/tex]
Result
And, yes, the result is that the derivative of P with respect to t is zero. And I trust you that the others will be zero as well. It's difficult for me to follow this, but I hope that I lack only a few basic physical or mathematical concepts. If you're not disappointed I hope you'll continue your lectures. See you soon!This, together with the chain rule of calculus guarantees that any [itex]\vec{\Psi}[/itex] which satisfies the above relations also satisfies the relation on the probability stated earlier. In the interest of saving space, I will show the result explicitly for the time derivative (the derivatives with respect to the arguments [itex]x_i[/itex] go through exactly the same.
[tex] \frac{\partial}{\partial t}P(\vec{x},t)[/tex]
[tex] =\,\, \left( \frac{\partial}{\partial t}\vec{\Psi}^\dagger \right) \cdot \vec{\Psi}+\vec{\Psi}^\dagger \cdot
\left( {\frac{ \partial}{ \partial t}} \vec{\Psi} \right)[/tex]
[tex]=\,\, -im \vec{\Psi}^\dagger \cdot \vec{\Psi}+im \vec{\Psi}^\dagger \cdot \vec{\Psi}\,\,=\,\,0.[/tex]
Andy