Time Evolution and Hamiltonian Problem

In summary: But, for example, have you verified that the eigenvectors are orthonormal? In other words, have you verified that:v.w = 0for every pair of eigenvectors v and w? The formula that you showed above (the "projection matrix" formula) is correct if you are given an orthonormal basis. And, yes, that formula does give you the 3x3 identity matrix if you use the (normalized) eigenvectors as a basis. But it doesn't seem like you actually used the eigenvectors as a basis in that formula.
  • #1
phil ess
70
0

Homework Statement



Consider a physical system with a three-dimensional state space. In this space the Hamiltonian is represented by the matrix:

[tex]H = hbar\omega \[ \left( \begin{array}{ccc}
0 & 0 & 2 \\
0 & 1 & 0 \\
2 & 0 & 0 \end{array} \right)\] [/tex]

The state of the system at t = 0 in coordinate representation is:

[tex]s(t=0) = \[ \left( \begin{array}{ccc}
\sqrt{2} \\
1 \\
1 \end{array} \right)\] [/tex]

Find the state s(t=/=0) by doing the following steps:

i) Find the eigenvectors and eigenvalues of the Hamiltonian.
ii) Expand the initial state in eigenstates of the Hamiltonian.
iii) Use your knowledge of the time evolution of the eigenstates to find the state of the system s(t).

The Attempt at a Solution



i) I can get the eigenvalues:

[tex]\[ \chi(\lambda) = \left| \begin{array}{ccc}
-\lambda & 0 & 1 \\
0 & 2-\lambda & 0 \\
1 & 0 & -\lambda \end{array} \right| = 0\] [/tex]

which gives:

[tex]\lambda = 2,-2,1[/tex]

and then the eigenvectors are:

[tex]\[ \left( \begin{array}{ccc}
1 \\
0 \\
1 \end{array} \right)\[ \left( \begin{array}{ccc}
1 \\
0 \\
-1 \end{array} \right)\] \] \[ \left( \begin{array}{ccc}
0 \\
1 \\
0 \end{array} \right)\][/tex]

Is this right?

ii) Now I am not sure how exactly to expand the initial states in eigenstates of the hamiltonian

Any hints are appreciated!
 
Physics news on Phys.org
  • #2
phil ess said:
Is this right?
Basically, yes. But anyway, you really should know how to check for yourself. Just apply H to each of these vectors. If the result is proportional to the original, then it is an eigenvector. And, the proportionality factor is the eigenvalue. (There is one slight issue: eigenvectors are typically defined to be unit-normalized.)

phil ess said:
ii) Now I am not sure how exactly to expand the initial states in eigenstates of the hamiltonian
Use the fact the the eigenvectors form a complete 3-D basis. (Well, first you should verify that this is true.) So, you can construct an identity matrix out of projection matrices, which you can in turn construct from the eigenvectors. (This is where the unit-normalization comes in handy.) Then, Iv=v, where I is the identitiy matrix and v is any vector. Are you familiar with bra-ket notation?
 
  • #3
Right ok so first I normalize the eigenvectors:

[tex]\[ \left( \begin{array}{ccc}
1/\sqrt{2} \\
0 \\
1/\sqrt{2} \end{array} \right) \[ \left( \begin{array}{ccc}
1/{\sqrt{2} \\
0 \\
-1/\sqrt{2} \end{array} \right)\[ \left( \begin{array}{ccc}
0 \\
1 \\
0 \end{array} \right)\]\]\]
[/tex]

And now I want to construct projection matricies from these? Ok as far as I know a projection matrix has to be hermitian and idempotent right? Or does it only have to be hermitian for an orthogonal projection matrix?

Well either way this is what I put together:

[tex]A = \[ \left( \begin{array}{ccc}
1/\sqrt{2} & 0 & 1/\sqrt{2} \\
0 & 1 & 0 \\
1/\sqrt{2} & 0 & -1/\sqrt{2} \end{array} \right)\][/tex]

Which is hermitian, but it certainly isn't idempotent, since A2 = I

Obviously I am missing something important! Thanks for the help so far!
 
  • #4
Use the fact that the eigenvectors are orthonormal (and again, you should verify this if you have not done so already). I'll call them v+1, v+2, and v-2. Then, for example:

v+1 ( v+1v+1 ) = v+1 (1) = v+1

v+1 ( v+1v+2 ) = v+1 (0) = 0

v+1 ( v+1v-2 ) = v+1 (0) = 0

So, v+1 ( v+1v ) projects v onto the v+1 "direction". You can write this as a 3x3 matrix acting on v. That matrix is the projection matrix for the eigenvector v+1. You can find two more matrices for the other two eigenvectors in the same way. (The reason that I asked you about bra-ket notation is that it has a very simple notational implementation: e.g. |+1><+1| is the projector for |+1>.) So, you should find 3 projection matrices. The sum of these three projections matrices will actually be the 3x3 identity matrix! This is a very important concept.

BTW, please let me know if the notation is difficutl to read; if so, I will switch to LaTeX.
 
  • #5
I understand what youre saying when you say "v+1 ( v+1 ⋅ v ) projects v onto the v+1 direction", but I don't know how to write this as a 3x3 matrix.

The way I've been taught to make a projection matrix is using:

Proj V x = A(ATA)-1AT x

Where A is a matrix constructed from basis vectors of V, and the eigenvectors I have are basis vectors for R3.

But When I do this I just get the identity matrix back which obviously isn't right. Could you explain how you construct the projection matrix your way? Thanks for all the help so far I am stll struggling with this..

EDIT: "The sum of these three projections matrices will actually be the 3x3 identity matrix! This is a very important concept"

Oh is this why I got the identity matrix when I used the formula above?
 
  • #6
phil ess said:
I understand what youre saying when you say "v+1 ( v+1 ⋅ v ) projects v onto the v+1 direction", but I don't know how to write this as a 3x3 matrix.
Write those relations that I gave to you in component form. Hint: the dot product can be written in component form as:

v.w = ∑jvjwj

and the action of a matrix on a vector can be written in component form as:

Av → ∑jAijvj

So, the row vector is like a row of a matrix ...

phil ess said:
Where A is a matrix constructed from basis vectors of V, and the eigenvectors I have are basis vectors for R3.
Possibly you only need to recognize that V = ℝ3 in order to resolve your confusion. (Just in case your browser doesn't support that one, that is V = R3.) I'm not sure what exactly you are projecting onto there (V is some subspace?), or how you are constructing A "from the basis vectors of V". But, anyway, the method that I suggest is a much simpler concept, once you can figure it out.

phil ess said:
EDIT: "The sum of these three projections matrices will actually be the 3x3 identity matrix! This is a very important concept"

Oh is this why I got the identity matrix when I used the formula above?
Probably. It isn't clear to me what you've done so far.
 

FAQ: Time Evolution and Hamiltonian Problem

What is time evolution in the context of physics?

Time evolution refers to the change or transformation of a physical system over time. It is a fundamental concept in physics that describes how the state of a system evolves over time based on its initial conditions and the laws of physics that govern it.

What is the Hamiltonian problem?

The Hamiltonian problem, also known as the Hamiltonian dynamics, is a mathematical framework used to describe the time evolution of a physical system. It involves the use of Hamiltonian equations, which are a set of differential equations that determine the trajectory of a system in phase space.

How is the Hamiltonian problem related to classical mechanics?

The Hamiltonian problem is a central concept in classical mechanics. It provides a mathematical framework for understanding the behavior of classical systems, such as particles or objects, under the influence of forces. It is based on Hamilton's principle, which states that the evolution of a system is determined by minimizing its action.

What is the importance of the Hamiltonian problem in quantum mechanics?

In quantum mechanics, the Hamiltonian problem plays a crucial role in understanding the time evolution of quantum systems. The Hamiltonian operator, which is derived from the Hamiltonian equations, is used to describe the energy of a quantum system and how it changes with time. It is a fundamental concept in quantum mechanics and is used in many applications, such as calculating the energy levels of atoms and molecules.

How is the Hamiltonian problem used in other fields of science?

The Hamiltonian problem has applications in various fields of science, such as chemistry, biology, and engineering. In chemistry, it is used to study molecular dynamics and chemical reactions. In biology, it is used to model the behavior of biological systems. In engineering, it is used to analyze and design control systems. The Hamiltonian problem has wide-ranging applications and is a fundamental concept in many areas of science.

Similar threads

Replies
2
Views
331
Replies
5
Views
1K
Replies
7
Views
2K
Replies
4
Views
1K
Replies
8
Views
1K
Replies
2
Views
859
Replies
8
Views
1K
Back
Top