- #1
BiGyElLoWhAt
Gold Member
- 1,630
- 134
- TL;DR Summary
- When trying to factor the klein-gordon equation, you need to make your cross terms go away. Either your matrices can be anti-commutative or orthogonal. Is orthogonality a valid solution?
Take the Klein-Gordon equation:
##\Box^2 = m^2##
Say we want to linearize this equation, we try to come up with a new operator that squares into ##\Box^2##.
##(A\partial_t - B\partial_x - C\partial_y - D\partial_z)^2 = \Box^2##
So we need ##-A^2=B^2=C^2=D^2=I## as this gives back the 2nd partial operators with appropriate signs (I just put the - sign on A for simplicity), and we also need to get rid of our cross terms, as they don't appear in the box operator. The way I see it, you can have one of two conditions be true (one is actually a special case of the other).
Either ##AB = BA = 0## or ##AB +BA = 0##
The simplest solution, to me, seems to be the first. If you can effectively brute force your way to finding one of the matrices, then you can just treat them like (pseudo?)tensors, and make them be orthogonal. On that note, I'm not sure how to address the subtleties of orthogonality/rotations in 4-d. These matrices aren't going to be in space-time, right? Since I'm using standard matrix multiplication here, these should exist in something like R^4. So then the idea would be to construct a 4d euclidean rotation matrix from 6 plane rotation generators, using +/-90 degrees in the various permutations of the rotations. However, in 3d, rotations are non-commutative, and can also be represented by quaternions, so ##q(\frac{\theta}{2}) V q^*(\frac{\theta}{2})##. I don't see how this can then work in 4d. Perhaps the solution is the octonions, however, iirc, not only are the octonions non-commutative, I think they are also non associative. This seems to imply that there is no good way to construct my matrices A B C D using the planar rotation generators.
Intuitively, it seems like it should work, and any family of solutions that I can find should satisfy my factorization requirements.
What I've (kind of) tried is:
*Not sure how to compactify this equation into index notation off the top of my head, so I'll just provide a non-trivial example*
Take basis vectors w, x, y, z, then
##R_{wy} = \left ( \begin{array}{cccc}
cos(\alpha) & 0 & isin(\alpha) & 0 \\
0 & 1 & 0 & 0 \\
isin(\alpha) & 0 &cos(\alpha) &0 \\
0 & 0 & 0 & 1 \\
\end{array} \right )
##
Then the thought is take a matrix that squares to I, (but isn't I itself), and then simply rotate it 3x to get 4 orthogonal matrices. I'm not entirely sure what I need to keep track of in order to properly perform a 4d rotation, like with the quaternions, you need to left hand and right hand multiply by half your rotation and it's conjugate.
Would this not give me a solution? If not, why not? If so, why is the anti-commutative condition preferentially taken over the orthogonality condition?
Edit*
So I worked through a specific example as I had suggested it, and didn't get 0 out like I thought I would. I'm not sure why though. Maybe someone can elaborate. This might just be a poor choice for the rotation plane, actually.
So I used the rotation matrix above and let alpha be 90 degrees.
##\left ( \begin{array}{cccc}
0 & 0 & i & 0 \\
0 & 1 & 0 & 0 \\
i & 0 &0 &0 \\
0 & 0 & 0 & 1 \\
\end{array} \right )
\left ( \begin{array}{cccc}
0 & 0 & i & 0 \\
0 & 0 & 0 &i \\
-i & 0 &0 &0 \\
0 & -i & 0 & 0 \\
\end{array} \right ) =
\left ( \begin{array}{cccc}
1 & 0 &0 & 0 \\
0 & 0 & 0 &i \\
0& 0 &-1 &0 \\
0 & -i & 0 & 0 \\
\end{array} \right )
##
When I take my RHS and multiply it with the right matrix on LHS, I get back the rotation matrix that I started with, not the 0 matrix. I will try a different plane, as now that I look at it, just visually, the matrix I rotated seems like it might live in the plane that I rotated it in.
Similar results for the w-x plane.
##\Box^2 = m^2##
Say we want to linearize this equation, we try to come up with a new operator that squares into ##\Box^2##.
##(A\partial_t - B\partial_x - C\partial_y - D\partial_z)^2 = \Box^2##
So we need ##-A^2=B^2=C^2=D^2=I## as this gives back the 2nd partial operators with appropriate signs (I just put the - sign on A for simplicity), and we also need to get rid of our cross terms, as they don't appear in the box operator. The way I see it, you can have one of two conditions be true (one is actually a special case of the other).
Either ##AB = BA = 0## or ##AB +BA = 0##
The simplest solution, to me, seems to be the first. If you can effectively brute force your way to finding one of the matrices, then you can just treat them like (pseudo?)tensors, and make them be orthogonal. On that note, I'm not sure how to address the subtleties of orthogonality/rotations in 4-d. These matrices aren't going to be in space-time, right? Since I'm using standard matrix multiplication here, these should exist in something like R^4. So then the idea would be to construct a 4d euclidean rotation matrix from 6 plane rotation generators, using +/-90 degrees in the various permutations of the rotations. However, in 3d, rotations are non-commutative, and can also be represented by quaternions, so ##q(\frac{\theta}{2}) V q^*(\frac{\theta}{2})##. I don't see how this can then work in 4d. Perhaps the solution is the octonions, however, iirc, not only are the octonions non-commutative, I think they are also non associative. This seems to imply that there is no good way to construct my matrices A B C D using the planar rotation generators.
Intuitively, it seems like it should work, and any family of solutions that I can find should satisfy my factorization requirements.
What I've (kind of) tried is:
*Not sure how to compactify this equation into index notation off the top of my head, so I'll just provide a non-trivial example*
Take basis vectors w, x, y, z, then
##R_{wy} = \left ( \begin{array}{cccc}
cos(\alpha) & 0 & isin(\alpha) & 0 \\
0 & 1 & 0 & 0 \\
isin(\alpha) & 0 &cos(\alpha) &0 \\
0 & 0 & 0 & 1 \\
\end{array} \right )
##
Then the thought is take a matrix that squares to I, (but isn't I itself), and then simply rotate it 3x to get 4 orthogonal matrices. I'm not entirely sure what I need to keep track of in order to properly perform a 4d rotation, like with the quaternions, you need to left hand and right hand multiply by half your rotation and it's conjugate.
Would this not give me a solution? If not, why not? If so, why is the anti-commutative condition preferentially taken over the orthogonality condition?
Edit*
So I worked through a specific example as I had suggested it, and didn't get 0 out like I thought I would. I'm not sure why though. Maybe someone can elaborate. This might just be a poor choice for the rotation plane, actually.
So I used the rotation matrix above and let alpha be 90 degrees.
##\left ( \begin{array}{cccc}
0 & 0 & i & 0 \\
0 & 1 & 0 & 0 \\
i & 0 &0 &0 \\
0 & 0 & 0 & 1 \\
\end{array} \right )
\left ( \begin{array}{cccc}
0 & 0 & i & 0 \\
0 & 0 & 0 &i \\
-i & 0 &0 &0 \\
0 & -i & 0 & 0 \\
\end{array} \right ) =
\left ( \begin{array}{cccc}
1 & 0 &0 & 0 \\
0 & 0 & 0 &i \\
0& 0 &-1 &0 \\
0 & -i & 0 & 0 \\
\end{array} \right )
##
When I take my RHS and multiply it with the right matrix on LHS, I get back the rotation matrix that I started with, not the 0 matrix. I will try a different plane, as now that I look at it, just visually, the matrix I rotated seems like it might live in the plane that I rotated it in.
Similar results for the w-x plane.
Last edited: