Vectors and isometries on a manifold

In summary, the conversation discusses the concepts of vectors, coordinate systems, and isometries in the context of General Relativity. A map is used to assign values to points on a manifold and a tangent vector is defined using the coordinate basis. The basis vector depends on the point on the manifold, but not the vector itself. There is no issue with constructing tangent planes. It is also clarified that 4-velocity should not be confused with basis vectors. The question of what people mean by rotations in GR is left open.
  • #71
Please correct me if I'm wrong in what follows.
Solving Killing's equation I found that its components should satisfy ##\xi^{1} = - \xi^{2}##. Then I proceeded as follows:

In Cartesian coordinates, suppose we have a vector with components ## \begin{bmatrix}1\\0\end{bmatrix} ##. If we rotate it by 90°, its new components* should become ## \begin{bmatrix}0\\1\end{bmatrix} ##. It can be done through the action of a matrix operator on the original vector.

By trying combinations of elements of this matrix, knowing what the resultant vector will look, I found that the matrix should be
## \begin{bmatrix}a_{11}&\xi^1\\-\xi^1&a_{22}\end{bmatrix} ##.

Now, I found what ##a_{11}## and ##a_{22}## are by operating on the vector ## \begin{bmatrix}0\\1\end{bmatrix} ## which becomes ## \begin{bmatrix}1\\0\end{bmatrix} ## after rotation.
The matrix is then ## \begin{bmatrix}cos(\theta)&sin(\theta)\\-sin(\theta)&cos(\theta)\end{bmatrix} ## for a rotation through an angle ##\theta##. The killing vector components are ##sin(\theta)## and ##-sin(\theta)##.

* Here I'm considering a same vector rotated by an angle, so it has new components in its new "position". However, the interpretation you mentioned of seeing the rotation as a mapping of a vector ## \begin{bmatrix}0\\1\end{bmatrix} ## into another vector ## \begin{bmatrix}1\\0\end{bmatrix} ## makes more sense to me.

That is more like a heuristic derivation of the Killing vector components. If there is a more rigorous derivation please let me know.

I'm not sure the above is right :biggrin:.
 
Last edited:
Physics news on Phys.org
  • #72
davidge said:
I found that its components should satisfy ##\xi^{1} = - \xi^{2}##.

That doesn't look right. How are you deriving that requirement? The only nontrivial component of Killing's equation in 2 dimensions, in Cartesian coordinates (so covariant derivatives are just partial derivatives) is ##\frac{\partial}{\partial x^1} \xi_2 + \frac{\partial}{\partial x^2} \xi_1 = 0##. I don't see how to get from that to what you wrote.

davidge said:
suppose we have a vector with components ##\begin{bmatrix}1\\0\end{bmatrix}## . If we rotate it by 90°, its new components* should become ##\begin{bmatrix}0\\1\end{bmatrix}## .

This is not an infinitesimal rotation; you should start with rotating ##\begin{bmatrix} 1 \\ 0 \end{bmatrix}## to something rotated by an infinitesimal angle ##d \theta## from that. That is what will have a direct relationship to the Killing vector.

davidge said:
By trying combinations of elements of this matrix, knowing what the resultant vector will look, I found that the matrix should be
##\begin{bmatrix}a_{11}&\xi^1\\-\xi^1&a_{22}\end{bmatrix}## .

This isn't right either. See above comments.

If you want to guess what the infinitesimal matrix might look like, you could try expanding the components of the finite rotation matrix (which you got correct, see below) as Taylor series in ##\theta## (or ##d \theta## for an infinitesimal rotation) and dropping terms higher than first order. Also remember that, since we are using Cartesian coordinates, you will need to ultimately express everything in terms of ##x^1## and ##x^2## (or what would normally be called ##x## and ##y## if we used those names for the coordinates), so you will need to figure out how to express the rotation angle ##\theta## in terms of them (or ##d\theta## in terms of ##dx## and ##dy##).

davidge said:
The matrix is then ##\begin{bmatrix}cos(\theta)&sin(\theta)\\-sin(\theta)&cos(\theta)\end{bmatrix}## for a rotation through an angle ##\theta##.

This is correct, but for a finite rotation, not an infinitesimal one. (Actually, reversing the signs on the ##\sin \theta## components is also a valid solution; which one corresponds to rotating by ##\theta## and which by ##- \theta## depends on your convention for which direction, clockwise or counterclockwise, corresponds to increasing ##\theta##.) Also, the way you are getting there is not valid; see above.

davidge said:
The killing vector components are ##sin(\theta)## and ##sin(\theta)##.

This is not correct. See above.
 
  • Like
Likes davidge
  • #73
PeterDonis said:
How are you deriving that requirement?
PeterDonis said:
covariant derivatives are just partial derivatives is ∂∂x1ξ2+∂∂x2ξ1=0
PeterDonis said:
you should start with rotating [10][10]\begin{bmatrix} 1 \\ 0 \end{bmatrix} to something rotated by an infinitesimal angle dθdθd \theta from that

##V \doteq \begin{bmatrix}V^{1}\\V^{2}\end{bmatrix} = \begin{bmatrix}1\\0\end{bmatrix} \\
V'^{1} = V^{1} + d\theta \xi^{1}_{,2}V^{2} = V^{1} \\ V'^{2} = V^{2} + d\theta \xi^{2}_{,1}V^1 = d\theta##
where a prime denotes the transformed components and ##V^{1} = 1, V^{2} = 0, \xi^{1}_{,2} = - d\theta \xi^{2}_{,1} = 1##.

Would this be a solution?

PeterDonis said:
If you want to guess what the infinitesimal matrix might look like, you could try expanding the components of the finite rotation matrix
for the infinitesimal case, would it be enough to take the limit as ##\theta## tends to zero, in which case, ##cos(\theta) \approx 1## and ##sin(\theta) \approx \theta## (1), where we can replace ##\theta## in (1) with ##d\theta## because it's infinitesimal? (2)

PeterDonis said:
you will need to figure out how to express the rotation angle θθ\theta in terms of them (or dθdθd\theta in terms of dxdxdx and dydydy).
Following (2), we would have ##tan(\theta) \approx d\theta = dx^{2}/dx^{1}##.
 
  • #74
davidge said:
Would this be a solution?

I can't follow your notation. Are you trying to use derivatives of the Killing vector components again? Once more: isometries are generated by Killing vectors themselves, not their derivatives. If you are trying to use derivatives, you are doing it wrong.

davidge said:
for the infinitesimal case, would it be enough to take the limit as ##\theta## tends to zero

No, because in that limit, ##cos \theta = 1## and ##\sin \theta = 0##, and that just gives you the identity matrix.

What you were actually doing to get ##\cos \theta \approx 1## and ##\sin \theta \approx \theta## was what I said: you expand the functions as Taylor series, so ##\cos \theta = 1 - \frac{1}{2!} \theta^2 + \frac{1}{4!} \theta^4 - ...## and ##\sin \theta = \theta - \frac{1}{3!} \theta^3 + ...##, and then discard terms beyond first order in ##\theta##.

So now you have the infinitesimal rotation matrix ##\begin{bmatrix} 1 & - d\theta \\ d\theta & 1 \end{bmatrix}## (where I have flipped the signs on the ##d\theta## terms from what you wrote down). What do you get when you operate on the vector ##\begin{bmatrix} 1 \\ 0 \end{bmatrix}## with this matrix? Or, more generally, what do you get when you operate on the vector ##\begin{bmatrix} x^1 \\ x^2 \end{bmatrix}##?
 
  • Like
Likes davidge
  • #75
davidge said:
Would this be a solution?

To re-ground this in what you quoted from Weinberg's book, his equation for an infinitesimal coordinate transformation is

$$
x'^\mu = x^\mu + \epsilon \xi^\mu
$$

which we can rewrite in column vector notation as

$$
\begin{bmatrix} x'^1 \\ x'^2 \end{bmatrix} = \begin{bmatrix} x^1 \\ x^2 \end{bmatrix} + d\theta \begin{bmatrix} \xi^1 \\ \xi^2 \end{bmatrix}
$$

where I have put ##d\theta## instead of ##\epsilon## because we are talking about rotations by an infinitesimal angle. But since we are talking about infinitesimal rotations, we must also have the equation ##x' = R x##, which in matrix form is

$$
\begin{bmatrix} x'^1 \\ x'^2 \end{bmatrix} = \begin{bmatrix} R_{11} & R_{12} \\ R_{21} & R_{22} \end{bmatrix} \begin{bmatrix} x^1 \\ x^2 \end{bmatrix}
$$

The matrix ##R## is just the rotation matrix, whose infinitesimal form we know. This is what connects the two viewpoints (matrix and Killing vector).
 
  • Like
Likes davidge
  • #76
PeterDonis said:
in that limit, cosθ=1cosθ=1cos \theta = 1 and sinθ=0sin⁡θ=0\sin \theta = 0, and that just gives you the identity matrix.
What you were actually doing to get cosθ≈1cos⁡θ≈1\cos \theta \approx 1 and sinθ≈θsin⁡θ≈θ\sin \theta \approx \theta was what I said
Oh yea.

PeterDonis said:
So now you have the infinitesimal rotation matrix [1−dθdθ1][1−dθdθ1]\begin{bmatrix} 1 & - d\theta \\ d\theta & 1 \end{bmatrix} (where I have flipped the signs on the dθdθd\theta terms from what you wrote down). What do you get when you operate on the vector [10][10]\begin{bmatrix} 1 \\ 0 \end{bmatrix} with this matrix? Or, more generally, what do you get when you operate on the vector [x1x2][x1x2]\begin{bmatrix} x^1 \\ x^2 \end{bmatrix}?
PeterDonis said:
[x′1x′2]=[x1x2]+dθ[ξ1ξ2][x′1x′2]=[x1x2]+dθ[ξ1ξ2]​
\begin{bmatrix} x'^1 \\ x'^2 \end{bmatrix} = \begin{bmatrix} x^1 \\ x^2 \end{bmatrix} + d\theta \begin{bmatrix} \xi^1 \\ \xi^2 \end{bmatrix}

where I have put dθdθd\theta instead of ϵϵ\epsilon because we are talking about rotations by an infinitesimal angle. But since we are talking about infinitesimal rotations, we must also have the equation x′=Rxx′=Rxx' = R x, which in matrix form is

[x′1x′2]=[R11R12R21R22][x1x2][x′1x′2]=[R11R12R21R22][x1x2]​
\begin{bmatrix} x'^1 \\ x'^2 \end{bmatrix} = \begin{bmatrix} R_{11} & R_{12} \\ R_{21} & R_{22} \end{bmatrix} \begin{bmatrix} x^1 \\ x^2 \end{bmatrix}
Then the Killing vector components are ##\xi^{1} = -x^2## and ##\xi^{2} = x^{1}##?

PeterDonis said:
The matrix RRR is just the rotation matrix, whose infinitesimal form we know. This is what connects the two viewpoints (matrix and Killing vector).
I see now
 
  • #77
davidge said:
Then the Killing vector components are ##\xi^{1} = -x^2## and ##\xi^{2} = x^{1}##?

Yes, you've got it.
 
  • Like
Likes davidge
  • #78
PeterDonis said:
you've got it.
Finally :smile:. Thank you for clarifying so many things through this thread.
 
  • #79
There is only one problem. Weinberg says if we are to perform a rotation at a point ##x##, the Killing vector components must vanish there, so that the point is left invariant, because any other point ##x'^{\mu}## would be equal to ##x'^{\mu} = x^{\mu} + \epsilon \xi^{\mu}(x)## in his notation, and since ##\xi^{\mu}(x) = 0##, so ##x'^{\mu} = x^{\mu}##.

So would this condition be necessary only if we want the metric to be the same? (i.e. an isotropy) And else, the rotation would not keep the point fixed, as you've shown?
 
  • #80
davidge said:
Weinberg says if we are to perform a rotation at a point ##x##, the Killing vector components must vanish there

Yes, and the Killing vector we found satisfies that: the vector ##\begin{bmatrix} - x^2 \\ x^1 \end{bmatrix}## vanishes at the origin, since there ##x^1 = x^2 = 0##.

davidge said:
would this condition be necessary only if we want the metric to be the same?

I'm not sure what you mean. The coordinate transformation generated by any Killing vector (i.e., any isometry) leaves the metric the same everywhere: the metric ##g_{\mu \nu}## takes the same form in the new coordinates as in the original ones. But a rotation only leaves the coordinates of point ##X## invariant; it changes the coordinates of all other points. (Note that translations--the other isometries Weinberg discusses--do not leave the coordinates of any points invariant.)
 
  • Like
Likes davidge
  • #81
One loose end that still remains is the statement of Weinberg's that for a rotation, the first derivatives of the Killing vector at ##X## take "all possible values". I think I understand what he's trying to say, although it seems to me to be a rather confusing way to say it.

If I'm right, the 2-d rotation case actually doesn't illustrate this very well, because there is only one Killing vector and only one first derivative of it at ##X##. So we should try considering a case with multiple Killing vectors; the obvious case is 3-d rotations. Here there is a 3-parameter group of rotation Killing vectors, which in Cartesian coordinates have the form you would expect from our analysis of the 2-d case; they are:

$$
\xi = \begin{bmatrix} -x^2 \\ x^1 \end{bmatrix}
$$

$$
\upsilon = \begin{bmatrix} -x^3 \\ x^2 \end{bmatrix}
$$

$$
\zeta = \begin{bmatrix} -x^1 \\ x^3 \end{bmatrix}
$$

Weinberg has a qualifier after the "all possible values" statement, that this is "subject to the antisymmetry condition". What he means, I think, is simply that each Killing vector must satisfy Killing's equation, which constrains the partial derivatives of its components. Because of the antisymmetry, in ##N## dimensions Killing's equation will have ##N(N-1)/2## non-trivial components. For the 2-d case, that means just one component; for the 3-d case, there are three, which I have written in terms of the Killing vectors that satisfy them, as written above:

$$
\nabla_1 \xi_2 + \nabla_2 \xi_1 = 0
$$

$$
\nabla_2 \xi_3 + \nabla_3 \xi_2 = 0
$$

$$
\nabla_3 \xi_1 + \nabla_1 \xi_3 = 0
$$

So basically, I think that Weinberg's "all possible values" just means that, at the point ##X##, where all the rotation Killing vectors vanish, we have ##N(N-1)/2## choices of distinct pairs of coordinate indexes that we can choose in in order to define the rotation, since each non-trivial component of Killing's equation involves a pair of coordinate indexes. To put it another way, any rotation picks out a plane containing the point ##X##; but in ##N## dimensions there are ##N(N-1)/2## possible planes (more precisely, that number of mutually orthogonal planes), and each plane has its own Killing vector (more precisely, its own linearly independent Killing vector).
 
  • Like
Likes davidge
  • #82
PeterDonis said:
Are you trying to use derivatives of the Killing vector components again?
Now that we found what the Killing vector components are, we see that, for example:
##x'^{1} = x^{1} + d\theta \xi^{1}_{,2}x^{2}##, but ##\xi^{1}_{,2} = -1## and so ##x'^{1} = x^{1} - d\theta x^{2}##, the same result we obtain in considering ##x'^{1} = x^{1} + d\theta \xi^{1}##, because, as we know, ##\xi^{1} = -x^{2}##. That is why I was insisting in using derivatives.
Now I think I understand the Weinberg's statement that the components should be zero but the derivatives should not. In our case this happens at the origin (0,0).

PeterDonis said:
the Killing vector we found satisfies that: the vector [−x2x1][−x2x1]\begin{bmatrix} - x^2 \\ x^1 \end{bmatrix} vanishes at the origin
PeterDonis said:
a rotation only leaves the coordinates of point XXX invariant; it changes the coordinates of all other points. (Note that translations--the other isometries Weinberg discusses--do not leave the coordinates of any points invariant.)
Indeed this is what you have been shown me

PeterDonis said:
One loose end that still remains is the statement of Weinberg's that for a rotation, the first derivatives of the Killing vector at XXX take "all possible values". I think I understand what he's trying to say, although it seems to me to be a rather confusing way to say it.
PeterDonis said:
we should try considering a case with multiple Killing vectors; the obvious case is 3-d rotations. Here there is a 3-parameter group of rotation Killing vectors, which in Cartesian coordinates have the form you would expect from our analysis of the 2-d case;
PeterDonis said:
So basically, I think that Weinberg's "all possible values" just means that, at the point XXX, where all the rotation Killing vectors vanish, we have N(N−1)/2N(N−1)/2N(N-1)/2 choices of distinct pairs of coordinate indexes that we can choose in in order to define the rotation
Thank you for interpreting and explaining the meaning of that statement.
 
  • #83
davidge said:
we see that, for example:
##x'^{1} = x^{1} + d\theta \xi^{1}_{,2}x^{2}##,

No. No. No.

The correct equation is ##x'^1 = x^1 + d\theta \xi^1##. That is the equation Weinberg wrote down. That is the equation that is always correct. And in this case it gives ##x'^1 = x^1 - d\theta x^2##. (And ##x'^2 = x^2 + d\theta x^1##.)

Your equation, even if it happens to look the same numerically by coincidence in this particular case, is not correct. I don't understand why you keep on trying to use it.

davidge said:
Now I think I understand the Weinberg's statement that the components should be zero but the derivatives should not

No, you don't. The derivatives should not be zero at the origin because if the derivatives were zero, the Killing vector itself would vanish everywhere. It has nothing whatever to do with your wrong equation.

If you look at the translation Killing vectors, btw, you will see that their derivatives vanish everywhere. But the Killing vectors themselves do not. So the "vanish at the origin but nonzero derivatives" requirement only applies to rotations, not translations.

If you ask why those two cases (rotations and translations) are the only two possibilities Weinberg considers, that's because they're the only two non-trivial possibilities that Killing's equation admits.
 
  • Like
Likes davidge
  • #84
PeterDonis said:
No. No. No.
Ok :biggrin:

PeterDonis said:
The derivatives should not be zero at the origin because if the derivatives were zero, the Killing vector itself would vanish everywhere
PeterDonis said:
If you look at the translation Killing vectors, btw, you will see that their derivatives vanish everywhere. But the Killing vectors themselves do not. So the "vanish at the origin but nonzero derivatives" requirement only applies to rotations, not translations.
I got it.
PeterDonis said:
If you ask why those two cases (rotations and translations) are the only two possibilities Weinberg considers, that's because they're the only two non-trivial possibilities that Killing's equation admits.
I see
 
Back
Top