Showing that Metric Connections transform as a Connection

In summary, a metric connection is a mathematical tool used to study the geometric properties of a space with a metric structure. It satisfies certain properties such as the Leibniz rule for differentiating tensors and the condition that parallel transport preserves the inner product of vectors. This condition is significant in preserving the geometric structure of the space under parallel transport. Metric connections have various practical applications in fields such as physics, engineering, computer science, and machine learning. They are used to study curvature and geometry, as well as in optimization problems and image processing algorithms.
  • #1
MattRob
211
29

Homework Statement


Show that the metric connection transforms like a connection

Homework Equations


The metric connection is
[itex]Γ^{a}_{bc} = \frac{1}{2} g^{ad} ( ∂_{b} g_{dc} + ∂_{c} g_{db} - ∂_{d} g_{bc} )[/itex]
And of course, in the context of Einstein's GR, we have a symmetric connection,
[itex]Γ^{a}_{bc} = Γ^{a}_{cb}[/itex] and [itex]g_{ab} = g_{ba}[/itex]

The metric tensor is a tensor, thus the contravariant and covariant forms will transform as
[itex]g_{ab} = \frac{∂x^{∝} ∂x^{β}}{∂x^{a} ∂x^{b}} g_{∝β}[/itex]

[itex]g^{ab} = \frac{∂x^{a} ∂x^{b}}{∂x^{∝} ∂x^{β}} g^{∝β}[/itex]

(I like the notation where one coordinate system is represented with indexes from the alphabet a, b, c, ... and the other with greek letters α, β, γ... so as to avoid reliance on prime notation, and it's more clear if an object is in one coordinate system or another. Also, sorry it's a bit odd, but I'm using ∝ instead of α for alpha, since with this font it's hard to tell a from α in [itex]\frac{∂x^{a} ∂x^{b}}{∂x^{α} ∂x^{β}} g^{αβ}[/itex])

A connection transforms as

[itex]Γ^{a}_{bc} = \frac{∂x^{a} ∂x^{β} ∂x^{γ}}{∂x^{∝} ∂x^{b} ∂x^{c}} Γ^{∝}_{βγ} - \frac{∂x^{∝} ∂x^{β}}{∂x^{b} ∂x^{c}} \frac{∂^{2} x^{a}}{∂x^{∝} ∂x^{β}}[/itex]

or equivalently,

[itex]Γ^{a}_{bc} = \frac{∂x^{a} ∂x^{β} ∂x^{γ}}{∂x^{∝} ∂x^{b} ∂x^{c}} Γ^{∝}_{βγ} + \frac{∂x^{a}}{∂x^{∝}} \frac{∂^{2} x^{∝}}{∂x^{b} ∂x^{c}}[/itex]

Also,

[itex]g_{ab} g^{bc} = δ^{c}_{a}[/itex]
where [itex]δ^{c}_{a}[/itex] is the Kronecker delta, where [itex]δ^{c}_{a} = 0[/itex] if [itex] a ≠ c[/itex] and [itex]δ^{c}_{a} = 1[/itex] if [itex]a = c[/itex]

The Attempt at a Solution



I start with the definition of the metric connection

[itex]Γ^{a}_{bc} = \frac{1}{2} g^{ad} ( ∂_{b} g_{dc} + ∂_{c} g_{db} - ∂_{d} g_{bc} )[/itex]

And apply the chain rule to start the transformation
[itex] \frac{∂}{∂x^{a}} = \frac{∂x^{∝}}{∂x^{a}} \frac{∂}{∂x^{∝}}[/itex]

[itex]Γ^{a}_{bc} = \frac{1}{2} g^{ad} ( \frac{∂x^{β}}{∂x^{b}} ∂_{β} g_{dc} + \frac{∂x^{γ}}{∂x^{c}} ∂_{γ} g_{db} - \frac{∂x^{δ}}{∂x^{d}} ∂_{δ} g_{bc} )[/itex]

Then use the tensor transformation equations above (using parenthesis to indicate what the partial derivatives are acting on)

[itex]Γ^{a}_{bc} = \frac{1}{2} \frac{∂x^{a} ∂x^{d}}{∂x^{∝} ∂x^{δ}} g^{∝δ} ( \frac{∂x^{β}}{∂x^{b}} ∂_{β} (\frac{∂x^{δ} ∂x^{γ}}{∂x^{d} ∂x^{c}} g_{δγ}) + \frac{∂x^{γ}}{∂x^{c}} ∂_{γ} (\frac{∂x^{δ} ∂x^{β}}{∂x^{d} ∂x^{b}} g_{δβ}) - \frac{∂x^{δ}}{∂x^{d}} ∂_{δ} (\frac{∂x^{β} ∂x^{δ}}{∂x^{b} ∂x^{c}} g_{βγ}) )[/itex]

And using the product rule to separate out certain terms...

[itex]Γ^{a}_{bc} = \frac{1}{2} \frac{∂x^{a} ∂x^{d}}{∂x^{∝} ∂x^{δ}} g^{∝δ} [( \frac{∂x^{δ} ∂x^{γ}}{∂x^{d} ∂x^{c}} \frac{∂x^{β}}{∂x^{b}} ∂_{β} ( g_{δγ}) + \frac{∂x^{δ} ∂x^{β}}{∂x^{d} ∂x^{b}} \frac{∂x^{γ}}{∂x^{c}} ∂_{γ} ( g_{δβ}) - \frac{∂x^{β} ∂x^{δ}}{∂x^{b} ∂x^{c}} \frac{∂x^{δ}}{∂x^{d}} ∂_{δ} (g_{βγ}) ) + ( \frac{∂x^{β}}{∂x^{b}} g_{δγ} ∂_{β} (\frac{∂x^{δ} ∂x^{γ}}{∂x^{d} ∂x^{c}} ) + \frac{∂x^{γ}}{∂x^{c}} g_{δβ} ∂_{γ} (\frac{∂x^{δ} ∂x^{β}}{∂x^{d} ∂x^{b}}) - \frac{∂x^{δ}}{∂x^{d}} g_{βδ} ∂_{δ} (\frac{∂x^{β} ∂x^{δ}}{∂x^{b} ∂x^{c}}) )][/itex]

Distributing that first coefficient

[itex]Γ^{a}_{bc} = \frac{1}{2} \frac{∂x^{a} ∂x^{d}}{∂x^{∝} ∂x^{δ}} g^{∝δ} [ \frac{∂x^{δ} ∂x^{γ}}{∂x^{d} ∂x^{c}} \frac{∂x^{β}}{∂x^{b}} ∂_{β} ( g_{δγ}) + \frac{∂x^{δ} ∂x^{β}}{∂x^{d} ∂x^{b}} \frac{∂x^{γ}}{∂x^{c}} ∂_{γ} ( g_{δβ}) - \frac{∂x^{β} ∂x^{δ}}{∂x^{b} ∂x^{c}} \frac{∂x^{δ}}{∂x^{d}} ∂_{δ} (g_{βδ}) ] +
\frac{1}{2} \frac{∂x^{a} ∂x^{d}}{∂x^{∝} ∂x^{δ}} g^{∝δ}[ \frac{∂x^{β}}{∂x^{b}} g_{δγ} ∂_{β} (\frac{∂x^{δ} ∂x^{γ}}{∂x^{d} ∂x^{c}} ) + \frac{∂x^{γ}}{∂x^{c}} g_{δβ} ∂_{γ} (\frac{∂x^{δ} ∂x^{β}}{∂x^{d} ∂x^{b}}) - \frac{∂x^{δ}}{∂x^{d}} g_{βγ} ∂_{δ} (\frac{∂x^{β} ∂x^{δ}}{∂x^{b} ∂x^{c}}) ][/itex]

Since all of the metrics have the same differentials out front, in the first big term I pull those out, cancel where applicable, and get

[itex]Γ^{a}_{bc} = \frac{1}{2} \frac{∂x^{a} ∂x^{β} ∂x^{γ}}{∂x^{∝} ∂x^{b} ∂x^{c}} g^{∝δ} [ ∂_{β} ( g_{δγ}) + ∂_{γ} ( g_{δβ}) - ∂_{δ} (g_{βγ}) ] +

\frac{1}{2} \frac{∂x^{a} ∂x^{d}}{∂x^{∝} ∂x^{δ}} g^{∝δ}[ \frac{∂x^{β}}{∂x^{b}} g_{δγ} ∂_{β} (\frac{∂x^{δ} ∂x^{γ}}{∂x^{d} ∂x^{c}} ) + \frac{∂x^{γ}}{∂x^{c}} g_{δβ} ∂_{γ} (\frac{∂x^{δ} ∂x^{β}}{∂x^{d} ∂x^{b}}) - \frac{∂x^{δ}}{∂x^{d}} g_{βγ} ∂_{δ} (\frac{∂x^{β} ∂x^{δ}}{∂x^{b} ∂x^{c}}) ][/itex]

which is clearly

[itex]Γ^{a}_{bc} = \frac{∂x^{a} ∂x^{β} ∂x^{γ}}{∂x^{∝} ∂x^{b} ∂x^{c}} Γ^{∝}_{βγ} +

\frac{1}{2} \frac{∂x^{a} ∂x^{d}}{∂x^{∝} ∂x^{δ}} g^{∝δ}[ \frac{∂x^{β}}{∂x^{b}} g_{δγ} ∂_{β} (\frac{∂x^{δ} ∂x^{γ}}{∂x^{d} ∂x^{c}} ) + \frac{∂x^{γ}}{∂x^{c}} g_{δβ} ∂_{γ} (\frac{∂x^{δ} ∂x^{β}}{∂x^{d} ∂x^{b}}) - \frac{∂x^{δ}}{∂x^{d}} g_{βγ} ∂_{δ} (\frac{∂x^{β} ∂x^{δ}}{∂x^{b} ∂x^{c}}) ][/itex]

Now the first term clearly matches the transformation of a connection,

[itex]Γ^{a}_{bc} = \frac{∂x^{a} ∂x^{β} ∂x^{γ}}{∂x^{∝} ∂x^{b} ∂x^{c}} Γ^{∝}_{βγ} + \frac{∂x^{a}}{∂x^{∝}} \frac{∂^{2} x^{∝}}{∂x^{b} ∂x^{c}}[/itex]

But the second term has metrics in it, which the transformation of a connection obviously doesn't have. Fortunately we have metric contraction,
[itex]g_{ab} g^{bc} = δ^{c}_{a}[/itex]
where [itex]δ^{c}_{a}[/itex] is the Kronecker delta.

So distributing the metric gives

[itex]Γ^{a}_{bc} = \frac{∂x^{a} ∂x^{β} ∂x^{γ}}{∂x^{∝} ∂x^{b} ∂x^{c}} Γ^{∝}_{βγ} +

\frac{1}{2} \frac{∂x^{a} ∂x^{d}}{∂x^{∝} ∂x^{δ}} [ \frac{∂x^{β}}{∂x^{b}} g^{∝δ} g_{δγ} ∂_{β} (\frac{∂x^{δ} ∂x^{γ}}{∂x^{d} ∂x^{c}} ) + \frac{∂x^{γ}}{∂x^{c}} g^{∝δ} g_{δβ} ∂_{γ} (\frac{∂x^{δ} ∂x^{β}}{∂x^{d} ∂x^{b}}) - \frac{∂x^{δ}}{∂x^{d}} g^{∝δ} g_{βγ} ∂_{δ} (\frac{∂x^{β} ∂x^{δ}}{∂x^{b} ∂x^{c}}) ][/itex]

Which gives

[itex]Γ^{a}_{bc} = \frac{∂x^{a} ∂x^{β} ∂x^{γ}}{∂x^{∝} ∂x^{b} ∂x^{c}} Γ^{∝}_{βγ} +

\frac{1}{2} \frac{∂x^{a} ∂x^{d}}{∂x^{∝} ∂x^{δ}} [ \frac{∂x^{β}}{∂x^{b}} δ^{∝}_{γ} ∂_{β} (\frac{∂x^{δ} ∂x^{γ}}{∂x^{d} ∂x^{c}} ) + \frac{∂x^{γ}}{∂x^{c}} δ^{∝}_{β} ∂_{γ} (\frac{∂x^{δ} ∂x^{β}}{∂x^{d} ∂x^{b}}) - \frac{∂x^{δ}}{∂x^{d}} g^{∝δ} g_{βγ} ∂_{δ} (\frac{∂x^{β} ∂x^{δ}}{∂x^{b} ∂x^{c}}) ][/itex]

Since we have [itex]δ^{a}_{b}[/itex] with two different terms in two places, a = b must be true in each of those terms or else they'll be multiplied by zero, so terms where a ≠ b will vanish. Thus in the first term with [itex]δ^{∝}_{γ}[/itex], I'll simply replace all the γ's with ∝'s, and in the second term with [itex]δ^{∝}_{β}[/itex] I'll do the same replacing β's with ∝'s.

[itex]Γ^{a}_{bc} = \frac{∂x^{a} ∂x^{β} ∂x^{γ}}{∂x^{∝} ∂x^{b} ∂x^{c}} Γ^{∝}_{βγ} +

\frac{1}{2} \frac{∂x^{a} ∂x^{d}}{∂x^{∝} ∂x^{δ}} [ \frac{∂x^{β}}{∂x^{b}} ∂_{β} (\frac{∂x^{δ} ∂x^{∝}}{∂x^{d} ∂x^{c}} ) + \frac{∂x^{γ}}{∂x^{c}} ∂_{γ} (\frac{∂x^{δ} ∂x^{∝}}{∂x^{d} ∂x^{b}}) - \frac{∂x^{δ}}{∂x^{d}} g^{∝δ} g_{βγ} ∂_{δ} (\frac{∂x^{β} ∂x^{δ}}{∂x^{b} ∂x^{c}}) ][/itex]

Where I'm stuck is I don't know how to simplify that last term with [itex]g^{∝δ}g_{βγ}[/itex] since those two metric tensors share no indeces. I see that there's a [itex]g^{∝δ}∂_{δ}[/itex], and doing some googling around I've come on [itex]g^{∝δ}∂_{δ} = ∂^{∝}[/itex], though to be honest, while I get that [itex]∂_{a}[/itex] is shorthand for [itex]\frac{∂}{∂x^{a}}[/itex], I don't get what [itex]∂^{a}[/itex] could possibly mean, except perhaps just the inverse of [itex]\frac{∂}{∂x^{a}}[/itex]?

But even if I applied that, I'm still left with the covariant metric [itex]g_{βγ}[/itex].

I have one guess: Perhaps I could use raising/lowering of indices to get out of this.

[itex]g_{ab} T^{b}_{c} = T_{ac}[/itex]

but the only other objects the metrics could do this to are partial derivatives,

[itex]\frac{∂x^{δ}}{∂x^{d}} g^{∝δ} g_{βγ} ∂_{δ} (\frac{∂x^{β} ∂x^{δ}}{∂x^{b} ∂x^{c}})[/itex]

And [itex]\frac{∂x^{β}}{∂x^{b}}[/itex], for example, is not a tensor (it is [itex]∂_{b}[/itex] acting on [itex]x^{β}[/itex], and as I transform the coordinates of [itex]x^{β}[/itex] I get partial derivatives, that [itex]∂_{b}[/itex] is now acting on, thus will get second partial derivatives which are not in the transformation of a tensor), so there's no tensors to do raising/lowering indices with, and I'm not sure if using the metric to raise/lower indices applies to non-tensors.

Secondly, the metric tensor is defined as
[itex]ds^{2} = g_{ab} dx^{a} dx^{b}[/itex]

And
[itex]\frac{∂x^{δ}}{∂x^{d}} g^{∝δ} g_{βγ} ∂_{δ} (\frac{∂x^{β} ∂x^{δ}}{∂x^{b} ∂x^{c}}[/itex]
if I expand it out, does have terms of the form
[itex]g_{ab} ∂^{2}x^{a} ∂x^{b}[/itex]
but those involve second derivatives and are partial derivatives as opposed to total derivatives.

Perhaps I could try getting some total derivatives, but at that point I'm getting wildly off-track and I'm not even sure how I could work that into those terms.

So, basically, I'm stuck trying to get rid of those metrics in that last term. I feel like there should be some obvious relation with [itex]T^{ab}T_{cd}[/itex], but I don't know what it is.

It looks like the way forward may be in the google result, but I don't know what the notation of [itex]∂^{a}[/itex] means, other than perhaps [itex](\frac{∂}{∂x^{a}})^{-1}[/itex] or perhaps [itex]\frac{∂}{∂x_{a}}[/itex], though that doesn't make much since since I thought basis vectors were always contravariant.

I thought perhaps [itex]g^{ab}g_{cd}[/itex] may just be the identity matrix since [itex]g^{ab} = (g_{ab})^{-1}[/itex], but this says otherwise (further down the page, a question asked by TheGeometer), that the identity matrix would come from [itex]g_{ab}g^{ab}[/itex], but [itex]g_{ab}g^{cd}[/itex] is just a rank (2, 2) tensor with no immediately obvious properties I could use to "get rid" of it.

Any help would be greatly appreciated. I'm really stuck, and even rather upset that I can't figure this out. I've clearly spent quite some time on it...
 

FAQ: Showing that Metric Connections transform as a Connection

What is a metric connection?

A metric connection is a mathematical tool used to study the geometric properties of a space with a metric structure. It is a way to define how vectors and tensors are differentiated and parallel transported in a space with a metric.

How do you show that metric connections transform as a connection?

To show that metric connections transform as a connection, one must prove that they satisfy certain properties. These include the Leibniz rule for differentiating tensors and the condition that parallel transport preserves the inner product of vectors.

Can you explain the Leibniz rule for metric connections?

The Leibniz rule states that the derivative of a tensor product is equal to the sum of the products of the derivative of each tensor with the other unchanged. In the case of metric connections, this means that the derivative of a tensor product is equal to the sum of the products of the derivative of each tensor with the metric connection coefficients.

What is the significance of the condition that parallel transport preserves the inner product?

This condition is important because it ensures that the geometric structure of the space is preserved under parallel transport. In other words, parallel transported vectors will maintain their relative angles and lengths, allowing for a consistent measurement of distances and angles in the space.

How are metric connections used in practical applications?

Metric connections have many applications in different fields, such as physics, engineering, and computer science. They are used to study the curvature and geometry of spaces, which is important in understanding general relativity, fluid mechanics, and optimization problems. They are also used in machine learning and image processing algorithms where the structure of the data is important.

Back
Top