Understanding Matrix Inversion in SL(2,C)

In summary, the group SL(2,C) is defined using a 2x2 matrix from a generic 4 vector and a vector. To solve for the 4 vector, the author suggests tracing with another vector of matrices and obtains the solution by taking half the trace of the resultant matrix. The indices for tracing are determined by using an orthonormal basis and an isomorphism between the space V and R^4.
  • #1
LAHLH
409
1
Hi,

I'm just reading about the group SL(2,C). In the book that I'm using(Jones, groups reps and physics), he defines a 2x2 matrix from a generic 4 vector [tex]v_{\mu}[/tex] and a vector [tex] \sigma_{\mu}:=(1,\vec{\sigma})[/tex], as [tex] V:=v_{\mu}\sigma^{\mu}[/tex]

He nows wants to invert this equation to solve for [tex]v_{\mu}[/tex], and he suggests tracing with another vector of matrices defined as [tex] \tilde{\sigma_{\mu}}:=(1,-\vec{\sigma})[/tex], and he obtains [tex] v_{\mu}=\tfrac{1}{2}Tr(\tilde{\sigma_{\mu}}V)[/tex]

I can't seem to get this, starting with [tex] V:=v_{\mu}\sigma^{\mu}[/tex] and then multiplying by [tex] \tilde{\sigma_{\nu}} [/tex], leads to [tex] \tilde{\sigma_{\nu}}V:=v_{\mu}\sigma^{\mu}\tilde{\sigma_{\nu}}[/tex]

Now I'm not sure what indices I'm supposed to trace with?
 
Physics news on Phys.org
  • #2


Your V is a 2x2 matrix. If you want to recover [tex]v_{\nu}[/tex] (i.e. the nu'th component of the vector v), you should take [tex]\frac{1}{2} Tr(\tilde{\sigma}_{\nu}V)[/tex], where nu is not summed over. In other words, if you want the first component, take the 2x2 matrix [tex]\tilde{\sigma}_1[/tex], left-multiply (though inside a trace order doesn't matter) it with the 2x2 matrix V, and take half the trace of the resultant 2x2 matrix. Try a simple example if this isn't clear; pick (1,0,0,0) and try to recover v_0.
 
  • #3


I'm going to write all indices as subscripts. [itex]\{I,\sigma_1,\sigma_2,\sigma_3\}[/itex], is a basis for the real vector space of complex 2×2 self-adjoint matrices. I'll call that space V. If we define an inner product by

[tex]\langle A,B\rangle=\frac{1}{2}\operatorname{Tr}(A^\dagger B)[/tex]

for all A,B in V, it's an orthonormal basis. So if we define [itex]\sigma_0=I[/itex], any [itex]x \in V[/itex] can be expressed as [itex]x=x_\mu \sigma_\mu[/itex], with [itex]x_\mu=\langle\sigma_\mu,x\rangle[/itex]. Note that all the [itex]x_\mu[/itex] are real. (This is implied by the facts that V is a real vector space and that [itex]\{\sigma_0,\sigma_1,\sigma_2,\sigma_3\}[/itex] is a basis of V).

[tex]x_\mu=\langle\sigma_\mu,x\rangle=\frac{1}{2}\operatorname{Tr}(\sigma_\mu^\dagger x)=\frac{1}{2}(\sigma_\mu x)_{\nu\nu}[/tex]

The map [itex]\mathbb R^4\ni (x_0,x_1,x_2,x_3)\mapsto x_\mu\sigma_\mu\in V[/itex] is an isomorphism. So V is isomorphic to [itex]\mathbb R^4[/itex].
 
Last edited:
  • #4


Thanks for the help.

I convinced myself of this in the end.

[tex]
\tilde{\sigma_{\mu}}V:=\tilde{\sigma_{\mu}}(v_{\beta}\sigma^{\beta})
[/tex]
Tracing (on the 2x2 matrix indices, not 4 vec indices, which is I think what was confusing me):
[tex] Tr( \tilde{\sigma_{\mu}}V)=( \tilde{\sigma_{\mu}}V)_{ii}=(\tilde{\sigma_{\mu}})_{ij}(v_{\beta}\sigma^{\beta})_{ji}=v_{\beta} (\tilde{\sigma_{\mu}})_{ij}(\sigma^{\beta})_{ji}[/tex]

Now we have that, [tex] \tilde{\sigma_{\mu}}=(1,-\vec{\sigma}) [/tex] and [tex] \sigma^{\mu}=(1,-\vec{\sigma}) [/tex]. Therefore if [tex]\mu=\beta[/tex] : [tex] (\tilde{\sigma_{\mu}})_{ij}(\sigma^{\beta})_{ji}=1_{ii}=2 [/tex] (since e.g. 1x1=1, [tex] \sigma_{x}\sigma_{x}=1 [/tex], [tex] \sigma_{y}\sigma_{y}=1 [/tex], [tex] \sigma_{z}\sigma_{z}=1 [/tex] etc, no summation)

On the other hand if [tex]\mu\neq\beta[/tex], we end up with [tex] (\tilde{\sigma_{\mu}})_{ij}(\sigma^{\beta})_{ji} [/tex] equal to the trace on another Pauli matrix by virtue of the cyclic identity [tex] \sigma_{i}\sigma_{j}=i\epsilon_{ijk}\sigma_{k} [/tex], and the since the Pauli matrices are traceless, this trace of the product is zero.

Combining these facts:

[tex] Tr(\tilde{\sigma}_{\mu}V)=v_{\beta}\delta^{\beta}_{\mu}2 [/tex] which implies [tex] v_{\beta}=\tfrac{1}{2}Tr(\tilde{\sigma}_{\mu}V) [/tex]
 

FAQ: Understanding Matrix Inversion in SL(2,C)

1. What is a matrix inversion and why is it important in SL(2,C)?

Matrix inversion is the process of finding the inverse of a matrix, which is a matrix that when multiplied by the original matrix results in the identity matrix. In SL(2,C), which is the special linear group of 2x2 complex matrices, matrix inversion is important because it allows us to solve systems of equations, perform transformations, and find the determinant and trace of a matrix.

2. How is matrix inversion performed in SL(2,C)?

In SL(2,C), matrix inversion is typically performed using the Gauss-Jordan elimination method. This involves using elementary row operations to transform the original matrix into an identity matrix, while keeping track of the operations performed. The resulting transformed matrix is then the inverse of the original matrix.

3. What are the conditions for a matrix to be invertible in SL(2,C)?

A matrix in SL(2,C) is invertible if its determinant is non-zero. This means that the matrix must have a non-zero determinant in order for its inverse to exist. In other words, the matrix must not be singular.

4. Can all matrices in SL(2,C) be inverted?

No, not all matrices in SL(2,C) can be inverted. As mentioned in the previous question, a matrix must have a non-zero determinant in order to be invertible. Therefore, if a matrix in SL(2,C) has a determinant of 0, it cannot be inverted.

5. Are there any shortcuts or special techniques for matrix inversion in SL(2,C)?

Yes, there are several special techniques for matrix inversion in SL(2,C) that can make the process more efficient. These include using properties of determinants, such as the fact that the determinant of the inverse of a matrix is equal to the reciprocal of the determinant of the original matrix. Additionally, there are also specialized algorithms and software programs that can perform matrix inversion in SL(2,C) more quickly and accurately.

Similar threads

Replies
4
Views
2K
Replies
13
Views
3K
Replies
3
Views
1K
Replies
1
Views
2K
Replies
1
Views
2K
Replies
1
Views
1K
Replies
5
Views
741
Replies
3
Views
1K
Back
Top