Question about decomposition of matrices in SL(2,R)

In summary, the Matrix S is diagonalizable, but is not unique. The matrix A can be found from the eigenvectors of S by multiplying it with an invertible matrix.
  • #1
mnb96
715
5
Hello,

we are given a 2×2 matrix [itex]S[/itex] such that [itex]det(S)=1[/itex].
I would like to find a 2x2 invertible matrix [itex]A[/itex] such that: [itex]A S A^{-1} = R[/itex], where [itex]R[/itex] is an orthogonal matrix.

Note that the problem can be alternatively reformulated as: Is it possible to decompose a matrix SSL(2,ℝ) in the following way: [tex]S=A^{-1}R A [/tex]where R is orthogonal and A is invertible?

Is this a well-known problem? To be honest, I don't have many ideas on how to tackle this problem, so even a suggestion that could get me on the right track would be very welcome.
 
Physics news on Phys.org
  • #2
Every orthogonal matrix can be diagonalized. Not every matrix in ##SL(2,\mathbb{R})## can be diagonalized. So what you ask is impossible.
 
  • #3
You may count the numbers of degrees of freedom you have in either matrix.
 
  • #4
Why did you mention the diagonalizability of a matrix?
I am not asking if S is similar to a diagonal matrix. I am rather asking if S is similar to an orthogonal matrix.
 
  • #5
##R## orthogonal ⇒ ##R## diagonalizable ⇒ (if ##R## similar to ##S##) ##S## diagonalizable
##S = \begin{pmatrix} 1 && 1 \\ 0 && 1 \end{pmatrix} \in SL(2,ℝ)## not diagonalizable, contradiction
⇒ ##S## not similar to ##R##
 
  • #6
fresh_42 said:
##R## orthogonal ⇒ ##R## diagonalizable
I thought orthogonal matrices were not diagonalizable over ℝ in general.Btw, I actually found a counterexample of your (micromass' and fresh's) above statements: consider the matrix [itex]
S = \begin{pmatrix}
1 & 2\\
-1 & -1
\end{pmatrix}
[/itex].
We can easily see that [itex]det(S)=1[/itex], and we can still decompose that matrix into: [tex]
S = \underbrace{\begin{pmatrix}
-1 & -1\\
0 & 1
\end{pmatrix} }_{A^{-1}}\;

\underbrace{\begin{pmatrix}
0 & -1\\
1 & 0
\end{pmatrix}}_R\;

\underbrace{\begin{pmatrix}
-1 & -1\\
0 & 1
\end{pmatrix}}_A
[/tex]

where A is invertible and R is orthogonal, as required. This proves that there exist matrices in SL(2,ℝ) that are similar to orthogonal matrices. Note also that S is not diagonalizable over ℝ.
 
Last edited:
  • #7
mnb96 said:
I thought orthogonal matrices were not diagonalizable over ℝ in general.

They are over ##\mathbb{C}##.

Btw, I actually found a counterexample of your (micromass' and fresh's) above statements: consider the matrix [itex]
S = \begin{pmatrix}
1 & 2\\
-1 & -1
\end{pmatrix}
[/itex].
We can easily see that [itex]det(S)=1[/itex], and we can still decompose that matrix into: [tex]
S = \underbrace{\begin{pmatrix}
-1 & -1\\
0 & 1
\end{pmatrix} }_{A^{-1}}\;

\underbrace{\begin{pmatrix}
0 & -1\\
1 & 0
\end{pmatrix}}_R\;

\underbrace{\begin{pmatrix}
-1 & -1\\
0 & 1
\end{pmatrix}}_A
[/tex]

where A is invertible and R is orthogonal, as required. This proves that there exist matrices in SL(2,ℝ) that are similar to orthogonal matrices. Note also that S is not diagonalizable over ℝ.

##S## is diagnalizable over ##\mathbb{C}##. And of course there exists matrices in ##SL(2,\mathbb{R})## similar to orthogonal matrices. The identity matrix would be an example of this. The point is that not all matrices in ##SL(2,\mathbb{R})## are similar to orthogonal matrices.
 
  • #8
Furthermore, what you ask is possible exactly for those matrices in ##SL(2,\mathbb{R})## whose eigenvalues have ##|\lambda|=1## (and could be complex).
 
  • #9
Thanks micromass,

I think I understand your remarks, but I still have a doubt.

Given a matrix [itex]\mathrm{S} \in SL(2,\mathbb{R})[/itex], and assuming there exists a 2×2 real matrix matrix A such that [itex]A^{-1}SA = R[/itex], we can deduce that S must be diagonalizable (over ℂ), since every rotation matrix R is. That means that we can write: [tex](AC)^{-1}\,S\,(AC) = \Lambda[/tex] where we used the diagonalization [itex]R=C\,\Lambda\,C^{-1}[/itex] (note that both C and Λ are in general complex).Now, starting from the sole knowledge of S we could perform an eigendecomposition of S and we would obtain [itex]S=Q^{-1}\Lambda Q[/itex], where [itex]Q=AC[/itex]. But then, how do we extract the real matrix A from the complex matrix Q?

For instance, in the example I gave above where [itex]
S = \begin{pmatrix}
1 & 2\\
-1 & -1
\end{pmatrix}
[/itex] the eigendecomposition of S would give [itex]S=Q \Lambda Q^{-1}[/itex] where: [tex]Q=
\frac{\sqrt{6}}{6}\begin{pmatrix}
2 & 2\\
-1+i & -1-i
\end{pmatrix}
[/tex]
[tex]\Lambda=
\begin{pmatrix}
i & 0\\
0 & -i
\end{pmatrix}
[/tex]

How do we extract [itex]A=
\begin{pmatrix}
-1 & -1\\
0 & 1
\end{pmatrix}
[/itex] from the complex matrix Q ?
 
Last edited:
  • #10
Ok, I think this question has been almost answered.

Summarizing, given a matrix [itex]S\in SL(2,\mathbb{R})[/itex] we ask if there exist one invertible matrix [itex]A\in GL(2,\mathbb{R})[/itex] and a rotation matrix [itex]R\in SO(2,\mathbb{R})[/itex] such that [itex]A^{-1}SA=R[/itex].

Since [itex]R[/itex] is diagonalizable over ℂ (i.e. [itex]R=C\Lambda C^{-1}[/itex]), we have that:

[tex]S=(AC)\Lambda (C^{-1}A^{-1}) = Q\Lambda Q^{-1} \qquad\qquad\qquad \mathrm{(1)}[/tex]
From the above equation we can deduce that [itex]S[/itex] must be diagonalizable and must have eigenvalues having modulo 1. These are necessary conditions. They would be also sufficient conditions if we add the requirement that the eigenvectors of [itex]S[/itex] must have the same real part.

Since [itex]Q=AC[/itex] we can find a matrix [itex]A=QC^{-1}[/itex], where the columns of [itex]C[/itex] contain the eigenvectors of [itex]R[/itex], for instance [itex]C=\begin{pmatrix}
1 & 1\\
i & -i
\end{pmatrix}
[/itex].
It can be verified that when the eigenvectors of [itex]S[/itex] have the same real part, then the matrix [itex]A[/itex] is real.

However, the matrix [itex]A[/itex] is not unique. For example, we can easily see that by multiplying [itex]A[/itex] with an invertible matrix [itex]M[/itex] that commutes with [itex]S[/itex] (i.e. [itex]MS=SM[/itex]) Equation (1) would still hold.
 

Related to Question about decomposition of matrices in SL(2,R)

What is the definition of decomposition of matrices in SL(2,R)?

The decomposition of matrices in SL(2,R) refers to breaking down a matrix in SL(2,R) into simpler, more easily understandable components. This is done by finding matrices that, when multiplied together, equal the original matrix.

What are the benefits of decomposing matrices in SL(2,R)?

Decomposing matrices in SL(2,R) can make calculations and manipulations easier, as well as provide insight into the structure and properties of the original matrix. It can also help in solving systems of linear equations and understanding transformations in geometry.

What are the different methods of decomposition in SL(2,R)?

There are several methods of decomposition in SL(2,R), including LU decomposition, QR decomposition, and Singular Value Decomposition (SVD). Each method has its own advantages and is used in different situations depending on the properties of the original matrix and the desired outcome.

What is the difference between decomposition and factorization of matrices in SL(2,R)?

Decomposition and factorization are often used interchangeably, but they have slightly different meanings. Decomposition refers to breaking a matrix down into simpler components, while factorization is the process of finding factors that, when multiplied together, equal the original matrix.

How is decomposition of matrices in SL(2,R) used in real-world applications?

Decomposition of matrices in SL(2,R) has many real-world applications, such as in computer graphics, data compression, and solving systems of linear equations. It is also used in fields such as physics, engineering, and economics to analyze and model complex systems.

Similar threads

  • Linear and Abstract Algebra
Replies
1
Views
1K
  • Linear and Abstract Algebra
Replies
2
Views
732
  • Linear and Abstract Algebra
Replies
9
Views
911
  • Linear and Abstract Algebra
Replies
1
Views
1K
Replies
4
Views
1K
  • Linear and Abstract Algebra
Replies
4
Views
2K
  • Topology and Analysis
Replies
12
Views
454
  • Linear and Abstract Algebra
Replies
8
Views
2K
  • Linear and Abstract Algebra
Replies
17
Views
4K
  • Linear and Abstract Algebra
Replies
14
Views
2K
Back
Top