Orthogonal Matrix: Properties & Conditions

In summary: The matrix is orthogonal with respect to the metric \eta=\begin{pmatrix}2&1\\1&1\end{pmatrix} in the sense that A^T\eta A=\eta
  • #1
ali987
9
0
Dear all,

I have a matrix, namely A. I calculate its eigenvalues by MATLAB and all of its eigenvalues lie on the unit circle(their amplitudes equal to 1). But A is not an orthogonal matrix (transpose(A) is not equal to inverse(A) ). What other condition or relationship may be correct for it?
 
Physics news on Phys.org
  • #2
Is you matrix a real matrix or it has complex entries? Is it invertible?
 
  • #3
It is definitely invertible. All entries are real, but eigenvalues can be complex.
 
  • #4
If your matrix is nxn, do you have n different eigenvalues or less than n?
 
  • #5
I can't talk about it definitely, but, according to a numerical simulation results, I had several identical eigenvalues. so, there are less than "n" unique and different eigenvalues. But, as is mentioned, I can't extend this conclusion to general case, these are results of just a numerical example.

However, if I always have less than "n" different eigenvalues, what can I do?
 
  • #6
A unitary matrix has eigenvalues on the unit circle. Not sure if "all eigenvalues on the unit circle" implies "unitary" though. I don't have time to think about that right now.
 
  • #7
Unfortunately, that matrix is not unitary(or Orthogonal, since all entries are real).

A unitary matrix has eigenvalues on the unit circle, but if all eigenvalues of a matrix lie on the unit circle, that matrix is not definitely unitary. that's the problem.
 
  • #8
Take simple example

[tex]A=\begin{pmatrix}1&1\\0&1\end{pmatrix}[/tex]

It has just one eigenvalue, namely 1, but it is not orthogonal.
 
  • #9
arkajad said:
Take simple example

[tex]A=\begin{pmatrix}1&1\\0&1\end{pmatrix}[/tex]

It has just one eigenvalue, namely 1, but it is not orthogonal.

Or even better, take

[tex]A=\begin{pmatrix}1&1\\0&-1\end{pmatrix}[/tex]

With distinct eigenvalues (1 and -1) and vectors.

It should be intuitively obvious that your hypothesis that "eigenvalues on the unit circle is sufficient" is probably wrong. There are only n eigenvalues, but orthogonality requires n2 relationships to hold.
 
  • #10
AlephZero said:
Or even better, take

[tex]A=\begin{pmatrix}1&1\\0&-1\end{pmatrix}[/tex]

With distinct eigenvalues (1 and -1) and vectors.

This matrix is nevertheless orthogonal with respect to the metric
[tex]\eta=\begin{pmatrix}2&1\\1&1\end{pmatrix}[/tex]
in the sense that
[tex]A^T\eta A=\eta[/tex]
 
  • #11
Thanks every one for their useful comments, but, what about the main question?

We have a matrix (with real entries) whose all eigenvalues lie on the unit circle and that matrix is not orthogonal. What other conditions may be exist for this matrix? Or in other words, how can we proof that all eigenvalues of this matrix lie on the unit circle??
 

Related to Orthogonal Matrix: Properties & Conditions

What is an orthogonal matrix?

An orthogonal matrix is a square matrix in which all the columns and rows are orthogonal to each other. This means that the dot product of any two columns or rows is equal to zero, and the dot product of a column or row with itself is equal to one.

What are the properties of an orthogonal matrix?

Some of the key properties of an orthogonal matrix include:

  • All columns and rows are orthogonal to each other.
  • The determinant of an orthogonal matrix is either 1 or -1.
  • The inverse of an orthogonal matrix is equal to its transpose.
  • The product of two orthogonal matrices is also an orthogonal matrix.

What are the conditions for a matrix to be orthogonal?

A matrix must satisfy the following conditions to be considered orthogonal:

  • All columns and rows must be unit vectors.
  • All columns and rows must be orthogonal to each other.
  • The determinant of the matrix must be either 1 or -1.

What is the significance of orthogonal matrices?

Orthogonal matrices have many applications in mathematics and science, including:

  • They are commonly used in linear algebra and matrix operations.
  • They can be used to rotate or reflect vectors in n-dimensional space.
  • They are crucial in the process of Gram-Schmidt orthogonalization, which is used to create an orthonormal basis for a vector space.
  • They are important in fields such as quantum mechanics, signal processing, and computer graphics.

How are orthogonal matrices related to orthogonal transformations?

An orthogonal matrix represents an orthogonal transformation, which is a linear transformation that preserves the length and angle of vectors. This means that the dot product of any two vectors remains the same before and after the transformation. In other words, the geometry of the vectors is not affected by an orthogonal transformation.

Similar threads

  • Linear and Abstract Algebra
Replies
9
Views
565
  • Linear and Abstract Algebra
Replies
6
Views
1K
  • Linear and Abstract Algebra
Replies
14
Views
2K
  • Linear and Abstract Algebra
Replies
5
Views
2K
  • Linear and Abstract Algebra
Replies
1
Views
1K
  • Linear and Abstract Algebra
Replies
11
Views
1K
  • Linear and Abstract Algebra
Replies
3
Views
1K
  • Linear and Abstract Algebra
Replies
7
Views
874
  • Linear and Abstract Algebra
Replies
8
Views
2K
  • Linear and Abstract Algebra
Replies
2
Views
2K
Back
Top