Symmetry of Orthogonally diagonalizable matrix

In summary: This is incorrect. However, if A is orthogonal, then you can use that an orthogonal matrix is invertible (the transpose is the inverse), and that the columns of an invertible matrix are linearly independent.This is incorrect. However, if A is orthogonal, then you can use that an orthogonal matrix is invertible (the transpose is the inverse), and that the columns of an invertible matrix are linearly independent.
  • #1
DmytriE
78
0
Can someone confirm or refute my thinking regarding the diagonalizability of an orthogonal matrix and whether it's symmetrical?

A = [b1, b2, ..., bn] | H = Span {b1, b2, ..., bn}. Based on the definition of the span, we can conclude that all of vectors within A are linearly independent. Furthermore, we can then conclude that Rank(A) = n.

If Rank(A) = n then none of the vectors in A can be made as a linear combination of the other n-1 vectors. Since A can be row-reduced to an identity matrix and the transpose of the identity matrix is itself. Can it be concluded that the original matrix A is also symmetrical (AT = A)?
 
Physics news on Phys.org
  • #2
DmytriE said:
Based on the definition of the span, we can conclude that all of vectors within A are linearly independent.
This is incorrect. However, if A is orthogonal, then you can use that an orthogonal matrix is invertible (the transpose is the inverse), and that the columns of an invertible matrix are linearly independent.
 
  • #3
Fredrik said:
This is incorrect. However, if A is orthogonal, then you can use that an orthogonal matrix is invertible (the transpose is the inverse), and that the columns of an invertible matrix are linearly independent.

When you say that this is incorrect, are you referring to my rational regarding symmetrical matrices or vector linear independence?
 
  • #4
I was referring to the part of your post that I quoted. There's nothing about the definition of "span" that allows you to conclude that ##\{b_1,\dots,b_n\}## is linearly independent.
 
  • #5
Fredrik said:
I was referring to the part of your post that I quoted. There's nothing about the definition of "span" that allows you to conclude that ##\{b_1,\dots,b_n\}## is linearly independent.

Unless of course you know somehow that the dimension of ##\text{span}\{b_1,...,b_n\}## is ##n##.
 
  • #6
DmytriE said:
Can someone confirm or refute my thinking regarding the diagonalizability of an orthogonal matrix and whether it's symmetrical?

A = [b1, b2, ..., bn] | H = Span {b1, b2, ..., bn}.
Do you intend here that H is itself n dimensional? If so, you need to say that. If not, your following statement is not true.

Based on the definition of the span, we can conclude that all of vectors within A are linearly independent. Furthermore, we can then conclude that Rank(A) = n.

If Rank(A) = n then none of the vectors in A can be made as a linear combination of the other n-1 vectors. Since A can be row-reduced to an identity matrix and the transpose of the identity matrix is itself. Can it be concluded that the original matrix A is also symmetrical (AT = A)?
 
  • #7
Orthogonal matrices are a particular case of unitary matrices, which are in turn a particular case of normal matrices. Any normal matrix ##A## is orthogonally diagonalizable, i.e. represented as ##A=UDU^*##, where ##U## is a unitary and ##D## is diagonal matrix (generally with complex coefficients).

Orthogonal matrices do not need to be symmetric, a rotation matrix $$R_\alpha = \left(\begin{array}{cc} \cos \alpha & -\sin\alpha \\ \sin\alpha & \cos\alpha \end{array}\right)$$ with ##\alpha\ne \pi n## can serve as an example.

On the other hand, there are symmetric orthogonal matrices, for example the identity matrix ##I## or the matrices $$\left(\begin{array}{cc} 1 & 0 \\ 0 &-1 \end{array}\right), \qquad \left(\begin{array}{cc} \cos \alpha & \sin\alpha \\ \sin\alpha &- \cos\alpha \end{array}\right) $$ (the last two matrices are unitarily equivalent).
 

FAQ: Symmetry of Orthogonally diagonalizable matrix

What is a symmetric matrix and how is it related to orthogonally diagonalizable matrix?

A symmetric matrix is a square matrix that is equal to its transpose. This means that the elements above the main diagonal are the same as the elements below the main diagonal. A symmetric matrix is related to an orthogonally diagonalizable matrix because a symmetric matrix can be diagonalized by an orthogonal matrix, which means that it can be expressed as a product of three matrices: A = PDPT, where P is an orthogonal matrix and D is a diagonal matrix.

What is the significance of orthogonally diagonalizable matrices in linear algebra?

Orthogonally diagonalizable matrices have several important properties in linear algebra. They can simplify calculations involving matrix multiplication and inverses, and they can also help in solving systems of linear equations. Additionally, they have applications in areas such as physics, computer science, and cryptography.

How can you determine if a matrix is orthogonally diagonalizable?

A matrix is orthogonally diagonalizable if it can be diagonalized by an orthogonal matrix. This means that the matrix must have real eigenvalues and its eigenvectors must be orthogonal. To determine if a matrix is orthogonally diagonalizable, you can use the spectral theorem, which states that a matrix is orthogonally diagonalizable if and only if it is symmetric.

What are the benefits of using orthogonally diagonalizable matrices over non-diagonalizable matrices?

One benefit of using orthogonally diagonalizable matrices is that they are easier to work with in calculations and can simplify solutions to systems of equations. They also have applications in areas such as optimization and data compression. Additionally, orthogonally diagonalizable matrices have unique properties, such as being able to preserve the length of vectors, which can be beneficial in certain applications.

Can a non-symmetric matrix be orthogonally diagonalizable?

No, a non-symmetric matrix cannot be orthogonally diagonalizable. As mentioned earlier, the spectral theorem states that a matrix is orthogonally diagonalizable if and only if it is symmetric. Therefore, a non-symmetric matrix does not have real eigenvalues or orthogonal eigenvectors, which are necessary conditions for a matrix to be orthogonally diagonalizable.

Back
Top