Theorem about symmetric matrices?

In summary: Eigenvalues%20and%20Eigenvectors/book.pdfIn summary, the spectral theorem states that symmetric matrices are always diagonalizable, and that there is a basis of mutually perpendicular eigenvectors for them.
  • #1
PBRMEASAP
191
2
If you have a symmetric, nonsingular matrix [tex]A[/tex], is it always possible to find a matrix [tex]B[/tex] such that

[tex]B^T A B = 1[/tex],

where [tex]1[/tex] is the identity?
 
Physics news on Phys.org
  • #2
I think there's a theorem that says symmetric matrices are diagonalizable. That should help.
 
  • #3
Okay, I think I see what you mean.

Since [tex]A[/tex] can be written as [tex]P^T D P[/tex], where [tex]P[/tex] is orthogonal and [tex]D[/tex] is diagonal, then my problem reduces to finding a matrix [tex]C = PB \ \mbox{such that} \ C^T D C = 1[/tex], which should not be a problem. Do I have the right idea?

thanks,
PBR
 
  • #4
Sounds good.
 
  • #5
Thanks for your help :smile:
 
  • #6
extract from my book on linear algebra posted here:

The “spectral theorem” (symmetric matrices are diagonalizable)
Theorem: If A is symmetric, then Rn has a basis of eigenvectors for A.
proof: The real valued function f(x) = Ax.x has a maximum on the unit sphere in Rn, at which point the gradient vector of f is zero on the tangent space to the sphere, i.e. is perpendicular to the tangent space at that point. But the tangent space at x is the subspace of vectors perpendicular to x, and the gradient of f at x is the vector 2Ax. Hence Ax is also perpendicular to the tangent space at x, i.e. either Ax is parallel to x or Ax = 0, i.e. x is an eigenvector for A. That gives one eigenvector for A.

Now restrict A to the tangent space (through the origin) to the sphere at x. I.e. let v be a tangent vector, so that v.x = 0. Then Av.x = v.Ax = v.cx for some c. so this is also zero, and hence A preserves this tangent space. Now A still has the property Av.x = v.Ax on this subspace, so A the restriction of A has an eigenvector. Since we are in finite dimensions, by repeating at most n times, A has a basis of eigenvectors. (Note that although the subspace has no natural representation as Rn-1, the argument above for producing an eigenvector was coordinate - free, and depended only the property that Av.v = v.Av, which is still true on the subspace.) QED.

Corollary (of proof): There is actually a basis of mutually perpendicular eigenvectors for a symmetric n by n matrix.


since a matrix whose columns are orthonormal vectors has invrse equalk to its iown transpose, this also diagonalizes the quadratic form, i.e. gives a diagonal matrix under the operation m goes to (P^t)MP.
 
  • #7
That is a very slick proof, Mathwonk. If I recall, it is similar to the one given in Apostol's Calculus. Who's the author of your book, by the way?

I guess I had forgotten that zero is a legitimate eigenvalue for a matrix. So am I correct in understanding that the spectral theorem applies even to singular symmetric matrices? Even so, I believe requirement of being non-singular was necessary to make the final jump in answering my original question, since I don't think a (not necessarily orthogonal) transformation [tex]B^T D B = 1[/tex] can be found for a diagonal matrix [tex]D[/tex] with some zero diagonal elements.

Thanks for posting that proof!
 
  • #8
roy smith is the author of the book. it is only 15 pages long and includes proofs of existence of rational canonical form, jordan normal form, and the spectral theorem. it can be dowloaded from his webpage at the math dept of the university of georgia. he probably learned that proof from some standard source, as it is well known. it occurs for example in lang's analysis I book.

http://www.math.uga.edu/~roy/
 
Last edited:

FAQ: Theorem about symmetric matrices?

What is the theorem about symmetric matrices?

The theorem about symmetric matrices states that every symmetric matrix can be diagonalized by a unitary matrix. This means that it can be expressed as a diagonal matrix with the eigenvalues of the original matrix on the main diagonal.

How do you know if a matrix is symmetric?

A matrix is symmetric if it is equal to its own transpose. This means that the elements of the matrix are symmetric with respect to the main diagonal.

What are the properties of symmetric matrices?

Some properties of symmetric matrices include:

  • Their eigenvalues are real numbers.
  • They have orthogonal eigenvectors.
  • Their rank is equal to the number of non-zero eigenvalues.
  • They can be diagonalized by a unitary matrix.

How is the theorem about symmetric matrices used in real-world applications?

The theorem about symmetric matrices has many practical applications in fields such as physics, engineering, and computer science. It is used in linear algebra to simplify calculations and solve problems involving symmetric matrices, such as finding the principal components in data analysis or solving systems of linear equations.

Can a non-square matrix be symmetric?

No, a non-square matrix cannot be symmetric because it cannot be equal to its own transpose. Symmetry requires the matrix to have the same number of rows and columns.

Similar threads

Replies
5
Views
799
Replies
2
Views
1K
Replies
5
Views
1K
Replies
5
Views
2K
Replies
6
Views
2K
Replies
2
Views
1K
Back
Top