Proving Hv = 0 for a Symmetric Matrice with Orthogonal Diagonalization

In summary: This is because the diagonal entries of M are all 0, so the matrix multiplication just results in a vector of zeros. Therefore, Hv = 0, as required.In summary, if H is an n by n real symmetric matrix and H^(k+1)v = 0 for a real column n-vector v, then Hv = 0. This can be proven by diagonalizing H and using the fact that the null space of a diagonal matrix is related to the null space of its powers.
  • #1
moo5003
207
0

Homework Statement


Suppose H is an n by n real symmetric matrix. v is a real column n-vector and H^(k+1)v = 0. Prove that Hv = 0

The Attempt at a Solution



Since H is a real symmetric matrice we can find an orthogonal matrix Q to diagnolize it:

M = Q transpose.

MA^(k+1)Qv = 0

Implying

A^(k+1)Qv = 0

This is where I'm stuck I'm not sure how to proceed.

I'm pretty sure its not possible to somehow get Q remove from the equation because that implies v or A would have to be 0 but this does not follow since I can easily cook up an example were there is a symmetric matrix to a power were H^(K+1)v=0 and v != 0. Thus any hints would be appreciated.
 
Physics news on Phys.org
  • #2
If you work in the basis where H is diagonal then replace the phrase "n by n real symmetric matrix" in your premise with "n by n DIAGONAL matrix". Can you prove that one?
 
  • #3
Assume v is not zero. Then, since Q is invertible, Qv is not zero. So Qv must be in the nullspace of A^(k+1). What is the null space of a diagonal matrix, and how is this related to the nullspace of its powers?
 
  • #4
Dick said:
If you work in the basis where H is diagonal then replace the phrase "n by n real symmetric matrix" in your premise with "n by n DIAGONAL matrix". Can you prove that one?

That was the proof shown in class where you chose to represent the matrix using a basis consisting of eigen vectors, though I didnt start my proof that way and I believe that the diagonalization method should work if I could finish the last step.

Statusx: It would imply that Qv is in the nullspace of A itself since A^(k+1) simply exponentiates each entry by k+1 (diagonal matrice), though I'm unsure how that shows Av = 0.

You would get AQv = 0

Transpose of Qv (easier as a row vector when typing) would be something along the lines:

[0,0,0,...,a_i,...,0] where a_i can by any real number corresponding to a 0 in the diagonal entry a_ii in the diagonal matrice A.
 
  • #5
moo5003 said:
You would get AQv = 0

Right, and so MAQv = Hv = 0.
 

FAQ: Proving Hv = 0 for a Symmetric Matrice with Orthogonal Diagonalization

What is a symmetric matrix?

A symmetric matrix is a square matrix where the values on either side of the main diagonal are equal. In other words, if the matrix is denoted by A, then A[i,j] = A[j,i] for all indices i and j.

How can you prove that a matrix is symmetric?

To prove that a matrix is symmetric, you can check if the values on either side of the main diagonal are equal. You can also use the transpose property, which states that if A is a symmetric matrix, then AT = A.

What is the importance of symmetric matrices?

Symmetric matrices have many important applications in mathematics, physics, and engineering. They can be used to represent real-world data such as distances between cities, and also have properties that make them useful for solving equations and performing calculations.

Are all symmetric matrices invertible?

No, not all symmetric matrices are invertible. A symmetric matrix is invertible if and only if all its eigenvalues are non-zero. If any eigenvalue is zero, then the matrix is not invertible.

How can you prove that a symmetric matrix is positive definite?

To prove that a symmetric matrix is positive definite, you can use the properties of eigenvalues and eigenvectors. A symmetric matrix is positive definite if all its eigenvalues are positive, and the eigenvectors corresponding to different eigenvalues are orthogonal to each other.

Back
Top