Real Eigenvalues and 3 Orthogonal Eigenvectors for Matrix (c,d)

In summary, The conversation discusses finding real eigenvalues and three orthogonal eigenvectors for a given matrix, and eventually concludes that any real choice for d will work for the matrix to have these properties. The spectral theorem and properties of self-adjoint operators are mentioned as well. Gram-Schmidt is suggested as a potential method to find orthonormal vectors if necessary.
  • #1
robierob12
48
0

Homework Statement



For which real numbers c and d does the matrix have real eigenvalues and three orthogonal eigenvectors?

120
2dc
053

Homework Equations



im having trouble getting started on this one.

Ive tried using solving for the eigenvalues pretending that c and d are constants but that doesn't seem to help any. Can anyone nudge me in the correct direction.



The Attempt at a Solution

 
Physics news on Phys.org
  • #2
If the matrix has real eigenvalues and three orthogonal eigenvectors, then the corresponding linear transformation is self-adjoint. What does that mean in terms of its matrix representation?
 
Last edited:
  • #3
so... would I find the co-factor matrix of

123
2dc
053

take the transpose to find the adj.

which I did actually.

new values aprear in the d c spots... it is its own adjoint so those would be the fill in values?

I think I am off track
 
  • #4
Not the adjugate, the adjoint. The conjugate transpose. You are making this much harder than it should be.
 
  • #5
I don't know why this one is so hard for me...
does this have to do with the real spectral theorem?
 
  • #6
It's the converse of the spectral theorem. The spectral theorem says that if an operator has a certain property then its eigenspace has a certain property. This problem says if the eigenspace has a certain property then the operator has a certain property. If you write the linear transformation corresponding to the matrix as M, then self adjoint means (Mx).y=x.(My). Can you show that's true in this case? What might that have to do with a certain matrix being hermitian?
 
  • #7
I don't think that I've seen the operations the way that showing them...
I do see something now though.

Wont the matrix

120
2dc
053


have to be

120
2d5
053

It will have to symetric?

I don't see how to find d besides by trying different d until the found eigenvectors are orthogonal. Any real number choice for d should give real eigenvalues by the spectral theorem?
 
  • #8
Correct. Any real choice for d will work.
 
  • #9
That easy?

120
205
053

120
215
053

120
225
053

120
235
053

will all have orthogonal eigenvectors?

I'll try a few to see if it works.
 
  • #10
You don't trust the spectral theorem? That's healthy skepticism! Check away.
 
  • #11
One more question... is it possible to choose d so that one of eigenvalues is repeated makeing it so there are not actually three othogonal eigenvectors. Because I want to say right now that any choice of d will work.
 
  • #12
I appreciate the help on this one... thanks'a bunch
 
  • #13
A repeated eigenvalue does not mean you don't have three orthogonal eigenvectors. The zero matrix has them. Take (1,0,0), (0,1,0) and (0,0,1). They are orthogonal and all have eigenvalue 0. You're welcome!
 
  • #14
nice...

even if it wasnt so, gram-schmidt could make them into orthonormal set I suppose.
 
  • #15
Yes. The problem comes with matrices like [[1,1],[0,1]]. 1 is a double eigenvalue - but there is only one linearly independent eigenvector.
 

FAQ: Real Eigenvalues and 3 Orthogonal Eigenvectors for Matrix (c,d)

1. What are real eigenvalues and orthogonal eigenvectors?

Real eigenvalues and orthogonal eigenvectors are mathematical concepts used to describe the behavior of a square matrix. Real eigenvalues are the values that, when multiplied by the matrix, result in a scalar multiple of the original vector. Orthogonal eigenvectors are the corresponding vectors that remain at a right angle to each other, even when multiplied by the matrix.

2. Why are real eigenvalues and orthogonal eigenvectors important?

Real eigenvalues and orthogonal eigenvectors provide valuable information about the properties of a matrix. They can be used to calculate the stability of a system, identify special matrices such as symmetric or diagonal matrices, and simplify complex calculations involving matrices.

3. How are real eigenvalues and orthogonal eigenvectors calculated?

The process of finding real eigenvalues and orthogonal eigenvectors involves solving the characteristic equation of the matrix, which is obtained by subtracting a scalar multiple of the identity matrix from the original matrix. The resulting eigenvalues can then be used to find the corresponding eigenvectors using matrix operations such as row reduction or diagonalization.

4. Can a matrix have more than 3 orthogonal eigenvectors?

Yes, a matrix can have any number of orthogonal eigenvectors, as long as it is a square matrix. The number of orthogonal eigenvectors is equal to the dimension of the matrix, which is the number of rows or columns it has. In the case of a 3x3 matrix, there can be up to 3 orthogonal eigenvectors.

5. What is the significance of orthogonal eigenvectors being at right angles to each other?

The fact that orthogonal eigenvectors are at right angles to each other is important because it means that they are linearly independent. This makes them useful for simplifying calculations involving matrices, as they can be used as a basis for the vector space spanned by the matrix. Additionally, orthogonal eigenvectors can represent different physical quantities in a system, making them useful for analyzing and understanding complex systems.

Similar threads

Back
Top