Symmetric matrix with eigenvalues

In summary, we are given an orthonormal basis {u1, u2,...,un} for Rn and a linear combination A of rank 1 matrices u1u1T, u2u2T,...,ununT. We are asked to show that A is a symmetric matrix with eigenvalues c1, c2,..., cn and that ui is an eigenvector belonging to ci for each i. To do this, we first prove that A is symmetric by showing that A = AT. Then, using the orthonormality of the basis, we can rewrite Aui as ciui. This shows that Aui is a multiple of ui, making ui an eigenvector of A with eigenvalue
  • #1
Ylle
79
0

Homework Statement


Let {u1, u2,...,un} be an orthonormal basis for Rn and let A be a linear combination of the rank 1 matrices u1u1T, u2u2T,...,ununT. If

A = c1u1u1T + c2u2u2T + ... + cnununT

show that A is a symmetric matrix with eigenvalues c1, c2,..., cn and that ui is an eigenvector belonging to ci for each i.



Homework Equations



No clue...

The Attempt at a Solution



I'm really stuck with this problem. So I'm really just hoping for a little hint or something.
It SOUNDS easy, but as I said, I have no clue where to start.

Hope you can help.


Regards
 
Physics news on Phys.org
  • #2
Hi Ylle! :smile:
Ylle said:
… show that A is a symmetric matrix with eigenvalues c1, c2,..., cn and that ui is an eigenvector belonging to ci for each i.

(i assume you can prove that A is symmetric)

You just need to prove that Aui = ciui

(and remember the basis is orthonormal :wink:)
 
  • #3


Hi :)

I'm not totally sure about the first, but I have this (See the picture added). I don't know if it is proved there ?
For it to be symmetric it's it has to be: A = AT, right ?

And the next I can't figure out what you mean :?
If you add ui to A I guess you can remove the u1-n and u1-nT because they become the identitymatrix for all of them. And then you have c1-nui left. So how do I make the c1-n to ci ? :)
 

Attachments

  • math.JPG
    math.JPG
    19.7 KB · Views: 377
  • #4
Ylle said:
I'm not totally sure about the first, but I have this (See the picture added). I don't know if it is proved there ?
For it to be symmetric it's it has to be: A = AT, right ?

Hi Ylle! :smile:

Yes, A = AT.

But your proof doesn't work, because transpose is like inverse … it alters the order of things … so (AB)T = BTAT :wink:
And the next I can't figure out what you mean :?

Just write Aui in full (using the orthonormality of the u's) …

what do you get? :smile:
 
  • #5


Well, my guess is you have:

A = ciuiuiT

If you then add ui on both sides you get:

Aui = ciuiuiTui

And because uiuiT = <ui,<ui> = ||ui|| = 1 since it's a basis, you have:

Aui = ciui


Don't know if that is correct ?
 
  • #6
Hi Ylle! :smile:

(btw, please don't say "add" :eek: … say "multiply", or if you prefer a more neutral word, "apply" :wink:)
Ylle said:
… And because uiuiT = <ui,<ui> = ||ui|| = 1 since it's a basis, you have:

Aui = ciui

Don't know if that is correct ?

Yup, that's fine …

that's what "orthonormal" is all about! :biggrin:

but … you still have to deal with all the u's in the ∑ that aren't that particular ui

use the orthonormal property again. :smile:
 

FAQ: Symmetric matrix with eigenvalues

What is a symmetric matrix?

A symmetric matrix is a square matrix where the elements are symmetric with respect to the main diagonal. This means that the element at position (i,j) is equal to the element at position (j,i). In other words, the matrix is unchanged when reflected along the main diagonal.

What are eigenvalues and eigenvectors?

Eigenvalues and eigenvectors are important concepts in linear algebra. Eigenvalues represent the scaling factor of an eigenvector when it is multiplied by a matrix. In other words, the eigenvectors do not change direction when multiplied by the matrix, but they may be scaled. Eigenvectors are linearly independent vectors that correspond to each eigenvalue.

Why is it important to have a symmetric matrix with eigenvalues?

A symmetric matrix with eigenvalues has several important properties that make it useful in various applications. For example, symmetric matrices are easier to work with and can be diagonalized, which simplifies many calculations. Additionally, symmetric matrices have real eigenvalues and orthogonal eigenvectors, which have practical applications in fields such as physics, engineering, and machine learning.

How do you find the eigenvalues of a symmetric matrix?

The eigenvalues of a symmetric matrix can be found by solving the characteristic equation det(A-λI)=0, where A is the symmetric matrix and λ is the eigenvalue. This equation can be solved using various methods, such as the quadratic formula or using a calculator. Alternatively, the eigenvalues can also be found by using specialized algorithms or software.

Can a symmetric matrix have complex eigenvalues?

No, a symmetric matrix always has real eigenvalues. This is because the characteristic equation for a symmetric matrix only has real solutions. Additionally, the eigenvectors of a symmetric matrix are always orthogonal, which is only possible with real eigenvalues. This means that a symmetric matrix cannot have complex eigenvalues.

Back
Top