Finding a Matrix P for Non-symmetric Diagonalization: To Normalize or Not?

In summary: If I have a matrix P and I want to find its inverse, is there a way to do this without having to calculate it?
  • #1
seanc12
15
0

Homework Statement



Find a matrix P such that [itex]P^{-1}AP[/itex] is diagonal and evaluate [itex]P^{-1}AP[/itex].

A=
[2 5]
[2 3]

The Attempt at a Solution


First off, I Found the Eigenvalues, which turned out to be:

[itex]\lambda = \frac{5 \pm \sqrt{41}}{2}[/itex]

This gave me the two Eigenvectors:
[[itex]\frac{10}{1+\sqrt{41}}[/itex]]
[ 1 ]

[[itex]\frac{10}{1-\sqrt{41}}[/itex]]
[ 1 ]Now this is where i get a little bit stuck. Am I suppose to go and normalise the two Eigenvectors before i use them as P, or can I just use those two vectors as my P then go ahead and find the inverse and start grinding my way through it?

I have tried both ways but can't quite get the answer out. It is most likely from an error in my working (there are so many surds!) but I wanted to check how to correctly do it before I try and re-work the problem. Also if you guys could give me any tips for evaluating it without having to do so much work and most likely produce an error it would be nice.

Also I know that for a symmetric matrix the eignenvales should appear on the diagonal, but is this still the case for the non-symmetric matrix?
 
Physics news on Phys.org
  • #2
Hi seanc12!:smile:

seanc12 said:

Homework Statement



Find a matrix P such that [itex]P^{-1}AP[/itex] is diagonal and evaluate [itex]P^{-1}AP[/itex].

A=
[2 5]
[2 3]

The Attempt at a Solution


First off, I Found the Eigenvalues, which turned out to be:

[itex]\lambda = \frac{5 \pm \sqrt{41}}{2}[/itex]

This gave me the two Eigenvectors:
[[itex]\frac{10}{1+\sqrt{41}}[/itex]]
[ 1 ]

[[itex]\frac{10}{1-\sqrt{41}}[/itex]]
[ 1 ]

I'll assume these calculations are correct. You want me to check them?

Now this is where i get a little bit stuck. Am I suppose to go and normalise the two Eigenvectors before i use them as P, or can I just use those two vectors as my P then go ahead and find the inverse and start grinding my way through it?

No, you don't need to normalize them. You might want to multiplicate them by [itex]1+\sqrt{41}[/itex] to eliminate the fraction. That is, work with the eigenvectors

[tex](10,1+\sqrt{41})~~\text{and}~~(10,1-\sqrt{41})[/tex]

I have tried both ways but can't quite get the answer out. It is most likely from an error in my working (there are so many surds!) but I wanted to check how to correctly do it before I try and re-work the problem. Also if you guys could give me any tips for evaluating it without having to do so much work and most likely produce an error it would be nice.

Also I know that for a symmetric matrix the eignenvales should appear on the diagonal, but is this still the case for the non-symmetric matrix?

Yes, the eigenvalues will still appear on the diagonal. So you already know what your diagonal matrix looks like!
 
  • #3
Hi micromass.
Thank you for the help! I got the answer out :D. Getting rid of that fraction made my life 500 times easier. Thanks again.

I have a couple of general questions if anyone could answer them for me would be great.

What is the difference between diagonalizing non-symmetric and symmetric matrices? It seems like we only do symmetric ones for most of the examples (this was a bonus question).

My other question is; Sometimes my lecturer had the two eigen vectors, then narmalised them. What's the purpose of this? It seems like the only difference was that the transpose = inverse, so the transpose could be used. What is the point of all this extra work?

Thanks again.
 
  • #4
seanc12 said:
Hi micromass.
Thank you for the help! I got the answer out :D. Getting rid of that fraction made my life 500 times easier. Thanks again.

I have a couple of general questions if anyone could answer them for me would be great.

What is the difference between diagonalizing non-symmetric and symmetric matrices? It seems like we only do symmetric ones for most of the examples (this was a bonus question).

Nothing much. The diagonalization of symmetric matrices is quite the same as nonsymmetric matrices. However, diagonalization of symmetric matrices is simpler because, when working with the matrix of eigenvectors, we can use the transpose instead of the inverse. So we won't need to calculate that nasty inverse.

My other question is; Sometimes my lecturer had the two eigen vectors, then narmalised them. What's the purpose of this? It seems like the only difference was that the transpose = inverse, so the transpose could be used. What is the point of all this extra work?

There is no point to all that work. If we didn't normalize the vectors, then the transpose would still be the inverse. My guess is that your teacher just wants to be annoying. :smile:
 
  • #5
Ah ok that makes sense.

One last thing (I want to get these diagonal matrices down, my lecturer seems to be in love with them), when I get the eigen vectors out, do they need to be orthogonal? Because the ones I used from that question don't seem to be.
 
  • #6
That's another fun thing with symmetric matrices: you can always choose the eigenvectors to be orthogonal, and I would recommend that you always choose them orthogonal.

With arbitrary matrices, this is not possible anymore. The eigenvectors will and can not be chosen orthogonal. So this is quite normal.
 
  • #7
Ok, I see now. So, why is better to choose othogonal eigen vectors with symmtric matrices (with respect to diagonalisation)?
 
  • #8
It is not really a matter of "symmetric" matrices. An n by n matrix is diagonalizable if and only if it has n independent eigenvectors (it can be shown that this is true of all symmetric matrices but is also true of some other matrices) and then they can be chosen to be orthonormal.
 
  • #9
HallsofIvy said:
It is not really a matter of "symmetric" matrices. An n by n matrix is diagonalizable if and only if it has n independent eigenvectors (it can be shown that this is true of all symmetric matrices but is also true of some other matrices) and then they can be chosen to be orthonormal.

Was it wrong what I said? Surely not every diagonalizable matrix has an orthonormal basis of eigenvectors? Or am I missing something?
 
  • #10
Yes, it is true that every diagonalizable matrix has an orthonormal basis. More correctly- if a matrix is diagonalizable there exist a basis for the vector space consisting of orthonormal eigenvectors of that martrix.

Eigenvectors corresponding to different eigenvectors are necessarily orthogonal while eigenvectors corresponding to the same eigenvalue form a subspace. We can construct an orthonormal basis for that subspace.
 
Last edited by a moderator:
  • #11
HallsofIvy said:
Eigenvectors corresponding to different eigenvectors are necessarily orthogonal
The diagonalization formula tells us exactly how to construct a 2x2 matrix with any given pair of distinct real numbers as eigenvalues and any given pair of linearly independent vectors as eigenvectors.

You are thinking of symmetric matrices.
 
  • #12
HallsofIvy said:
Yes, it is true that every diagonalizable matrix has an orthonormal basis. More correctly- if a matrix is diagonalizable there exist a basis for the vector space consisting of orthonormal eigenvectors of that martrix.

Eigenvectors corresponding to different eigenvectors are necessarily orthogonal while eigenvectors corresponding to the same eigenvalue form a subspace. We can construct an orthonormal basis for that subspace.

Consider the matrix

[tex]\left(\begin{array}{ccc}1 & 2 & 0\\ 0 & 3 & 0\\ 2 & -4 & 2\\ \end{array}\right)[/tex]

It has eigenvalues 1, 2 and 3 and the eigenspaces are generated by (-1,0,2), (0,0,1) and (-1,-1,2). But as you can see, they are not orthogonal!
 
  • #13
Hi guys, thanks for all the feedback. It stills seems to be a little bit of a grey area, so if anyone can clarify all this it would be great!
 
  • #14
Clarify what? I see no "gray area". An n by n matrix is diagonalizable if and only if it has n independent eigenvectors. You can always choose those vectors to be orthonormal so that the martrices P and P-1 are orthogonal but this is not necessary. It is "better" sometimes to choose P orthogonal because then P-1 is easier to calculate.
 
  • #15
So, when is it that P^T A P gives a diagonal matrix? Is it only when the eigenvectors are orthonormal?
 
  • #16
HallsofIvy said:
You can always choose those vectors to be orthonormal so that the martrices P and P-1 are orthogonal but this is not necessary.

But I gave a counterexample of that above! :frown:
 
  • #17
HallsofIvy said:
You can always choose those vectors to be orthonormal so that the martrices P and P-1 are orthogonal but this is not necessary.

micromass said:
But I gave a counterexample of that above! :frown:

Couldn't you use the Gram Schmidt Process to orthonormalise those eigenvectors?
 
  • #18
seanc12 said:
Couldn't you use the Gram Schmidt Process to orthonormalise those eigenvectors?
Sure, but the resulting vectors (typically) won't be eigenvectors.
 

FAQ: Finding a Matrix P for Non-symmetric Diagonalization: To Normalize or Not?

What is non-symmetric diagonalization?

Non-symmetric diagonalization is a mathematical process that involves transforming a non-symmetric matrix into a diagonal matrix, where all of the off-diagonal elements are equal to zero. This is achieved by finding a matrix that transforms the original matrix into a diagonal one.

Why is non-symmetric diagonalization important?

Non-symmetric diagonalization is important in various fields such as physics, engineering, and computer science. It allows for the simplification of complex systems and makes it easier to solve equations and analyze data.

How is non-symmetric diagonalization different from symmetric diagonalization?

The main difference between non-symmetric diagonalization and symmetric diagonalization is that the latter can only be applied to symmetric matrices, where the entries are mirrored across the main diagonal. Non-symmetric diagonalization can be applied to any type of matrix.

What are the applications of non-symmetric diagonalization?

Non-symmetric diagonalization has various applications, including solving differential equations, analyzing quantum systems, and predicting the behavior of complex systems. It is also used in data analysis, such as in principal component analysis.

What are the limitations of non-symmetric diagonalization?

One of the main limitations of non-symmetric diagonalization is that it can be computationally expensive for large matrices. Additionally, it may not always be possible to find a matrix that can fully transform a non-symmetric matrix into a diagonal one. In these cases, approximations may be used instead.

Similar threads

Replies
2
Views
688
Replies
8
Views
2K
Replies
3
Views
896
Replies
6
Views
3K
Replies
14
Views
2K
Replies
10
Views
2K
Back
Top