Optimizing Eigenvalues of Matrices: A Creative Approach

In summary, the homework statement is that there is an equation for the eigenvalues of a matrix, but it is more complex than the usual method. The Attempt at a Solution provides a few references to help with understanding the concept of similar matrices and how they are unitarily equivalent.
  • #1
elphin
18
0

Homework Statement


Homework Equations


The Attempt at a Solution



the usual method i.e. det(A - bI) = 0

i get the equation finally as b[3][/SUP] - 75b[2][/SUP] + 1850b -15576 = 0

from this i get b[1][/SUB][2][/SUP] + b[2][/SUB][2][/SUP] + b[3][/SUB][2][/SUP] = 1925 < 1949

is there an easier/more creative method??

forgot to add tags .. don't delete..
 

Attachments

  • 2a.jpg
    2a.jpg
    28.7 KB · Views: 396
Last edited:
Physics news on Phys.org
  • #3
Another way to look at this: Are you familiar with the fact that every complex square matrix is unitarily equivalent to an upper triangular matrix and the properties that are invariant under such an equivalence?
 
  • #4
stringy said:
Another way to look at this: Are you familiar with the fact that every complex square matrix is unitarily equivalent to an upper triangular matrix and the properties that are invariant under such an equivalence?

no idea... never heard of it before at all! :smile:
 
  • #5
Hmmm, perhaps HallsofIvy has something else in mind that is simpler than my plan of attack. I'm not sure how to solve this problem by just looking at the spectral radius though, so I can't help with that.

I'll give you a quick rundown on what I was getting at, and I'll provide a few references...

Are you familiar with the notion of "similar matrices?" If not, the general idea is that two square matrices A,B are similar when there exists an invertible P such that

[tex] A = P^{-1}BP.[/tex]

It can be shown that similar matrices share a lot of properties and we say that these properties are invariant under similarity. This definition may look strange, but if you study the matrix representations of abstract linear transformations with respect to different bases, it turns out that those matrices will be related via this "similarity" equation [itex] A=P^{-1}BP.[/itex] The matrix P is called the "change-of-basis" matrix.

Now, two matrices A,B are unitarily equivalent when there exists a unitary matrix U such that

[tex] A = U^* BU.[/tex]

A unitary matrix [itex]U[/itex] is a matrix whose conjugate transpose [itex]U^*[/itex] is equal to its inverse. So, then, by definition unitarily equivalent matrices ARE similar, but the converse is in general false. So since unitary similarity is a "stronger" condition on matrices in a certain sense, we expect that there will be additional properties that are invariant under unitary similarity in addition to the ones that are invariant under "standard" similarity.

Here's an example. The trace of a matrix is an invariant property under similarity. But for unitary similarity, not only is the trace invariant, but (1) the sum of the squares of the absolute values of ALL the matrix entries is invariant (or, in other words, the http://en.wikipedia.org/wiki/Frobenius_norm#Frobenius_norm" is invariant). So, therefore, we use the fact that ANY matrix is unitarily equivalent to an upper triangular matrix (and where are the eigenvalues on an upper triangular matrix? :smile:) and property (1) to find the bound [itex]\sqrt{1949}.[/itex]

If you want to learn why every matrix is unitarily equivalent to an upper triangular matrix, you can learn it from where I learned it, http://books.google.com/books?id=Pl...CDYQ6AEwBA#v=onepage&q=theorem schur&f=false", on page 79. A proof of property (1) is back on page 73. :biggrin:
 
Last edited by a moderator:
  • #6
Nice problem! :smile:

I've been puzzling on my own what you can say about this matrix.

Note that the values on the main diagonal are large, while the others are small.
This means the eigenvalues will be close to the values on the main diagonal.

Furthermore, the matrix is *almost* symmetric.
A symmetric matrix is unitarily equivalent to a diagonal matrix that has exactly the eigen values on its main diagonal and the rest are zeroes (whereas an upper triangular matrix has its eigenvalues on the main diagonal).

If the matrix were symmetric, its frobenius norm would be identical to the root of the sum of the squared eigenvalues.
As it is the frobenius norm is just a little higher.

The determinant of the matrix is 15000, which is also the product of the eigenvalues.
Any eigenvalues that are rational numbers, have to be whole numbers that divide 15000 (due to the Rational root theorem).
Since we already know the eigenvalues have to be around 21, 26, and 28, the closest dividers to check are 20, 25, and 30.

Furthermore the sum of the eigenvalues has to be the trace of the matrix, which is 75.
Belo and hold: 20+25+30=75. We have a match! :cool:

So we can quickly find the eigenvalues and calculate the exact value for the root requested.
 
  • #7
You have the equation for the eigenvalues: b3 - 75b2 + 1850b -15576 = 0. If there are three real roots λ1, λ2, λ3, the equation can be written in the form (b-λ1)(b-λ2)(b-λ3)=0. Do the multiplications and compare the coefficients of each power of b with the ones in the original equation.
You get λ122232 without calculating the λ-s.

ehild
 
  • #8
One small correction, the equation should be: b3 - 75b2 + 1850b - 15000 = 0. :wink:
 
  • #9
ehild said:
You get λ122232 without calculating the λ-s.

ehild

Nice! :smile:
I just got what you were hinting at.
It doesn't even matter that the last coefficient is wrong!
 
  • #10
I like Serena said:
Nice! :smile:
I just got what you were hinting at.
It doesn't even matter that the last coefficient is wrong!

You were very fast!
I did not check the equation :shy: Luckily, the last term does not matter.

ehild
 

FAQ: Optimizing Eigenvalues of Matrices: A Creative Approach

What are matrices and eigenvalues?

Matrices are rectangular arrays of numbers or symbols that are used for representing and manipulating data in various mathematical operations. Eigenvalues, on the other hand, are a special set of numbers associated with a specific square matrix that represent the scaling factor of a transformation in linear algebra.

What is the importance of eigenvalues in matrices?

Eigenvalues are crucial in understanding the behavior of a linear transformation. They help in determining the stability of a system, finding the principal components of a dataset, and solving differential equations, among other applications.

How do you find eigenvalues of a matrix?

To find the eigenvalues of a matrix, you need to solve the characteristic equation of the matrix, which is obtained by setting the determinant of the matrix minus a scalar multiple of the identity matrix equal to zero. The resulting values are the eigenvalues of the matrix.

Can a matrix have more than one eigenvalue?

Yes, a matrix can have multiple eigenvalues. The number of eigenvalues a matrix has is equal to its dimension, but some of these eigenvalues may have multiplicity, meaning they have more than one corresponding eigenvector.

What is the significance of the eigenvectors associated with eigenvalues?

Eigenvectors are the corresponding vectors to eigenvalues, and they represent the direction of the linear transformation associated with the eigenvalue. They are essential in understanding the behavior of a linear transformation and can be used to decompose a matrix into its principal components.

Similar threads

Replies
5
Views
2K
Replies
3
Views
1K
Replies
2
Views
11K
Replies
11
Views
5K
Replies
7
Views
2K
Back
Top