Eigenvalues of a matrix B= f(A) given eigenvalues of A

In summary: A) = sum f(ri)*Ei. In particular, we can take f(x) = exp(3x) + 5. Finally, we have the orthogonal basis vi, and we have that f(A)*vi = f(ri)*vi. In particular, f(A)*|1> = (exp(3) + 5)|1>, etc. I hope that this helps.In summary, the theorem in matrix analysis states that for an nxn matrix A with n distinct eigenvalues and for any analytic function f(x), the function f(A) can be expressed as a sum of matrices Ei multiplied by the corresponding eigenvalue f(lambda_i). In the case of a
  • #1
L-x
66
0

Homework Statement


Find the eigenvalues/vectors of A. (I can do this bit :P, A is a 3x3 matrix)
What are the eigenvalues and eigenvectors of the matrix B = exp(3A) + 5I, where I is
the identity matrix?

Homework Equations





The Attempt at a Solution



I have (correctly) found that A has eigenvectors and corresponding eigenvalues such that
A|1>=|1>, A|2>=2|2>, A|4>=4|4>.
|1>=(1,1,1), |2>=(1,-1,0), |4>=(1,1,-2) though i don't actually think you need this.

as B can be expanded in a power series of A, and A commutes with the identity (obiously)

[B,A]=0

=> |1>, |2>, |4> are eigenvectors of B

I am not confident in the reasoning that follows, does it seem correct?

exp(3A)|j>=e^(3j)|j>
5I|j>=5|j>
j=1,2,4
=> the eigenvectors of B are |1>, |2>, |4> with respective eigenvalues (e^3 +5), (e^6 +5), (e^12 +5)

Does this seem correct? Thanks in advance for your help.
 
Physics news on Phys.org
  • #3
There is a theorem in matrix analysis that for an nxn matrix A having n distinct eigenvalues (lambda_1, ..., lambda_n), and for any analytic function f(x) = sum_{n=0}^infinity c_n *x^n, the function f(A) =def= c_0*Identity + sum_{n=1}^infinity c_n * A^n has the form
f(A) = sum_{j=1}^{n} E_j * f(lamba_j), where E_1,...,E_n are nxn matrices that do not depend on the form of f(.) . Furthermore, if v1, v2,..., vn are eigenvectors for the n eigenvalues, we have E_i*v*j = 0 for i =/= j. So, in your case, taking f(x) = x^0 we have Identity = 1^0*E1 + 2^0*E2 + 3^0*E3 = E1+E2+E3, taking f(x) = x gives A = E1 + 2E2 + 3E3, and taking f(x) = x^2 gives A^2 = E1 + 4*E2 + 9*E3. So, you can determine E1, E2 and E3. Then, of course, for f(x) = exp(3x) + 5 we have f(A) = E1*(exp(3)+5) + E2*(exp(6)+5) + E3*(exp(9)+5). Thus, f(A)*v1 = (exp(3)+5)*v1, etc. (Note that you do not need to a actually find E1, E2 and E3 to answer the question.)

RGV
 
  • #4
Ray Vickson said:
There is a theorem in matrix analysis that for an nxn matrix A having n distinct eigenvalues (lambda_1, ..., lambda_n), and for any analytic function f(x) = sum_{n=0}^infinity c_n *x^n, the function f(A) =def= c_0*Identity + sum_{n=1}^infinity c_n * A^n has the form
f(A) = sum_{j=1}^{n} E_j * f(lamba_j), where E_1,...,E_n are nxn matrices that do not depend on the form of f(.) . Furthermore, if v1, v2,..., vn are eigenvectors for the n eigenvalues, we have E_i*v*j = 0 for i =/= j. So, in your case, taking f(x) = x^0 we have Identity = 1^0*E1 + 2^0*E2 + 3^0*E3 = E1+E2+E3, taking f(x) = x gives A = E1 + 2E2 + 3E3, and taking f(x) = x^2 gives A^2 = E1 + 4*E2 + 9*E3. So, you can determine E1, E2 and E3. Then, of course, for f(x) = exp(3x) + 5 we have f(A) = E1*(exp(3)+5) + E2*(exp(6)+5) + E3*(exp(9)+5). Thus, f(A)*v1 = (exp(3)+5)*v1, etc. (Note that you do not need to a actually find E1, E2 and E3 to answer the question.)

RGV

This looks interesting but I'm having quite a lot of trouble following your notation. Do you know the name of the theorem so i could look it up perhaps?

Thanks for the help.
 
  • #5
L-x said:
This looks interesting but I'm having quite a lot of trouble following your notation. Do you know the name of the theorem so i could look it up perhaps?

Thanks for the help.

The old books “Matrices”, by Gantmacher, or the book “Theory of Matrices”, by P. Lancaster (Academic Press, 1969) have these results. I am sure there are numerous more modern treatments, but those two are the ones I have on my bookshelf.

Anyway, below I will give a short proof for the case of a 3 x 3 matrix A having 3 distinct, non-zero real eigenvalues r1, r2 and r3. The same form of proof goes through for a general nxn matrix with n distinct, real, nonzero eigenvalues. The basic result is the same, but with a somewhat different proof, if some of the eigenvalues are complex or some are zero. Distinctness remains important; without it we need to invoke Jordan Canonical forms, and f(A) can involve not only f(r) for eigenvalue r, but also f’(r). f’’’(r), etc., depending on the multiplicity of r and the form of its Jordan block(s).

So, let ui and vi be the left and right-eigenvectors of A for eigenvalue ri, chosen so that ui*vi = 1. Note that for rj different from ri we have ui*vj = 0; see, eg,
http://answers.yahoo.com/question/index?qid=20100729070957AAlQuHF . Note that the vi form a basis for the whole space. Now let Ei = vi*ui for all i; we have that Ei is a 3x3 matrix because it is of the form column x row in that order. The orthogonality results imply that Ei*Ei = Ei and Ei*Ej = 0 if i is different from j. Letting B = sum ri*Ei, we have that B*vi = ri*vi for all i, hence (B-A)*vi = 0 for all vi, hence (B-A)w = 0 for all vectors w, because all such w have the form sum ci*vi . Thus, B = A, so we have shown that A = sum ri*Ei. Now look at C = E1+E2+E3. We have C*vi = vi for all i, so C*w = w for all vectors w. That is, C = Identity matrix I. Now we are almost done.

Look at any polynomial p(x) = c0 + c1*x + c2*x^2 + … + cm*x^m, and _define_ p(A) = c0*I + c1*A + c2*A^2 + … + cm*A^m. We have A^2 = sum ri^2 * Ei*Ei + sum_{i < j}ri*rj Ei*Ej + sum_{j < i} rj*ri*Ej*Ei = sum ri^2 * Ei + 0 + 0. Similarly, A^3 = sum ri^3*Ei, etc. Finally, we have I = sum Ei = sum ri^0 * Ei (this is where having all ri nonzero comes in!), so we have that p(A) = c0*A^0 + c1*A^1 + … + cm*A^m is of the form p(A) = sum p(ri)*Ei. If f(x) is an analytic function whose radius of convergence includes the largest |eigenvalue|, then the same type of argument goes through, by taking some limits. The result is f(A) = sum f(ri)*Ei.

This stuff is typically used in solving constant-coefficient couples ODEs, because if x is a vector and A is a constant matrix, the DE system dx/dt = Ax has solution x(t) = exp(A*t)*x(0), so we need to know how to compute matrix exponentials.

RGV
 
  • #6
How about just changing to the basis given by the eigenvectors (they should be linearly independent since there are 3 distinct eigenvalues). then in this basis A is just diagonal so the exponent is just the diagonal matrix with the exponents on the diagonal (so the diagonal entries exponentiated on the diagonal).
 

FAQ: Eigenvalues of a matrix B= f(A) given eigenvalues of A

What is the relationship between the eigenvalues of a matrix B and a function of the eigenvalues of matrix A?

The eigenvalues of a matrix B are equal to the function of the eigenvalues of matrix A. In other words, if matrix B is a function of matrix A, then the eigenvalues of B will be equal to the function of the eigenvalues of A.

How are the eigenvalues of a matrix B related to the eigenvalues of matrix A?

The eigenvalues of a matrix B are related to the eigenvalues of matrix A through the function that defines the relation between the two matrices. This function can be a simple algebraic expression or a complex mathematical formula.

Can the eigenvalues of a matrix B be calculated if the eigenvalues of matrix A are known?

Yes, if the function that defines the relation between matrix B and A is known, then the eigenvalues of matrix B can be calculated using the eigenvalues of matrix A. However, if the function is unknown, then the eigenvalues of B cannot be calculated.

Are there any limitations or restrictions on the function that relates matrix B to A for calculating eigenvalues?

There are no specific limitations or restrictions on the function, as long as it is well-defined and can be applied to the eigenvalues of matrix A. However, certain functions may result in complex eigenvalues, which may not be practical for certain applications.

How are the eigenvectors of matrix B related to the eigenvectors of matrix A?

The eigenvectors of matrix B are also related to the eigenvectors of matrix A through the same function that defines the relation between the matrices. In other words, if matrix B is a function of matrix A, then the eigenvectors of B will be equal to the function of the eigenvectors of A.

Back
Top