How to Find the Generalized Eigenvector in a Matrix ODE?

In summary, the conversation discusses finding the generalized eigenvector matrix for a set of ODE's represented in matrix format. The matrix has algebraic multiplicity 3 and geometric multiplicity 2. The conversation also addresses a possible error in the journal manuscript, as well as the issue of choosing eigenvectors when there are degenerate eigenvalues. The final conclusion is that the sixth column of the eigenvector matrix is not a random vector, but one that allows for the calculation of a Jordan chain.
  • #1
Alwar
5
1
TL;DR Summary
To find the generalized eigenvectors of a 6x6 matrix
Hi,

I have a set of ODE's represented in matrix format as shown in the attached file. The matrix A has algebraic multiplicity equal to 3 and geometric multiplicity 2. I am trying to find the generalized eigenvector by algorithm (A-λI)w=v, where w is the generalized eigenvector and v is the eigenvector found by (A-λI)v=0. But I am not able to get the eigenvector matrix M as shown in the attached file.

Any help could be useful.Thanks,
Alwar
 

Attachments

  • Problem file.pdf
    141.6 KB · Views: 200
  • Like
Likes Delta2
Physics news on Phys.org
  • #2
I'm a little confused, it says the eigenvalues are ##\pm k^2## but then the Jordan normal form only has ##\pm k## on the diagonal. Aren't the diagonal elements supposed to be the eigenvalues?
 
  • #3
Would you be a little more specific about the problem you're having? You say you can't find the matrix of eigenvectors. Does that mean you are unable to calculate any particular eigenvector? If I understand your question, we can set aside the generalized eigenvectors for now, correct?
 
  • #4
Office_Shredder said:
I'm a little confused, it says the eigenvalues are ##\pm k^2## but then the Jordan normal form only has ##\pm k## on the diagonal. Aren't the diagonal elements supposed to be the eigenvalues?
This is the part of a manuscript published in the journal. I think they have mentioned eigenvalues wrongly as k^2. When I calculated its only k. I hope the Jordan matrix is correct.

Thanks.
 
  • #5
Haborix said:
Would you be a little more specific about the problem you're having? You say you can't find the matrix of eigenvectors. Does that mean you are unable to calculate any particular eigenvector? If I understand your question, we can set aside the generalized eigenvectors for now, correct?
Thanks Haborix. Inspecific, I cannot find the first and sixth column of eigenvectors in matrix M. But when I find the characteristic equation (A-λI)X = 0 for each of the eigenvalue, both these vectors satisfy the solution upon back substitution. But I cannot find them directly. Or the authors have taken any random vector?
 
  • #6
Haborix said:
Would you be a little more specific about the problem you're having? You say you can't find the matrix of eigenvectors. Does that mean you are unable to calculate any particular eigenvector? If I understand your question, we can set aside the generalized eigenvectors for now, correct?
For λ = -sqrt(α^2+β^2), solving the characteristic equation in the matlab, the fourth column vector in M is displayed as solution. However if we manually do, we can get four equations as shown in the attached figure. And the first equation repeats three times meaning infinite solutions. The sixth column vector will satisfy the solution. But is it just a random vector? Or can it be attained as a solution.
 

Attachments

  • IMG_20210730_160242.jpg
    IMG_20210730_160242.jpg
    42.2 KB · Views: 169
  • #7
When you have degenerate eigenvalues you usually pick three eigenvectors such that they are all mutually orthogonal. I think there are some typos in your original attachment in post #1. In a few instances if a ##k## were a ##k^2## then two eigenvectors would be orthogonal. The big picture message is that there is freedom in choosing the eigenvectors for a given eigenvalue with when its multiplicity greater than 1.
 
  • #8
Alwar said:
But is it just a random vector? Or can it be attained as a solution.
I solved for the two eigenvectors ##\vec v_1, \vec v_2## for ##\lambda = -k## the usual way, and I found that the system ##(A-\lambda I)\vec x = \vec v_i## was inconsistent for both eigenvectors. But that was okay because any linear combination of the two is still an eigenvector, and there is one that results in a system with a solution, namely the sixth column of ##M##. So it's not a random vector, but the one that allows you to calculate a Jordan chain.

I found the vector by setting up an augmented matrix where the seventh column was a linear combination of the two eigenvectors and row-reduced until I ended up with a row of zeros in the left six columns. Then I solved for coefficients that caused the corresponding value in the seventh column to vanish.

By the way, I noticed a typo: the bottom diagonal element of ##J## should be ##-k##, not ##k##.
 
  • #9
vela said:
I solved for the two eigenvectors ##\vec v_1, \vec v_2## for ##\lambda = -k## the usual way, and I found that the system ##(A-\lambda I)\vec x = \vec v_i## was inconsistent for both eigenvectors. But that was okay because any linear combination of the two is still an eigenvector, and there is one that results in a system with a solution, namely the sixth column of ##M##. So it's not a random vector, but the one that allows you to calculate a Jordan chain.

I found the vector by setting up an augmented matrix where the seventh column was a linear combination of the two eigenvectors and row-reduced until I ended up with a row of zeros in the left six columns. Then I solved for coefficients that caused the corresponding value in the seventh column to vanish.
Hi Vela,

Thanks for your effort. Can you upload the solution of how you got the sixth column of eigenvector. I hope I will be able to grasp what you have done.
 
  • #10
Sorry, no. I did the calculations using Mathematica and didn't save the notebook.
 
  • Wow
Likes Delta2
  • #11
Alwar said:
Hi Vela,

Thanks for your effort. Can you upload the solution of how you got the sixth column of eigenvector. I hope I will be able to grasp what you have done.
You can probably find a solution using mathworld
 

FAQ: How to Find the Generalized Eigenvector in a Matrix ODE?

What is a generalized eigenvector?

A generalized eigenvector is a vector that satisfies a specific condition when multiplied by a matrix. It is a generalization of the concept of eigenvectors, which are only defined for square matrices.

How is a generalized eigenvector different from a regular eigenvector?

A regular eigenvector is only defined for square matrices and satisfies the equation Av = λv, where A is the matrix, v is the eigenvector, and λ is the corresponding eigenvalue. A generalized eigenvector, on the other hand, satisfies the equation (A - λI)^k v = 0, where k is the size of the matrix and I is the identity matrix.

What is the significance of generalized eigenvectors in linear algebra?

Generalized eigenvectors play a crucial role in the theory of linear algebra, particularly in the study of diagonalizable and non-diagonalizable matrices. They allow us to find a basis for the generalized eigenspace, which is the set of all vectors that satisfy the generalized eigenvector equation for a given eigenvalue.

How are generalized eigenvectors used in practical applications?

In practical applications, generalized eigenvectors are used to solve systems of differential equations, compute matrix exponentials, and diagonalize non-diagonalizable matrices. They also have applications in fields such as physics, engineering, and economics.

Can a matrix have both regular and generalized eigenvectors?

Yes, a matrix can have both regular and generalized eigenvectors. In fact, a matrix may have more generalized eigenvectors than regular eigenvectors, as the generalized eigenvector equation is more general and can have multiple solutions for a given eigenvalue.

Back
Top