Eigenvectors 4x4 Matrix in Mathematica

In summary, you are trying to calculate the eigenvectors of a 4x4 matrix, but you want the actual eigenvalues listed as a variable. For example, you have the matrix:H_F =\left[\begin{array}{cccc}\hbar\Omega&\hbar v_fk_- &0&0\\ \hbar v_fk_+&\hbar\Omega&\frac{v_fe}{c}A_0 &0\\ 0&\frac{v_fe}{c}A_0 &0&\hbar v_fk_-
  • #1
DeathbyGreen
84
16
Hi,

I'm trying to calculate the eigenvectors of a 4x4 matrix, but I don't want the actual eigenvalues included in the solution, I simply want them listed as a variable. For example, I have the matrix:

[itex]
H_F =
\left[
\begin{array}{cccc}
\hbar\Omega&\hbar v_fk_- &0&0\\
\hbar v_fk_+&\hbar\Omega&\frac{v_fe}{c}A_0 &0\\
0&\frac{v_fe}{c}A_0 &0&\hbar v_fk_- \\
0&0&\hbar v_fk_+ &0\\
\end{array}
\right]
[/itex]

My attempt at a solution was just plugging in [itex]H_F-\epsilon\bf{I}[/itex] with [itex]\bf{I}[/itex] the identity matrix, and then using RowReduce. However, this only gives me the identity matrix, which is not the answer I'm looking for. If I just use [itex]H_F[/itex] and use Eigenvectors[[itex]H_F[/itex]] then I get a huge, essentially useless mess of variables. I would like the output to list the eigenvector as a function of a variable which represents the eigenvalue. Is there any way to do this? A code I was using is (with some variable substitutions for easier entry):

Code:
RowReduce((h*w-l,h*v*x,0,0),(h*v*y,hw-l,m,0),(0,m,-l,h*v*x),(0,0,h*v*y,-l))
 
Last edited:
Physics news on Phys.org
  • #2
I guess we can assume that ##H_F## is an isomorphism, at least it looks like one. So one possibility is to simply calculate ##H_F^{-1}##.

As ##0## isn't an eigenvalue, ##H_F\,x = \lambda x## results in ##x_1 \sim x_2\, , \,x_3 \sim x_4## and with this ##x_2 \sim x_4## and ##x_1 \sim x_3##. In total this means ##x_1 \sim x_2 \sim x_3 \sim x_4##. The proportions might be a bit compicated, but not impossible to calculate. The matrix has only ##4## different entries.
 
  • Like
Likes DeathbyGreen
  • #3
Thank you for the response. Could explain how calculating the inverse would help?
 
  • #4
DeathbyGreen said:
Thank you for the response. Could explain how calculating the inverse would help?
Sorry, that was wrong and stupid. I thought inverting the equation would do the trick, but it doesn't. One would need the inverse of ##(H_F-\lambda I)## which is indeed unpleasant considering the polynomials in ##\lambda##. So, sorry for this. But the direct calculation doesn't seem to be too complicated. I wrote it as
$$
\begin{bmatrix}a&b&0&0\\c&a&d&0\\0&d&0&b\\0&0&c&0\end{bmatrix}\cdot \begin{bmatrix}x_1\\x_2\\x_3\\x_4\end{bmatrix} = \begin{bmatrix}\lambda x_1\\\lambda x_2\\\lambda x_3\\\lambda x_4\end{bmatrix}
$$
which was easy to solve, especially if we may assume all variables to be unequal zero and the ##a,b,c,d## share common factors.
 
  • Like
Likes DeathbyGreen
  • #5
No worries! I appreciate you taking a look at it. Maybe it's best to just solve it by hand using the equation you posted. I was hoping there would be a simple way to plug it into Mathematica.
 
  • #6
DeathbyGreen said:
No worries! I appreciate you taking a look at it. Maybe it's best to just solve it by hand using the equation you posted. I was hoping there would be a simple way to plug it into Mathematica.
I don't know Mathematica (anymore), so maybe you could do it in the notation with ##a,b,c,d##. It's a least shorter. But in this case playing around with the program will probably take longer than the few equations will take. The last one will get you rid of ##x_4## immediately, so there are only three variables left. Same for the first row.
 
  • Like
Likes DeathbyGreen
  • #7
DeathbyGreen said:
I would like the output to list the eigenvector as a function of a variable which represents the eigenvalue. Is there any way to do this? A code I was using is (with some variable substitutions for easier entry):
Strictly speaking, no, since there is a discrete set of eigenvectors corresponding to a discrete set of eigenvalues. Trying to write this as a function implies a continuous set of eigenvalues. But maybe I have misunderstood your question.
DeathbyGreen said:
If I just use HFHFH_F and use Eigenvectors[HFHFH_F] then I get a huge, essentially useless mess of variables.
This is not surprising since the general element-wise expression for the eigenvectors and eigenvalues of a 4x4 matrix is very large.
 
  • Like
Likes DeathbyGreen
  • #8
Maybe I didn't explain it well enough. So I rewrite the 4x4 matrix in a new basis (formed from eigenvectors corresponding to degenerate eigenvalues) which is a 2x2 matrix. I have solved for the eigenvalues of the 2x2 matrix. What I want to do is take those eigenvalues of the 2x2, and plug them into the 4x4 matrix eigenvector equation to get a two state solution. I wanted to leave the eigenvalues represented as [itex]\epsilon[/itex] during the solution process to make the algebra easier because even the 2x2 eigenvalues are pretty nasty.
 
  • #9
DeathbyGreen said:
So I rewrite the 4x4 matrix in a new basis (formed from eigenvectors corresponding to degenerate eigenvalues) which is a 2x2 matrix.
Are you saying you found eigenvectors of a 2x2 matrix and constructed a 4x4 matrix from them?
 
  • Like
Likes DeathbyGreen
  • #10
No, I solved the eigenvalues of the 2x2 and want to find the eigenvectors of the 4x4 with them

1) set [itex]A_0=0[/itex] in the 4x4 matrix, solve for eigensystem
2) Take two degenerate branches (degenerate at a k value of [itex]k=k_0=\frac{\Omega}{2v_F}[/itex] with k in polar coordinates) and rewrite the 4x4 into a 2x2 [itex]A_0\neq0[/itex]
3) solve for eigenvalues of the 2x2
4) Solve for the eigenvectors of the 4x4 matrix again by using the eigenvalues of the 2x2 with [itex]A_0\neq0[/itex]
 
  • #11
Alright, but I don't think the eigenvalues will be the same for the cases ##A_{0}=0## and ##A_{0}\neq 0##. We know the eigenvalues can be determined by solving
$$\text{det}(H_{F}-\lambda I)=0$$
This determinate will be a lengthy expression but it should still depend on the value of ##A_{0}## so I imagine the eigenvalues will also depend on ##A_{0}##.
 

FAQ: Eigenvectors 4x4 Matrix in Mathematica

What is an Eigenvector in a 4x4 matrix?

An Eigenvector is a special vector in a 4x4 matrix that does not change its direction when multiplied by the matrix. Instead, it only changes its magnitude by a constant factor, known as the Eigenvalue. In other words, an Eigenvector is a vector that represents a stable direction of change within the matrix.

How can I find Eigenvectors of a 4x4 matrix in Mathematica?

To find the Eigenvectors of a 4x4 matrix in Mathematica, you can use the built-in function Eigenvectors[m]. This function takes in a matrix m and returns a list of Eigenvectors corresponding to the Eigenvalues of the matrix.

Can a 4x4 matrix have more than 4 Eigenvectors?

No, a 4x4 matrix can have at most 4 linearly independent Eigenvectors. This is because the number of Eigenvectors is equal to the dimension of the matrix, and a 4x4 matrix has a dimension of 4.

Can two Eigenvectors of a 4x4 matrix have the same Eigenvalue?

Yes, it is possible for two Eigenvectors of a 4x4 matrix to have the same Eigenvalue. This means that these two Eigenvectors represent the same stable direction of change within the matrix.

Why are Eigenvectors and Eigenvalues important in 4x4 matrices?

Eigenvectors and Eigenvalues are important in 4x4 matrices as they provide valuable information about the behavior and properties of the matrix. They can be used to simplify calculations, solve differential equations, and identify important patterns within the data represented by the matrix.

Similar threads

Replies
2
Views
2K
Replies
1
Views
2K
Replies
3
Views
2K
Replies
14
Views
2K
Replies
12
Views
2K
Replies
4
Views
4K
Back
Top