Eigen-vectors/values under row flipping

  • Thread starter onako
  • Start date
  • Tags
    Row
In summary, after the eigendecomposition of the following matrix is performed, I wonder what happens to the eigenvectors and eigenvalues of the matrix obtained by flipping rows of the original. According to the online matrix calculator, the eigendecomposition data of the original and flipped version have no relation. However, when SVD is performed on the original and flipped, they appear to be related: right singular vectors and singular values of the original and flipped are the same, and they are equal to the eigenvectors and eigenvalues of the original.
  • #1
onako
86
0
After the eigendecomposition of the following matrix is performed, I wonder what happens to the eigenvectors and eigenvalues of the matrix obtained by flipping rows of the original. Say the original is
0 5 7 8
5 0 2 9
7 2 0 3
8 9 3 0
and the flipped version is:
5 0 2 9
8 9 3 0
7 2 0 3
0 5 7 8
Using the online matrix calculator says (at first glance) that the eigendecomposition data of the original and flipped version have no relation. But, SVD of the original and the flipped appear to be related: right singular vectors and singular values of the original and flipped are the same, and they are equal to the eigenvectors and eigenvalues of the original. (left singular vectors of flipped matrix are also flipped). I would like to hear the reasoning behind this behaviour.
More precisely, what is the relation between the eigendecomposition of the original and the flipped version, and how might this be related to the svd.

Thanks
 
Physics news on Phys.org
  • #2
Generally speaking when you flip one row with another, that can be described by a matrix acting from the left onto the right of another matrix, i.e.

A(the flipping matrix) * original_matrix = matrix with flipped rows
(likewise, original_matrix * A = matrix_with_flipped_columns)
In this particular case to flip, A has to be something simple like a re-organized identity matrix.

So in the case of the SVD the U vectors absorb the row flipping that you do to your original matrix. (Since yourMatrix, M = U*E*V_t).

For eigenvectors the U and V have to be same vectors, so now the U can't just absorb the change, and a new set of eigenvectors has to be found.
 
  • #3
So, eigevectors and eigenvalues of the original and flipped matrix are not related (meaning that the eigendecomposition of the flipped matrix needs to be computed anew)?
 
  • #4
Short answer is that it needs to be recomputed anew.

Start with your basic eigen equation:

Ax=x * [tex]\lambda[/tex]
where A is your original matrix and x is an eigenvector, and [tex]\lambda[/tex] is the constant eigenvalue.

Now rotate (or apply some matrix transformation such as row flipping) by R on the left.

Now:
R*A*x= R*x*[tex]\lambda[/tex]

To make the left x equal to the right R*x so that those will both be eigen values we can put in an identity matrix between A and x, and assuming that R is invertible, set it equal to R[tex]^{-1}[/tex] * R.

So you have
{R*A*R[tex]^{-1}[/tex]} * {R*x} = {R*x} * [tex]\lambda[/tex]

In that case now your eigenvectors would be easily related. But in your case there is no R^-1 being applied, i.e. your matrix is not similar to the original matrix and will have its own eigenvalues/vectors.

(This is really annoying the Latex isn't showing up properly, so if you see a -1 and \lambda swapped between the last 2 equations that is why).
 
Last edited:
  • #5
Wow -- last one blew up -- I pulled out the coding, hopefully this is a little easier to read.
----------------------------------------------------------------------------

Short answer is that it needs to be recomputed anew.

Start with your basic eigen equation:

Ax=x * \lambda
where A is your original matrix and x is an eigenvector, and \lambda is the constant eigenvalue.

Now rotate (or apply some matrix transformation such as row flipping) by R on the left.

Now:
R*A*x= R*x*\lambda

To make the left x equal to the right R*x so that those will both be eigen values we can put in an identity matrix between A and x, and assuming that R is invertible, set it equal to R^{-1} * R.

So you have
(R*A*R^{-1} ) * (R*x) = (R*x) * \lambda

In that case now your eigenvectors would be easily related. But in your case there is no R^-1 being applied, i.e. your new matrix is not similar to the original matrix and will have its own eigenvalues/vectors.
 
  • #6
And now that I take all the latex out, the old latex starts working fine!

I give up :)
 
  • #7
Thanks.
 

FAQ: Eigen-vectors/values under row flipping

What are eigen-vectors and eigen-values?

Eigen-vectors and eigen-values are mathematical concepts used in linear algebra to describe the properties of a linear transformation. Eigen-vectors are special vectors that, when multiplied by a linear transformation, only change in scale, not direction. Eigen-values are the corresponding scalar values that indicate the amount of scaling.

What is row flipping?

Row flipping is a matrix operation where the rows of a matrix are rearranged in reverse order. This can be done horizontally (flipping left to right) or vertically (flipping top to bottom). Row flipping can be used to transform a matrix without changing its underlying structure or properties.

How does row flipping affect eigen-vectors and eigen-values?

Under row flipping, eigen-vectors and eigen-values are preserved. This means that the eigen-vectors and eigen-values of a matrix will remain the same, even after the rows have been flipped. This is because row flipping only changes the order of the rows, but does not change the underlying linear transformation.

What are the practical applications of studying eigen-vectors and eigen-values?

The study of eigen-vectors and eigen-values is important in many fields such as physics, engineering, and computer science. These concepts are used in data analysis, image processing, and machine learning to understand and analyze complex systems. They also have applications in quantum mechanics, fluid dynamics, and many other areas of science and engineering.

How can I calculate eigen-vectors and eigen-values?

There are various methods for calculating eigen-vectors and eigen-values, such as the power method, QR algorithm, and Jacobi method. These methods involve manipulating the matrix algebraically or numerically to find the values that satisfy the eigenvalue equation. There are also software packages, such as MATLAB and Mathematica, that can calculate eigen-vectors and eigen-values for you.

Similar threads

Replies
2
Views
1K
Replies
5
Views
372
Replies
7
Views
1K
Replies
7
Views
1K
Replies
5
Views
2K
Back
Top