What do the directions of eigenvalues represent?

In summary, the individual eigenvectors obtained from principal component analysis may have arbitrary directions and lengths, but the angles between the subspaces they represent can still be used to compare the two data sets. The significance of the eigenvector's direction depends on the application, but in some cases, such as when studying Markov processes, the sign of the eigenvector's coefficients may matter.
  • #1
preet
98
0
Background: I'm having trouble using principal component analysis to try and align two data sets.
I have two sets of 3D point data, and I can use PCA to get principal axes of the two sets of data. I do this by finding the eigenvectors of the covariance matrix for each set of data. This gives me two sets of principal axes defined by 6 eigenvectors. PCA gives me an eigenvalue that says my data corresponds strongly along an axis parallel to the eigenvector and through the centroid of the data. Therefore, the eigenvector could be pointing in the opposite direction and the axis would still be the same. I want to know the significance of the direction in which the eigenvector is pointing (what is the difference between the eigenvector, and that eigenvector * -1).

Problem: I'm trying to align the two data sets using PCA -- but I can't do this if the corresponding direction vectors are allowed to point in the opposite direction. So if the one of the axes 'x' from data set 1 is pointing close to a higher elevation, and axes 'x' from data set 2 is pointing in the opposite direction, I'm in trouble... I hope this makes some sense.
 
Physics news on Phys.org
  • #2
In determining eigenvectors, the choice of +/- and length is arbitrary, since every point along the axis you described is an eigenvector. So you want to compare the axes not the specific eigenvectors (i.e. the angle between the subspaces of the eigenvectors).

The situation gets more complicated if there are repeated eigenvalues (e.g. 2x2 identity matrix) where the choice of direction is even more arbitrary, e.g. {(1,0),(0,1)} and {(1,1),(1,-1)} are both valid pairs of eigenvectors for the identity matrix. However subspace angles should still be helpful in this situation.

Hope this helps. Out of curiosity what is the procedure to align the two data sets?
 
  • #3
You mean, of course, the directions of the eigenvectors, not eigenvalues.

What they "mean" depends on the application. For example, I could write the quadratic form [itex]x^2+ 4xy+ 3y^2[/itex] as a matrix formula:
[tex]\begin{bmatrix}x & y \end{bmatrix}\begin{bmatrix}1 & 2 \\ 2 & 3\end{bmatrix}\begin{bmatrix}x \\ y \end{bmatrix}[/tex]
Because that is a symmetric matrix, it will have two independent eigenvectors: their directions will be the axes of symmetry- for example, the two axes if this is a ellipse.
 
  • #4
I can think of one case where the sign of the eigenvector's coefficients matters. If you are studying Markov processes, and your matrix represents the transition probabilities, then all the physically meaningful vectors will represents probabilities. Therefore, they will have components that range 0 <= p <= 1, and which always sum to unity.

One of the eigenvalues of such matrices will be one, and the corresponding eigenvector is best represented as a vector with all non-negative components, which all sum up to one. Now, I suppose one could argue that the opposite of that vector is also the eigenvector, but it wouldn't be too useful.
 

Related to What do the directions of eigenvalues represent?

1. What are eigenvalues and eigenvectors?

Eigenvalues and eigenvectors are mathematical concepts used in linear algebra to identify special directions in a given system. Eigenvalues represent the scalars associated with the eigenvectors, which are special vectors that do not change direction when multiplied by a given matrix.

2. What do the directions of eigenvalues represent?

The directions of eigenvalues represent the principal axes or directions of a system. These directions are independent of each other and represent the main directions of variation or change in the system.

3. How are eigenvalues and eigenvectors calculated?

Eigenvalues and eigenvectors can be calculated by solving the characteristic equation of the matrix, which is obtained by subtracting an unknown scalar value from the diagonal elements of the matrix and setting the determinant of the resulting matrix to zero. The resulting solutions are the eigenvalues, and the corresponding eigenvectors can be found by substituting the eigenvalues back into the original equation.

4. What do negative, zero, and complex eigenvalues represent?

Negative eigenvalues represent a flip or reflection of the system in the corresponding eigenvector direction. Zero eigenvalues represent a contraction or expansion of the system in the corresponding eigenvector direction. Complex eigenvalues represent a rotation of the system in the corresponding eigenvector direction.

5. How are eigenvalues and eigenvectors used in real-world applications?

Eigenvalues and eigenvectors have various applications in fields such as physics, engineering, and data analysis. They are used to solve systems of differential equations, identify important features in data sets, and analyze the stability of physical systems. They are also used in image processing and pattern recognition algorithms.

Similar threads

Replies
3
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
1K
  • Linear and Abstract Algebra
Replies
7
Views
8K
  • Linear and Abstract Algebra
Replies
1
Views
2K
  • Calculus and Beyond Homework Help
Replies
2
Views
725
  • MATLAB, Maple, Mathematica, LaTeX
Replies
1
Views
994
  • Advanced Physics Homework Help
Replies
1
Views
1K
  • Linear and Abstract Algebra
Replies
4
Views
3K
  • Linear and Abstract Algebra
Replies
4
Views
5K
Replies
2
Views
2K
Back
Top