- #1
nomadreid
Gold Member
- 1,729
- 229
- TL;DR Summary
- The following argument is obviously wrong somewhere. If a is an eigenvector for the matrix M, then for each of its eigenvectors v, Mv=av. But then for any nonzero k, M(k*v)= (a/k)(k*v), so that for M, a/k is an eigenvector for the eigenvector k*v. This would lead to the absurdity that if an eigenvalue exists, then everything is an eigenvalue for the matrix.
The fallacy in the summary is not covered in the sites discussing eigenvalues, so there must be something blindingly and embarrassingly obvious that is wrong. I would be grateful if someone would point it out. Thanks.