Why can't we define an eigenvalue of a matrix as any scalar value?

In summary, the reason why we cannot say λ=1 and then 1 would be the eigenvalue of the matrix is because the column vector must be the same on both sides of the equation for it to be an eigenvalue/eigenvector pair. In the given example, the column vector on the right is not equal to the constant on the left, making 1 not a valid eigenvalue.
  • #1
member 731016
Homework Statement
Please see below
Relevant Equations
Please see below
For this,
1682737134968.png

Dose anybody please know why we cannot say ##\lambda = 1## and then ##1## would be the eigenvalue of the matrix?

Many thanks!
 
Physics news on Phys.org
  • #2
The result of the multiplication is ##\begin{bmatrix} 1 \\ 5 \end{bmatrix}##, not ##\begin{bmatrix} \lambda \\ 0 \end{bmatrix}##, so it doesn't matter what the value of ##\lambda## is.
 
  • Like
Likes member 731016
  • #3
ChiralSuperfields said:
Dose anybody please know why we cannot say λ=1 and then 1 would be the eigenvalue of the matrix?
"Dose" -- an amount of medicine.
"Does" -- third person singular conjugation of the infinitive verb "to do."

An eigenvalue ##\lambda## is a number such that for an eigenvector x, ##A\mathbf x = \lambda \mathbf x##.

For the matrix you asked about ##\begin{bmatrix}1 & 6 \\ 5 & 2\end{bmatrix} \begin{bmatrix}1 \\ 0 \end{bmatrix} = \begin{bmatrix}1 \\5 \end{bmatrix} \ne \lambda \begin{bmatrix}1 \\ 0 \end{bmatrix}## for any value of ##\lambda##.
 
  • Like
Likes member 731016
  • #4
Thank you for your replies @FactChecker and @Mark44!

Sorry I still don't I understand. I'll try to explain what my understanding is so that any misconception can be exposed. ##\lambda## is the constant in front that is factor out of the column vector ##\vec x## which is called the eigenvalue. For examples 1 and 2 below, the constant multiplied to the column vector is ##\lambda = 7 , -4## respectively.
1682746269493.png


However, for this example,

##\begin{bmatrix}1 & 6 \\ 5 & 2\end{bmatrix} \begin{bmatrix}1 \\ 0 \end{bmatrix} = \begin{bmatrix}1 \\5 \end{bmatrix}##, why can't we factor out a 1 from the column vector to get ##\begin{bmatrix}1 & 6 \\ 5 & 2\end{bmatrix} \begin{bmatrix}1 \\ 0 \end{bmatrix} = 1 \begin{bmatrix}1 \\5 \end{bmatrix}##.

According to the textbook, ##\lambda## can be any real number, so why can't ##1## be an eigenvalue?

Many thanks!
 
  • #5
Mark44 said:
An eigenvalue ##\lambda## is a number such that for an eigenvector x, ##A\mathbf x = \lambda \mathbf x##.
You didn't read what I wrote in my previous post carefully enough. An eigenvalue is closely associated with a specific eigenvector. In the equation above, x is an eigenvector that appears on both sides of the equation. For an eigenvalue/eigenvector pair, multiplication of the vector by the matrix produces a value that is a scalar multiple (i.e., the eigenvalue) of that same vector.
ChiralSuperfields said:
However, for this example,
##\begin{bmatrix}1 & 6 \\ 5 & 2\end{bmatrix} \begin{bmatrix}1 \\ 0 \end{bmatrix} = \begin{bmatrix}1 \\5 \end{bmatrix}##, why can't we factor out a 1 from the column vector to get ##\begin{bmatrix}1 & 6 \\ 5 & 2\end{bmatrix} \begin{bmatrix}1 \\ 0 \end{bmatrix} = 1 \begin{bmatrix}1 \\5 \end{bmatrix}##.
Because ##\begin{bmatrix}1 \\ 0 \end{bmatrix}## isn't the vector that appears on both sides of the equation.
 
  • Like
Likes member 731016
  • #6
Mark44 said:
You didn't read what I wrote in my previous post carefully enough. An eigenvalue is closely associated with a specific eigenvector. In the equation above, x is an eigenvector that appears on both sides of the equation. For an eigenvalue/eigenvector pair, multiplication of the vector by the matrix produces a value that is a scalar multiple (i.e., the eigenvalue) of that same vector.

Because ##\begin{bmatrix}1 \\ 0 \end{bmatrix}## isn't the vector that appears on both sides of the equation.
Oh, thank you @Mark44! I see now. Sorry I forgot that the column vector has to be on both sides.
 

FAQ: Why can't we define an eigenvalue of a matrix as any scalar value?

What is an eigenvalue of a matrix?

An eigenvalue of a matrix is a scalar value λ such that there exists a non-zero vector v (called an eigenvector) where the matrix A, when multiplied by v, equals λ times v. In mathematical terms, this is expressed as Av = λv.

Why can't any scalar value be an eigenvalue of a matrix?

Not every scalar value can be an eigenvalue of a matrix because eigenvalues are specific to the matrix's properties. They are determined by solving the characteristic equation det(A - λI) = 0, where A is the matrix, λ is the eigenvalue, and I is the identity matrix. Only the values of λ that satisfy this equation are considered eigenvalues.

What does the characteristic equation represent?

The characteristic equation det(A - λI) = 0 represents a polynomial equation derived from the matrix A. The roots of this polynomial are the eigenvalues of the matrix. This equation ensures that the eigenvalue λ satisfies the condition Av = λv for some non-zero vector v.

Can a matrix have multiple eigenvalues?

Yes, a matrix can have multiple eigenvalues. The number of eigenvalues a matrix has (counting multiplicities) is equal to the dimension of the matrix (i.e., the number of rows or columns). These eigenvalues can be real or complex numbers.

What happens if a scalar value does not satisfy the characteristic equation?

If a scalar value does not satisfy the characteristic equation det(A - λI) = 0, it means that there is no non-zero vector v such that Av = λv for that scalar value. Therefore, it cannot be considered an eigenvalue of the matrix A.

Back
Top