Almost all matrices have n distinct eigenvalues

In summary, we have shown that any $n \times n$ complex matrix can be written as the sum of a diagonalizable matrix and a nilpotent matrix. By considering a small perturbation of this matrix, we can ensure that the resulting matrix has $n$ distinct eigenvalues. Therefore, almost all $n \times n$ complex matrices have $n$ distinct eigenvalues.
  • #1
Nono713
Gold Member
MHB
618
4
Consider the following problem:

Prove that almost all $n \times n$ complex matrices have $n$ distinct eigenvalues. That is, show that given an arbitrary $n \times n$ complex matrix $A$, for every $\epsilon > 0$ there exists a matrix $A'$ such that
$$\max_{1 \leq i, j \leq n} \left \lvert A_{ij} - A'_{ij} \right \rvert < \epsilon$$
and $A'$ has $n$ distinct eigenvalues.

I would like to solve this problem using only elementary techniques, along with the fact that almost all $n \times n$ complex matrices are invertible (obviously we don't use the fact that almost all matrices are diagonizable, ). Here is my approach:


Let $A$ be an $n \times n$ complex matrix and $\epsilon > 0$. Assume $n \geq 2$, since the case $n = 1$ is trivial. If $A$ already has $n$ distinct eigenvalues, we are done. If not, let distinct complex numbers $\epsilon_1, \epsilon_2, \cdots, \epsilon_n$ such that $\lvert \epsilon_i \rvert < \epsilon$ and let:
$$A' = A + \left [ \begin{matrix} \epsilon_1 &\epsilon_2 &\cdots &\epsilon_n \\ 0 &0 &\cdots &0 \\ \vdots &\vdots &\vdots &\vdots \end{matrix} \right ] = \left [ \begin{matrix} A_{11} + \epsilon_1 &A_{12} + \epsilon_2 &\cdots &A_{1n} + \epsilon_n \\ A_{21} &A_{22} &\cdots &A_{2n} \\ \vdots &\vdots &\vdots &\vdots \\ A_{n1} &A_{n2} &\cdots &A_{nn} \end{matrix} \right ]$$
By Laplace expansion, the characteristic polynomial of $A$ is given by
$$p_A(z) = \det(zI - A) = \sum_{i = 1}^n (-1)^{i + 1} A_{1i} \det(zI - A_1^i)$$
where $A_1^i$ represents the $(n - 1) \times (n - 1)$ matrix formed by $A$ excluding the first row and the $i$th column. Similarly, the characteristic polynomial of $A'$ is given by
$$p_{A'}(z) = \det(zI - A') = \sum_{i = 1}^n (-1)^{i + 1} {A'}_{1i} \det(zI - {A'}_1^i)$$
Observe that since $A$ and $A'$ are identical in all elements other than the first row, ${A'}_1^i = A_1^i$ for all $i$, and by construction $A'_{1i} = A_{1i} + \epsilon_i$ hence
$$p_{A'}(z) = \sum_{i = 1}^n (-1)^{i + 1} (A_{1i} + \epsilon_i) \det(zI - A_1^i) = p_A(z) + \sum_{i = 1}^n (-1)^{i + 1} \epsilon_i \det(zI - A_1^i)$$
For convenience, write $\det(zI - A_1^i) = p_{A_1^i}$ so that
$$p_{A'}(z) = p_A(z) + \sum_{i = 1}^n (-1)^{i + 1} \epsilon_i p_{A_1^i}(z)$$
Denote the $n$ (including repeated) roots of $p_A$ as $z_1, z_2, \cdots, z_n$ so that $p_A(z_j) = 0$ for all $1 \leq j \leq n$. Select $\delta_1, \delta_2, \cdots, \delta_n$ such that $z_j + \delta_j$ are all distinct for $1 \leq j \leq n$. Then we have
$$p_{A'}(z_j + \delta_j) = p_A(z_j + \delta_j) + \sum_{i = 1}^n (-1)^{i + 1} \epsilon_i p_{A_1^i}(z_j + \delta_j)$$
Because almost all matrices are invertible, there exists a suitable choice of $(\delta_j)$ such that
$$\left [ \begin{matrix} p_{A_1^1}(z_1 + \delta_1) &p_{A_1^2}(z_1 + \delta_1) &\cdots &p_{A_1^n}(z_1 + \delta_1) \\ p_{A_1^1}(z_1 + \delta_2) &\ddots & & \\ \vdots & &\ddots & \\ p_{A_1^1}(z_n + \delta_n) & & & p_{A_1^n}(z_n + \delta_n) \end{matrix} \right ]$$
is invertible. Then it suffices to set
$$\left [ \begin{matrix} \epsilon_1 \\ -\epsilon_2 \\ \vdots \\ (-1)^{n + 1} \epsilon_n \end{matrix} \right ] = \left [ \begin{matrix} p_{A_1^1}(z_1 + \delta_1) &p_{A_1^2}(z_1 + \delta_1) &\cdots &p_{A_1^n}(z_1 + \delta_1) \\ p_{A_1^1}(z_2 + \delta_2) &\ddots & & \\ \vdots & &\ddots & \\ p_{A_1^1}(z_n + \delta_n) & & & p_{A_1^n}(z_n + \delta_n) \end{matrix} \right ]^{-1} \left [ \begin{matrix} -p_A(z_1 + \delta_1) \\ -p_A(z_2 + \delta_2) \\ \vdots \\ -p_A(z_n + \delta_n) \end{matrix} \right ]$$
Now observe that $p_A(z_j + \delta_j) \to 0$ as $\delta_j \to 0$ by continuity and $p_A(z_j) = 0$ so by linearity of matrix multiplication we can make $(\delta_j)$ sufficiently small to achieve $\lvert \epsilon_i \rvert < \epsilon$. At this point we have $p_{A'}(z_j + \delta_j) = 0$ by construction and $z_j + \delta_j$ are all distinct, hence $p_{A'}$ has $n$ distinct roots, and the proof is complete.


Does this solution make sense? And is there a more direct proof (elementary, i.e. using only basic linear algebra techniques) available? Thanks.​
 
Physics news on Phys.org
  • #2


Yes, your solution makes sense and is a valid proof. However, there is a simpler and more direct approach to proving this statement.

First, note that any matrix $A$ can be written as a sum of a diagonalizable matrix $D$ and a nilpotent matrix $N$, i.e. $A = D + N$, where $DN = ND$ and $N^n = 0$. This is known as the Jordan decomposition of a matrix.

Now, let $A$ be an $n \times n$ complex matrix. By the Jordan decomposition, we can write $A = D + N$, where $D$ is diagonalizable and $N$ is nilpotent. Let $D = P^{-1}AP$ be the diagonalization of $A$, where $P$ is an invertible matrix. Then, for any $\epsilon > 0$, consider the matrix $A' = P^{-1}(A + \epsilon I)P$. This matrix has the same eigenvalues as $A$, since it is just a scalar multiple of $A$.

Furthermore, since $A$ is diagonalizable, $A'$ is also diagonalizable. Thus, $A'$ has $n$ distinct eigenvalues, since the diagonal entries of $A'$ are all distinct (they are the eigenvalues of $A$ plus a small perturbation). Therefore, we have shown that for any $\epsilon > 0$, there exists a matrix $A'$ such that $\lvert A_{ij} - A'_{ij} \rvert < \epsilon$ and $A'$ has $n$ distinct eigenvalues. This completes the proof.
 

FAQ: Almost all matrices have n distinct eigenvalues

What does it mean for a matrix to have n distinct eigenvalues?

Having n distinct eigenvalues means that the matrix has n different values that can be used to scale certain vectors without changing their direction. These values are known as eigenvalues and are important in understanding the behavior of a matrix.

Why is it important for a matrix to have distinct eigenvalues?

Distinct eigenvalues are important because they allow for a matrix to be easily diagonalized, which simplifies calculations and makes it easier to understand the behavior of the matrix. Additionally, having distinct eigenvalues can provide insight into the geometric and algebraic properties of the matrix.

How can you determine the number of distinct eigenvalues a matrix has?

The number of distinct eigenvalues a matrix has is equal to its dimension. This means that a 2x2 matrix will have 2 distinct eigenvalues, a 3x3 matrix will have 3 distinct eigenvalues, and so on. You can also determine the number of distinct eigenvalues by finding the matrix's characteristic polynomial and counting the number of distinct roots.

Are there any special types of matrices that always have distinct eigenvalues?

Yes, diagonalizable matrices always have distinct eigenvalues. This means that the matrix can be reduced to a diagonal matrix with the eigenvalues along the main diagonal. Additionally, symmetric matrices and Hermitian matrices also have distinct eigenvalues.

What happens if a matrix does not have distinct eigenvalues?

If a matrix does not have distinct eigenvalues, it is known as a defective matrix. This means that the matrix cannot be diagonalized and will have repeated eigenvalues. In this case, the eigenvectors will not be linearly independent and the matrix's behavior can be more difficult to understand.

Similar threads

Replies
4
Views
1K
Replies
10
Views
2K
Replies
4
Views
1K
Replies
9
Views
3K
Back
Top