Diagonalizability of Invertible Matrices in Z_p

  • Thread starter Treadstone 71
  • Start date
  • Tags
    Matrix
In summary, the conversation discusses determining if a matrix is diagonalizable if its order divides p-1. Suggestions were given to use Fermat's Little Theorem and properties of diagonal matrices to prove both directions of the theorem. It was also mentioned that the minimal polynomial of A divides x^t - 1, but it does not necessarily equal it. Further considerations were given to the multiplicities of eigenvalues and how this relates to Fermat's Little Theorem.
  • #1
Treadstone 71
275
0
"Let A be an invertible matrix with entries in Z_p. Show that A is diagonalizable if and only if its order (the least t such that A^t=1 in GL_n(Z_p)) divides p-1."

I got the => direction, but I'm having trouble with the backwards direction. Any hints?
 
Physics news on Phys.org
  • #2
EDIT: I originally made three posts, but I'll put them all in one:

----------------------------------------------------------------

POST 1:

Suppose A is diagonalizable but it's order does not divide p-1. Let D = (dij) be the corresponding diagonal matrix. Then use:
- Fermat's Little Theorem
- the fact that D is diagonal
- the fact that A and D are similar
- then use the division algorithm together with assumption that the order of A does not divide p-1 to derive a contradiction which essentially says "if t is the order of A, i.e. if t is the least positive natural such that At = 1, then there exists a t' such that 0 < t' < t but such that At' = 1"

EDIT TO POST 1: Oops, I guess that's the direction you already proved. I'll have to think some more.

----------------------------------------------------------------

POST 2:

Just throwing out some ideas:

1) the characteristic polynomial of a diagonal matrix splits (it's irreducible factors are all linear)
2) matrices with degree dividing p-1 form a normal subgroup of GLn(Zp) - maybe the orbit-stabilizer theorem or the class equation can be used here (you want to show that every matrix whose order divides p-1 contains a diagonal matrix in its conjugacy class).

----------------------------------------------------------------

POST 3:

I'm rusty on the linear algebra, but how about this:

The order of A divides p-1
implies
The minimal polynomial of A is xt - 1, where t is the order of A
implies
The minimal polynomial of A splits (since xt - 1 = 0 has solutions in Zp iff t | p-1)
implies
The char poly of A splits
implies
A is diagonalizable (I think there's a theorem showing that the char poly splits iff A is diagonalizable).

EDIT TO POST 3: Actually, it wouldn't surprise me if the "implies"s can be changed to "iff"s, but at the same time, it wouldn't surprise me if some of the "implies"s were wrong altogether. It's been well over a year since I did any linear algebra, especially anything to do with diagonalization. And I've never really done any linear algebra over finite fields. So check your theorems in your book, and see if the above proof a) is correct, and b) can be strengthened so the "implies"s can become "iff"s, which would then prove both directions of the theorem simultaneously, and then get back to me about it.
 
Last edited:
  • #3
The order of A divides p-1 does not necessarily imply that the minimal polynoial of A is x^t -1, but it does imply that the minimal polynomial of A DIVIDES x^t -1. I'm not sure if your implications still follow; thinking.
 
  • #4
We know the minimal poly divides x^{p-1}-1

This now tells us everything about the minimal poly we know. Think what the multiplicities of the eigenvalues can be (think Fermat's Little Theorem). Think back to the other question you posted too about multiplicity one eigenvalues.
 

FAQ: Diagonalizability of Invertible Matrices in Z_p

What does it mean for a matrix to be diagonalizable in Z_p?

Diagonalizability in Z_p means that a square matrix can be transformed into a diagonal matrix through a similarity transformation using elements from the finite field Z_p. This is important because it simplifies calculations and makes it easier to find solutions to certain problems.

How do you determine if a matrix is invertible in Z_p?

A matrix is invertible in Z_p if its determinant is not zero in the finite field Z_p. This means that the matrix has a unique inverse that can be calculated using modular arithmetic in Z_p. If the determinant is zero, the matrix is not invertible in Z_p.

Can all invertible matrices in Z_p be diagonalized?

No, not all invertible matrices in Z_p are diagonalizable. In order for a matrix to be diagonalizable, it must have a complete set of eigenvectors. However, not all matrices have this property. In such cases, the matrix cannot be diagonalized in Z_p.

Are there any advantages to diagonalizing invertible matrices in Z_p?

Yes, diagonalizing invertible matrices in Z_p can have several advantages. It simplifies calculations and makes it easier to solve certain problems, such as finding eigenvalues and eigenvectors. It also allows for easier manipulation and transformation of the matrix, which can be useful in applications.

Can the diagonalization of invertible matrices in Z_p be used in real-world applications?

Yes, the diagonalization of invertible matrices in Z_p has many real-world applications, particularly in fields such as cryptography and coding theory. It can also be used in linear algebra and other areas of mathematics to simplify calculations and solve problems.

Similar threads

Replies
10
Views
3K
Replies
1
Views
7K
Replies
3
Views
1K
Replies
9
Views
2K
Replies
9
Views
7K
Replies
1
Views
5K
Back
Top