Tricky Linear Algebra Question. To show that an operator is 'cyclic'.

In summary: T(2,1) = (u,u^2) = (1,2)T(1,2) = (1,1) = (u+1,u+1).In summary, The conversation is about a problem in linear algebra, specifically on finding an element of order p^n-1 in the general linear group GL_n(F_p) over the field of p elements. The solution involves using the concept of a primitive element in a finite field and constructing a matrix with a certain minimal polynomial to show that such an element exists.
  • #1
caffeinemachine
Gold Member
MHB
816
15
Hello MHB,
I am stuck at this problem for quite a long time now.

Problem.
Let $F_p$ denote the field of $p$ elements, where $p$ is prime. Let $n$ be a positive integer. Let $V$ be the vector space $(F_p)^n$ over the field $F_p$. Let $GL_n(F_p)$ denote the set of all the invertible linear transformations on $V$. Then note that each element $T\in GL_n(F_p)$ is a permutation of the elements in $V$. Show that there is an element $T\in GL_n(F_p)$ such that the cycle decomposition of $T$ has a cycle of length $p^n-1$.Attempt.
Write $s=p^n=|V|$. Let $T\in\mathcal L(V)$ and $v\in V$. We say that $T$ is good with $v$ if $\{Tv,\ldots,T^{s-1}(v)\}=V\setminus\{0\}$.

Claim 1: If $T\in \mathcal L(V)$ is good with a vector $v\in V$ then $T\in GL_n(F_p)$.
Proof: Let $K=\{v,Tv,\ldots,T^{s-2}(v)\}$. Then $T(K)=V\setminus\{0\}$ since $T$ is good with $v$. Thus $T$ is surjective and hence invertible.

Claim 2: If $T$ is good with a vector $v\in V$ then $T^kv=v\Rightarrow k\geq s$.
Proof: Let $k$ be the smallest positive integer with the property that $T^kv=v$. Now $\{Tv,\ldots,T^{s-1}v\}=\{Tv,\ldots,T^{k-1}v\}$. The LHS has cardinality $s-1$ while the RHS has the cardinality $k-1$. Hence $s=k$ and we are done.

Claim 3: If $T$ is good with a vector $v$ then $T^sv=v$.
Proof: Clearly $T^sv\in V\setminus\{0\}$. So let $T^sv=T^iv$ for some $i$ in $\{1,\ldots,s-1\}$. This gives $T^{s-i}v=v$ and form the above claim $s-i\geq s$ and hence $i\leq 0$, which is a contradiction.

Claim 4: If $T$ is good with $v\in V$ then $T$ is good with each $u\in V\setminus\{0\}$.
Proof: Let $u\in V\setminus\{0\}$. Since $T$ is good with $v$, we have $u=T^iv$ for some $i \in \{1,\ldots,s-1\}$. Let $T^ku=u$ and $k$ be minimum, positive integer with this property. Then $T^{i+k}v=T^{i}v$. This leads to $T^k v=v$ and hence $k\geq s$. This settles that $T$ is good with $u$.

Claim 5: Let $T\in \mathcal L(V)$ is good with a vector $v\in V$ if and only if $T^k-I$ is invertible for each $k\in\{1,\ldots,s-1\}$.
Proof: Let $T$ be good with a vector $v\in V$ but $T^k-I$ is not invertible for a $k\in\{1,\ldots,s-1\}$. Then there is a vector $u\in V\setminus\{0\}$ such that $(T^k-I)u=0$. This means $T^ku=u$ and hence $T$ is not good with $u$. But this implies that $T$ isn't good with $v$ either and hence this direction of the claim in settled. The other direction is trivial.

Now coming back to the problem.
The question asks us to establish that there is some $T\in GL_n(F_p)$ such that $T$ is good with some vector $v\in V$. By Claim 5 this is equivalent to showing that there is a $T$ in $\mathcal L(V)$ such that $T^k-I$ is invertible for each $k\in\{1,\ldots,s-1\}$. To do this may be it will help to use the fact that an operator is invertible if and only if its matrix with respect to any basis has determinant non zero. I am unable to make progress here. Can anybody help?
 
Physics news on Phys.org
  • #2
This is an interesting problem, and my solution is a bit circuitous (I apologize for that).

I will show that it suffices to exhibit an element \(\displaystyle T \in GL_n(F_p)\), of order \(\displaystyle p^n - 1\).

First of all, it should be clear from your opening remarks that we can consider \(\displaystyle GL_n(F_p)\) as a subgroup of \(\displaystyle S_{p^n - 1}\) (the symmetric group on \(\displaystyle p^n - 1\) letters), so that ANY such element \(\displaystyle \ T\) must be a \(\displaystyle (p^n - 1)\)-cycle.

Now, consider a primitive element \(\displaystyle \xi \in (F_{p^n})^{\ast}\). Let its minimal polynomial (necessarily of degree n) be:

\(\displaystyle m(x) = c_0 + c_1x + \cdots + c_{n-1}x^{n-1} + x^n \in F_p[x]\).

We have the matrix:

\(\displaystyle T = \begin{bmatrix}0&0&\dots&0&-c_0\\1&0&\dots&0&-c_1\\0&1&\dots&0&-c_2\\ \vdots&\vdots&\ddots&\vdots&\vdots\\0&0&\dots&1&-c_{n-1} \end{bmatrix}\)

with \(\displaystyle m(T) = 0\) (see: Companion matrix - Wikipedia, the free encyclopedia).

Since \(\displaystyle \xi\) is an eigenvalue of \(\displaystyle T\), we have for any eigenvector \(\displaystyle v \in (F_{p^n})^n\):

\(\displaystyle T^kv = \xi^kv \neq v\) for \(\displaystyle k = 1,2,\dots,p^n-1\).

This shows that the order of \(\displaystyle T\) is at least \(\displaystyle p^n - 1\).

Since the order of \(\displaystyle T\) by Lagrange divides \(\displaystyle (p^n - 1)!\), we conclude that:

\(\displaystyle |T| = p^n - 1\)

(In all fairness, this proof just turns the problem into a different one: that the field with \(\displaystyle p^n\) elements has a primitive element, for example see: http://www.math.wm.edu/~vinroot/430S13MultFiniteField.pdf ).

As a concrete example: consider \(\displaystyle p = 3, n = 2\). We have:

\(\displaystyle F_9 \cong \Bbb Z_3(u)\), where \(\displaystyle u\) is a root of \(\displaystyle x^2 + 1\) (in other words, \(\displaystyle u\) is a square root of 2). A primitive element for \(\displaystyle \Bbb Z_3(u)\) is \(\displaystyle u+1\):

\(\displaystyle (u+1)^2 = u^2 + 2u + 1 = 2 + 2u + 1 = 2u\),
\(\displaystyle (u+1)^4 = (2u)^2 = 2^2u^2 = u^2 = 2\),
\(\displaystyle (u+1)^8 = 2^2 = 1\).

The minimal polynomial for \(\displaystyle u+1\) over \(\displaystyle \Bbb Z_3\) is:

\(\displaystyle m(x) = x^2 + x + 2\).

The companion matrix in this case is:

\(\displaystyle T = \begin{bmatrix}0&1\\1&2 \end{bmatrix}\), and indeed:

\(\displaystyle T^2 + T + 2I = \begin{bmatrix}0&1\\1&2 \end{bmatrix}^2 + \begin{bmatrix}0&1\\1&2 \end{bmatrix} + \begin{bmatrix}2&0\\0&2 \end{bmatrix}\)

\(\displaystyle = \begin{bmatrix}1&2\\2&2 \end{bmatrix} + \begin{bmatrix}0&1\\1&2 \end{bmatrix} + \begin{bmatrix}2&0\\0&2 \end{bmatrix} = \begin{bmatrix}0&0\\0&0 \end{bmatrix}\).

Identifying \(\displaystyle u+1\) with the vector \(\displaystyle (1,1)\) (in the basis: \(\displaystyle \{1,u\}\)), we see that:

\(\displaystyle T(1,1) = (1,0)\)
\(\displaystyle T(1,0) = (0,1)\)
\(\displaystyle T(0,1) = (1,2)\)
\(\displaystyle T(1,2) = (2,2)\)
\(\displaystyle T(2,2) = (2,0)\)
\(\displaystyle T(2,0) = (0,2)\)
\(\displaystyle T(0,2) = (2,1)\)
\(\displaystyle T(2,1) = (1,1)\)

So listing the elements of \(\displaystyle (F_9)^{\ast}\) as \(\displaystyle \{(1,0),(2,0),(0,1),(1,1),(2,1),(0,2),(1,2),(2,2) \}\) we may identify \(\displaystyle T\) with the 8-cycle:

\(\displaystyle (1\ 3\ 7\ 8\ 2\ 6\ 5\ 4) \in S_8\)

and \(\displaystyle \langle T \rangle\) is clearly a cyclic subgroup of order 8 in \(\displaystyle GL(2,3)\).
 
  • #3
I had to rush off while composing this, and reflecting upon it during my shopping, I realized there is an error.

For some reason, I had thought \(\displaystyle p^n - 1\) was prime, which is clearly not so, so it is a false conclusion to invoke Lagrange to show that:

\(\displaystyle |T|\) divides \(\displaystyle (p^n - 1)!\) implies \(\displaystyle |T| \leq p^n - 1\).

Fortunately, this is easily remedied:

Consider the algebra \(\displaystyle F_p[T]\). Since \(\displaystyle T\) satisfies a polynomial of degree n, \(\displaystyle F_p[T]\) has at most \(\displaystyle p^n - 1\) non-zero elements. Since the powers of \(\displaystyle T\) are among these, and since \(\displaystyle T\) is an element of a finite group (\(\displaystyle GL_n(F_p)\)), we conclude that the order of \(\displaystyle T\) can be at most \(\displaystyle p^n - 1\), and therefore that \(\displaystyle |T| = p^n - 1\), as before.
 
  • #4
I think what you are doing is not work in $(F_{p})^n$ as a vector space over $F_p$ but work in $F_{p^n}$ as a vector space over $F_p$. Showing the existence of a linear operator $T:F_{p^n}\to F_{p^n}$ which is good with a vector in $F_{p^n}$ (see my original post for the definition of goodness ) suffices.
For this solution to make sense it must be true that $F_p$ is embedded in $F_{p^n}$. I had not known about this fact since I have not studied finite field explicitly yet but I confirmed that this is indeed true. If you have an easy proof of this then please post it. Meanwhile I'll try to prove it myself.
If I am wrong here then please stop me right away because then I have clearly misunderstood you solution.

Deveno said:
This is an interesting problem, and my solution is a bit circuitous (I apologize for that).

I will show that it suffices to exhibit an element \(\displaystyle T \in GL_n(F_p)\), of order \(\displaystyle p^n - 1\).

First of all, it should be clear from your opening remarks that we can consider \(\displaystyle GL_n(F_p)\) as a subgroup of \(\displaystyle S_{p^n - 1}\) (the symmetric group on \(\displaystyle p^n - 1\) letters), so that ANY such element \(\displaystyle \ T\) must be a \(\displaystyle (p^n - 1)\)-cycle.

Now, consider a primitive element \(\displaystyle \xi \in (F_{p^n})^{\ast}\). Let its minimal polynomial (necessarily of degree n) be:

\(\displaystyle m(x) = c_0 + c_1x + \cdots + c_{n-1}x^{n-1} + x^n \in F_p[x]\).
So here you are using the fact that $F_p$ is embedded in $F_{p^n}$ right?

Deveno said:
We have the matrix:

\(\displaystyle T = \begin{bmatrix}0&0&\dots&0&-c_0\\1&0&\dots&0&-c_1\\0&1&\dots&0&-c_2\\ \vdots&\vdots&\ddots&\vdots&\vdots\\0&0&\dots&1&-c_{n-1} \end{bmatrix}\)

with \(\displaystyle m(T) = 0\) (see: Companion matrix - Wikipedia, the free encyclopedia).
Okay...

Deveno said:
Since \(\displaystyle \xi\) is an eigenvalue of \(\displaystyle T\)...
An eigenvalue of any linear operator must come for the ground field of the vector space, isn't it? Here our ground field is $F_p$. I don't understand how can an element of $F_{p^n}$ be an eigenvalue?

Deveno said:
... we have for any eigenvector \(\displaystyle v \in (F_{p^n})^n\):
I think you mean $v\in F_{p^n}$ and not $(F_{p^n})^n$. Am I right?

Deveno said:
\(\displaystyle T^kv = \xi^kv \neq v\) for \(\displaystyle k = 1,2,\dots,p^n-1\).

This shows that the order of \(\displaystyle T\) is at least \(\displaystyle p^n - 1\).

Since the order of \(\displaystyle T\) by Lagrange divides \(\displaystyle (p^n - 1)!\), we conclude that:

\(\displaystyle |T| = p^n - 1\)
I don't understand this because of the doubt I had above about $\xi$ being an eigenvalue and also a little bit unsure about $v$.

Deveno said:
(In all fairness, this proof just turns the problem into a different one: that the field with \(\displaystyle p^n\) elements has a primitive element, for example see: http://www.math.wm.edu/~vinroot/430S13MultFiniteField.pdf ).
Yes. I am comfortable with the cyclic nature of the group of units of any finite field.

Deveno said:
As a concrete example: consider \(\displaystyle p = 3, n = 2\). We have:

\(\displaystyle F_9 \cong \Bbb Z_3(u)\), where \(\displaystyle u\) is a root of \(\displaystyle x^2 + 1\) (in other words, \(\displaystyle u\) is a square root of 2). A primitive element for \(\displaystyle \Bbb Z_3(u)\) is \(\displaystyle u+1\):

\(\displaystyle (u+1)^2 = u^2 + 2u + 1 = 2 + 2u + 1 = 2u\),
\(\displaystyle (u+1)^4 = (2u)^2 = 2^2u^2 = u^2 = 2\),
\(\displaystyle (u+1)^8 = 2^2 = 1\).

The minimal polynomial for \(\displaystyle u+1\) over \(\displaystyle \Bbb Z_3\) is:

\(\displaystyle m(x) = x^2 + x + 2\).

The companion matrix in this case is:

\(\displaystyle T = \begin{bmatrix}0&1\\1&2 \end{bmatrix}\), and indeed:

\(\displaystyle T^2 + T + 2I = \begin{bmatrix}0&1\\1&2 \end{bmatrix}^2 + \begin{bmatrix}0&1\\1&2 \end{bmatrix} + \begin{bmatrix}2&0\\0&2 \end{bmatrix}\)

\(\displaystyle = \begin{bmatrix}1&2\\2&2 \end{bmatrix} + \begin{bmatrix}0&1\\1&2 \end{bmatrix} + \begin{bmatrix}2&0\\0&2 \end{bmatrix} = \begin{bmatrix}0&0\\0&0 \end{bmatrix}\).

Identifying \(\displaystyle u+1\) with the vector \(\displaystyle (1,1)\) (in the basis: \(\displaystyle \{1,u\}\)), we see that:

\(\displaystyle T(1,1) = (1,0)\)
\(\displaystyle T(1,0) = (0,1)\)
\(\displaystyle T(0,1) = (1,2)\)
\(\displaystyle T(1,2) = (2,2)\)
\(\displaystyle T(2,2) = (2,0)\)
\(\displaystyle T(2,0) = (0,2)\)
\(\displaystyle T(0,2) = (2,1)\)
\(\displaystyle T(2,1) = (1,1)\)

So listing the elements of \(\displaystyle (F_9)^{\ast}\) as \(\displaystyle \{(1,0),(2,0),(0,1),(1,1),(2,1),(0,2),(1,2),(2,2) \}\) we may identify \(\displaystyle T\) with the 8-cycle:

\(\displaystyle (1\ 3\ 7\ 8\ 2\ 6\ 5\ 4) \in S_8\)

and \(\displaystyle \langle T \rangle\) is clearly a cyclic subgroup of order 8 in \(\displaystyle GL(2,3)\).
I am trying to understand this. I'll get back in some time.

Thanks.
 
  • #5
Hi,
I think once you know a little about finite fields, this is an easy problem.
Let GF(pn) be the Galois field with pn elements. Then GF(pn) is a vector space V of dimension n over the field GF(p). So GL(n,K) with K=GF(p) is your linear group. Furthermore the multiplicative group of GF(pn) is cyclic with generator say a. The order of a in this multiplicative group is then pn-1 . Define the linear transformation T of V by T(v)=av for all v in V. Then clearly when T is viewed as a permutation on V, T is the product of a 1 cycle and a pn-1 cycle.
I took this solution directly from B. Huppert, Endliche Gruppen I, p. 187
 
  • #6
johng said:
Hi,
I think once you know a little about finite fields, this is an easy problem.
Let GF(pn) be the Galois field with pn elements. Then GF(pn) is a vector space V of dimension n over the field GF(p). So GL(n,K) with K=GF(p) is your linear group. Furthermore the multiplicative group of GF(pn) is cyclic with generator say a. The order of a in this multiplicative group is then pn-1 . Define the linear transformation T of V by T(v)=av for all v in V. Then clearly when T is viewed as a permutation on V, T is the product of a 1 cycle and a pn-1 cycle.
I took this solution directly from B. Huppert, Endliche Gruppen I, p. 187
I understand this solution. Thanks.

Can you look into my approach see if you can make it work?

It's hard for me to abandon my approach because I had made good progress with it.
 
  • #7
caffeinemachine said:
I think what you are doing is not work in $(F_{p})^n$ as a vector space over $F_p$ but work in $F_{p^n}$ as a vector space over $F_p$. Showing the existence of a linear operator $T:F_{p^n}\to F_{p^n}$ which is good with a vector in $F_{p^n}$ (see my original post for the definition of goodness ) suffices.

You have a good eye...In fact, when working with eigenvalues and eigenvectors, it is often necessary to enlarge the base field...for example the matrix:

\(\displaystyle \begin{bmatrix}\cos \theta&-\sin\theta \\ \sin\theta&\cos\theta \end{bmatrix}\)

has no real eigenvectors (except when \(\displaystyle \theta = 0\) or \(\displaystyle \pi\)), but it DOES have complex eigenvectors.

For this solution to make sense it must be true that $F_p$ is embedded in $F_{p^n}$. I had not known about this fact since I have not studied finite field explicitly yet but I confirmed that this is indeed true. If you have an easy proof of this then please post it. Meanwhile I'll try to prove it myself.
If I am wrong here then please stop me right away because then I have clearly misunderstood you solution.

\(\displaystyle F_{p^n}\) can be realized as the unique (up to isomorphism) splitting field of the polynomial \(\displaystyle x^{p^n} - x \in F_p[x]\).
So here you are using the fact that $F_p$ is embedded in $F_{p^n}$ right?Okay...An eigenvalue of any linear operator must come for the ground field of the vector space, isn't it? Here our ground field is $F_p$. I don't understand how can an element of $F_{p^n}$ be an eigenvalue?

No, this is not true (see my example above). I used the larger field to show that if \(\displaystyle T^k \neq I\) in this larger field, then it certainly doesn't have order k considered back in the base field.
I think you mean $v\in F_{p^n}$ and not $(F_{p^n})^n$. Am I right?

No, I meant in the larger field...the eigenvalues in this case don't exist in the smaller field (because of the irreducibility of m(x)).
I don't understand this because of the doubt I had above about $\xi$ being an eigenvalue and also a little bit unsure about $v$.

One can prove that the roots of m(x) are the eignevalues of the companion matrix using induction on n, for example see here:

homework - characteristic polynomial of companion matrix - Mathematics Stack Exchange

Yes. I am comfortable with the cyclic nature of the group of units of any finite field.I am trying to understand this. I'll get back in some time.

Thanks.

I would like to mention here that johng's comment is also dead on the money...what I did is show how one can specifically pull this back "to matrix form". The key point is that multiplication in an extension field \(\displaystyle E\) over a base field \(\displaystyle F\) is \(\displaystyle F\)-linear, due to the field axioms. One generally has this result for any extension ring over a central sub-ring (a fact that is put to good use in studying algebras).

The interplay of linear algebra, polynomials, and fields in the finite case is almost magical to me. The finite general linear groups are fascinating, and somewhat mysterious.
 

Related to Tricky Linear Algebra Question. To show that an operator is 'cyclic'.

1. How can I prove that an operator is cyclic?

To prove that an operator is cyclic, you must show that there exists a vector in the vector space such that its linear span under the operator contains all other vectors in the vector space.

2. What is the significance of an operator being cyclic?

An operator being cyclic means that it can generate all other vectors in the vector space through its repeated application on a single vector. This can make solving complex linear algebra problems easier.

3. Can an operator be cyclic in any vector space?

Yes, an operator can be cyclic in any finite-dimensional vector space. However, in infinite-dimensional vector spaces, not all operators are cyclic.

4. How does one go about finding a cyclic vector for an operator?

To find a cyclic vector for an operator, you can use the Cayley-Hamilton theorem or the minimal polynomial of the operator. These methods allow you to find a vector that generates all other vectors in the vector space under the operator.

5. Are there any practical applications for understanding cyclic operators?

Yes, understanding cyclic operators can be useful in many areas, such as signal processing, control theory, and quantum mechanics. It can also simplify calculations in solving differential equations and understanding the behavior of linear systems.

Similar threads

  • Linear and Abstract Algebra
Replies
2
Views
1K
  • Linear and Abstract Algebra
Replies
3
Views
2K
  • Linear and Abstract Algebra
Replies
3
Views
1K
  • Linear and Abstract Algebra
Replies
3
Views
3K
Replies
4
Views
1K
Replies
3
Views
2K
Replies
4
Views
1K
  • Linear and Abstract Algebra
Replies
12
Views
1K
  • Linear and Abstract Algebra
Replies
24
Views
1K
  • Linear and Abstract Algebra
Replies
9
Views
1K
Back
Top