- #1
Math Amateur
Gold Member
MHB
- 3,998
- 48
I am reading An Introduction to Rings and Modules With K-Theory in View by A.J. Berrick and M.E. Keating (B&K).
I need help with Exercise 1.2.8 (a) (Chapter 1: Basics, page 3o) concerning \(\displaystyle K^n\) as a \(\displaystyle K[T]\)-module ... ...
First, so that MHB readers will understand the relevant notation for the construction of \(\displaystyle K^n\) as a \(\displaystyle K[T]\)-module, I am presenting the relevant text from B&K as follows:https://www.physicsforums.com/attachments/2997
Exercise 1.2.8 (a) (page 30) reads as follows:https://www.physicsforums.com/attachments/2998
https://www.physicsforums.com/attachments/2999So we are given that:
\(\displaystyle A\) is an \(\displaystyle n \times n\) matrix over a field \(\displaystyle K\)
\(\displaystyle U\) is a subspace of \(\displaystyle K^n\)
\(\displaystyle M\) is the \(\displaystyle K[T]\)-module obtained from the vector space of column vectors \(\displaystyle K^n\)
So we regard \(\displaystyle M=K^n\) as a right module over \(\displaystyle K[T]\)
Addition in \(\displaystyle M=K^n\) is normal column vector addition \(\displaystyle x+y\) where \(\displaystyle x, y \in K^n\) making \(\displaystyle M=K^n\) an abelian group as required
Given an \(\displaystyle n \times n\) matrix \(\displaystyle A\) over \(\displaystyle K\), a right action of \(\displaystyle f(T) \in K[T]\) on \(\displaystyle M\) is defined as follows:
\(\displaystyle xf(T) = xf_0 + Axf_1 + ... \ ... A^rxf_r
\)
where
\(\displaystyle f(T) = f_0 + f_1T + ... \ ... f_rT^r\)
We are required to show that:
------------------------------------------------
... a subspace \(\displaystyle U\) of \(\displaystyle K^n\) is a submodule \(\displaystyle L\) of \(\displaystyle M\) ...
if and only if
... \(\displaystyle AU \subseteq U\)
------------------------------------------------
Just reviewing the definitions of subspace and submodule in this context ...\(\displaystyle U\) is a subspace of \(\displaystyle K^n\) if \(\displaystyle U\) is a subset of \(\displaystyle K^n\) such that:
(1) \(\displaystyle 0 \in U\)
(2) \(\displaystyle x, y \in U \Longrightarrow x + y \in U \)
(3) \(\displaystyle f(T) \in k[T] , x \in U \Longrightarrow x f(T) \in U \)\(\displaystyle L\) is a submodule of \(\displaystyle M = K^n\) if \(\displaystyle L\) is a subset of \(\displaystyle K^n\) such that:
(1) \(\displaystyle 0 \in L\)
(2) \(\displaystyle x,y \in L \Longrightarrow x + y \in L
\)
(3) \(\displaystyle x \in L\) and \(\displaystyle f(T) \in K[T] \Longrightarrow xf(T) \in L\)
===========================
Now assume that the subspace \(\displaystyle U\) is a submodule \(\displaystyle L\) of \(\displaystyle M\)
Then we have that \(\displaystyle xf(T) \in U\) for all \(\displaystyle x \in L, f(T) \in K[T] \)
We need to show \(\displaystyle AU \subseteq U\)
So let \(\displaystyle x \in AU\)
Therefore
\(\displaystyle x = AU = \begin{pmatrix} a_{11} & a_{12} & a_{13} & ... & ... & a_{1n} \\ a_{21} & a_{22} & a_{23} & ... & ... & a_{2n} \\ a_{31} & a_{32} & a_{33} & ... & ... & a_{3n} \\... & ... & ... & ... & ... \\ ... & ... & ... & ... & ... \\ a_{n1} & a_{n2} & a_{n3} & ... & ... & a_{nn}\end{pmatrix} \begin{pmatrix} u_1 \\ u_2 \\u_3 \\.\\ .\\ u_s \\ 0 \\ 0 \\ .\\ . \\0 \end{pmatrix}\) for some integer \(\displaystyle s\) where \(\displaystyle s = dim(U)\)
Therefore
\(\displaystyle x = AU = \begin{pmatrix} a_{11}u_1 + a_{12}u_2 + a_{13}u_3 + ... \ ... + a_{1s}u_s + 0 +0 ... \ ... +0 \\ a_{21}u_1 + a_{22}u_2 + a_{23}u_3 + ... \ ... + a_{2s}u_s + 0 +0 ... \ ... +0 \\ a_{31}u_1 + a_{32}u_2 + a_{33}u_3 + ... ... + a_{3s}u_s + 0 +0 ... \ ... +0 \\... \ ... \\ ... \ ... \\ a_{n1}u_1 + a_{n2}u_2 + a_{n3}u_3 + ... ... + a_{ns}u_s +0 +0 ... \ ... +0 \end{pmatrix}\)
Thus \(\displaystyle x = AU \in U\) if \(\displaystyle a_{ij}u_j \in U\) since the terms of the matrix \(\displaystyle AU\) are sums of such elements and a subspace will contain these sums if the individual summands belong to it ...
But we are given that \(\displaystyle U\) is also a submodule L
Thus \(\displaystyle xf(T) \in L\) ...
... ... I was going to try to use this to show that each term \(\displaystyle a_{ij}u_j \in U\) ... ... BUT ... feel I have lost my way ...
Can someone please help ... ...
Peter
I need help with Exercise 1.2.8 (a) (Chapter 1: Basics, page 3o) concerning \(\displaystyle K^n\) as a \(\displaystyle K[T]\)-module ... ...
First, so that MHB readers will understand the relevant notation for the construction of \(\displaystyle K^n\) as a \(\displaystyle K[T]\)-module, I am presenting the relevant text from B&K as follows:https://www.physicsforums.com/attachments/2997
Exercise 1.2.8 (a) (page 30) reads as follows:https://www.physicsforums.com/attachments/2998
https://www.physicsforums.com/attachments/2999So we are given that:
\(\displaystyle A\) is an \(\displaystyle n \times n\) matrix over a field \(\displaystyle K\)
\(\displaystyle U\) is a subspace of \(\displaystyle K^n\)
\(\displaystyle M\) is the \(\displaystyle K[T]\)-module obtained from the vector space of column vectors \(\displaystyle K^n\)
So we regard \(\displaystyle M=K^n\) as a right module over \(\displaystyle K[T]\)
Addition in \(\displaystyle M=K^n\) is normal column vector addition \(\displaystyle x+y\) where \(\displaystyle x, y \in K^n\) making \(\displaystyle M=K^n\) an abelian group as required
Given an \(\displaystyle n \times n\) matrix \(\displaystyle A\) over \(\displaystyle K\), a right action of \(\displaystyle f(T) \in K[T]\) on \(\displaystyle M\) is defined as follows:
\(\displaystyle xf(T) = xf_0 + Axf_1 + ... \ ... A^rxf_r
\)
where
\(\displaystyle f(T) = f_0 + f_1T + ... \ ... f_rT^r\)
We are required to show that:
------------------------------------------------
... a subspace \(\displaystyle U\) of \(\displaystyle K^n\) is a submodule \(\displaystyle L\) of \(\displaystyle M\) ...
if and only if
... \(\displaystyle AU \subseteq U\)
------------------------------------------------
Just reviewing the definitions of subspace and submodule in this context ...\(\displaystyle U\) is a subspace of \(\displaystyle K^n\) if \(\displaystyle U\) is a subset of \(\displaystyle K^n\) such that:
(1) \(\displaystyle 0 \in U\)
(2) \(\displaystyle x, y \in U \Longrightarrow x + y \in U \)
(3) \(\displaystyle f(T) \in k[T] , x \in U \Longrightarrow x f(T) \in U \)\(\displaystyle L\) is a submodule of \(\displaystyle M = K^n\) if \(\displaystyle L\) is a subset of \(\displaystyle K^n\) such that:
(1) \(\displaystyle 0 \in L\)
(2) \(\displaystyle x,y \in L \Longrightarrow x + y \in L
\)
(3) \(\displaystyle x \in L\) and \(\displaystyle f(T) \in K[T] \Longrightarrow xf(T) \in L\)
===========================
Now assume that the subspace \(\displaystyle U\) is a submodule \(\displaystyle L\) of \(\displaystyle M\)
Then we have that \(\displaystyle xf(T) \in U\) for all \(\displaystyle x \in L, f(T) \in K[T] \)
We need to show \(\displaystyle AU \subseteq U\)
So let \(\displaystyle x \in AU\)
Therefore
\(\displaystyle x = AU = \begin{pmatrix} a_{11} & a_{12} & a_{13} & ... & ... & a_{1n} \\ a_{21} & a_{22} & a_{23} & ... & ... & a_{2n} \\ a_{31} & a_{32} & a_{33} & ... & ... & a_{3n} \\... & ... & ... & ... & ... \\ ... & ... & ... & ... & ... \\ a_{n1} & a_{n2} & a_{n3} & ... & ... & a_{nn}\end{pmatrix} \begin{pmatrix} u_1 \\ u_2 \\u_3 \\.\\ .\\ u_s \\ 0 \\ 0 \\ .\\ . \\0 \end{pmatrix}\) for some integer \(\displaystyle s\) where \(\displaystyle s = dim(U)\)
Therefore
\(\displaystyle x = AU = \begin{pmatrix} a_{11}u_1 + a_{12}u_2 + a_{13}u_3 + ... \ ... + a_{1s}u_s + 0 +0 ... \ ... +0 \\ a_{21}u_1 + a_{22}u_2 + a_{23}u_3 + ... \ ... + a_{2s}u_s + 0 +0 ... \ ... +0 \\ a_{31}u_1 + a_{32}u_2 + a_{33}u_3 + ... ... + a_{3s}u_s + 0 +0 ... \ ... +0 \\... \ ... \\ ... \ ... \\ a_{n1}u_1 + a_{n2}u_2 + a_{n3}u_3 + ... ... + a_{ns}u_s +0 +0 ... \ ... +0 \end{pmatrix}\)
Thus \(\displaystyle x = AU \in U\) if \(\displaystyle a_{ij}u_j \in U\) since the terms of the matrix \(\displaystyle AU\) are sums of such elements and a subspace will contain these sums if the individual summands belong to it ...
But we are given that \(\displaystyle U\) is also a submodule L
Thus \(\displaystyle xf(T) \in L\) ...
... ... I was going to try to use this to show that each term \(\displaystyle a_{ij}u_j \in U\) ... ... BUT ... feel I have lost my way ...
Can someone please help ... ...
Peter