How Do You Prove a Matrix Ring is Right Artinian But Not Left Artinian?

  • I
  • Thread starter Math Amateur
  • Start date
  • Tags
    Rings
In summary: ... the following equation must hold:##\begin{pmatrix} r_{11} & r_{12} & r_{22} \\ 0 & 0 & 0 \end{pmatrix} = \begin{pmatrix} s_{11} & s_{12} & s_{22} \\ 0 & 0 & 0 \end{pmatrix}##
  • #1
Math Amateur
Gold Member
MHB
3,998
48
I am reading Paul E. Bland's book: Rings and Their Modules and am currently focused on Section 4.2 Noetherian and Artinian Modules ... ...

I need help with fully understanding Example 6 "Right Artinian but not Left Artinian" ... in Section 4.2 ... ...

Example 6 reads as follows:

?temp_hash=95601a76cc5c9263991319d423591e38.png
My problem is how to prove, explicitly and formally, that the matrix ring:##\begin{pmatrix} \mathbb{Q} & \mathbb{R} \\ 0 & \mathbb{R} \end{pmatrix}##Also, a related problem is that I am trying to explicitly determine/calculate the form of all the ideals of the above matrix ring ... but without success ... help with this issue would be appreciated as well ...
Reading around this problem leads me to believe that showing the above matrix ring to be right artinian but not left artinian would involve showing that all descending chains of right ideals terminate ... but that this does not hold for descending chains of left ideals ... ...
Note that the conditions regarding descending chains of right (left) ideals for right (left) Artinian rings are not given in the definition of right (left) Artinian rings by Bland, but I believe Bland's definition implies them ... is that correct? ... see Bland's definition of Artinian rings below ...
Bland's definition of Artinian rings and modules is as follows:
?temp_hash=95601a76cc5c9263991319d423591e38.png

Hope someone can help,

Peter
 

Attachments

  • Bland - Example - Section 4.2 ....png
    Bland - Example - Section 4.2 ....png
    34.1 KB · Views: 1,200
  • Bland - Defn of Artininan Module and Ring - Section 4.2 ....png
    Bland - Defn of Artininan Module and Ring - Section 4.2 ....png
    59.1 KB · Views: 892
Physics news on Phys.org
  • #2
Multiply with elementary matrices. These are matrices where only one entry is 1 and the others are 0. For example ##\left(\begin{array}{cc}1 & 0\\ 0 & 0\end{array}\right)##.

Try that and then tell me what you got.
 
  • Like
Likes Math Amateur
  • #3
Thanks micromass ... will now work with your idea ...

Peter
 
  • #4
micromass said:
Multiply with elementary matrices. These are matrices where only one entry is 1 and the others are 0. For example ##\left(\begin{array}{cc}1 & 0\\ 0 & 0\end{array}\right)##.

Try that and then tell me what you got.
Firstly ... thanks to micromass for his help ...After some hints/help from MHB and micromass, I am at the following point ...

We need to examine the right and left ideals of the matrix ring and determine if there are/are not descending chains of ideals that do/do not terminate ...

-----------------------------------------------------------------------------------------------------------------------

So ... let ##I## be a left ideal of the ring ##S = \begin{pmatrix} \mathbb{Q} & \mathbb{R} \\ 0 & \mathbb{R} \end{pmatrix}##Suppose ##x \in I## and ##r \in S## ... ... then ... ...

##\begin{pmatrix} r_{11} & r_{12} \\ 0 & r_{22} \end{pmatrix} \begin{pmatrix} x_{11} & x_{12} \\ 0 & x_{22} \end{pmatrix} \ = \begin{pmatrix} r_{11} x_{11} & r_{11} x_{12} + r_{12} x_{22} \\ 0 & r_{22} x_{22} \end{pmatrix} ##... and ... going through the same exercise for a right ideal ...

...
let ##J## be a right ideal of the ring ##S = \begin{pmatrix} \mathbb{Q} & \mathbb{R} \\ 0 & \mathbb{R} \end{pmatrix}##Suppose ##x \in J## and ##r \in S## ... ... then ... ...

##\begin{pmatrix} x_{11} & x_{12} \\ 0 & x_{22} \end{pmatrix} \begin{pmatrix} r_{11} & r_{12} \\ 0 & r_{22} \end{pmatrix} \ = \begin{pmatrix} x_{11} r_{11} & x_{11} r_{12} + x_{12} r_{22} \\ 0 & x_{22} r_{22} \end{pmatrix} ##BUT ... ... where to from here ... ... ?------------------------------------------------------------------------------------------------------------------I am not quite sure of the exact approach that Micromass is suggesting when he says "multiply with elementary matrices" ... ... BUT ... ... maybe he is implying that I investigate ideals generated by the appropriate elementary matrices ...

Suppose ##K## is the right ideal generated by ##\begin{pmatrix} 1 & 0 \\ 0 & 0 \end{pmatrix}##

... ... then investigate this ideal ...

An element of ##K## will have the form:

##\begin{pmatrix} 1 & 0 \\ 0 & 0 \end{pmatrix} \begin{pmatrix} r_{11} & r_{12} \\ 0 & r_{22} \end{pmatrix} \ = \ \begin{pmatrix} r_{11} & r_{12} \\ 0 & 0 \end{pmatrix} = k ##Now ... ##K## must be closed under addition and under multiplication on the right by an element of ##S## ...

Let a general element of ##S## be ##s = \begin{pmatrix} s_{11} & s_{12} \\ 0 & s_{22} \end{pmatrix}##

then ##ks## must belong to ##K## ... ...

and ##ks = \begin{pmatrix} r_{11} & r_{12} \\ 0 & 0 \end{pmatrix} \begin{pmatrix} s_{11} & s_{12} \\ 0 & s_{22} \end{pmatrix} \ = \ \begin{pmatrix} r_{11} s_{11} & r_{12} s_{22} \\ 0 & 0 \end{pmatrix}##

So ##ks \in K## and ##K## seems to be an ideal ... ...

... BUT ... how do we proceed from here ... ?Is the above analysis correct? ... BUT ... where are we going ...
-----------------------------------------------------------------------------------------------------------------------------ALTERNATIVELY ... again ... trying to fathom micromass' suggestion ... ... we could try to constructively use elementary matrices as follows ...
We have the matrix ring ##S = \begin{pmatrix} \mathbb{Q} & \mathbb{R} \\ 0 & \mathbb{R} \end{pmatrix}##

Now ... suppose we have a right ideal ##I## and suppose further that the element ##s = \begin{pmatrix} s_{11} & s_{12} \\ 0 & s_{22} \end{pmatrix}## belongs to ##I## ...

... then since ##\begin{pmatrix} 1 & 0 \\ 0 & 0 \end{pmatrix} \in S## we have

##\begin{pmatrix} s_{11} & s_{12} \\ 0 & s_{22} \end{pmatrix} \begin{pmatrix} 1 & 0 \\ 0 & 0 \end{pmatrix} \ = \ \begin{pmatrix} s_{11} & 0 \\ 0 & 0 \end{pmatrix}##Similarly

##\begin{pmatrix} s_{11} & s_{12} \\ 0 & s_{22} \end{pmatrix} \begin{pmatrix} 0 & 1 \\ 0 & 0 \end{pmatrix} \ = \ \begin{pmatrix} 0 & s_{11} \\ 0 & 0 \end{pmatrix}##and again ...

##\begin{pmatrix} s_{11} & s_{12} \\ 0 & s_{22} \end{pmatrix} \begin{pmatrix} 0 & 0 \\ 0 & 1 \end{pmatrix} \ = \ \begin{pmatrix} 0 & s_{12} \\ 0 & s_{22} \end{pmatrix}##... BUT ... where does this lead us ...

Can someone please help further ...

Peter
 
  • #5
To show, that ##R = \left\{ \begin{bmatrix} \mathbb{Q} && \mathbb{R} \\ 0 && \mathbb{R} \end{bmatrix} \right\}## is right-Artinian, we have to show that the right-##R##-module ##R## satisfies the descending chain condition, i.e. any chain of right-##R##-submodules ##R = M_0 \supseteq M_1 \supseteq M_2 \supseteq \dots ## becomes stationary.
I only see the rough way here, i.e. running through the cases. We are done, if any of our ##M_n## contains an invertible element and likewise if it is zero.

Do you know why?

So otherwise there will be an element ##\mu = \begin{bmatrix}m_1 && m_2 \\ 0 && m_3 \end{bmatrix}## in an ##M_n## with ##m_1 \cdot m_3 = 0## and not all ##m_i = 0##. The question is, what happens if we multiply an element ##\alpha = \begin{bmatrix}a_1 && a_2 \\ 0 && a_3 \end{bmatrix}## to ##\mu## from the right, i.e. ##\mu \cdot \alpha##. And then ##\mu \cdot \alpha \cdot \beta## with ##\beta = \begin{bmatrix}b_1 && b_2 \\ 0 && b_3 \end{bmatrix}## and so on and so on. Will we get stationary?

In order to show that ##R## is not left-Artinian it is enough to find a chain of left-submodules of ##R## that does not become stationary, i.e. ##R = M_0 \supset M_1 \supset M_2 \supset \dots ## forever.

Can you find such a chain?

Hint: Consider modules generated by ##\mu_n = \begin{bmatrix}0 && m_n \\ 0 && 0 \end{bmatrix}## and look what can be gained if you multiply now only from the left with ##\alpha = \begin{bmatrix}a_1 && a_2 \\ 0 && a_3 \end{bmatrix}##. Remember that ##a_1 \in \mathbb{Q}##. What could you take as ##m_n## which you are free to define?

And if you found a suitable descending chain of left-##R##-modules of ##R## that does not become stationary, why doesn't the same argument, i.e. the same chain, work as right-##R##-modules, i.e. why does it end if considered as right-##R##-modules?
 
  • Like
Likes Math Amateur
  • #6
Sorry for late reply, fresh_42 ... been traveling ... but now have secure Internet connection ...

Thanks so much for your help ... really needed some detailed help on this problem!

Will work through your post shortly ...

Thanks again!

Peter
 
  • #7
fresh_42 said:
To show, that ##R = \left\{ \begin{bmatrix} \mathbb{Q} && \mathbb{R} \\ 0 && \mathbb{R} \end{bmatrix} \right\}## is right-Artinian, we have to show that the right-##R##-module ##R## satisfies the descending chain condition, i.e. any chain of right-##R##-submodules ##R = M_0 \supseteq M_1 \supseteq M_2 \supseteq \dots ## becomes stationary.
I only see the rough way here, i.e. running through the cases. We are done, if any of our ##M_n## contains an invertible element and likewise if it is zero.

Do you know why?

So otherwise there will be an element ##\mu = \begin{bmatrix}m_1 && m_2 \\ 0 && m_3 \end{bmatrix}## in an ##M_n## with ##m_1 \cdot m_3 = 0## and not all ##m_i = 0##. The question is, what happens if we multiply an element ##\alpha = \begin{bmatrix}a_1 && a_2 \\ 0 && a_3 \end{bmatrix}## to ##\mu## from the right, i.e. ##\mu \cdot \alpha##. And then ##\mu \cdot \alpha \cdot \beta## with ##\beta = \begin{bmatrix}b_1 && b_2 \\ 0 && b_3 \end{bmatrix}## and so on and so on. Will we get stationary?

In order to show that ##R## is not left-Artinian it is enough to find a chain of left-submodules of ##R## that does not become stationary, i.e. ##R = M_0 \supset M_1 \supset M_2 \supset \dots ## forever.

Can you find such a chain?

Hint: Consider modules generated by ##\mu_n = \begin{bmatrix}0 && m_n \\ 0 && 0 \end{bmatrix}## and look what can be gained if you multiply now only from the left with ##\alpha = \begin{bmatrix}a_1 && a_2 \\ 0 && a_3 \end{bmatrix}##. Remember that ##a_1 \in \mathbb{Q}##. What could you take as ##m_n## which you are free to define?

And if you found a suitable descending chain of left-##R##-modules of ##R## that does not become stationary, why doesn't the same argument, i.e. the same chain, work as right-##R##-modules, i.e. why does it end if considered as right-##R##-modules?
Thanks again for your help, fresh_42 ...

You wrote:

"... ... We are done, if any of our ##M_n## contains an invertible element and likewise if it is zero.

Do you know why? ... ... I can see that if any of the ##M_n## are ##\{ 0 \}## then the descending chain becomes stationary as in ... ... ##\{ 0 \} \supseteq \{ 0 \} \supseteq \{ 0 \} \supseteq \dots##

... ... as the chain cannot possibly 'descend' any further ...I am, however, not so sure of the case where ##M_n## say contains a unit ...

If a submodule contains a unit then as is the case for an ideal of a ring:

##M_n = R = \left\{ \begin{bmatrix} \mathbb{Q} && \mathbb{R} \\ 0 && \mathbb{R} \end{bmatrix} \right\}##

and then the only descending chain that is possible from ##M_0## to ##M_n## is

## R \supseteq R \supseteq R \supseteq \dots \supseteq R ## (is this your argument ... indeed is this correct? )

BUT ... surely we may find a submodule without the unit ... say M_{n+1} and the descending chain may continue on as follows ...

## R \supseteq R \supseteq R \supseteq \dots \supseteq R \supseteq M_{n+1} \supseteq M_{n+2} \supseteq \dots ##

... BUT ... I think you are arguing that this is impossible and that the chain is stationary as

## R \supseteq R \supseteq R \supseteq \dots \supseteq R ## ... ... but why can it not continue as I have argued ...

Can you please clarify ... ...

Peter
 
  • #8
Math Amateur said:
##R \supseteq R \supseteq R \supseteq \dots \supseteq R \supseteq M_{n+1} \supseteq M_{n+2} \supseteq \dots##
Yes, this might happen. But unless somewhere in the chain ##R \supset M_n## happens to be a proper inclusion, we won't have anything to do because it's obviously stationary. Therefore we may assume that some ##M_n## doesn't contain any invertible elements. The left side of the chain is simply not of interest. Whether we would renumber the entire chain if it is ##R=R=R=R=R=R= \dots =R \supset M_n## or simply start with ##M_n## makes not much of a difference.

(This argument isn't allowed in general, since modules are no ideals. Here, however, all elements are ring elements and our ring has a ##1##, so it is allowed to speak of invertible elements.)
 
  • Like
Likes Math Amateur
  • #9
Thank you for your help fresh_42 ... appreciate it ...

Peter
 

FAQ: How Do You Prove a Matrix Ring is Right Artinian But Not Left Artinian?

What are Noncommutative Artinian Rings?

Noncommutative Artinian Rings are algebraic structures that consist of a set of elements, along with two binary operations of addition and multiplication. They are noncommutative, meaning that the order in which the operations are performed affects the result. Artinian rings are a type of ring that has certain properties, such as having a finite number of subrings and satisfying the descending chain condition.

What are the properties of Noncommutative Artinian Rings?

Noncommutative Artinian Rings have several important properties, including being finite-dimensional, having a finite number of subrings, and satisfying the descending chain condition. Additionally, they are left and right Artinian, meaning that they have a finite number of left and right ideals, respectively.

How are Noncommutative Artinian Rings different from Commutative Artinian Rings?

The main difference between Noncommutative and Commutative Artinian Rings is that the multiplication operation in Noncommutative rings is not commutative, meaning that the order in which the operations are performed matters. In Commutative Artinian Rings, the multiplication operation is commutative, meaning that the order does not affect the result.

What are the applications of Noncommutative Artinian Rings in mathematics?

Noncommutative Artinian Rings have several important applications in mathematics, including in the study of group rings, representation theory, and noncommutative algebraic geometry. They also have connections to other areas of mathematics, such as number theory and topology.

How are Noncommutative Artinian Rings relevant in other fields?

Noncommutative Artinian Rings have applications in other fields, such as physics and computer science. In physics, they are used in the study of quantum mechanics and quantum field theory. In computer science, they are used in coding theory and cryptography. Additionally, they have applications in engineering, economics, and other areas of science and technology.

Back
Top