Left/Right Multiplication Maps on Algebras .... Bresar

In summary, the conversation was regarding the proof of Lemma 1.24 in Matej Bresar's book "Introduction to Noncommutative Algebra." The conversation included questions about the proof and clarifications on notation and definitions. The questions revolved around understanding how certain expressions and equations were derived in the proof. The book's first two pages on Multiplication Algebras were provided for context and clarification. The conversation was conducted in a respectful manner and the person asking for help expressed their gratitude for the assistance provided.
  • #1
Math Amateur
Gold Member
MHB
3,998
48
I am reading Matej Bresar's book, "Introduction to Noncommutative Algebra" and am currently focussed on Chapter 1: Finite Dimensional Division Algebras ... ...

I need help with the proof of Lemma 1.24 ...

Lemma 1.24 reads as follows:
?temp_hash=695688f2306bf2924c46d39e19428801.png


My questions regarding the proof of Lemma 1.24 are as follows ... ...Question 1

In the above proof by Bresar, we read:

" ... ... Since ##A## is simple, the ideal generated by ##b_n## is equal to ##A##.

That is ##\sum_{ j = 1 }^m w_j b_n z_j = 1## for some ##w_j , z_J \in A##. ... ... "My question is ... ... how/why does the fact that the ideal generated by ##b_n## being equal to ##A## ...

imply that ... ##\sum_{ j = 1 }^m w_j b_n z_j = 1## for some ##w_j , z_J \in A## ...?

Question 2In the above proof by Bresar, we read:" ... ##0 = \sum_{ j = 1 }^m R_{ z_j } \ ( \sum_{ i = 1 }^n L_{ a_i } R_{ b_i } ) \ R_{ w_j }####= \sum_{ i = 1 }^n L_{ a_i } \ ( \sum_{ j = 1 }^m R_{ w_j b_i z_j } )####= \sum_{ i = 1 }^n L_{ a_i } R_{ c_i }##... ... "
My questions are

(a) can someone help me to understand how##\sum_{ j = 1 }^m R_{ z_j } \ ( \sum_{ i = 1 }^n L_{ a_i } R_{ b_i } ) \ R_{ w_j }####= \sum_{ i = 1 }^n L_{ a_i } \ ( \sum_{ j = 1 }^m R_{ w_j b_i z_j } ) ##
(b) can someone help me to understand how##\sum_{ i = 1 }^n L_{ a_i } \ ( \sum_{ j = 1 }^m R_{ w_j b_i z_j } ) ####= \sum_{ i = 1 }^n L_{ a_i } R_{ c_i }##

Help will be appreciated ...

Peter

=========================================================================

*** NOTE ***So that readers of the above post will be able to understand the context and notation of the post ... I am providing Bresar's first two pages on Multiplication Algebras ... ... as follows:
?temp_hash=695688f2306bf2924c46d39e19428801.png

?temp_hash=695688f2306bf2924c46d39e19428801.png
 

Attachments

  • Bresar - Lemma 1.24 ... ....png
    Bresar - Lemma 1.24 ... ....png
    85.9 KB · Views: 688
  • Bresar - 1 - Section 1.5 Multiplication Algebra - PART 1 ... ....png
    Bresar - 1 - Section 1.5 Multiplication Algebra - PART 1 ... ....png
    27.6 KB · Views: 644
  • Bresar - 2 - Section 1.5 Multiplication Algebra - PART 2 ... ....png
    Bresar - 2 - Section 1.5 Multiplication Algebra - PART 2 ... ....png
    29.5 KB · Views: 667
Last edited:
Physics news on Phys.org
  • #2
Q1:
What does it mean to be an ideal ##I## of ##A##?
For a (two-sided!) ideal, it has to hold, that ##A\cdot I \subseteq I## and ##I\cdot A \subseteq I##. Since ##b_n \in I##, we need to have all left and right multiples to also be in ##I##. So all elements of the form ##w_jb_nz_j## are together with ##b_n## also elements of ##I##.
Furthermore an ideal is closed under addition, so all summations of elements of ##I## are again in ##I##, esp. any sum ##\sum_{j=1}^{m} w_jb_nz_j##. This is simultaneously the most general form of any element of ##I = <b_n> = A\cdot I \cdot A##.

Q2:
We have ##\sum_{i=1}^{n} L_{a_i}R_{b_i} = 0 \; (^*) \;## by assumption.
Then let us define ##\sum_{j=1}^{m} w_jb_iz_j =: c_i \; (^{**}) \;##, simply as an abbreviation for these sums ##c_1, \ldots , c_n##.
Because all sums are finite, we won't have to bother any order of summation.
At last let us assume we have an arbitrary element ##x \in A##.

Now calculate ##\left( \sum_{j=1}^{m} R_{z_j} \left( \sum_{i=1}^{n} L_{a_i}R_{b_i} \right) R_{w_j} \right)(x)## by using ##(^*)## and ##(^{**})##.
 
  • Like
Likes Math Amateur
  • #3
Thanks again for your assistance, fresh_42 ...

Most helpful ...

Peter
 

FAQ: Left/Right Multiplication Maps on Algebras .... Bresar

What is a multiplication map on an algebra?

A multiplication map on an algebra is a function that takes two elements from the algebra and produces a new element. This operation is similar to multiplication in arithmetic, but it is defined specifically for the algebraic structure.

What is the purpose of left/right multiplication maps on algebras?

The purpose of left/right multiplication maps on algebras is to extend the algebraic structure and allow for more complex operations. They are used to define the algebra's multiplication in a way that is consistent with the algebra's other properties, such as associativity and distributivity.

How do you define a left/right multiplication map on an algebra?

A left/right multiplication map on an algebra is defined as a function that takes an element from the algebra and multiplies it on the left or right by another element from the algebra. This operation is denoted as L(a) or R(a), where a is the element being multiplied.

What are some properties of left/right multiplication maps on algebras?

Some properties of left/right multiplication maps on algebras include distributivity, associativity, and identity. Distributivity means that L(a+b) = L(a) + L(b) and R(a+b) = R(a) + R(b). Associativity means that L(aL(b)) = L(a)L(b) and R(aR(b)) = R(a)R(b). Identity means that L(1) = 1 and R(1) = 1, where 1 is the multiplicative identity element of the algebra.

How are left/right multiplication maps used in algebraic structures?

Left/right multiplication maps are used in algebraic structures to define the algebra's multiplication in a consistent and meaningful way. They are also used to prove properties and theorems about the algebra, as well as to construct more complex algebraic structures.

Back
Top