Is \( R[x,y] \) Isomorphic to \( R[y,x] \)?

  • MHB
  • Thread starter Math Amateur
  • Start date
In summary: R[y,z] \cong R[y,x]$There are polynomial rings defined in such a way that $x$ and $y$ do NOT commute (so $xy \neq yx$) over non-commutative rings. These are much more complicated. Be happy.In summary, the conversation is about proving that $\phi(f(x,y)) = f(y,x)$ is a ring homomorphism by using the commutativity of $R$ and $x$ and $y$. The proof involves showing that $\phi$ is injective and surjective, and an alternative method is also provided. It is also mentioned that polynomial rings can be defined in a way where $x$ and $y$ do not
  • #1
Math Amateur
Gold Member
MHB
3,998
48
I am reading Joseph Rotman's book Advanced Modern Algebra.

I need help with Problem 2.31 on page 100.

Problem 2.31 reads as follows:

--------------------------------------------------------------------

If R is a commutative ring, prove that \(\displaystyle R[x,y] \sim R[y,x] \).

--------------------------------------------------------------------

Can someone please help me get started on this problem.

Peter
 
Physics news on Phys.org
  • #2
Define $\phi(f(x,y)) = f(y,x)$.

This is a ring homomorphism because $R$ is commutative and $x$ and $y$ commute (it would be an additive group homomorphism in any case, so it's the multiplicative property that matters, here).

Suppose:

$f(x,y) = a_0 + a_1x + b_1y + a_2x^2 + c_1xy + b_2y^2 + \cdots + a_mx^m +\cdots + b_ny^n$

Now if $\phi(f(y,x)) = f(y,x) = 0$, we have:

$a_0 + b_1x + a_1y + b_2x^2 + c_1xy + a_2y^2 + \cdots + b_nx^n + \cdots + a_my^m = 0$

so that:

$a_0 = a_1 = \dots = a_m = b_1 = \dots = b_n = c_1 = \dots = c_k = 0$

(we are using the fact that $\{1,x,y,x^2,xy,y^2,x^3,x^2y,xy^2,y^3,\dots\}$ is a basis for $R[x,y]$ OR $R[y,x]$ considered as $R$-modules. The fact that $x$ and $y$ commute is ESSENTIAL, here).

Hence $\phi$ is injective.

Given $g(y,x) \in R[y,x]$, clearly $g(x,y)$ is in the pre-image under $\phi$, so $\phi$ is surjective.

Alternatively:

$R[x,y] \cong (R[x])[y] \cong (R[x])[z] \cong (R[y])[z] \cong R[y,z] \cong R[y,x]$

There are polynomial rings defined in such a way that $x$ and $y$ do NOT commute (so $xy \neq yx$) over non-commutative rings. These are much more complicated. Be happy.
 
  • #3
Deveno said:
Define $\phi(f(x,y)) = f(y,x)$.

This is a ring homomorphism because $R$ is commutative and $x$ and $y$ commute (it would be an additive group homomorphism in any case, so it's the multiplicative property that matters, here).

Suppose:

$f(x,y) = a_0 + a_1x + b_1y + a_2x^2 + c_1xy + b_2y^2 + \cdots + a_mx^m +\cdots + b_ny^n$

Now if $\phi(f(y,x)) = f(y,x) = 0$, we have:

$a_0 + b_1x + a_1y + b_2x^2 + c_1xy + a_2y^2 + \cdots + b_nx^n + \cdots + a_my^m = 0$

so that:

$a_0 = a_1 = \dots = a_m = b_1 = \dots = b_n = c_1 = \dots = c_k = 0$

(we are using the fact that $\{1,x,y,x^2,xy,y^2,x^3,x^2y,xy^2,y^3,\dots\}$ is a basis for $R[x,y]$ OR $R[y,x]$ considered as $R$-modules. The fact that $x$ and $y$ commute is ESSENTIAL, here).

Hence $\phi$ is injective.

Given $g(y,x) \in R[y,x]$, clearly $g(x,y)$ is in the pre-image under $\phi$, so $\phi$ is surjective.

Alternatively:

$R[x,y] \cong (R[x])[y] \cong (R[x])[z] \cong (R[y])[z] \cong R[y,z] \cong R[y,x]$

There are polynomial rings defined in such a way that $x$ and $y$ do NOT commute (so $xy \neq yx$) over non-commutative rings. These are much more complicated. Be happy.
Given confidence and guidance by your post I decided to try to formulate a rigorous proof of the proposition that \(\displaystyle \phi \) as defined by you is actually a ring homomorphism ... ...

So ... ... for \(\displaystyle \phi \) to be a ring homomorphism we require that:

(1) \(\displaystyle \phi (1) = 1 \)

(2) \(\displaystyle \phi ( f+ g (x,y) ) = \phi (f(x,y)) + \phi (g(x,y)) \)

(3) \(\displaystyle \phi (fg (x,y)) = \phi ( f(x,y) ) \phi ( g(x,y) ) \)

Now we proceed as follows:

(1) \(\displaystyle \phi (1) = 1 \) by definition, since when \(\displaystyle f(x,y)\) = any constant polynomial, \(\displaystyle c\) , there is no \(\displaystyle x\) or \(\displaystyle y\) to 'swap' so \(\displaystyle \phi (c) = c \)

(2) \(\displaystyle \phi ( f+ g (x,y) ) = f + g (y,x) \) by definition of \(\displaystyle \phi \)

\(\displaystyle = f(y,x) + g(y,x) \) by definition of the sum of functions

\(\displaystyle = \phi ( f(x,y)) + \phi ( g(x,y)) \) by definition of \(\displaystyle \phi \)

(3) \(\displaystyle \phi ( fg (x,y) ) = fg (y,x) \) by definition of \(\displaystyle \phi \)

\(\displaystyle = f(y,x) g(y,x) \) by definition of the product of functions

\(\displaystyle = \phi ( f(x,y)) \phi ( g(x,y)) \) by definition of \(\displaystyle \phi \)

... ... ... ... So ... ... the above seems to be a proof (rigorous, I hope ... ... ) that the function \(\displaystyle \phi \) defined by you is a ring homomorphism ...

However ... ... the question I am now puzzling over is the following:

You have written:

"This is a ring homomorphism because $R$ is commutative and $x$ and $y$ commute ... ... "

BUT ... where exactly in the above proof do we need commutativity of R and commutativity of x and y?

Thinking ... ...

Presumably, commutativity is needed when we actually calculate \(\displaystyle fg(y,x) \) and and the product \(\displaystyle f(y,x) g(y,x) \) and seek to confirm or validate that they are equal ... ... maybe this cannot be achieve without both the commutativity of \(\displaystyle R\) and of \(\displaystyle x\) and \(\displaystyle y\)?

A question related to the above is as follows: how exactly do we define the multiplication of two polynomials in two variables (as this would give us the exact form of the product before we begin to manipulate terms via the use of commutativity)? So this definition would presumably have something to do with whether we need commutativity or not! Further, we would have to know whether x and y commute ... but anyway, you indicate this is a matter of definition.

Can someone please clarify the above?

Peter
 
Last edited:
  • #4
Deveno said:
Define $\phi(f(x,y)) = f(y,x)$.

This is a ring homomorphism because $R$ is commutative and $x$ and $y$ commute (it would be an additive group homomorphism in any case, so it's the multiplicative property that matters, here).

Suppose:

$f(x,y) = a_0 + a_1x + b_1y + a_2x^2 + c_1xy + b_2y^2 + \cdots + a_mx^m +\cdots + b_ny^n$

Now if $\phi(f(y,x)) = f(y,x) = 0$, we have:

$a_0 + b_1x + a_1y + b_2x^2 + c_1xy + a_2y^2 + \cdots + b_nx^n + \cdots + a_my^m = 0$

so that:

$a_0 = a_1 = \dots = a_m = b_1 = \dots = b_n = c_1 = \dots = c_k = 0$

(we are using the fact that $\{1,x,y,x^2,xy,y^2,x^3,x^2y,xy^2,y^3,\dots\}$ is a basis for $R[x,y]$ OR $R[y,x]$ considered as $R$-modules. The fact that $x$ and $y$ commute is ESSENTIAL, here).

Hence $\phi$ is injective.

Given $g(y,x) \in R[y,x]$, clearly $g(x,y)$ is in the pre-image under $\phi$, so $\phi$ is surjective.

Alternatively:

$R[x,y] \cong (R[x])[y] \cong (R[x])[z] \cong (R[y])[z] \cong R[y,z] \cong R[y,x]$

There are polynomial rings defined in such a way that $x$ and $y$ do NOT commute (so $xy \neq yx$) over non-commutative rings. These are much more complicated. Be happy.
Hi Deveno,

I have been further reflecting on your post and now have an additional question ... ...

You write:

"Now if $\phi(f(y,x)) = f(y,x) = 0$, we have:

$a_0 + b_1x + a_1y + b_2x^2 + c_1xy + a_2y^2 + \cdots + b_nx^n + \cdots + a_my^m = 0$

so that:

$a_0 = a_1 = \dots = a_m = b_1 = \dots = b_n = c_1 = \dots = c_k = 0$

(we are using the fact that $\{1,x,y,x^2,xy,y^2,x^3,x^2y,xy^2,y^3,\dots\}$ is a basis for $R[x,y]$ OR $R[y,x]$ considered as $R$-modules. The fact that $x$ and $y$ commute is ESSENTIAL, here)."I do not see why the reference to a basis and indeed to modules is necessary here ... nor do I fully understand your reference to commutativity ... so I really do not understand the issues here ... hope you can help ...

In simple terms it seems to me that if you want

$a_0 + b_1x + a_1y + b_2x^2 + c_1xy + a_2y^2 + \cdots + b_nx^n + \cdots + a_my^m = 0$

then the co-efficients must be zero ... simple?

Obviously I am missing something ... so I hope ou can help ... ...

Peter
 
  • #5
If you have a basis, then the $R$-linear coefficients are unique for that basis.

This is what (in $R[x]$, for example) allows us to conclude that if:

$a_0 + a_1x + a_2x^2 = 3 + 2x$

that $a_0 = 3, a_1 = 2, a_2 = 0$.

The definition for "two-variable" polynomial multiplication is a little convoluted:

If $\displaystyle f(x,y) = \sum_{i = 0}^m \left(\sum_{j = 0}^i a_{ij}x^{i-j}y^j\right)$

and $\displaystyle g(x,y) = \sum_{i = 0}^n \left(\sum_{j = 0}^i b_{ij}x^{i-j}y^j\right)$

then:

$\displaystyle f(x,y)g(x,y) = \sum_{i = 0}^{m+n} \left(\sum_{j = 0}^i c_{ij}x^{i-j}y^j\right)$

where:

$\displaystyle c_{ij} = \sum_{k = 0}^i \sum_{m = 0}^j a_{km}b_{(i-k)(j-m)}$

and we take $a_{km} = 0$ if $k > m$ and $b_{(i-k)(j-m)} = 0$ if $i - k > j - m$.

(think of the first index as tagging the degree of a monomial, and the second as tagging the power of $y$ that occurs).

The "easier" way to think of this is just apply the distributive law, for example:

$(x + y)(2x + y) = x(2x + y) + y(2x + y) = x(2x) + xy + y(2x) + y(y) = 2x^2 + xy + 2yx + y^2$

at this point, we NEED commutativity to say:

$= 2x^2 + xy + 2xy + y^2 = 2x^2 + 3xy + y^2$

**********************

Your proof that $\phi$ is a homomorphism isn't quite right: $R[x,y]$ is a ring of "polynomial expressions"; the related "polynomial functions" where (for some commutative ring $S$):

$\tilde{f}: S \times S \to S$ is given by $\tilde{f}(s_1,s_2) = f(s_1,s_2)$, are slightly different animals.

We are not evaluating, at any time, an expression like $x^2 + 3xy + y^2$ "at" some pair of numbers, and multiplication and addition of two polynomials are not defined "point-wise" these are FORMAL sums and products.

I'll try to post more on this when I'm not so tired.
 
  • #6
Deveno said:
If you have a basis, then the $R$-linear coefficients are unique for that basis.

This is what (in $R[x]$, for example) allows us to conclude that if:

$a_0 + a_1x + a_2x^2 = 3 + 2x$

that $a_0 = 3, a_1 = 2, a_2 = 0$.

The definition for "two-variable" polynomial multiplication is a little convoluted:

If $\displaystyle f(x,y) = \sum_{i = 0}^m \left(\sum_{j = 0}^i a_{ij}x^{i-j}y^j\right)$

and $\displaystyle g(x,y) = \sum_{i = 0}^n \left(\sum_{j = 0}^i b_{ij}x^{i-j}y^j\right)$

then:

$\displaystyle f(x,y)g(x,y) = \sum_{i = 0}^{m+n} \left(\sum_{j = 0}^i c_{ij}x^{i-j}y^j\right)$

where:

$\displaystyle c_{ij} = \sum_{k = 0}^i \sum_{m = 0}^j a_{km}b_{(i-k)(j-m)}$

and we take $a_{km} = 0$ if $k > m$ and $b_{(i-k)(j-m)} = 0$ if $i - k > j - m$.

(think of the first index as tagging the degree of a monomial, and the second as tagging the power of $y$ that occurs).

The "easier" way to think of this is just apply the distributive law, for example:

$(x + y)(2x + y) = x(2x + y) + y(2x + y) = x(2x) + xy + y(2x) + y(y) = 2x^2 + xy + 2yx + y^2$

at this point, we NEED commutativity to say:

$= 2x^2 + xy + 2xy + y^2 = 2x^2 + 3xy + y^2$

**********************

Your proof that $\phi$ is a homomorphism isn't quite right: $R[x,y]$ is a ring of "polynomial expressions"; the related "polynomial functions" where (for some commutative ring $S$):

$\tilde{f}: S \times S \to S$ is given by $\tilde{f}(s_1,s_2) = f(s_1,s_2)$, are slightly different animals.

We are not evaluating, at any time, an expression like $x^2 + 3xy + y^2$ "at" some pair of numbers, and multiplication and addition of two polynomials are not defined "point-wise" these are FORMAL sums and products.

I'll try to post more on this when I'm not so tired.
Thanks Deveno, but ... ...

... ... when you write:

"Your proof that $\phi$ is a homomorphism isn't quite right: $R[x,y]$ is a ring of "polynomial expressions"; the related "polynomial functions" where (for some commutative ring $S$):

$\tilde{f}: S \times S \to S$ is given by $\tilde{f}(s_1,s_2) = f(s_1,s_2)$, are slightly different animals."

I do not follow you ... HOWEVER ... ... I know enough to be aware that you are saying something rather fundamental ... I would therefore welcome some further clarification ... when you can ... ...

On a much more trivial level ... you write:

"... ... we NEED commutativity to say:

$= 2x^2 + xy + 2xy + y^2 = 2x^2 + 3xy + y^2$"

Can you help me to see exactly where commutativity is required in this simplification ... sorry if I am being a bit slow ... :-(

Peter
 
Last edited:
  • #7
Deveno said:
If you have a basis, then the $R$-linear coefficients are unique for that basis.

This is what (in $R[x]$, for example) allows us to conclude that if:

$a_0 + a_1x + a_2x^2 = 3 + 2x$

that $a_0 = 3, a_1 = 2, a_2 = 0$.

The definition for "two-variable" polynomial multiplication is a little convoluted:

If $\displaystyle f(x,y) = \sum_{i = 0}^m \left(\sum_{j = 0}^i a_{ij}x^{i-j}y^j\right)$

and $\displaystyle g(x,y) = \sum_{i = 0}^n \left(\sum_{j = 0}^i b_{ij}x^{i-j}y^j\right)$

then:

$\displaystyle f(x,y)g(x,y) = \sum_{i = 0}^{m+n} \left(\sum_{j = 0}^i c_{ij}x^{i-j}y^j\right)$

where:

$\displaystyle c_{ij} = \sum_{k = 0}^i \sum_{m = 0}^j a_{km}b_{(i-k)(j-m)}$

and we take $a_{km} = 0$ if $k > m$ and $b_{(i-k)(j-m)} = 0$ if $i - k > j - m$.

(think of the first index as tagging the degree of a monomial, and the second as tagging the power of $y$ that occurs).

The "easier" way to think of this is just apply the distributive law, for example:

$(x + y)(2x + y) = x(2x + y) + y(2x + y) = x(2x) + xy + y(2x) + y(y) = 2x^2 + xy + 2yx + y^2$

at this point, we NEED commutativity to say:

$= 2x^2 + xy + 2xy + y^2 = 2x^2 + 3xy + y^2$

**********************

Your proof that $\phi$ is a homomorphism isn't quite right: $R[x,y]$ is a ring of "polynomial expressions"; the related "polynomial functions" where (for some commutative ring $S$):

$\tilde{f}: S \times S \to S$ is given by $\tilde{f}(s_1,s_2) = f(s_1,s_2)$, are slightly different animals.

We are not evaluating, at any time, an expression like $x^2 + 3xy + y^2$ "at" some pair of numbers, and multiplication and addition of two polynomials are not defined "point-wise" these are FORMAL sums and products.

I'll try to post more on this when I'm not so tired.

Hi Deveno,

I did some reading and found that Beachy and Blair describe both polynomial expressions and polynomials quite clearly. Unlike Rotman they describe the coefficients as coming from a field, which is less general than Rotman who allows the coefficients to come from a commutative ring.

The relevant text from Beachy and Blair is as follows:

View attachment 2672
View attachment 2673Rotman defines polynomials as a special type of power series as follows: (note, coefficients come from a commutative ring)View attachment 2674

Puzzlingly (for me anyway) he seems to allude to power series, and hence by implication, polynomials as functions when he says the following: (see above text)

" ... ... let us recognize that a sequence \(\displaystyle \sigma \) is really a function \(\displaystyle \sigma \ : \ \mathbb{N} \to R \), where \(\displaystyle \mathbb{N} \) is the set of natural numbers, with \(\displaystyle \sigma (i) = s_i \text{ for all } i \ge 0 \) ... ... "

However it seems that \(\displaystyle \sigma \) is a function of a different nature than the function that results when the indeterminate(s) is replaced by elements from R.

I hope I am now on the right track ...

Peter
 
  • #8
Deveno said:
If you have a basis, then the $R$-linear coefficients are unique for that basis.

This is what (in $R[x]$, for example) allows us to conclude that if:

$a_0 + a_1x + a_2x^2 = 3 + 2x$

that $a_0 = 3, a_1 = 2, a_2 = 0$.

The definition for "two-variable" polynomial multiplication is a little convoluted:

If $\displaystyle f(x,y) = \sum_{i = 0}^m \left(\sum_{j = 0}^i a_{ij}x^{i-j}y^j\right)$

and $\displaystyle g(x,y) = \sum_{i = 0}^n \left(\sum_{j = 0}^i b_{ij}x^{i-j}y^j\right)$

then:

$\displaystyle f(x,y)g(x,y) = \sum_{i = 0}^{m+n} \left(\sum_{j = 0}^i c_{ij}x^{i-j}y^j\right)$

where:

$\displaystyle c_{ij} = \sum_{k = 0}^i \sum_{m = 0}^j a_{km}b_{(i-k)(j-m)}$

and we take $a_{km} = 0$ if $k > m$ and $b_{(i-k)(j-m)} = 0$ if $i - k > j - m$.

(think of the first index as tagging the degree of a monomial, and the second as tagging the power of $y$ that occurs).

The "easier" way to think of this is just apply the distributive law, for example:

$(x + y)(2x + y) = x(2x + y) + y(2x + y) = x(2x) + xy + y(2x) + y(y) = 2x^2 + xy + 2yx + y^2$

at this point, we NEED commutativity to say:

$= 2x^2 + xy + 2xy + y^2 = 2x^2 + 3xy + y^2$

**********************

Your proof that $\phi$ is a homomorphism isn't quite right: $R[x,y]$ is a ring of "polynomial expressions"; the related "polynomial functions" where (for some commutative ring $S$):

$\tilde{f}: S \times S \to S$ is given by $\tilde{f}(s_1,s_2) = f(s_1,s_2)$, are slightly different animals.

We are not evaluating, at any time, an expression like $x^2 + 3xy + y^2$ "at" some pair of numbers, and multiplication and addition of two polynomials are not defined "point-wise" these are FORMAL sums and products.

I'll try to post more on this when I'm not so tired.
This time take \(\displaystyle R[x,y] \) as a ring of polynomial expressions as defined by Beachy and Blair - see previous post.

Then taking sums and products of polynomial expressions is defined as follows: (by Beachy and Blair)

View attachment 2675Now for \(\displaystyle \phi \) to be a ring homomorphism we require that:

(1) \(\displaystyle \phi (1) = 1 \)

(2) \(\displaystyle \phi ( f+ g (x,y) ) = \phi (f(x,y)) + \phi (g(x,y)) \)

(3) \(\displaystyle \phi (fg (x,y)) = \phi ( f(x,y) ) \phi ( g(x,y) ) \)

Now (1) is trivial.

I also wrote out general calculation for (2) and my conclusion is that (2) follows pretty much from the definition of the sum of two polynomial expressions and that commutativity of R and x and y is not needed.

Now the general case for (3) is somewhat tedious so I thought I would work through your example:

So, let us, for example, take \(\displaystyle R = \mathbb{Z} \) and (following your example) take the following polynomial expressions:

\(\displaystyle f(x,y) = x + y \)

\(\displaystyle g(x,y) = 2x + y \)

Now proceed to demonstrate that for these polynomial expressions we have

\(\displaystyle \phi ( f(x,y) g(x,y) ) = \phi ( f(x,y)) (\phi ( g(x,y)) \)

Proceed as follows:

\(\displaystyle \phi ( f(x,y) g(x,y) ) \)

\(\displaystyle = \phi ((x + y) (2x + y)) \)

\(\displaystyle = \phi ( x(2x + y) + y(2x + y) ) \) by distributivity of the ring \(\displaystyle R[x,y]\)

\(\displaystyle = \phi ( x.2x + xy + y.2x + y^2 ) \) ... ... now at this point, to proceed we seem to need commutativity of x and y with the elements of R - rather than the commutativity of elements of R? - in order that, for example \(\displaystyle x.2x = 2.x.x = 2x^2 \)? But Rotman specifies that R is commutative, NOT that the indeterminates and the elements of R commute with each other?

\(\displaystyle = \phi (2x^2 + xy + 2yx + y^2) \)

\(\displaystyle = 2y^2 + yx + 2xy + x^2 \) ... ... ... (*)
Now consider \(\displaystyle \phi(f(x,y)) \phi(g(x,y)) \)

Now we proceed as follows:

\(\displaystyle \phi(f(x,y)) \phi(g(x,y)) \)

\(\displaystyle = \phi (x+y) \phi (2x + y) \)

\(\displaystyle = (y+x) ( 2y + x) \)

\(\displaystyle = y( 2y + x) + x( 2y + x) \)

\(\displaystyle = y.2y + yx + x.2y + x^2 \) ... ... so to take the next step, we need y and 2 to commute and x and 2 to commute - again, indeterminates and elements of R need to commute, rather than elements of R?

\(\displaystyle = 2y^2 + yx + 2xy \)

\(\displaystyle = \phi ( (x+y) (2x + y) = \phi ( f(x,y) g(x,y) ) \)Can you please confirm that the above analysis makes sense.

Could you please clarify the following issue - we specify that R needs to be a commutative ring - but the commutativity that we seem to need is commutativity between the indeterminates and the elements of R. Further we do not seem to need the commutativity of the elements of R - is this just due to demonstrating the homomorphism using a simple example? WOuld a more complicated pair of polynomial expressions actually show the need for commutativity of elements of R?

Would appreciate help.

Peter
 
Last edited:

FAQ: Is \( R[x,y] \) Isomorphic to \( R[y,x] \)?

What does it mean for R[x,y] to be isomorphic to R[y,x]?

When two rings, R[x,y] and R[y,x], are isomorphic, it means that there is a bijective mapping (a one-to-one correspondence) between the elements of the two rings that preserves the ring structure. In this case, it means that there is a way to reorder the variables x and y such that the resulting rings are essentially the same.

How do you prove that R[x,y] is isomorphic to R[y,x]?

To prove that R[x,y] is isomorphic to R[y,x], you must show that there exists a bijective mapping between the two rings that preserves addition, multiplication, and the identity elements. This can be done by explicitly constructing the mapping or by showing that the two rings satisfy the same defining equations.

Can R[x,y] and R[y,x] be isomorphic if R is not a commutative ring?

No, R[x,y] and R[y,x] can only be isomorphic if R is a commutative ring. This is because the reordering of variables in the two rings implies that the multiplication in the two rings is commutative, which is a defining property of commutative rings.

Are there any other conditions that must be met for R[x,y] to be isomorphic to R[y,x]?

In addition to R being a commutative ring, R[x,y] and R[y,x] must also have the same number of variables and the same degree of each variable. This is because the bijective mapping between the elements of the two rings must preserve the number and degree of the variables.

What is the significance of R[x,y] being isomorphic to R[y,x]?

The fact that R[x,y] is isomorphic to R[y,x] means that these two rings have the same underlying structure. This can be useful in simplifying calculations and understanding the properties of these rings. It also allows for the transfer of knowledge and techniques between the two rings.

Similar threads

Replies
4
Views
4K
Replies
5
Views
4K
Replies
7
Views
2K
Replies
2
Views
1K
Replies
2
Views
1K
Replies
1
Views
1K
Back
Top