Rational Functions - Polynomials Over a Field - Rotman Proposition 3.70

In summary, Rotman defines the gcd of two polynomials as a monic polynomial, making it unique. This means that a constant common divisor of two polynomials would have to be equal to 1, making the polynomials relatively prime. Therefore, Rotman only focuses on the nonconstant common divisor case, which he shows leads to a contradiction.
  • #1
Math Amateur
Gold Member
MHB
3,998
48
I am reading Joseph J. Rotman's book: A First Course in Abstract Algebra with Applications (Third Edition) ...

I am currently focused on Section 3.5 From Numbers to Polynomials ...

I need help with an aspect of the proof of Lemma 3.70 ...

The relevant text from Rotman's book is as follows:View attachment 4643In the proof of the above proposition we read the following:

" ... ... Moreover, \(\displaystyle f'\) and \(\displaystyle g'\) are relatively prime, for if \(\displaystyle h\) were a nonconstant common divisor of \(\displaystyle f '\) and \(\displaystyle g'\), then ... ... "
My question is as follows:

Why is Rotman worrying only about a nonconstant common divisor ... surely a constant common divisor different from \(\displaystyle 1\) would also be a worry, since relatively prime polynomials need to have a gcd of \(\displaystyle 1\) ... In other words if \(\displaystyle h\) was a constant common divisor different from \(\displaystyle 1\) then \(\displaystyle f'\) and \(\displaystyle g'\) would not be relatively prime ... wouldn't that be a problem? ... how is Rotman avoiding this ...Can someone please explain the focus on a nonconstant common divisor in the statement that I have quoted ...

Help will be appreciated ...

Peter

*** NOTE ***

Rotman's definitions of relatively prime and lowest terms are relevant to the above proposition ... so I am providing them as follows:View attachment 4644View attachment 4645
 
Last edited:
Physics news on Phys.org
  • #2
Peter said:
I am reading Joseph J. Rotman's book: A First Course in Abstract Algebra with Applications (Third Edition) ...

I am currently focused on Section 3.5 From Numbers to Polynomials ...

I need help with an aspect of the proof of Lemma 3.70 ...

The relevant text from Rotman's book is as follows:In the proof of the above proposition we read the following:

" ... ... Moreover, \(\displaystyle f'\) and \(\displaystyle g'\) are relatively prime, for if \(\displaystyle h\) were a nonconstant common divisor of \(\displaystyle f '\) and \(\displaystyle g'\), then ... ... "
My question is as follows:

Why is Rotman worrying only about a nonconstant common divisor ... surely a constant common divisor different from \(\displaystyle 1\) would also be a worry, since relatively prime polynomials need to have a gcd of \(\displaystyle 1\) ... In other words if \(\displaystyle h\) was a constant common divisor different from \(\displaystyle 1\) then \(\displaystyle f'\) and \(\displaystyle g'\) would not be relatively prime ... wouldn't that be a problem? ... how is Rotman avoiding this ...Can someone please explain the focus on a nonconstant common divisor in the statement that I have quoted ...

Help will be appreciated ...

Peter

*** NOTE ***

Rotman's definitions of relatively prime and lowest terms are relevant to the above proposition ... so I am providing them as follows:
Hello Peter. In any ring $R$ (take commutative with identity to be safe), the gcd of two elements $a$ and $b$ is defined as follows:

We say that $d$ is a gcd of $a$ and $b$ if 1) $d|a$ and $d|b$ and 2) If $d'|a$ and $d'|b$ for some $d'\in R$, then $d'|d$.

Note that gcd of two elements may not exist in an arbitrary ring though it is sure to exist if the ring is a PID or more generally a UFD.
Also note that if $d$ is a gcd of $a$ and $b$ then $ud$ too is a gcd if $u$ is a unit. Further, if $d$ and $d'$ are gcds of $a$ and $b$, then there is a unit $u$ such that $d=ud'$ provided $R$ is an integral domain.

Polynomial rings in one variable over a field are PIDs. So gcd of two polynomials in $k[x]$ always exists if $k$ is a field. So when we say that gcd of two polynomials $p(x)$ and $q(x)$ in $k[x]$ is $1$, we must note that gcd may very well be any other nonzero element of $k$.

This superficial non-uniqueness of gcd may be done away with if we insist that the gcd of two polynomials is a monic polynomial, which I believe what Rotman must have done. Note that this insistence of having gcds as monics will work only in polynomial rings over a field. If we take polynomials $p(x)=2x-2$ and $2x^2-2$ in $\mathbf Z[x]$, then the gcd of these two polynomials exists but is not a monic. It is $2x-2$ itself.
 
  • #3
caffeinemachine said:
Hello Peter. In any ring $R$ (take commutative with identity to be safe), the gcd of two elements $a$ and $b$ is defined as follows:

We say that $d$ is a gcd of $a$ and $b$ if 1) $d|a$ and $d|b$ and 2) If $d'|a$ and $d'|b$ for some $d'\in R$, then $d'|d$.

Note that gcd of two elements may not exist in an arbitrary ring though it is sure to exist if the ring is a PID or more generally a UFD.
Also note that if $d$ is a gcd of $a$ and $b$ then $ud$ too is a gcd if $u$ is a unit. Further, if $d$ and $d'$ are gcds of $a$ and $b$, then there is a unit $u$ such that $d=ud'$ provided $R$ is an integral domain.

Polynomial rings in one variable over a field are PIDs. So gcd of two polynomials in $k[x]$ always exists if $k$ is a field. So when we say that gcd of two polynomials $p(x)$ and $q(x)$ in $k[x]$ is $1$, we must note that gcd may very well be any other nonzero element of $k$.

This superficial non-uniqueness of gcd may be done away with if we insist that the gcd of two polynomials is a monic polynomial, which I believe what Rotman must have done. Note that this insistence of having gcds as monics will work only in polynomial rings over a field. If we take polynomials $p(x)=2x-2$ and $2x^2-2$ in $\mathbf Z[x]$, then the gcd of these two polynomials exists but is not a monic. It is $2x-2$ itself.
Thanks caffeinemachine,

Yes, you are right in your surmise that Rotman defines the gcd of two polynomials as a monic polynomial, thus making the gcd of two polynomials unique ...

This means that if \(\displaystyle h\) were a constant common divisor of \(\displaystyle f\) and \(\displaystyle g\) then \(\displaystyle h\) would be equal to \(\displaystyle 1\) and \(\displaystyle f\) and \(\displaystyle g\) would therefore be relatively prime ... so Rotman only has to deal with the nonconstant common divisor case, which he shows leads to a contradiction ... ...

Thanks again for your help ...

Peter
 
  • #4
There is something special going on here, which most algebra texts fail to highlight:

Recall that a ring (with unity) is two structures in one:

$(R,+)$ is an abelian group

$(R,\cdot)$ is a monoid.

(We also insist that $\cdot$ be $\Bbb Z$-bilinear-this is just the distributive laws, for "compatibility").

In short, we have a "forgetful function" that takes any ring $R$, and "forgets" we can add, leaving us with just a monoid.

The group of units of $R,\ U(R)$ is a sub-monoid of $R$.

For now, we restrict our attention to when $(R,\cdot)$ is a commutative monoid ("commutative rings with unity").

Define a relation $\sim$ on $(R,\cdot)$ by: $a \sim b$ if $a = ub$, with $u \in U(R)$ ("$a$ and $b$ are associates").

1). $\sim$ is reflexive: $a = 1a$ and since $1\cdot 1 = 1$, $1 \in U(R)$.

2). $\sim$ is symmetric: Suppose $a = ub$. Since $u \in U(R)$, there exists $v \in U(R)$ such that $uv = vu = 1$.

Thus $va = v(ub) = (vu)b = (uv)b = 1b = b$, and $b \sim a$.

3) $\sim$ is transitive: Suppose $a = ub$ and $b = vc$, for $u,v \in U(R)$. Then $a = u(vc) = (uv)c$, and $uv \in U(R)$ (if $u$ has the inverse $u'$, and $v$ has the inverse $v'$, it is clear that $v'u'$ is an inverse for $uv$), so $a \sim c$.

Thus association is an EQUIVALENCE relation on the multiplicative monoid of $R$.

We can say more:

$\sim$ is a congruence on the monoid $(R,\cdot)$. By this, we mean:

If $a \sim a'$ and $b \sim b'$, then $ab \sim a'b'$.

The proof is straight-forward: suppose $a \sim a'$ and $b \sim b'$. Thus $a = ua'$ and $b = vb'$ for $u,v \in U(R)$.

Thus $ab = (ua')(vb') = u(a'v)b' = u(va')b' = (uv)a'b'$ (note that here is where commutativity is finally used).

In short, $R/\sim$ is a QUOTIENT MONOID of $(R,\cdot)$.

In the special case where $R = F[x]$, show the following:

1. the element $m(x) \in [p(x)]_{\sim}$ such that $m$ is monic is *unique*.

2. Conclude gcd's in $F[x]/\sim$ are unique (N.B.: one cannot use Bezout's identity, here, since that uses addition. One has to define the gcd $[d(x)]$ of $[f(x)],[g(x)]$ like so:

a) $d(x)|f(x)$ and $d(x)|g(x)$
b) if $c(x)|f(x)$ and $c(x)|g(x)$ then $c(x)|d(x)$).

This is the underlying rationale of computing ring elements (in a commutative ring with identity) "only up to associates". In the quotient monoid, the units all are identified with $1$ (the "canonical unit", since in $F[x]$ it is monic-this is akin to factoring integers only considering "positive primes" (we get all sorts of "non-unique factorizations" when we consider $1$ and $-1$)).

What is $\Bbb Z/\sim$? Have you encountered this monoid before?
 
  • #5
Deveno said:
There is something special going on here, which most algebra texts fail to highlight:

Recall that a ring (with unity) is two structures in one:

$(R,+)$ is an abelian group

$(R,\cdot)$ is a monoid.

(We also insist that $\cdot$ be $\Bbb Z$-bilinear-this is just the distributive laws, for "compatibility").

In short, we have a "forgetful function" that takes any ring $R$, and "forgets" we can add, leaving us with just a monoid.

The group of units of $R,\ U(R)$ is a sub-monoid of $R$.

For now, we restrict our attention to when $(R,\cdot)$ is a commutative monoid ("commutative rings with unity").

Define a relation $\sim$ on $(R,\cdot)$ by: $a \sim b$ if $a = ub$, with $u \in U(R)$ ("$a$ and $b$ are associates").

1). $\sim$ is reflexive: $a = 1a$ and since $1\cdot 1 = 1$, $1 \in U(R)$.

2). $\sim$ is symmetric: Suppose $a = ub$. Since $u \in U(R)$, there exists $v \in U(R)$ such that $uv = vu = 1$.

Thus $va = v(ub) = (vu)b = (uv)b = 1b = b$, and $b \sim a$.

3) $\sim$ is transitive: Suppose $a = ub$ and $b = vc$, for $u,v \in U(R)$. Then $a = u(vc) = (uv)c$, and $uv \in U(R)$ (if $u$ has the inverse $u'$, and $v$ has the inverse $v'$, it is clear that $v'u'$ is an inverse for $uv$), so $a \sim c$.

Thus association is an EQUIVALENCE relation on the multiplicative monoid of $R$.

We can say more:

$\sim$ is a congruence on the monoid $(R,\cdot)$. By this, we mean:

If $a \sim a'$ and $b \sim b'$, then $ab \sim a'b'$.

The proof is straight-forward: suppose $a \sim a'$ and $b \sim b'$. Thus $a = ua'$ and $b = vb'$ for $u,v \in U(R)$.

Thus $ab = (ua')(vb') = u(a'v)b' = u(va')b' = (uv)a'b'$ (note that here is where commutativity is finally used).

In short, $R/\sim$ is a QUOTIENT MONOID of $(R,\cdot)$.

In the special case where $R = F[x]$, show the following:

1. the element $m(x) \in [p(x)]_{\sim}$ such that $m$ is monic is *unique*.

2. Conclude gcd's in $F[x]/\sim$ are unique (N.B.: one cannot use Bezout's identity, here, since that uses addition. One has to define the gcd $[d(x)]$ of $[f(x)],[g(x)]$ like so:

a) $d(x)|f(x)$ and $d(x)|g(x)$
b) if $c(x)|f(x)$ and $c(x)|g(x)$ then $c(x)|d(x)$).

This is the underlying rationale of computing ring elements (in a commutative ring with identity) "only up to associates". In the quotient monoid, the units all are identified with $1$ (the "canonical unit", since in $F[x]$ it is monic-this is akin to factoring integers only considering "positive primes" (we get all sorts of "non-unique factorizations" when we consider $1$ and $-1$)).

What is $\Bbb Z/\sim$? Have you encountered this monoid before?
Thanks for an interesting and informative post, Deveno ...

Just working through the post in detail, now ...

Thanks again for your help ...

Peter
 
  • #6
Deveno said:
There is something special going on here, which most algebra texts fail to highlight:

Recall that a ring (with unity) is two structures in one:

$(R,+)$ is an abelian group

$(R,\cdot)$ is a monoid.

(We also insist that $\cdot$ be $\Bbb Z$-bilinear-this is just the distributive laws, for "compatibility").

In short, we have a "forgetful function" that takes any ring $R$, and "forgets" we can add, leaving us with just a monoid.

The group of units of $R,\ U(R)$ is a sub-monoid of $R$.

For now, we restrict our attention to when $(R,\cdot)$ is a commutative monoid ("commutative rings with unity").

Define a relation $\sim$ on $(R,\cdot)$ by: $a \sim b$ if $a = ub$, with $u \in U(R)$ ("$a$ and $b$ are associates").

1). $\sim$ is reflexive: $a = 1a$ and since $1\cdot 1 = 1$, $1 \in U(R)$.

2). $\sim$ is symmetric: Suppose $a = ub$. Since $u \in U(R)$, there exists $v \in U(R)$ such that $uv = vu = 1$.

Thus $va = v(ub) = (vu)b = (uv)b = 1b = b$, and $b \sim a$.

3) $\sim$ is transitive: Suppose $a = ub$ and $b = vc$, for $u,v \in U(R)$. Then $a = u(vc) = (uv)c$, and $uv \in U(R)$ (if $u$ has the inverse $u'$, and $v$ has the inverse $v'$, it is clear that $v'u'$ is an inverse for $uv$), so $a \sim c$.

Thus association is an EQUIVALENCE relation on the multiplicative monoid of $R$.

We can say more:

$\sim$ is a congruence on the monoid $(R,\cdot)$. By this, we mean:

If $a \sim a'$ and $b \sim b'$, then $ab \sim a'b'$.

The proof is straight-forward: suppose $a \sim a'$ and $b \sim b'$. Thus $a = ua'$ and $b = vb'$ for $u,v \in U(R)$.

Thus $ab = (ua')(vb') = u(a'v)b' = u(va')b' = (uv)a'b'$ (note that here is where commutativity is finally used).

In short, $R/\sim$ is a QUOTIENT MONOID of $(R,\cdot)$.

In the special case where $R = F[x]$, show the following:

1. the element $m(x) \in [p(x)]_{\sim}$ such that $m$ is monic is *unique*.

2. Conclude gcd's in $F[x]/\sim$ are unique (N.B.: one cannot use Bezout's identity, here, since that uses addition. One has to define the gcd $[d(x)]$ of $[f(x)],[g(x)]$ like so:

a) $d(x)|f(x)$ and $d(x)|g(x)$
b) if $c(x)|f(x)$ and $c(x)|g(x)$ then $c(x)|d(x)$).

This is the underlying rationale of computing ring elements (in a commutative ring with identity) "only up to associates". In the quotient monoid, the units all are identified with $1$ (the "canonical unit", since in $F[x]$ it is monic-this is akin to factoring integers only considering "positive primes" (we get all sorts of "non-unique factorizations" when we consider $1$ and $-1$)).

What is $\Bbb Z/\sim$? Have you encountered this monoid before?
Thanks Deveno ... appreciate the interesting post ...

You write:

" ... ... In short, $R/\sim$ is a QUOTIENT MONOID of $(R,\cdot)$.

In the special case where $R = F[x]$, show the following:

1. the element $m(x) \in [p(x)]_{\sim}$ such that $m$ is monic is *unique*. ... ... "
So to show 1. ... ... I think we could proceed as follows:

Let \(\displaystyle p(x) = a_0 + a_1x + a_2x^2 + \ ... \ ... \ + a_nx^n\)

... so, then the elements of \(\displaystyle [p(x)]_{\sim}\) in $F[x]/\sim$ are of the form:

\(\displaystyle bp(x) = ba_0 + ba_1x + ba_2x^2 + \ ... \ ... \ + ba_nx^n\)

where \(\displaystyle b \in F\) ( that is, \(\displaystyle b\) is a unit of \(\displaystyle F[x]\))The monic element \(\displaystyle m(x)\) is obtained by taking \(\displaystyle b = a_n^{-1}\) ... ...Since inverses are unique in \(\displaystyle F[x], a_n^{-1}\) is unique, and so therefore \(\displaystyle m(x)\) is unique ...

... ... ... I am still thinking over the question 2.Considering the nature of \(\displaystyle \mathbb{Z}/\sim\), it seems to me that the elements would be of the form:

\(\displaystyle [a]_{ \sim} = \{ a, -a \text{ where } a \in \mathbb{Z} \}.\)

So ... typical elements of \(\displaystyle \mathbb{Z}/ \sim\) would be \(\displaystyle [3]_{ \sim} = \{ 3, -3 \}\), \(\displaystyle [8]_{ \sim} = \{ 8, -8 \}\) and so on ...

So, it seems that \(\displaystyle \mathbb{Z}/ \sim\) would be isomorphic to the positive integers ... ... or the negative integers, for that matter ...Can you please confirm that my thoughts above are correct, or, alternatively, point out the deficiencies and/or errors in my analysis.Peter
 
Last edited:
  • #7
As a *set*, yes $\Bbb Z/\sim$ might be isomorphic to either the positive, or the negative integers (or even to some "weird hybrid" say where the multiples of $7$ are positive, and everything else, negative).

As a MONOID, however, we must have "operational closure", so I ask you, does:

negative x negative = negative?
 
  • #8
Peter said:
Thanks Deveno ... appreciate the interesting post ...

You write:

" ... ... In short, $R/\sim$ is a QUOTIENT MONOID of $(R,\cdot)$.

In the special case where $R = F[x]$, show the following:

1. the element $m(x) \in [p(x)]_{\sim}$ such that $m$ is monic is *unique*. ... ... "
So to show 1. ... ... I think we could proceed as follows:

Let \(\displaystyle p(x) = a_0 + a_1x + a_2x^2 + \ ... \ ... \ + a_nx^n\)

... so, then the elements of \(\displaystyle [p(x)]_{\sim}\) in $F[x]/\sim$ are of the form:

\(\displaystyle bp(x) = ba_0 + ba_1x + ba_2x^2 + \ ... \ ... \ + ba_nx^n\)

where \(\displaystyle b \in F\) ( that is, \(\displaystyle b\) is a unit of \(\displaystyle F[x]\))The monic element \(\displaystyle m(x)\) is obtained by taking \(\displaystyle b = a_0^{-1}\) ... ...Since inverses are unique in \(\displaystyle F[x], a_0^{-1}\) is unique, and so therefore \(\displaystyle m(x)\) is unique ...

... ... ... I am still thinking over the question 2.Considering the nature of \(\displaystyle \mathbb{Z}/\sim\), it seems to me that the elements would be of the form:

\(\displaystyle [a]_{ \sim} = \{ a, -a \text{ where } a \in \mathbb{Z} \}.\)

So ... typical elements of \(\displaystyle \mathbb{Z}/ \sim\) would be \(\displaystyle [3]_{ \sim} = \{ 3, -3 \}\), \(\displaystyle [8]_{ \sim} = \{ 8, -8 \}\) and so on ...

So, it seems that \(\displaystyle \mathbb{Z}/ \sim\) would be isomorphic to the positive integers ... ... or the negative integers, for that matter ...Can you please confirm that my thoughts above are correct, or, alternatively, point out the deficiencies and/or errors in my analysis.Peter

Hi Deveno ... Are you able to help with question 2 on GCDs ...

Question 2 seems to me to be an important issue ... but I need some help with it ...

Peter

- - - Updated - - -

Peter said:
Thanks Deveno ... appreciate the interesting post ...

You write:

" ... ... In short, $R/\sim$ is a QUOTIENT MONOID of $(R,\cdot)$.

In the special case where $R = F[x]$, show the following:

1. the element $m(x) \in [p(x)]_{\sim}$ such that $m$ is monic is *unique*. ... ... "
So to show 1. ... ... I think we could proceed as follows:

Let \(\displaystyle p(x) = a_0 + a_1x + a_2x^2 + \ ... \ ... \ + a_nx^n\)

... so, then the elements of \(\displaystyle [p(x)]_{\sim}\) in $F[x]/\sim$ are of the form:

\(\displaystyle bp(x) = ba_0 + ba_1x + ba_2x^2 + \ ... \ ... \ + ba_nx^n\)

where \(\displaystyle b \in F\) ( that is, \(\displaystyle b\) is a unit of \(\displaystyle F[x]\))The monic element \(\displaystyle m(x)\) is obtained by taking \(\displaystyle b = a_0^{-1}\) ... ...Since inverses are unique in \(\displaystyle F[x], a_0^{-1}\) is unique, and so therefore \(\displaystyle m(x)\) is unique ...

... ... ... I am still thinking over the question 2.Considering the nature of \(\displaystyle \mathbb{Z}/\sim\), it seems to me that the elements would be of the form:

\(\displaystyle [a]_{ \sim} = \{ a, -a \text{ where } a \in \mathbb{Z} \}.\)

So ... typical elements of \(\displaystyle \mathbb{Z}/ \sim\) would be \(\displaystyle [3]_{ \sim} = \{ 3, -3 \}\), \(\displaystyle [8]_{ \sim} = \{ 8, -8 \}\) and so on ...

So, it seems that \(\displaystyle \mathbb{Z}/ \sim\) would be isomorphic to the positive integers ... ... or the negative integers, for that matter ...Can you please confirm that my thoughts above are correct, or, alternatively, point out the deficiencies and/or errors in my analysis.Peter

Hi Deveno ... Are you able to help with question 2 on GCDs ...

Question 2 seems to me to be an important issue ... but I need some help with it ...

Peter
 
  • #9
I am unsure which "question 2" you mean. Specifically you refer to: "question 2 on GCD's" but in your original post, I see only one question. It is possible that you are referring to question 2 on the monoid $\Bbb Z/\sim$, which does not mention GCD's.

With regard to the latter, your first intuition was correct, it is not hard to show that:

$\phi: \Bbb Z/\sim \to \Bbb N^+$ given by:

$\phi([k]_{\sim}) = |k|$

is a *multiplicative* monoid homomorphism (things go horribly wrong if we try to use addition).

There are the usual concerns of proving $\phi$ is well-defined (which amounts to showing $|k| = |-k|$), and the homomorphism property rests on the fact that:

$[k]_{\sim} \cdot [m]_{\sim} = [km]_{\sim}$ and $|km| = |k|\cdot |m|$.

It's pretty obvious $\phi$ is bijective.

As for your original concern about GCD's, another definition of the GCD of two polynomials is that:

$d(x) = \gcd(f(x),g(x)) \iff \exists a(x),b(x) \in F[x]: d(x) = a(x)f(x) + b(x)g(x)$, and $d$ has minimal degree among all such polynomials.

Again, if $d'(x)$ also has this property, then $d|d'$ and $d'|d$ so that:

$d'(x) = s(x)d(x) = s(x)(t(x)d'(x)) \implies s(x)t(x) = 1 \implies s,t \in U(F[x]) \implies s,t \in F^{\ast}$;

that is, $d$ and $d'$ are associates.

As I indicated in my last post, we have a unique monic $d$ in $[d(x)]_{\sim}$ (it really is a cumbersome notation).

Choosing the monic as *the* gcd is useful, but we don't *have* to (not all authors do). We can still define $f,g$ to be co-prime if there exists $a(x),b(x)$ with:

$a(x)f(x) + b(x)g(x) = 1$.

Indeed, we can make an even stranger definition:

Let $(f)$ be the principal ideal generated by $f(x)$, and similarly for $g$. We say $f,g$ are co-prime if:

$(f) + (g) = F[x]$, where $(f) + (g) = \{h(x) \in F[x]: h(x) = u(x) + v(x), u \in (f), v \in (g)\}$.

It does no harm to replace "1" in the definition of the gcd by an arbitrary unit, $u$, for if:

$a(x)f(x) + b(x)g(x) = u$, then surely:

$u^{-1}a(x)f(x) + u^{-1}b(x)g(x) = 1$, and all we do is replace $a(x)$ with $a_1(x) = u^{-1}a(x)$, and so on for $b$.

This business of "factoring out a unit" gets tedious to type over and over again, using "monic representatives" saves us a little trouble.
 
  • #10
Deveno said:
I am unsure which "question 2" you mean. Specifically you refer to: "question 2 on GCD's" but in your original post, I see only one question. It is possible that you are referring to question 2 on the monoid $\Bbb Z/\sim$, which does not mention GCD's.

With regard to the latter, your first intuition was correct, it is not hard to show that:

$\phi: \Bbb Z/\sim \to \Bbb N^+$ given by:

$\phi([k]_{\sim}) = |k|$

is a *multiplicative* monoid homomorphism (things go horribly wrong if we try to use addition).

There are the usual concerns of proving $\phi$ is well-defined (which amounts to showing $|k| = |-k|$), and the homomorphism property rests on the fact that:

$[k]_{\sim} \cdot [m]_{\sim} = [km]_{\sim}$ and $|km| = |k|\cdot |m|$.

It's pretty obvious $\phi$ is bijective.

As for your original concern about GCD's, another definition of the GCD of two polynomials is that:

$d(x) = \gcd(f(x),g(x)) \iff \exists a(x),b(x) \in F[x]: d(x) = a(x)f(x) + b(x)g(x)$, and $d$ has minimal degree among all such polynomials.

Again, if $d'(x)$ also has this property, then $d|d'$ and $d'|d$ so that:

$d'(x) = s(x)d(x) = s(x)(t(x)d'(x)) \implies s(x)t(x) = 1 \implies s,t \in U(F[x]) \implies s,t \in F^{\ast}$;

that is, $d$ and $d'$ are associates.

As I indicated in my last post, we have a unique monic $d$ in $[d(x)]_{\sim}$ (it really is a cumbersome notation).

Choosing the monic as *the* gcd is useful, but we don't *have* to (not all authors do). We can still define $f,g$ to be co-prime if there exists $a(x),b(x)$ with:

$a(x)f(x) + b(x)g(x) = 1$.

Indeed, we can make an even stranger definition:

Let $(f)$ be the principal ideal generated by $f(x)$, and similarly for $g$. We say $f,g$ are co-prime if:

$(f) + (g) = F[x]$, where $(f) + (g) = \{h(x) \in F[x]: h(x) = u(x) + v(x), u \in (f), v \in (g)\}$.

It does no harm to replace "1" in the definition of the gcd by an arbitrary unit, $u$, for if:

$a(x)f(x) + b(x)g(x) = u$, then surely:

$u^{-1}a(x)f(x) + u^{-1}b(x)g(x) = 1$, and all we do is replace $a(x)$ with $a_1(x) = u^{-1}a(x)$, and so on for $b$.

This business of "factoring out a unit" gets tedious to type over and over again, using "monic representatives" saves us a little trouble.

Thanks Deveno ... will now work through your post in detail ...

Sorry i was not clear ... I was referring to your Question 2 ... as follows:" ... ... 2. Conclude gcd's in [FONT=MathJax_Math]F[/FONT][FONT=MathJax_Main][[/FONT][FONT=MathJax_Math]x[/FONT][FONT=MathJax_Main]][/FONT][FONT=MathJax_Main]/[/FONT][FONT=MathJax_Main]∼[/FONT] are unique (N.B.: one cannot use Bezout's identity, here, since that uses addition. One has to define the gcd [FONT=MathJax_Main][[/FONT][FONT=MathJax_Math]d[/FONT][FONT=MathJax_Main]([/FONT][FONT=MathJax_Math]x[/FONT][FONT=MathJax_Main])[/FONT][FONT=MathJax_Main]][/FONT] of [FONT=MathJax_Main][[/FONT][FONT=MathJax_Math]f[/FONT][FONT=MathJax_Main]([/FONT][FONT=MathJax_Math]x[/FONT][FONT=MathJax_Main])[/FONT][FONT=MathJax_Main]][/FONT][FONT=MathJax_Main],[/FONT][FONT=MathJax_Main][[/FONT][FONT=MathJax_Math]g[/FONT][FONT=MathJax_Main]([/FONT][FONT=MathJax_Math]x[/FONT][FONT=MathJax_Main])[/FONT][FONT=MathJax_Main]][/FONT] like so:

a) [FONT=MathJax_Math]d[/FONT][FONT=MathJax_Main]([/FONT][FONT=MathJax_Math]x[/FONT][FONT=MathJax_Main])[/FONT][FONT=MathJax_Main]|[/FONT][FONT=MathJax_Math]f[/FONT][FONT=MathJax_Main]([/FONT][FONT=MathJax_Math]x[/FONT][FONT=MathJax_Main])[/FONT] and [FONT=MathJax_Math]d[/FONT][FONT=MathJax_Main]([/FONT][FONT=MathJax_Math]x[/FONT][FONT=MathJax_Main])[/FONT][FONT=MathJax_Main]|[/FONT][FONT=MathJax_Math]g[/FONT][FONT=MathJax_Main]([/FONT][FONT=MathJax_Math]x[/FONT][FONT=MathJax_Main])[/FONT]
b) if [FONT=MathJax_Math]c[/FONT][FONT=MathJax_Main]([/FONT][FONT=MathJax_Math]x[/FONT][FONT=MathJax_Main])[/FONT][FONT=MathJax_Main]|[/FONT][FONT=MathJax_Math]f[/FONT][FONT=MathJax_Main]([/FONT][FONT=MathJax_Math]x[/FONT][FONT=MathJax_Main])[/FONT] and [FONT=MathJax_Math]c[/FONT][FONT=MathJax_Main]([/FONT][FONT=MathJax_Math]x[/FONT][FONT=MathJax_Main])[/FONT][FONT=MathJax_Main]|[/FONT][FONT=MathJax_Math]g[/FONT][FONT=MathJax_Main]([/FONT][FONT=MathJax_Math]x[/FONT][FONT=MathJax_Main])[/FONT] then [FONT=MathJax_Math]c[/FONT][FONT=MathJax_Main]([/FONT][FONT=MathJax_Math]x[/FONT][FONT=MathJax_Main])[/FONT][FONT=MathJax_Main]|[/FONT][FONT=MathJax_Math]d[/FONT][FONT=MathJax_Main]([/FONT][FONT=MathJax_Math]x[/FONT][FONT=MathJax_Main])[/FONT]). ... ... "Apologies for not being clearer ...

Peter
 
  • #11
Deveno said:
As a *set*, yes $\Bbb Z/\sim$ might be isomorphic to either the positive, or the negative integers (or even to some "weird hybrid" say where the multiples of $7$ are positive, and everything else, negative).

As a MONOID, however, we must have "operational closure", so I ask you, does:

negative x negative = negative?
Hi Deveno,

You ask: in $\Bbb Z/\sim$ does negative x negative = negative? Hmm ... very puzzling ...

We have, as an example: \(\displaystyle [-3]_{\sim} \times [-2]_{\sim} = [-6]_{\sim}
\)

... so, maybe, yes ... ...

... BUT ...

\(\displaystyle [-6]_{\sim} = [6]_{\sim} = \{ -6, 6 \}\) ...

so it seems that one can also claim that

\(\displaystyle [-3]_{\sim} \times [-2]_{\sim} = [6]_{\sim}\)

... ...

I must say that I am not sure of my analysis above and I am certainly not sure of what it shows ... unless it means that 'negative' and 'positive' have meaning when talking about the integers, but not when talking about the elements of $\Bbb Z/\sim$ ...

Can you please clarify the situation ...

Peter
 
  • #12
That's fine.

All this is saying is: if $d$ is "any" gcd of $f$ and $g$, show that $d = um$ where $u$ is a unit, and $m$ is the unique monic element of $[d(x)]_{\sim}$.
 

FAQ: Rational Functions - Polynomials Over a Field - Rotman Proposition 3.70

What is a rational function?

A rational function is a function that can be written as the quotient of two polynomials. It is also known as a ratio of two polynomials.

What is a polynomial over a field?

A polynomial over a field is a polynomial with coefficients from a field, which is a set of numbers that can be added, subtracted, multiplied, and divided without any restrictions.

What is Rotman Proposition 3.70?

Rotman Proposition 3.70 is a theorem that states that if the gcd of two polynomials over a field is 1, then the gcd of their quotient as rational functions is also 1.

How is Rational Functions - Polynomials Over a Field - Rotman Proposition 3.70 useful?

This proposition is useful in proving the unique factorization theorem for polynomials over a field, as well as in determining the irreducibility of rational functions.

Can Rational Functions - Polynomials Over a Field - Rotman Proposition 3.70 be applied to any field?

Yes, this proposition can be applied to any field, as long as the field satisfies the axioms of a field and allows for division by non-zero elements.

Similar threads

Back
Top