Tensor Products - The free Z-module construction

In summary: This is called a "geometric representation" of a free abelian group (on two generators).In summary, the conversation discusses the understanding of tensor products as developed by Dummit and Foote in Section 10.4, specifically the early section devoted to the "extension of scalars". The conversation delves into the understanding of the (free) $\mathbb{Z}$-module on the set $S \times N$ and its construction, with a focus
  • #1
Math Amateur
Gold Member
MHB
3,998
48
I am trying (struggling! :() to understand tensor products as developed by Dummit and Foote in Section 10.4 - specifically the early section devoted to the "extension of scalars".

I have been reflecting on my attempts to understand the material of Dummit and Foote, pages 359 -362 (see attachment) and wondering if I really fully understand the basic notions. Thus I am now examining my understanding of the (free) \(\displaystyle \mathbb{Z} \)-module on the set \(\displaystyle S \times N \) (see D&F page 360 - see attached) - as this concept is critical to the development. I hope that some MHB member will confirm the correctness of my construction.

My understanding of the nature of the (free) \(\displaystyle \mathbb{Z} \)-module on the set \(\displaystyle S \times N \) follows - I am carefully following D&F's definition of a module on page 337 (see attachment).

Let \(\displaystyle R = \mathbb{Z} \) and \(\displaystyle M = S \times N \).

Then ...

(1) MATH] M = S \times N [/MATH] is an abelian group under + where + operates as follows:

\(\displaystyle (s_1, n_1) + (s_2, n_2) = (s_1 + s_2, n_1 + n_2) \)

(2) we have an action of \(\displaystyle \mathbb{Z} \) on \(\displaystyle M\); that is a map

\(\displaystyle \mathbb{Z} \times M \to M \), that is

\(\displaystyle \mathbb{Z} \times (S \times N) \to (S \times N) \)

denoted by \(\displaystyle xm = x(s,n) \) where \(\displaystyle x \in \mathbb{Z}, (s,n) \in M = S \times N \)

which satisfies

(a) \(\displaystyle (x + y)(s,n) = x(s,n) + y(s,n) \) where \(\displaystyle x, y \in \mathbb{Z}, (s,n) \in M = S \times N \)

(b) \(\displaystyle (xy)(s,n) = x(y(s,n)) \) where \(\displaystyle x, y \in \mathbb{Z}, (s,n) \in M = S \times N \)

(c) \(\displaystyle x((s_1, n_1) + (s_2, n_2)) = x(s_1, n_1) + x(s_2, n_2) \) where \(\displaystyle x \in \mathbb{Z}, (s_1, n_1), (s_2, n_2) \in M = S \times N \)

and, since \(\displaystyle \mathbb{Z} \) has a 1, we impose the additional axiom:

(d) \(\displaystyle 1(s,n) = (s,n) \)

--------------------------------------------------------------------------

Now looking at (2) more carefully we take

\(\displaystyle x(s,n) = (s,n) + (s,n) + ... \ ... (s,n) \) (x times)

So essentially the \(\displaystyle \mathbb{Z} \)-module is made up of sums of \(\displaystyle (s_i, n_i) \) of the form (for example)

\(\displaystyle 3(s_1, n_1) + 8(s_2, n_2) + 25(s_7, n_7) + 2(s_{10}, n_{10}) \)

Can someone please confirm that the above construction and reasoning is correct or indicate otherwise ...

I am uncertain of the construction because on pages 360 and 361 at various points D&F seem to indicate or imply at least that the elements of the free module are of the form:

\(\displaystyle \sum s_in_i = s_1n_1 + s_2n_2 + ... \ ... + s_kn_k \)

Another issue that I would like a comment n is the fact in the above that I have not explicitly discussed the free nature of the module being constructed - basically because I think that \(\displaystyle \mathbb{Z} \)-modules are free. Is that correct?

I would welcome some help.

Peter
 
Last edited:
Physics news on Phys.org
  • #2
Your equation (1) is incorrect, we cannot sum the pairs like you have shown. In a free module, with basis elements $m_1 = (s,n_1),m_2 = (s,n_2)$ we cannot say, for example, that:

$m_1 + m_2 = m_3 = (s,n_1+n_2)$

because that contradicts the $\Bbb Z$-linear independence of $m_1,m_2,m_3$ since we have:

$1m_1 + 1m_2 + (-1)m_3 = 0$.

I think you are under-estimating the complexity of a free abelian group. A better way of seeing such a group is "words" in an alphabet:

$A^{k_1}B^{k_2}C^{k_3}\dots$

where the number of distinct letters in each word is finite and the $k_j$ can be any integer. If our "alphabet" is the set $B$, the free abelian group generated by $B$ is provably isomorphic to:

$\Bbb Z^{|B|}$ if $B$ is finite

$\Bbb Z^{(B)}$ if $B$ is infinite (here this notation refers to the set of functions:

$f: B \to \Bbb Z$ with finite support.).

Your "examples" of typical elements ARE correct, insofar as they go. You just need to remember that when constructing $F(S \times N)$ that $S \times N$ is just being used as a set, nothing more (forget, for the time being, it has any structure at all). The only thing we can combine is the "integers in front" when they precede the "same basis element".

It is absolutely UNTRUE that all abelian groups are free. For example, the free abelian group generated by a singleton set is isomorphic to $\Bbb Z$, but not ALL abelian groups with a single generator are free: for example in $\Bbb Z_4$ we have:

$4(1) = 0$

which implies $\{1\}$ is not a basis, since it fails to be $\Bbb Z$-linearly independent.

In other words there is no UNIQUE integer $k$ such that $k(1) = a$ for any $a \in \Bbb Z_4$, there are MANY such integers.

In general, any $\Bbb Z$-module that has torsion elements is NOT FREE (the torsion elements introduce "relations amongst members" that are above and beyond just the abelian group axioms).

Let me give you a simple idea of how we build a free abelian group from a 2-element set. Let's call this set $\{x,y\}$.

We define multiplication on this set to be concatentation, that is:

$x\ast x = xx$
$x \ast y = xy$
$y \ast x = yx$ etc.

so $(xxyxyxy)\ast(yxxy) = xxyxyxyyxxy$.

This is clearly associative. So far, all we have is a semigroup. We "invent" a multiplicative identity, by declaring it to be "the empty word" (often represented by a "blank space"...but be careful, it doesn't take up any room!).

Now we have a monoid.

We still need inverses, so we take a SECOND copy of $\{x,y\}$ and declare them to be inverses, by putting "primes" on them:

$\{x',y'\}$.

Now we introduce a relation on our (free) monoid by:

$xx' =\ $ (the empty word),

$x'x = \ $

$yy' = \ $

$y'y = \ $

So now a typical word may look like:

$xxy'x'yy$

If we multiply this word by:

$y'y'xxy$, we get:

$xxy'x'yyy'y'xxy = xxy'x'yy'xxy = xxy'x'xxy = xxy'xy$

We call this a REDUCED word (we've canceled all the inverses we can).

Finally, we take the QUOTIENT of this group modulo the (normal) subgroup generated by all words of the form:

$aba'b'$ for any strings (words) $a,b$ (these are called commutators).

Now by the group axioms, we have:

$xyx'y' = \ $
$(xyx'y')y = y$
$xyx'(yy') = y$
$xyx' = y$ (reducing the word on the left)
$(xyx')x = yx$
$(xy)(x'x) = yx$
$xy = yx$ (perhaps this explains why these are called commutators)

and it is (hopefully) not hard to see that any word in $\{x,x',y,y'\}$ can first be reduced and then the above relation $xy = yx$ used to "move all the $x$'s to the front" giving a word:

$x^ky^m$ (where $k,m \in \Bbb Z$)

(here $x^k = xx\cdots x$ ($k$ times) if $k > 0$,
$x^0 = \ $
$x^{-k} = x'x'\cdots x'$ for $-k < 0$),

after a last reduction.

In fact, the "rules of exponents" (which hold for ANY group, not just free ones) mean we can regard this element as:

$(k,m) \in \Bbb Z^2$ (so we didn't even need $x$ and $y$ after all!).

By way of contrast, a given abelian group with two generators need not be infinite nor free, the standard example being:

$G = \{\langle a,b\rangle:a^2 = b^2 = e\}$

Which is isomorphic to the quotient:

$FA(x,y)/N$ where $N$ is the subgroup generated by $\{xx,yy\}$

Convince yourself that this quotient has only 4 elements, and that:

$\phi:FA(x,y) \to G$ given by:

$\phi(x) = a$
$\phi(y) = b$ is indeed a group homomorphism (in fact, what D&F try to impress upon you is that this group homomorphism is UNIQUE (up to another group isomorphism, of course)).

What is $\text{ker}(\phi)$?
 
  • #3
Deveno said:
Your equation (1) is incorrect, we cannot sum the pairs like you have shown. In a free module, with basis elements $m_1 = (s,n_1),m_2 = (s,n_2)$ we cannot say, for example, that:

$m_1 + m_2 = m_3 = (s,n_1+n_2)$

because that contradicts the $\Bbb Z$-linear independence of $m_1,m_2,m_3$ since we have:

$1m_1 + 1m_2 + (-1)m_3 = 0$.

I think you are under-estimating the complexity of a free abelian group. A better way of seeing such a group is "words" in an alphabet:

$A^{k_1}B^{k_2}C^{k_3}\dots$

where the number of distinct letters in each word is finite and the $k_j$ can be any integer. If our "alphabet" is the set $B$, the free abelian group generated by $B$ is provably isomorphic to:

$\Bbb Z^{|B|}$ if $B$ is finite

$\Bbb Z^{(B)}$ if $B$ is infinite (here this notation refers to the set of functions:

$f: B \to \Bbb Z$ with finite support.).

Your "examples" of typical elements ARE correct, insofar as they go. You just need to remember that when constructing $F(S \times N)$ that $S \times N$ is just being used as a set, nothing more (forget, for the time being, it has any structure at all). The only thing we can combine is the "integers in front" when they precede the "same basis element".

It is absolutely UNTRUE that all abelian groups are free. For example, the free abelian group generated by a singleton set is isomorphic to $\Bbb Z$, but not ALL abelian groups with a single generator are free: for example in $\Bbb Z_4$ we have:

$4(1) = 0$

which implies $\{1\}$ is not a basis, since it fails to be $\Bbb Z$-linearly independent.

In other words there is no UNIQUE integer $k$ such that $k(1) = a$ for any $a \in \Bbb Z_4$, there are MANY such integers.

In general, any $\Bbb Z$-module that has torsion elements is NOT FREE (the torsion elements introduce "relations amongst members" that are above and beyond just the abelian group axioms).

Let me give you a simple idea of how we build a free abelian group from a 2-element set. Let's call this set $\{x,y\}$.

We define multiplication on this set to be concatentation, that is:

$x\ast x = xx$
$x \ast y = xy$
$y \ast x = yx$ etc.

so $(xxyxyxy)\ast(yxxy) = xxyxyxyyxxy$.

This is clearly associative. So far, all we have is a semigroup. We "invent" a multiplicative identity, by declaring it to be "the empty word" (often represented by a "blank space"...but be careful, it doesn't take up any room!).

Now we have a monoid.

We still need inverses, so we take a SECOND copy of $\{x,y\}$ and declare them to be inverses, by putting "primes" on them:

$\{x',y'\}$.

Now we introduce a relation on our (free) monoid by:

$xx' =\ $ (the empty word),

$x'x = \ $

$yy' = \ $

$y'y = \ $

So now a typical word may look like:

$xxy'x'yy$

If we multiply this word by:

$y'y'xxy$, we get:

$xxy'x'yyy'y'xxy = xxy'x'yy'xxy = xxy'x'xxy = xxy'xy$

We call this a REDUCED word (we've canceled all the inverses we can).

Finally, we take the QUOTIENT of this group modulo the (normal) subgroup generated by all words of the form:

$aba'b'$ for any strings (words) $a,b$ (these are called commutators).

Now by the group axioms, we have:

$xyx'y' = \ $
$(xyx'y')y = y$
$xyx'(yy') = y$
$xyx' = y$ (reducing the word on the left)
$(xyx')x = yx$
$(xy)(x'x) = yx$
$xy = yx$ (perhaps this explains why these are called commutators)

and it is (hopefully) not hard to see that any word in $\{x,x',y,y'\}$ can first be reduced and then the above relation $xy = yx$ used to "move all the $x$'s to the front" giving a word:

$x^ky^m$ (where $k,m \in \Bbb Z$)

(here $x^k = xx\cdots x$ ($k$ times) if $k > 0$,
$x^0 = \ $
$x^{-k} = x'x'\cdots x'$ for $-k < 0$),

after a last reduction.

In fact, the "rules of exponents" (which hold for ANY group, not just free ones) mean we can regard this element as:

$(k,m) \in \Bbb Z^2$ (so we didn't even need $x$ and $y$ after all!).

By way of contrast, a given abelian group with two generators need not be infinite nor free, the standard example being:

$G = \{\langle a,b\rangle:a^2 = b^2 = e\}$

Which is isomorphic to the quotient:

$FA(x,y)/N$ where $N$ is the subgroup generated by $\{xx,yy\}$

Convince yourself that this quotient has only 4 elements, and that:

$\phi:FA(x,y) \to G$ given by:

$\phi(x) = a$
$\phi(y) = b$ is indeed a group homomorphism (in fact, what D&F try to impress upon you is that this group homomorphism is UNIQUE (up to another group isomorphism, of course)).

What is $\text{ker}(\phi)$?
Deveno,

Thanks so much for the help ... pity textbook writers cannot be so clear ...

But just one issue ...

You write:

"Your equation (1) is incorrect, we cannot sum the pairs like you have shown. In a free module, with basis elements $m_1 = (s,n_1),m_2 = (s,n_2)$ we cannot say, for example, that:

$m_1 + m_2 = m_3 = (s,n_1+n_2)$

because that contradicts the $\Bbb Z$-linear independence of $m_1,m_2,m_3$ since we have:

$1m_1 + 1m_2 + (-1)m_3 = 0$."

... OK, yes ... but we do require for a \(\displaystyle \mathbb{Z} \)-module that \(\displaystyle M = S \times N \) is an abelian group under addition, which we can notate as + ...

So then according to what you have written (if I am following you correctly) we have that:

\(\displaystyle (s_1, n_1) + (s_2, n_2) \) just remains as \(\displaystyle (s_1, n_1) + (s_2, n_2) \)

... we cannot combine different elements in any way ... we can only say things like:

\(\displaystyle (s_1, n_1) + (s_1, n_1) = 2(s_1, n_1) \) ... but this is really the multiplication of the \(\displaystyle \mathbb{Z} \)-module!

... and of course we have that

\(\displaystyle (s_1, n_1) + (s_2, n_2) = (s_2, n_2) + (s_1, n_1) \) since + is commutative ...

Is the above correct?

But we need associativity of +, inverses and identity for the abelian group \(\displaystyle M = S \times N \).

For associativity we require:

\(\displaystyle [(s_1, n_1) + (s_2, n_2)] + (s_3, n_3) = (s_1, n_1) + [(s_2, n_2) + (s_3, n_3)] \)

... but since we cannot combine different elements in any way, both these expressions above are just equal to \(\displaystyle (s_1, n_1) + (s_2, n_2) + (s_3, n_3) \)

For inverses, we would have (I think) that \(\displaystyle (-s_1, -n_1) \) would be the inverse of \(\displaystyle (s_1, n_1) \) since:

\(\displaystyle (s_1, n_1) + (-s_1, -n_1) = (s_1, n_1) + (-1)(s_1, n_1) = (s_1, n_1) - (s_1, n_1) = 0 \)

Now for the identity of the abelian group \(\displaystyle M = S \times N \) ... ... ? ... ... we could have (0,0) as the identity, since we could (? could we ?) say:

\(\displaystyle (s_1, n_1) + (0,0) = (s_1, n_1) \)

BUT, it looks a bit like what we are saying is as follows:

\(\displaystyle (s_1, n_1) + (0,0) = (s_1 + 0 , n_1 + 0 ) \)

and this seems to contradict what you have said about the free nature of the free \(\displaystyle \mathbb{Z} \)-module - especially that I should forget that the set \(\displaystyle S \times N \)has any structure?

Can you clarify?

Peter
 
Last edited:
  • #4
It is perhaps more profitable to think of the identity element as:

$0(s_1,n_1)$ for any pair $s_1,n_1$.

This does not violate $\Bbb Z$-linearity, as the $\Bbb Z$-coefficient is 0.

"0" is "special" the identity laws give it properties that do not hold for other elements, in general. In general, it doesn't HAVE an associated basis element, just like the origin in the plane doesn't have any "length" (it points in all directions simultaneously, or, if you like, it doesn't point in ANY direction).

It also does not make much sense to say:

$-(s,n) = (-s,-n)$.

$(s,n)$ and $(-s,-n)$ are two different basis elements, we can't "combine" them to get $(0,0)$, except by taking the 0-linear combination.

You are still trying to "simplify" a free object in terms of other objects you understand already. This is backwards: other objects you know already are simplifications of free objects, which are, by and large, beastly and unwieldy things.
 
  • #5
Think of $F(S\times N)$, the free abelian group generated by $S\times N$, as
$$ F = \bigg\{ f : S\times N \to \mathbb{Z} ~ \bigg| ~ f(s,n) = 0 \text{ for all but finitely many }(s,n) \in S\times N \bigg\} $$

Pick $(s_0,n_0) \in S\times N$. Consider the following two elements of $F$,
(i) $f:S\times N \to \mathbb{Z}$ defined by $f(s_0,n_0) = 1$ and $f(s,n) = 0$ for all $(s,n) \not = (s_0,n_0)$
(ii) $g:S\times N \to \mathbb{Z}$ defined by $g(-s_0,-n_0) = 1$ and $g(s,n) = 0$ for all $(s,n)\not = (-s_0,-n_0)$

Is it true that $f+g = 0$, here "$0$" being the zero-function in $F$? No.

Now, as a matter of simplified notation we denote the function $f$ by simply writing $1\cdot (s_0,n_0)$. All that means is that $f$ takes value $1$ on $(s_0,n_0)$, and is zero everywhere else. In the same way we denote $g$ by simply writing $1\cdot (-s_0,-n_0)$. All that means is that $g$ takes value $1$ on $(-s_0,-n_0)$, and is zero everywhere else. But $f+g \not = 0$, so in this simplified notation, we are saying that $(s_0,n_0) + (-s_0,-n_0) \not = 0$.

What is $f+g$? It is rather the function $h\in F$ so that $h = 1$ on $(s_0,n_0)$ and $(-s_0,-n_0)$ and it is zero everywhere else. The simplified notation for $h$ is, $1\cdot (s_0,n_0) + 1\cdot (-s_0,-n_0)$.

You have to stop combining the sums together as if it was a direct product construction. You leave them alone and do not combine them in any way.
 
  • #6
Now I see the beauty in what Keith Conrad did in his notes.

He replaced the basis elements with:

$\delta_{(s,n)}$ instead of $(s,n)$

to deliberately "break free" of the notion that $(s,n)$ was our "usual" coordinate pair.

Obviously, he is intending $\delta_{(s,n)}$ to be the function:

$\delta_{(s,n)}: S \times N \to R$ given by:

$\delta_{(s,n)}(s',n') = \left\{ \begin{array}{l}1,\text{ if }(s,n) = (s',n')\\0,\text{ otherwise } \end{array}\right.$

Pretty cool.
 
  • #7
ThePerfectHacker said:
Think of $F(S\times N)$, the free abelian group generated by $S\times N$, as
$$ F = \bigg\{ f : S\times N \to \mathbb{Z} ~ \bigg| ~ f(s,n) = 0 \text{ for all but finitely many }(s,n) \in S\times N \bigg\} $$

Pick $(s_0,n_0) \in S\times N$. Consider the following two elements of $F$,
(i) $f:S\times N \to \mathbb{Z}$ defined by $f(s_0,n_0) = 1$ and $f(s,n) = 0$ for all $(s,n) \not = (s_0,n_0)$
(ii) $g:S\times N \to \mathbb{Z}$ defined by $g(-s_0,-n_0) = 1$ and $g(s,n) = 0$ for all $(s,n)\not = (-s_0,-n_0)$

Is it true that $f+g = 0$, here "$0$" being the zero-function in $F$? No.

Now, as a matter of simplified notation we denote the function $f$ by simply writing $1\cdot (s_0,n_0)$. All that means is that $f$ takes value $1$ on $(s_0,n_0)$, and is zero everywhere else. In the same way we denote $g$ by simply writing $1\cdot (-s_0,-n_0)$. All that means is that $g$ takes value $1$ on $(-s_0,-n_0)$, and is zero everywhere else. But $f+g \not = 0$, so in this simplified notation, we are saying that $(s_0,n_0) + (-s_0,-n_0) \not = 0$.

What is $f+g$? It is rather the function $h\in F$ so that $h = 1$ on $(s_0,n_0)$ and $(-s_0,-n_0)$ and it is zero everywhere else. The simplified notation for $h$ is, $1\cdot (s_0,n_0) + 1\cdot (-s_0,-n_0)$.

You have to stop combining the sums together as if it was a direct product construction. You leave them alone and do not combine them in any way.

Deveno, ThePerfectHacker,

Thanks so much for your extensive guidance on tensor products and the free groups underpinning of it ... still reflecting on your posts ...

The characterization of \(\displaystyle F ( S \times N ) \) as

\(\displaystyle F = \bigg\{ f : S\times N \to \mathbb{Z} ~ \bigg| ~ f(s,n) = 0 \text{ for all but finitely many }(s,n) \in S\times N \bigg\} \)

was particularly helpful ...

... ... still thinking over the identity and inverses in F as an additive abelian group ...

Would a suitable definition of the identity i be as follows:

\(\displaystyle i: \ S \times N \to \mathbb{Z} \ \text{defined by} \ i(s,n) = 0 \ \text{for all} \ (s,n) \in S \times N \)

Is that OK ... can you confirm?


Regarding inverses ... if

\(\displaystyle f: \ S \times N \to \mathbb{Z} \ \text{defined by} f(s_0,n_0) = 1 \ \ \text{and} \ f(s,n) = 0 \ \text{for all} \ (s,n) \in S \times N \ne (s_0,n_0) \)

would the inverse of f, notation \(\displaystyle f^{-1} \) be as follows:

\(\displaystyle f^{-1}: \ S \times N \to \mathbb{Z} \ \text{defined by} f^{-1}(s_0,n_0) = -1 \ \ \text{and} \ f^{-1}(s,n) = 0 \ \text{for all} \ (s,n) \in S \times N \ne (s_0,n_0) \)

Then

\(\displaystyle f + f^{-1} = 1 + (-1) = 0 \) at \(\displaystyle (s_0, n_0) \) and 0 elsewhere, so

\(\displaystyle f + f^{-1} = 0\)

Can you confirm that the above is OK (or otherwise) ?

Peter
PS I still, when reasoning about the elements of F(S x N) in tensor products, find it necessary to think of the elements of F as \(\displaystyle (s_0 ,n_0) \) even though \(\displaystyle (s_0 ,n_0) \) is really f where

\(\displaystyle f: \ S \times N \to \mathbb{Z} \ \text{defined by} f(s_0,n_0) = 1 \ \ \text{and} \ f(s,n) = 0 \ \text{for all} \ (s,n) \in S \times N \ne (s_0,n_0) \)

and \(\displaystyle -(s_0 ,n_0) \) is really, say, \(\displaystyle f^{-1} \) where

\(\displaystyle f^{-1}: \ S \times N \to \mathbb{Z} \ \text{defined by} f^{-1}(s_0,n_0) = -1 \ \ \text{and} \ f^{-1}(s,n) = 0 \ \text{for all} \ (s,n) \in S \times N \ne (s_0,n_0) \)

Mind you, if instead of the functions, you think of the functional values, say f(s,n) = 1.(s,n) as (s,n) then that is pretty reasonable.
 
Last edited:
  • #8
You are correct about the additive identity for $F(S\times N)$, defined by,
$$ f:S\times N\to \mathbb{Z} \text{ by }f(s,n) = 0 \text{ for all }(s,n)\in S\times N$$

You are not exactly correct about the additive inverse. What you wrote down is the additive inverse for the function $f(s_0,n_0) = 1$ and $0$ elsewhere. Then $f^{-1}$ would be the function such that $f^{-1}(s_0,n_0) = -1$ and $0$ elsewhere.

However, not all elements for $F(S\times N)$ are of this form. For example, consider $g:S\times N \to \mathbb{Z}$ defined by $g(s_0,n_0) = 1$ and $g(s_1,n_1) = 1$ where $(s_0,n_0)$ and $(s_1,n_1)$ are distinct elements of $S\times N$. So what you wrote down is only a very special inverse.

More generally, the inverse is given by, $f^{-1}:S\times N \to \mathbb{Z}$ by $f^{-1}(x) = -f(x)$. It is easy to verify that this function is zero almost everywhere (do you see why?) and furthermore $f^{-1}(x) + f(x) = 0$.
 
  • #9
ThePerfectHacker said:
You are correct about the additive identity for $F(S\times N)$, defined by,
$$ f:S\times N\to \mathbb{Z} \text{ by }f(s,n) = 0 \text{ for all }(s,n)\in S\times N$$

You are not exactly correct about the additive inverse. What you wrote down is the additive inverse for the function $f(s_0,n_0) = 1$ and $0$ elsewhere. Then $f^{-1}$ would be the function such that $f^{-1}(s_0,n_0) = -1$ and $0$ elsewhere.

However, not all elements for $F(S\times N)$ are of this form. For example, consider $g:S\times N \to \mathbb{Z}$ defined by $g(s_0,n_0) = 1$ and $g(s_1,n_1) = 1$ where $(s_0,n_0)$ and $(s_1,n_1)$ are distinct elements of $S\times N$. So what you wrote down is only a very special inverse.

More generally, the inverse is given by, $f^{-1}:S\times N \to \mathbb{Z}$ by $f^{-1}(x) = -f(x)$. It is easy to verify that this function is zero almost everywhere (do you see why?) and furthermore $f^{-1}(x) + f(x) = 0$.
Thanks ThePerfectHacker! ... you have given me some further insights into the construction of the tensor product, especially into the nature of the abelian group (in fact the \(\displaystyle \mathbb{Z} \)-module) constructed on - indeed, I should say, generated by the set \(\displaystyle S \times N \).

We have:

\(\displaystyle F = \bigg\{ f : S\times N \to \mathbb{Z} ~ \bigg| ~ f(s,n) = 0 \text{ for all but finitely many }(s,n) \in S\times N \bigg\} \)

I will now indicate what I believe follows from what you have said ... please correct me if I am not making sense ...

I will move from some definitions of basic elements to some elements generated from them ... as follows:

Consider the element:

\(\displaystyle f_{(s_0, n_0)} \ : \ S \times N \to \mathbb{Z} \text{ defined by } f_{(s_0, n_0)} (s_0, n_0) =1 \text{ and } f_{(s_0, n_0)} (s, n) = 0 \text{ for all } (s,n) \ne (s_0, n_0) \)

However \(\displaystyle f_{(s_0, n_0)} \) is a generator for other elements. For example for \(\displaystyle r \in \mathbb{Z} \) we have

\(\displaystyle f_{ r_0(s_0, n_0)} = r_0 f_{(s_0, n_0)} \).At \(\displaystyle (s_0, n_0), \ f_{ r_0(s_0, n_0)} \) is computed as follows:

\(\displaystyle f_{ r_0(s_0, n_0)} = r_0 f_{(s_0, n_0)} (s_0, n_0) = f_{(s_0, n_0)} + f_{(s_0, n_0)} + ... \ ... + f_{(s_0, n_0)} (r_1 \text{ times) } \)

\(\displaystyle = 1 + 1 + ... \ ... +1 = r_0 \) but zero elsewhere.

Similarly we can define \(\displaystyle f_{ r_1(s_1, n_1)}, f_{ r_2(s_2, n_2)}, ... \ ..., f_{ r_k(s_k, n_k)} \)

Then we can generate elements of the free \(\displaystyle \mathbb{Z} \)-module as follows:

\(\displaystyle f = f_{ r_0(s_0, n_0)} + f_{ r_1(s_1, n_1)} + f_{ r_2(s_2, n_2)} + ... \ ... + f_{ r_k(s_k, n_k)} \)

\(\displaystyle = r_0 f_{(s_0, n_0)} + r_1f_{(s_1, n_1)} + r_2 f_{(s_2, n_2)} + ... \ ... + r_k f_{(s_k, n_k)} \)

The above function \(\displaystyle f \in F \) is \(\displaystyle r_0 \text{ at } (s_0, n_0) \), \(\displaystyle r_1 \text{ at } (s_1, n_1) \), \(\displaystyle r_2 \text{ at } (s_2, n_2) \), ... ... and \(\displaystyle r_k \text{ at } (s_k, n_k) \).

Thus presumably (?) we can identify f with the " sum" \(\displaystyle r_0(s_0, n_0) + r_1(s_1, n_1) + r_2(s_2, n_2) + ... \ ... + r_k(s_k, n_k) \)

Does the above reasoning make sense?

Is this the correct way to construct the free \(\displaystyle \mathbb{Z} \)-module on the set \(\displaystyle S \times N\)?

Can someone please help?

Peter

PS I am really unsure of identifying f with the " sum" \(\displaystyle r_0(s_0, n_0) + r_1(s_1, n_1) + r_2(s_2, n_2) + ... \ ... + r_k(s_k, n_k) \) since I think I am assuming some isomorphism ... but I cannot make sense of Dummit and Foote any other way ...
 
  • #10
Yes. That is all correct.

All what $(s_1,n_1)$ means here is a shorthand notation for the function that takes values $1$ at $(s_1,n_1)$ and $0$ everywhere else.

It is also easy to check that the inverse of $a_1(s_1,n_1) + ... + a_k(s_k,n_k)$ is $-a_1(s_1,n_1) - ... - a_k(s_1,n_1)$.

This is what it means "all formal sums". Basically, $F(S\times N)$ are just all expressions of sums of all pairs. But they are "formal sums", you just leave them alone, you do not add componentwise.
 
  • #11
In fact, some texts actually view vector spaces this way (vectors spaces are $F$-modules, and every $F$-module is free...the proof of this is a bit abstruse for arbitrary vector spaces, but for finitely-generated ones (that is, vector spaces of finite dimension) it is straight-forward).

It follows that we can make a similar tensor product of two vector spaces over the underlying field of both:

$V \otimes_F W$

which is a construction that is VERY useful in differential geometry.

So, if you finally "get" how this works for modules over a ring, the vector space analogues will come very easy to you (they are a bit easier to comprehend, since fields are very "nice" commutative rings).

To get an idea of how "big" a tensor product is compared to the "factors" the vector space version is useful, because vector spaces have a natural indication of size: their dimension. If $V,W$ are finite-dimensional, then:

$\text{dim}_{\ F}(V\otimes_F W) = \text{dim}_{\ F}(V) \ast \text{dim}_{\ F}(W)$

Note the analogy with:

$\text{dim}_{\ F}(V\oplus W) = \text{dim}_{\ F}(V) + \text{dim}_{\ F}(W)$

Tensor products are basically a way to "multiply modules" like direct sums are a way to "add modules". In ordinary ring-multiplication, bilinearity is just "the distributive laws", together with associativity and commutativity.

This means that given a suitable set of $R$-modules, we can think of this set itself as a ring, which turns out to be sort of handy.
 

FAQ: Tensor Products - The free Z-module construction

1. What is a tensor product?

The tensor product is a mathematical construction that takes two vector spaces and produces a new vector space. It is used to describe the relationship between vectors in one space and vectors in another space.

2. How is a tensor product different from a regular product?

A tensor product is different from a regular product because it takes into account more than just the operations of multiplication and addition. It also considers the relationships between the vectors being multiplied.

3. What is the purpose of using a tensor product in mathematics?

The purpose of using a tensor product is to extend the concept of a vector space to include more complex structures. It allows for the manipulation and analysis of vectors in multiple spaces simultaneously.

4. How is the free Z-module construction related to tensor products?

The free Z-module construction is a specific type of tensor product that is used to construct new vector spaces over the integers. It is a useful tool in abstract algebra and number theory.

5. What are some applications of tensor products in science and engineering?

Tensor products have many applications in science and engineering, including in quantum mechanics, electromagnetism, and signal processing. They are also used in areas such as computer graphics and machine learning.

Similar threads

Back
Top