Proof of Existence of Tensor Product: Cooperstein Theorem 10.1

In summary: V_1 \times \cdots \times V_n \to FV(X)$ by:$$\iota(v_1,\dots,v_n) = \sum_{i=1}^n v_i.$$So, we have TWO UMPs now:1) The UMP from $FV(X)$ to $V$: $\iota \in FV(X) \Longrightarrow \iota(v_1,\dots,v_n) \in V$.2) The UMP from $V
  • #1
Math Amateur
Gold Member
MHB
3,998
48
I am reading Bruce N. Coopersteins book: Advanced Linear Algebra (Second Edition) ... ...

I am focused on Section 10.1 Introduction to Tensor Products ... ...

I need help with another aspect of the proof of Theorem 10.1 regarding the existence of a tensor product ... ... The relevant part of Theorem 10.1 reads as follows:https://www.physicsforums.com/attachments/5429
View attachment 5430
View attachment 5431
View attachment 5432In the above text we read the following:" ... ... Recall that \(\displaystyle V_1 \times \ ... \ \times V_m = X\) and that \(\displaystyle Z\) is a vector space based on \(\displaystyle X\). Since \(\displaystyle W\) is a vector space and \(\displaystyle f\) is a map from \(\displaystyle X\) to \(\displaystyle W\), by the universal property of \(\displaystyle Z\) there exists a unique linear transformation \(\displaystyle S \ : \ Z \longrightarrow W\) such that \(\displaystyle S\) restricted to \(\displaystyle X\) is \(\displaystyle f\). ... ..."


Now I have summarised the mappings involved in Theorem 10.1 in Figure 1 below ... ...View attachment 5433

My question is as follows:

Why does \(\displaystyle Z\) have a universal mapping property ...? ... ... and indeed if \(\displaystyle Z\) has one, why doesn't \(\displaystyle V\) ... ... giving us the relationship \(\displaystyle T \gamma = f\) that we want ... what is special about \(\displaystyle Z\)?
Hope someone can help ...

Peter*** NOTE ***

... ... oh no! ... ... I think I have just realized the answer to my question ... hmm ... embarrassingly simple ... ... I think that \(\displaystyle Z\) has a UMP because \(\displaystyle (Z, \iota )\) is assumed to be the vector space based on the set \(\displaystyle X\) ... and vector spaces based on a set have a UMP ... is that right? ... see Cooperstein Definition 10.1 on the first page of Section 10.1 provided below ...

Can someone confirm that this is the reason Z has a Universal Mapping Property ...

Peter==========================================================*** NOTE ***It may help readers of the above post to be able to read Cooperstein's introduction to Section 10.1 where he covers, among other things, the notion of a vector space being based on a set and the idea of the universal mapping problem ... ... so I am providing this text as follows:View attachment 5434
View attachment 5435
View attachment 5431
View attachment 5432
 
Last edited:
Physics news on Phys.org
  • #2
The funny thing about this, is that it is a 3-layered cake, of UMP built upon UMP built upon UMP.

Let's start with the top level.

A tensor product of vectors spaces $V_1,\dots,V_n$ is a vector space $V$ along with a multilinear (that is, linear in each $v_i \in V_i$, or $n$-linear, if you prefer) map $\mu: V_1 \times \cdots \times V_n \to V$ such that:

If $W$ is ANY vector space, and $M: V_1 \times V_n \to W$ is ANY multilinear map, there exists a UNIQUE linear map:

$L: V \to W$ such that $L \circ \mu = M$.

This property characterizes the tensor product, up to isomorphism (or "up to iso" as the cool kids say), but it does not assure us that such a thing even exists. For example, I could say:

A "Shirlharvey" is a pregnant man, which defines what a Shirlharvey is, but does not tell us if there are actually any such entities. To do so, I have to exhibit one.

So, our task is now to define SOME vector space, and SOME multilinear map, such that it is at least ONE entity that satisfies the UMP (there might be many others, but all are isomorphic).

This takes us to the second (middle layer).

We begin by defining the free vector space defined on a set, $X$, which I will call $FV(X)$. This in turn, is defined by yet ANOTHER UMP.

The free vector space UMP says that we have a vector space $FV(X)$ along with an injective (set) mapping $\iota:X \to FV(X)$, such that if $U$ is ANY vector space, and $f: X \to U$ is any (set) function, there exists a unique linear map:

$T:FV(X) \to U$, with $T \circ \iota = f$.

Again, this UMP characterizes the free vector space "based on $X$" *up to iso*, but of course, now we need to show that THIS construction exists (that is, we can make "one example"). The existence of the tensor product will have to wait a bit.

Again, the specific example given here is $\mathcal{M}_{fin}(X)$, the set of functions $X \to F$ (where $F$ is the (fixed) underlying field we wish our scalars to live in) with finite support, which we can regard as linear combinations of the characteristic functions of $X$ (which are in one-to-one correspondence with the elements of $X$). This construction involves a clever use of $1$ and $0$ carrying two meanings simultaneously: as field elements, and as "indicator values" for the characteristic functions.

Essentially, this involves setting elements of $X$ as "formal basis elements" (which is why we use the functions $\chi_x$), and taking "formal linear combinations" of these. Note well, linear combinations are what vector spaces are all about.

Now, granting that we "understand" what $FV(X)$ is, we use a PARTICULAR set $X$, namely:

$X = V_1 \times \cdots \times V_n$ (this is a BIG set, usually). Since $FV(V_1 \times \cdots V_n)$ is even bigger, we have wandered into the realm of "stop trying to imagine how big it is-you can't".

But, we're not there yet: $FV(V_1 \times \cdots V_n)$ is NOT the vector space we're looking for, what we are looking for is a QUOTIENT SPACE:

$[FV(V_1\times\cdots\times V_n)]/V_0$.

And, as luck would have it, quotient spaces are defined by yet ANOTHER UMP.

That is to say, a quotient space of a vector space $V$ by a subspace $U$ is a vector space $W$ along with a surjective linear map:

$q: V \to W$

such that if $W'$ is ANY vector space, and $h:V \to W'$ any linear map such that $h(U) = 0$, then there exists a UNIQUE linear map $g: W \to W'$ with $h = g\circ q$.

Again, this defines the quotient space $V/U$ "up to iso", but there is no guarantee such a thing even exists.

However, we can take:

$W = \{v + U: v \in V\}$, where $v + U = \{v + u: u \in U\}$

And define $(v + U) + (v' + U) = (v+v') + U$, and $a(v+U) = av + U$. It is routine that this defines a vector space, and moreover:

$q:v \mapsto v + U$ is thus a surjective linear map $V \to W = V/U$.

So now we verify that "this" $W$ satisfies our UMP.

Let $W'$ be a vector space, and $h: V \to W'$ be a linear map such that $h(u) = 0_{W'}$, for all $u \in U$. We define:

$g: V/U \to W'$ by:

$g(v + U) = h(v)$.

First, we must ensure $g$ depends only on the coset $v + U$, and not on some particular element $v$ in it. In other words, we need to show that if $v+U = v' + U$ then $h(v) = h(v')$.

Now if $v + U = v' + U$, we have $v + U - (v' + U) = 0_V + U = U$. But:

$v + U - (v' + U) = v - v' + U$. If this equals $U$, we must have $v - v' \in U$, that is: $v - v' = u$, so $v = v' + u$, for some $u \in U$.

Since $h$ is linear:

$h(v) = h(v'+u) = h(v') + h(u)$

Since, by stipulation, $h(u) = 0_{W'}$, this becomes:

$h(v) = h(v') + 0_{W'} = h(v')$, and so $g$ is "well-defined".

Thus $h(v) = g(v + U) = g(q(v)) = (g \circ q)(v)$, for any $v \in V$, so we conclude that indeed, $h = g \circ q$, and it should be clear $g$ is the ONLY linear map that does this.

For suppose we have $g' \circ q = h$, as well. Consider the map:

$(g' - g)$ defined by: $(g' - g)(v+U) = g'(v+U) - g(v+U)$.

Then $((g' - g) \circ q)(v) = g'(v+U) - g'(v+U) = h(v) - h(v) = 0$.

Thus, for any $v+U \in V/U$, we have $g(v+U) = g'(v+U)$, that is: $g = g'$.

We say $q: V \to V/U$ is "universal among homomorphisms out of $V$ that kill $U$."

Now, we watch the dominoes fall...

Let $V_0$ be the subspace of $FV(V_1 \times V_n)$ generated by all elements of the form:

$(v_1,\dots,v_i+v'_i,\dots,v_n) - (v_1,\dots,v_i\dots,v_n) - (v_1,\dots,v_i',\dots,v_n)$ for each $i = 1,\dots,n$ and:

$(v_1,\dots,cv_i,\dots,v_n) - c(v_1,\dots,v_i,\dots,v_n)$ (again for each $i$).

Then the quotient map $q: FV(V_1\times\cdots \times V_n) \to [FV(V_1\times\cdots \times V_n)]/V_0$ sends all such elements above to the $0$-vector of the quotient space.

Hence the map $\mu = q \circ \iota : V_1 \times \cdots \times V_n \to [FV(V_1\times\cdots \times V_n)]/V_0$ is multilinear (by the way we defined $V_0$).

Now it is trivial to verify, that if $M:V_1 \times \cdots \times V_n \to W$ is multilinear, we can define:

$L:[FV(V_1\times\cdots \times V_n)]/V_0 \to W$ by:

$L((v_1,\dots,v_n) + V_0) = M(v_1,\dots,v_n)$

(Again, with the usual caveats, of showing $L$ is well-defined, linear and uniquely determined by $M$).
 
  • #3
Deveno said:
The funny thing about this, is that it is a 3-layered cake, of UMP built upon UMP built upon UMP.

Let's start with the top level.

A tensor product of vectors spaces $V_1,\dots,V_n$ is a vector space $V$ along with a multilinear (that is, linear in each $v_i \in V_i$, or $n$-linear, if you prefer) map $\mu: V_1 \times \cdots \times V_n \to V$ such that:

If $W$ is ANY vector space, and $M: V_1 \times V_n \to W$ is ANY multilinear map, there exists a UNIQUE linear map:

$L: V \to W$ such that $L \circ \mu = M$.

This property characterizes the tensor product, up to isomorphism (or "up to iso" as the cool kids say), but it does not assure us that such a thing even exists. For example, I could say:

A "Shirlharvey" is a pregnant man, which defines what a Shirlharvey is, but does not tell us if there are actually any such entities. To do so, I have to exhibit one.

So, our task is now to define SOME vector space, and SOME multilinear map, such that it is at least ONE entity that satisfies the UMP (there might be many others, but all are isomorphic).

This takes us to the second (middle layer).

We begin by defining the free vector space defined on a set, $X$, which I will call $FV(X)$. This in turn, is defined by yet ANOTHER UMP.

The free vector space UMP says that we have a vector space $FV(X)$ along with an injective (set) mapping $\iota:X \to FV(X)$, such that if $U$ is ANY vector space, and $f: X \to U$ is any (set) function, there exists a unique linear map:

$T:FV(X) \to U$, with $T \circ \iota = f$.

Again, this UMP characterizes the free vector space "based on $X$" *up to iso*, but of course, now we need to show that THIS construction exists (that is, we can make "one example"). The existence of the tensor product will have to wait a bit.

Again, the specific example given here is $\mathcal{M}_{fin}(X)$, the set of functions $X \to F$ (where $F$ is the (fixed) underlying field we wish our scalars to live in) with finite support, which we can regard as linear combinations of the characteristic functions of $X$ (which are in one-to-one correspondence with the elements of $X$). This construction involves a clever use of $1$ and $0$ carrying two meanings simultaneously: as field elements, and as "indicator values" for the characteristic functions.

Essentially, this involves setting elements of $X$ as "formal basis elements" (which is why we use the functions $\chi_x$), and taking "formal linear combinations" of these. Note well, linear combinations are what vector spaces are all about.

Now, granting that we "understand" what $FV(X)$ is, we use a PARTICULAR set $X$, namely:

$X = V_1 \times \cdots \times V_n$ (this is a BIG set, usually). Since $FV(V_1 \times \cdots V_n)$ is even bigger, we have wandered into the realm of "stop trying to imagine how big it is-you can't".

But, we're not there yet: $FV(V_1 \times \cdots V_n)$ is NOT the vector space we're looking for, what we are looking for is a QUOTIENT SPACE:

$[FV(V_1\times\cdots\times V_n)]/V_0$.

And, as luck would have it, quotient spaces are defined by yet ANOTHER UMP.

That is to say, a quotient space of a vector space $V$ by a subspace $U$ is a vector space $W$ along with a surjective linear map:

$q: V \to W$

such that if $W'$ is ANY vector space, and $h:V \to W'$ any linear map such that $h(U) = 0$, then there exists a UNIQUE linear map $g: W \to W'$ with $h = g\circ q$.

Again, this defines the quotient space $V/U$ "up to iso", but there is no guarantee such a thing even exists.

However, we can take:

$W = \{v + U: v \in V\}$, where $v + U = \{v + u: u \in U\}$

And define $(v + U) + (v' + U) = (v+v') + U$, and $a(v+U) = av + U$. It is routine that this defines a vector space, and moreover:

$q:v \mapsto v + U$ is thus a surjective linear map $V \to W = V/U$.

So now we verify that "this" $W$ satisfies our UMP.

Let $W'$ be a vector space, and $h: V \to W'$ be a linear map such that $h(u) = 0_{W'}$, for all $u \in U$. We define:

$g: V/U \to W'$ by:

$g(v + U) = h(v)$.

First, we must ensure $g$ depends only on the coset $v + U$, and not on some particular element $v$ in it. In other words, we need to show that if $v+U = v' + U$ then $h(v) = h(v')$.

Now if $v + U = v' + U$, we have $v + U - (v' + U) = 0_V + U = U$. But:

$v + U - (v' + U) = v - v' + U$. If this equals $U$, we must have $v - v' \in U$, that is: $v - v' = u$, so $v = v' + u$, for some $u \in U$.

Since $h$ is linear:

$h(v) = h(v'+u) = h(v') + h(u)$

Since, by stipulation, $h(u) = 0_{W'}$, this becomes:

$h(v) = h(v') + 0_{W'} = h(v')$, and so $g$ is "well-defined".

Thus $h(v) = g(v + U) = g(q(v)) = (g \circ q)(v)$, for any $v \in V$, so we conclude that indeed, $h = g \circ q$, and it should be clear $g$ is the ONLY linear map that does this.

For suppose we have $g' \circ q = h$, as well. Consider the map:

$(g' - g)$ defined by: $(g' - g)(v+U) = g'(v+U) - g(v+U)$.

Then $((g' - g) \circ q)(v) = g'(v+U) - g'(v+U) = h(v) - h(v) = 0$.

Thus, for any $v+U \in V/U$, we have $g(v+U) = g'(v+U)$, that is: $g = g'$.

We say $q: V \to V/U$ is "universal among homomorphisms out of $V$ that kill $U$."

Now, we watch the dominoes fall...

Let $V_0$ be the subspace of $FV(V_1 \times V_n)$ generated by all elements of the form:

$(v_1,\dots,v_i+v'_i,\dots,v_n) - (v_1,\dots,v_i\dots,v_n) - (v_1,\dots,v_i',\dots,v_n)$ for each $i = 1,\dots,n$ and:

$(v_1,\dots,cv_i,\dots,v_n) - c(v_1,\dots,v_i,\dots,v_n)$ (again for each $i$).

Then the quotient map $q: FV(V_1\times\cdots \times V_n) \to [FV(V_1\times\cdots \times V_n)]/V_0$ sends all such elements above to the $0$-vector of the quotient space.

Hence the map $\mu = q \circ \iota : V_1 \times \cdots \times V_n \to [FV(V_1\times\cdots \times V_n)]/V_0$ is multilinear (by the way we defined $V_0$).

Now it is trivial to verify, that if $M:V_1 \times \cdots \times V_n \to W$ is multilinear, we can define:

$L:[FV(V_1\times\cdots \times V_n)]/V_0 \to W$ by:

$L((v_1,\dots,v_n) + V_0) = M(v_1,\dots,v_n)$

(Again, with the usual caveats, of showing $L$ is well-defined, linear and uniquely determined by $M$).
Thanks so much for the post above, Deveno ... ... most instructive and helpful ... ... just working through it in detail now ... and reflecting ...

Just a simple preliminary question ... you write:

" ... ... This takes us to the second (middle layer).

We begin by defining the free vector space defined on a set, $X$, which I will call $FV(X)$. This in turn, is defined by yet ANOTHER UMP. ... ... "


Now Keating in his book "A First Course in Module Theory" defines a free module as a module that has a basis ... but clearly this definition does not work for vector spaces ... ... so what is a free vector space?

Hope you can help ...

Peter
 
  • #4
Peter said:
Thanks so much for the post above, Deveno ... ... most instructive and helpful ... ... just working through it in detail now ... and reflecting ...

Just a simple preliminary question ... you write:

" ... ... This takes us to the second (middle layer).

We begin by defining the free vector space defined on a set, $X$, which I will call $FV(X)$. This in turn, is defined by yet ANOTHER UMP. ... ... "


Now Keating in his book "A First Course in Module Theory" defines a free module as a module that has a basis ... but clearly this definition does not work for vector spaces ... ... so what is a free vector space?

Hope you can help ...

Peter

Sure it does: the truth is, considered as an $F$-module, EVERY VECTOR SPACE IS FREE. In fact, given any basis $B$, of a vector space $V$, we have $FV(B) = V$.

This is part of why linear algebra is so "clean".
 
  • #5
Deveno said:
Sure it does: the truth is, considered as an $F$-module, EVERY VECTOR SPACE IS FREE. In fact, given any basis $B$, of a vector space $V$, we have $FV(B) = V$.

This is part of why linear algebra is so "clean".
Thanks Deveno ... ... BUT ... then when you wrote:" ... ... We begin by defining the free vector space defined on a set, $X$, which I will call $FV(X)$. This in turn, is defined by yet ANOTHER UMP. ... ... "

I do not understand why you wrote about a "free" vector space at all ... it is just any vector space ... so why not write:

" ... ... We begin by defining a vector space V defined on a set, $X$, which I will call $V(X)$. ... ..."

... ... or something like that ... essentially dropping the descriptor "free" ...

Peter
 
  • #6
Peter said:
Thanks Deveno ... ... BUT ... then when you wrote:" ... ... We begin by defining the free vector space defined on a set, $X$, which I will call $FV(X)$. This in turn, is defined by yet ANOTHER UMP. ... ... "

I do not understand why you wrote about a "free" vector space at all ... it is just any vector space ... so why not write:

" ... ... We begin by defining a vector space V defined on a set, $X$, which I will call $V(X)$. ... ..."

... ... or something like that ... essentially dropping the descriptor "free" ...

Peter

There's a couple of reasons. One is to distinguish it as a distinct item from $V_1 \times \cdots \times V_n$, which has a direct sum vector space structure. Another is to reinforce the analogy with other free constructions, such as free groups, free abelian groups, and free modules, all of which are defined in essentially the same way, by a universal mapping property. Finally, there is a difference in algebra between an "internal construction" (such as when we START with a vector space $V$, find a basis $B$, and then re-cast $V$ as $FV(B)$), and an "external construction", where we just start with the set $X$, and "add stuff to it" until we have enough parts to make a vector space.

"Freeness" generally indicates a completely general way to make a $Y$-structure out of $X$. Because of its construction, a tensor product is a free multilinear map.

Note that in this entire discussion, the emphasis is on mappings. Elements of the structures we discuss take a back-seat supporting role. In fact, elements of a set $X$ can be identified with the mappings $1 \to X$ (here "1" is the unique (up to a unique set bijection) set with 1 element). The idea is, that insofar as *algebraic* properties go "we don't care" what elements we apply to them to, we care about the properties themselves.

Of course, applied mathematicians, physicists, engineers, et. al, would beg to differ-their work often depends on the "particular" nature of a certain situation, and the properties are only useful insofar as they help to produce a result. This "dual nature" of the structures investigated by mathematics is what keeps it from just being "general abstract nonsense".
 

FAQ: Proof of Existence of Tensor Product: Cooperstein Theorem 10.1

1. What is the Cooperstein Theorem 10.1 and why is it important in tensor product proofs?

The Cooperstein Theorem 10.1 is a mathematical theorem that proves the existence of the tensor product of two vector spaces. It is important in tensor product proofs because it provides a rigorous and systematic way of constructing the tensor product, which is a fundamental concept in linear algebra and functional analysis.

2. How does the Cooperstein Theorem 10.1 differ from other theorems in tensor product proofs?

The Cooperstein Theorem 10.1 differs from other theorems in tensor product proofs because it is a constructive theorem, meaning it provides a specific method for constructing the tensor product. Other theorems may prove the existence of the tensor product without providing a specific construction method.

3. What are the main components of the proof of the Cooperstein Theorem 10.1?

The main components of the proof of the Cooperstein Theorem 10.1 include the use of the universal property of tensor product, the use of multilinear maps, and the construction of a specific basis for the tensor product space.

4. What are the key applications of the Cooperstein Theorem 10.1 in mathematics and other fields?

The Cooperstein Theorem 10.1 has various applications in mathematics, including in the study of multilinear algebra, functional analysis, and differential geometry. It also has applications in other fields such as physics and computer science, where the concept of tensor product is used extensively.

5. Are there any limitations or extensions of the Cooperstein Theorem 10.1?

While the Cooperstein Theorem 10.1 provides a powerful and widely applicable method for constructing tensor products, it is limited to vector spaces over fields. There have been extensions of this theorem to more general algebraic structures, such as modules over rings. Additionally, there are variations of this theorem that deal with infinite-dimensional vector spaces or non-commutative algebras.

Similar threads

Replies
13
Views
3K
Replies
1
Views
1K
Replies
1
Views
940
Replies
2
Views
2K
Replies
4
Views
1K
Back
Top