Constructing Norms on Tensor Products of Finite Dimensional Vector Spaces

In summary, the tensor product space V\otimes W can be equipped with a norm \|\cdot\|_\otimes such that for pure tensors it holds that \|v\otimes w\|_\otimes=\|u\|_V\|w\|_W. One possible construction for this norm is to take the sum of the absolute values of the coefficients in the decomposition of an element as a linear combination of pure tensors. However, there may be more general constructions using the singular value decomposition and a convex function. The projective and injective tensor norms are also options, but may not be as intuitive or explicit.
  • #1
Pere Callahan
586
1
I was wondering about useful norms on tensor products of finite dimensional vector spaces.

Let V,W be two such vector spaces with bases [itex]\{v_1,\ldots,v_{d_1}\}[/itex] and [itex]\{w_1,\ldots,w_{d_2}\}[/itex]. We further assume that each is equipped with a norm, [itex]||\cdot||_V[/itex] and [itex]||\cdot||_W[/itex].

Then the tensor product space [itex]V\otimes W[/itex] is the vector space with basis [itex]\{v_i\otimes w_j:1\leq i\leq d_1, 1\leq j\leq d_2\}[/itex].

I have read a lot about norming the tensor product of two Banach spaces and there a lot of different choices can be made it seems. For the finite dimensional case I would be interested to know how one defines a norm [itex]\|\cdot\|_\otimes[/itex] on [itex]V\otimes W[/itex] such that for pure tensors it holds that [itex]\|v\otimes w\|_\otimes=\|u\|_V\|w\|_W[/itex]. Such norm are called crossnorms in the Banach space context (or so it seems) but I have not seen a construction of such a norm even for the finite dimensional case.

This is no homework.

Thanks,
PereEDIT:

It appears that if [itex]\|\sum_{i=1}^{d_1}{x_iv_i}\|_V[/itex] is defined as [itex]\sum_{i=1}^{d_1}{|x_i|}[/itex] and similarly for W then one could define
[tex]
\|\sum_{i,j}\gamma_{ij}v_i\otimes w_j\|_\otimes = \sum_{i,j}{|\gamma_{ij}|}.
[/tex]
This would be a norm and would satisfy the crossnorm condition because
[tex]
\left\|v\otimes w\right\|_\otimes=\left\|\left(\sum_{i=1}^{d_1}x_iv_i\right)\otimes\left(\sum_{j=1}^{d_2}y_jw_j\right)\right\|_\otimes = \left\|\sum_{ij}x_iy_j v_i\otimes w_j\right\|_\otimes = \sum_{ij}|x_i||y_j|=\|v\|_V\|w\|_W.
[/tex]

However there should be a more general construction for arbitrary norms on V and W.
 
Last edited:
Physics news on Phys.org
  • #2
This is late so I may be wrong, but here's what I think.

In the general case, you can take

[tex]\sum_{i,j}\gamma_{ij}v_i\otimes w_j[/tex]

and apply singular value decomposition to [itex]\gamma[/itex], thus rewriting your element in the form

[tex] \sum_i \gamma_i v'_i \otimes w'_i[/tex]

such that

[tex]\|v'_i\|_V = \|w'_i\|_W = 1[/tex]

and then use any convex function [itex]F(\gamma_i)[/itex] such that [itex]F(1,0,...0)=1[/itex], ..., [itex] F(\alpha \gamma_i) = |\alpha| F(\gamma_i)[/itex]to define

[tex]
\|\sum_{i,j}\gamma_{ij}v_i\otimes w_j\|_\otimes = F(\gamma_i).
[/tex]

Crossnorm condition is satisfied by construction. It is a bit tricky to prove that it is a norm, triangle inequality looks particularly challenging ... Maybe try special cases [itex]F(\gamma_i) = \sum_i |\gamma_i|[/itex] and [itex]F(\gamma_i) = \sqrt{\sum_i \gamma_i^2}[/itex].
 
Last edited:
  • #3
On further thought, I began to doubt that it's a norm for any, let alone all F. Besides, for some norms on V and W, it may not be unique.

Oh well, I don't know then...
 
  • #4
Thanks for your thought hamster. I also don't see how the general construction with the SVD and a convex function F would give a norm. Well, the example I gave in my first post certainly extends to all p-norms on V and W and maybe that's enough.
 
  • #5
Have you seen this;

http://en.wikipedia.org/wiki/Topological_tensor_product

There is a largest cross norm π called the projective cross norm, given by

[tex] \pi(x) = \inf \{ \Sigma_{i=1}^n \|a_i\| \|b_i\| : x = \Sigma a_i \otimes b_i\}[/tex]

where [itex]x \in A \otimes B.[/itex]
 
  • #6
Yes I knew about the projective and injective tensor norms. I just found them little intuitive and was hoping for more explicit constructions in the finite dimensional case. But I think for my purposes the p-norms will do.
 

FAQ: Constructing Norms on Tensor Products of Finite Dimensional Vector Spaces

What is the tensor product of two vectors?

The tensor product of two vectors is a mathematical operation that results in a new vector space. It is denoted by the symbol ⊗ and is used to combine two vectors into a single, larger vector.

How is the norm on tensor product defined?

The norm on tensor product is defined as the maximum of the norms of the individual vectors. In other words, the norm of the tensor product is equal to the larger of the two norms of the original vectors.

What is the purpose of the norm on tensor product?

The norm on tensor product is used to measure the size or length of a vector. It is a useful tool in linear algebra and is often used to solve problems involving vector spaces.

Can the norm on tensor product be negative?

No, the norm on tensor product is always positive. It represents the magnitude or size of a vector and therefore cannot be negative.

How is the norm on tensor product related to the concept of inner product?

The norm on tensor product is closely related to the concept of inner product. In fact, the norm of a tensor product is equal to the square root of the inner product of the tensor product with itself. This relationship allows us to easily calculate the norm of a tensor product using the inner product.

Similar threads

Replies
3
Views
3K
Replies
11
Views
2K
Replies
32
Views
3K
Replies
1
Views
2K
Replies
14
Views
2K
Back
Top