A problem in multilinear algebra

  • A
  • Thread starter steenis
  • Start date
  • Tags
    Algebra
In summary, the problem in multilinear algebra involves proving that the map ##g:W \rightarrow V^*## defined by ##g(w)(v)=L(w,v)## is an isomorphism, given that ##L## is a non-degenerate bilinear map. After showing that ##g## is obviously linear and injective, it remains to prove that it is surjective. This can be done by considering the map ##h:V \rightarrow W^*## defined in a similar manner, which is also an isomorphism. This leads to the conclusion that ##g## is indeed an isomorphism, and therefore ##dim W=dim V=dim V^*##. However, this may not
  • #1
steenis
312
18
TL;DR Summary
A one-to-one correspondence between bilinear non-degenerate maps and invertible linear maps
I have the following problem in multilinear algebra:
Let ##W## and ##V## be real finite-dimensional vector spaces, ##V^*## is the dual space of ##V##
Let ##L:W \times V \rightarrow \mathbb{R}## be a non-degenerate bilinear map
Define ##g:W \rightarrow V^*## by ##g(w)(v)=L(w,v)##
To prove: ##g## is an isomorphism

The map ##g## is obviously linear
##g## is injective, because if ##g(w)=0## for ##w \in W##, then for all ##v \in V## we have that ##g(w)(v)=L(w,v)=0##, so, because ##L## is non-degenerate, we have ##w=0##
Remains to prove that ##g## is surjective

If ##dim W=dim V=n##, then we are ready, because in that case ##dim W=dim V^*=n## and ##g## is surjective if and only if ##g## is injective. On the other hand, if ##g## is bijective, then ##g## is an isomorphism and ##dim W=dimV=dim V^*##. However, it is not a-priori given that ##W## and ##V## have the same dimension

Can anybody help me with this problem, can the statement be proven or is the statement not true?
 
Physics news on Phys.org
  • #2
You are correct. The difficulty is that we only can conclude ##\dim W \leq \dim V##.
What happens if you consider ##h\, : \,V\longrightarrow W^*## with ##h(v)(w):=L(w,v)##?
 
  • #3
In the same way, ##h## is injective and therefore ##dim V \leq dim W##. Now we have ##dim W=dim V=dim V^*##, thus the map ##g## is surjective and therefore an isomorphism (##h## is also an isomorphism)
 
  • Like
Likes fresh_42

FAQ: A problem in multilinear algebra

1. What is multilinear algebra?

Multilinear algebra is a branch of mathematics that studies vector spaces and linear transformations with more than one input variable. It extends the concepts of linear algebra to multiple dimensions and allows for the manipulation of tensors, which are multidimensional arrays of numbers.

2. What are some applications of multilinear algebra?

Multilinear algebra has many applications in various fields, including physics, engineering, computer science, and statistics. It is used to model physical systems with multiple variables, such as fluid dynamics and electromagnetism. In computer science, it is used in machine learning and data analysis. In statistics, it is used for multivariate data analysis and regression analysis.

3. What is a problem in multilinear algebra?

One common problem in multilinear algebra is finding the eigenvalues and eigenvectors of a tensor. This involves finding the values and corresponding vectors that satisfy the equation Ax = λx, where A is a tensor and λ is a scalar. This problem is important in many applications, such as principal component analysis and image processing.

4. How is multilinear algebra different from linear algebra?

The main difference between multilinear algebra and linear algebra is that multilinear algebra deals with vector spaces and transformations with more than one input variable, while linear algebra deals with only one input variable. Multilinear algebra also introduces the concept of tensors, which are not present in linear algebra. Additionally, the operations and properties in multilinear algebra are more complex and involve multiple variables.

5. What are some useful techniques for solving problems in multilinear algebra?

Some useful techniques for solving problems in multilinear algebra include tensor decomposition, which breaks down a tensor into simpler components, and tensor contraction, which combines multiple tensors into a single tensor. Other techniques include using matrix representations of tensors and applying linear algebra techniques, such as eigenvalue decomposition and singular value decomposition.

Similar threads

Replies
4
Views
1K
Replies
2
Views
1K
Replies
2
Views
2K
Replies
5
Views
2K
Replies
15
Views
2K
Replies
21
Views
1K
Back
Top