Electric and magnetic constants are tensors

In summary, a tensor is a multilinear map (a map that's linear in each variable) from a vector space and the dual of the vector space to the Reals. It can take in multiple vectors and covectors and produce a scalar. Examples include matrices, vectors, and the dot product. The components of a tensor can transform when changing coordinate systems, and this transformation follows a specific set of rules.
  • #36
but to rephrase hurkyls very clear explanation: a tensor is just a way of assigning a number to a sequence of vectors and covectors, that is linear in each variable separatley, i.e. it is some way of multiplying them.

so anytime you encounter a quantity that depends on several tangent vectors and cotangent vectors, and is linear in each one separately, i.e. somehow is a product of them of some kind, maybe yielding a number or another vector or covector, or even yielding another linear or multilinear map, it seems to be [representable as] a tensor.
 
Physics news on Phys.org
  • #37
it would be very illuminating now to go back and take the various ophysical exampels athat are said to be tenosrs, and analyze exactly how they fit into this paradigm.

i.e. how does stress appear as a multilinear function on some sequence of vectors and covectors? etc...

then we will actually be talking to and communicating with each other.
 
  • #38
let me indulge in a few more points tht cause confusin and controversy. people here have debated whether a certain thing IS or IS NOT a tensor. e.g. is a vector a tensor? is a matrix a tensor?

well if you accept hurkyl's description of a tensor as a multilinear real valued function on sequences of vectors and covectors and covectors, then no, neither a vector nor a matrix is strictly such an object.But as recently debated in the media in another context, it does all depend on your definition of the word "is".

I.e. there is a natural map from a vector space V to V** = linear functions on linear functions. if v is a vector, then it defiensa linear function on linear functions by evaluation, i.e. v takes f to f(v), which is being thought of as v(f).

this map is very natural and injective, hence in finite dimensions is an isomorphism, although not in infinite dimensions, hence has no such natural inverse.

but this allows one by means of this uniquely natural isomorophism to say that a vector "is" a linear function on covectors and hence a tensor.

similarly there is a natural map from VtensorV* to Hom(V,V), taking a basic tensor of form vtensorf to the linear map sending w to f(w).v.

This too is an isomorophism in finite dimensions and hence permits a matrix, or linear endomorphism in Hom(V,V) to be thiught of as a tensor belonging to VtensorV*.

Since this map is completely natural and unique, there is really only one way to represent a matrix this was a a tensor, so there can be "no confusion" (haha) in saying a matrix is in this sense a tensor of type (1,1)?anyway mathematicians tend to asume that all statements are made up to natural isomorphism, unless they are trying to win an argument.

so anytine somebody says such and such IS something else, ask yourself if there is any natural way to intepret that as true.
 
  • #39
if i have the indices right, by the conventions above a vector "is" a tensor of type (1,0) anda matrix "is" a tensor of type (1,1).

however because there is no natural isomorphism betwen V and V*, even in finite dimensions, although many unnaturl ones, we cannot naturally identify tensors of types (1,0) with those of type (0,1).

now i am geting beyond what i have thought about thoroughly here, but a riemannian metric does allow vectors in V to be identified with vector in V*. nonetheless, the transormation laws get screwed up i believe, so although a field of elements of various V's can thus be changed into a field of elements of V*, they will not transform correctly if one uses the other transofrmation rules.

thus I guess even in the presence of a metric one must distinguish types of vector fields, but i have made mistakes on this before by claiming otherwise.

the problem sems to be that when replaces a linear function by dotting with a vector, one still hS TO TRANSFORM BY THE transpose of the matrix for transforming vectors, not the matrix itself, and in the opposite direction.

so if you replace a linear function by a vector, such as replacing a diferential by a gradient vector, I think now that transforming that vector by the vector transformation laws, will not transform the linear function correctly.

i apolopgize if i hVE MISLED PEOPLE ON THIS POINT BEFORE.

i.e. keep in mind that one does not really replace the differential by the gradient, but by the operation of dotting with the gradient.
 
  • #40
a bit more on when thigns may be considred the same.

a natural construction is also called a"functor" to frighten children.

thus "chANGING" V to V is a functor, the identity functor, "changing V to V* is a functor the dual functor, V to Hom(V,V) is a functor, V to VtensorV*is a functor, and so on...some con structions, i.e.some functors are essentilly equivaklent or at elast related to otger functors, such relationships are expresed by families of maps called natural transformations.

there is a natural transformation from the identity to the double dual functor sending V to V** for every V, and its is an equivalence for finite dimensional V.

there is a naturl transformation from VtensorV* to Hom(V,V) and it is an equivalnece for finite dimensional V.

there is no natural transformation from V to V*. in any case, the right definition of the word "is" above might be "is naturally equivalent to" or "define naturally equivalent functors", to frighten children and adults.
 
  • #41
Tensor is a general vector.
 
  • #42
i highly recommnend the following text, on tensors, by a math prof and a prof of mechanical engineerinbg: book #60 by biowen and wang on the free site:
http://www.math.gatech.edu/~cain/tex...linebooks.html .in particular their introduction includes this:

In preparing this two volume work our intention is to present to Engineering and Science
students a modern introduction to vectors and tensors. Traditional courses on applied mathematics
have emphasized problem solving techniques rather than the systematic development of concepts.
As a result, it is possible for such courses to become terminal mathematics courses rather than
courses which equip the student to develop his or her understanding further.

As Engineering students our courses on vectors and tensors were taught in the traditional
way. We learned to identify vectors and tensors by formal transformation rules rather than by their
common mathematical structure. The subject seemed to consist of nothing but a collection of
mathematical manipulations of long equations decorated by a multitude of subscripts and
superscripts. Prior to our applying vector and tensor analysis to our research area of modern
continuum mechanics, we almost had to relearn the subject. Therefore, one of our objectives in
writing this book is to make available a modern introductory textbook suitable for the first in-depth
exposure to vectors and tensors. Because of our interest in applications, it is our hope that this
book will aid students in their efforts to use vectors and tensors in applied areas. in particular they explain such things as the natural isomorphisms of V and V**, and of Hom(V,V) with tensors of type (1,1) on V.
 
Last edited by a moderator:
  • #43
chapter 11.5 of dummitt and foote 2nd edition discusses tensors, symmetric and alternating, and the isomorphism betwen the symmetric algebra of an dimensional vector space and the ring of polynomials in n variables.
 
  • #44
"In classical physics it is customary to define a tensor T_ijk... by generalizing

[tex]V_i= \sum_{j} R_{ij} Vj [/tex]

(which is his definition of a vector, a quantity which components transforms like that) as follows

[tex]
T_{ijk...}= \sum_{i'}\sum_{j'}\sum_{k'} ... R_{ii'} R_{jj'} R_{kk'} ... T_{i'j'k'...}
[/tex]
under a rotation specified by the 3*3 orthogonal matrix R"

/Modern quantum mechanics by J.J. Sakurai

I understand that a tensor takes N vectors to a scalar, or N components of N vectors to a scalar. And this can be generalized to include upper indices (covectors, which is nothing more than column vectors).

And the tensor above is a cartesian tensor. Now I am trying to learn about spherical tensors. Can someone tell me about them? And is what i wrote above about cartesian tensors correct?
 
Last edited:
  • #45
arggggh! actually there is no thing such as a tensor. the terminology is a joke on the community. try to calm down and forget about wanting to know what a tensor is, babble babble babble...
 
Last edited:
  • #46
mathwonk said:
arggggh! actually there is no thing such as a tensor. the terminology is a joke on the community. try to caln down and forget about wanting to know what a tensor is, babble babble babble...

Are you suggesting that Sakurai is wrong, me is wrong or us both? :)
 
  • #47
Michael_McGovern said:
No, that is not right. The dot product is not a tensor, nor is the result of a dot product a (0,2) tensor-it is a (0,0) tensor a.k.a. scalar.
Of course it is. The dot product maps two vectors into a scalar. I.e. g(A,B) -> real number. By the very definition of tensors this is truly a tensor (of second rank).

Pete
 
  • #48
mathwonk said:
now i am geting beyond what i have thought about thoroughly here, but a riemannian metric does allow vectors in V to be identified with vector in V*. nonetheless, the transormation laws get screwed up i believe, so although a field of elements of various V's can thus be changed into a field of elements of V*, they will not transform correctly if one uses the other transofrmation rules.

In a slight abuse of notation, let [itex]g[/itex] be the metric-induced, natural isomorphism from [itex]V[/itex] to [itex]V*[/itex], denote the image under [itex]g[/itex] of [itex]\tilde{w}[/itex] by [itex]\tilde{w}[/itex]. Let [itex]L[/itex] be an isometry, with [itex]w' = Lw[/itex]. Let [itex]\tilde{L} = g^{-1}Lg[/itex] be the isomorphism on [itex]V*[/itex] that makes

[insert appropriate diagram that I can't seem to copy and paste from my latex editor]

commute for all [itex]v[/itex].

Introducie a basis for [itex]V[/itex], and the matrix representation of [itex]L[/itex] gives the transformation law for vectors. The matrix representation of [itex]\tilde{L}[/itex] with respect to the corresponding dual basis of [itex]V*[/itex] is the "correct" transformation law for covectors.
 
Last edited:
  • #49
lets keep this simple: tensor = multiplication.

hence pete is certainly right, the dot product is one our favorite tensors.
 
  • #50
So are tensors operators or operands?
 
  • #51
or operatoids? or operatives?

is that a diamonoid or a diamonelle? to paraphrase cleo on the cosby show.
 

Similar threads

Replies
6
Views
417
Replies
8
Views
373
Replies
8
Views
3K
Replies
3
Views
530
Replies
6
Views
3K
Back
Top