- #1
TimeRip496
- 254
- 5
In Einstein summation convention, the summation occurs for upper indices and its repeated but lower indices. However I have some confusion
1) $${\displaystyle v=v^{i}e_{i}={\begin{bmatrix}e_{1}&e_{2}&\cdots &e_{n}\end{bmatrix}}{\begin{bmatrix}v^{1}\\v^{2}\\\vdots \\v^{n}\end{bmatrix}},\ \qquad w=w_{i}e^{i}={\begin{bmatrix}w_{1}&w_{2}&\cdots &w_{n}\end{bmatrix}}{\begin{bmatrix}e^{1}\\e^{2}\\\vdots \\e^{n}\end{bmatrix}}}$$
Won't the above gives me a scalar each? And most text seems to label the v here as a vector, including wikipedia. I understand the vector components labelled as vi and its coordinate basis as ei or is the definition of vector different in Einstein convention?
In addition, how does the above transpose then work?
E.g. $$v^T=v^ie^i$$
Does it only change the coordinate basis but not the coefficient?
1a) For transpose of matrix, we just need to switch the two indices around. What about the transpose of a vector? Does it remains the same?
2) Inner product of vectors
To do inner product of two vectors, I first need to convert the other into a covector right? In that case, inner product of vector should be expressed as
$$v.u=v^iu_i=g_{ij}v^iu^j$$
3) For similar indices on a 4th order tensor, can I rewrite it as a 2nd order tensor without it losing its meaning?
E.g.
$$R^{\mu}_{\nu\mu\kappa}=R_{\nu\kappa}$$
Is the above equivalent valid? It doesn't seem correct to me as the same indices require summation and thus removing them will remove the summation which seems to contain less info.
4) For matrix-vector multiplication or matrix-matrix multiplication, they can only be done when the upper and lower similar indices from each tensor must be side by side right?
E.g. $$u_i=A^{j}_iv_j=v^jA_j^i$$
But this multiplication is not possible right, $$u_i\neq A_{ij}v_j$$
5) As for derivative, can the partial derivative tensor be arrange anywhere throughout the equation?
E.g. $$A_{ij}\partial_{\mu}\partial_{\nu}f(x)=\partial_{\nu}A_{ij}\partial_{\mu}f(x)$$
It should be possible based on the commutative property of partial derivative unless is covariant derivative.
1) $${\displaystyle v=v^{i}e_{i}={\begin{bmatrix}e_{1}&e_{2}&\cdots &e_{n}\end{bmatrix}}{\begin{bmatrix}v^{1}\\v^{2}\\\vdots \\v^{n}\end{bmatrix}},\ \qquad w=w_{i}e^{i}={\begin{bmatrix}w_{1}&w_{2}&\cdots &w_{n}\end{bmatrix}}{\begin{bmatrix}e^{1}\\e^{2}\\\vdots \\e^{n}\end{bmatrix}}}$$
Won't the above gives me a scalar each? And most text seems to label the v here as a vector, including wikipedia. I understand the vector components labelled as vi and its coordinate basis as ei or is the definition of vector different in Einstein convention?
In addition, how does the above transpose then work?
E.g. $$v^T=v^ie^i$$
Does it only change the coordinate basis but not the coefficient?
1a) For transpose of matrix, we just need to switch the two indices around. What about the transpose of a vector? Does it remains the same?
2) Inner product of vectors
To do inner product of two vectors, I first need to convert the other into a covector right? In that case, inner product of vector should be expressed as
$$v.u=v^iu_i=g_{ij}v^iu^j$$
3) For similar indices on a 4th order tensor, can I rewrite it as a 2nd order tensor without it losing its meaning?
E.g.
$$R^{\mu}_{\nu\mu\kappa}=R_{\nu\kappa}$$
Is the above equivalent valid? It doesn't seem correct to me as the same indices require summation and thus removing them will remove the summation which seems to contain less info.
4) For matrix-vector multiplication or matrix-matrix multiplication, they can only be done when the upper and lower similar indices from each tensor must be side by side right?
E.g. $$u_i=A^{j}_iv_j=v^jA_j^i$$
But this multiplication is not possible right, $$u_i\neq A_{ij}v_j$$
5) As for derivative, can the partial derivative tensor be arrange anywhere throughout the equation?
E.g. $$A_{ij}\partial_{\mu}\partial_{\nu}f(x)=\partial_{\nu}A_{ij}\partial_{\mu}f(x)$$
It should be possible based on the commutative property of partial derivative unless is covariant derivative.
Last edited: