Tensor gradient and maxwell's equation

In summary, Daniel is trying to figure out how to find the gradient of a tensor, but is having difficulty because of the additional term in the tensor. He asks for help from the community, and provides a summary of the content.
  • #1
steveurkell
7
0
Hi everyone,
I now have difficulties in using formula for gradient of tensor. The following is
tensor for field strength
Fsubscripts_alpha_beta =
[ 0 -Ex -Ey -Ez
Ex 0 Bz -By
Ey -Bz 0 Bx
Ez By -Bx 0 ]
My question is, how do we derive the Maxwell equations div(B) = 0 and curl(E)+dB/dt using gradient of tensor of the above field tensor?
I myself not quite sure how to use formula for tensor gradient, especially due to the additional term apart from the vectors and 1-forms that the tensor has
e.g.
del_zeta (S) = (dS/dx_superscript_delta) zeta_superscript_delta
What does zeta term here do?
I will be indebted if someone here could give a real example of how to find the gradient of a tensor with the tensor given in the example.
Thank you for any help
regards,
 
Physics news on Phys.org
  • #2
This borad has "tex" notation, you can click on any of the equations below to see the "tex" source, which makes it a lot easier to communicate.

I think you are asking about the covariant derivative operator

[tex]\nabla_a [/tex]

What this does is it takes any tensor of rank p, and produces a tensor of rank p+1. A scalar counts as a tensor of rank 0 for this purpose.

If you think about it this should make sense - the partial derivative of a scalar is a vector. The 'a' represents the extra index in the new tensor, you start with a rank p tensor, you end up with a rank p+1 tensor, this means you add one index to the tensor, this added index is represented by the symbol 'a'.

If you have a nice orthonormal coordinate system, the covariant derivative operator reduces to the ordinary derivative.

Thus

[tex]\nabla_a f = (\frac{\partial f}{\partial t}, \frac{\partial f}{\partial x}, \frac{\partial f}{\partial y}, \frac{\partial f}{\partial z}) [/tex]

where I have replaced a_0, a_1, a_2, a_3 with t,x,y,z to make the point clearer.

Note that we started out with a scalar, and wound up with a one-form. The index a represents the components of that one form (the scalar didn't have any indices at all).

The covariant derivative is only equivalent to the partial derivative in orthonormal coordinate systems where the Christoffel symbols are all zero though.

I hope I've understood your question properly, I'm not positive I have. I realize I haven't answered your question about Maxwell's equations yet, but I want to see if the notation I'm using is the notation you are using.

Sometimes people use other notations and notions (like the exterior derivative) - but that would be symbolized by

dF
 
  • #3
IIUC,you're trying to get from the covariant formalism to the noncovariant one.The equations:

[tex] \nabla\cdot\vec{B}=0 [/tex](1)

[tex] \nabla\times\vec{E}+\frac{\partial\vec{B}}{\partial t}=\vec{0} [/tex] (2-4)...

can be found by simply giving all 4 values to the subscripts in the field equations

[tex] \partial_{[\mu}F_{\nu\rho]}=0 [/tex]

(Pervect mentioned covariant derivative,but in this "fortunate" case

[tex] \partial_{[\mu}F_{\nu\rho]}\equiv \nabla_{[\mu}F_{\nu\rho]}=0 [/tex] )

So do it...

Daniel.

P.S.Report any problems.
 

FAQ: Tensor gradient and maxwell's equation

What is a tensor gradient?

The tensor gradient is a mathematical concept used in tensor calculus to describe the rate of change of a tensor field in a particular direction. It is represented by a vector field that specifies the direction and magnitude of the maximum change of the tensor at any given point.

How is the tensor gradient related to Maxwell's equations?

The tensor gradient is used in Maxwell's equations to describe the electric and magnetic fields in terms of their rates of change. This allows for the prediction and analysis of electromagnetic phenomena, such as the behavior of light, electricity, and magnetism.

What are the applications of tensor gradient and Maxwell's equations?

The applications of tensor gradient and Maxwell's equations are vast and cover a wide range of fields including physics, engineering, and telecommunications. They are used in the design of electronic devices, electromagnetic simulation software, and the development of new technologies such as wireless communication and satellite systems.

Are there any limitations to the use of tensor gradient and Maxwell's equations?

While tensor gradient and Maxwell's equations are powerful tools for understanding and predicting electromagnetic phenomena, they do have limitations. They are based on classical mechanics and do not fully explain certain quantum effects. Additionally, they are only accurate under certain conditions and may not accurately describe extreme or non-linear situations.

How can I learn more about tensor gradient and Maxwell's equations?

There are many resources available for learning about tensor gradient and Maxwell's equations, including textbooks, online courses, and research papers. It is recommended to have a strong foundation in mathematics, particularly in vector calculus and differential equations, before diving into these concepts. Consulting with a professor or expert in the field can also be helpful in understanding the complexities of these equations.

Similar threads

Back
Top