- #1
Reignbeaux
- 5
- 0
So I followed the derivation of the Macroscopic Maxwell's equations by averaging the fields / equations and doing a taylor series to separate the induced charges and currents from the free ones. But why does light now "suddenly" travel slower in dielectric media? I mean, sure, it comes out from the macroscopic equations, but what is happening here?
If you think about it from a microscopic point of view, the response of the medium, so it's polarisation (or radiation resulting from the polarisation) have to kind of cancel out the original wave traveling at c and create a "new" one going at c/n. Right?
How does this now drop out of the macroscopic equations, without having to worry about what is actually happening? Is it the averaging? Is there a better way to think about this?
If you think about it from a microscopic point of view, the response of the medium, so it's polarisation (or radiation resulting from the polarisation) have to kind of cancel out the original wave traveling at c and create a "new" one going at c/n. Right?
How does this now drop out of the macroscopic equations, without having to worry about what is actually happening? Is it the averaging? Is there a better way to think about this?