- #1
facenian
- 436
- 25
Hello, I have a problem understanding wave propagation in dispersive medium because the prescription to solve the wave equation for electromagnetics waves is this :
a) suppose [itex]\mu,\epsilon[/itex] are function of frequency only
b) solve the wave equation [itex]\Delta\textbf{E}-\mu\epsilon\frac{\partial^2\textbf{E}}{\partial t^2}=0[/itex], for each Fourier component
c) use superposition to get (sum each solution) to obtain the general solution,ie,
[tex]\textbf{E}(\textbf{r},t)=\int_{-\infty}^{\infty}\textbf{E}(\textbf{r},\omega)e^{-i\omega t}dw[/tex]
How can this superposition be valid since there is no linearity?, or in any case, what is the wave equation this prescribed expression is solution to?
It must be simple and clear since this is the ussual way to do it, but I'm not seeing it.
a) suppose [itex]\mu,\epsilon[/itex] are function of frequency only
b) solve the wave equation [itex]\Delta\textbf{E}-\mu\epsilon\frac{\partial^2\textbf{E}}{\partial t^2}=0[/itex], for each Fourier component
c) use superposition to get (sum each solution) to obtain the general solution,ie,
[tex]\textbf{E}(\textbf{r},t)=\int_{-\infty}^{\infty}\textbf{E}(\textbf{r},\omega)e^{-i\omega t}dw[/tex]
How can this superposition be valid since there is no linearity?, or in any case, what is the wave equation this prescribed expression is solution to?
It must be simple and clear since this is the ussual way to do it, but I'm not seeing it.