- #1
Coffee_
- 259
- 2
At the moment I'm revising some interference and diffraction basics, and there is something that bothers me slightly and I can't quite figure it out.
The intensity of a wave over some area ##dA## is in general is ##I=\frac{1}{dA} \frac{dE}{dt}##. Clearly for an electromagnetic wave falling on a surface, the part ##\frac{dE}{dt}## is not constant and depends on time. So intensity should be a function of time.
In every text I encounter they seem to DEFINE the intensity as being the average over 1 period of ##c|A(t)|^{2}## where ##A## is the deviation of the wave at the area of interest. Are they simply using more practical definitions, and technically I'm correct above in a general sense OR am I missing something?
The intensity of a wave over some area ##dA## is in general is ##I=\frac{1}{dA} \frac{dE}{dt}##. Clearly for an electromagnetic wave falling on a surface, the part ##\frac{dE}{dt}## is not constant and depends on time. So intensity should be a function of time.
In every text I encounter they seem to DEFINE the intensity as being the average over 1 period of ##c|A(t)|^{2}## where ##A## is the deviation of the wave at the area of interest. Are they simply using more practical definitions, and technically I'm correct above in a general sense OR am I missing something?