- #1
- 769
- 148
Suppose that we have two light sources emitting monochromatic light with the same wavelength and amplitude, but opposite phases. Suppose also that the distance between the light sources is small compared to the common wavelength (this is probably unrealistic, so we can instead consider two antenna emitting radio waves, with a small distance between them compared to the wavelength). This means that interference will cancel out most of the waves, only very little of them will be left (in the limit when the distance tends to 0, the waves will be completely cancelled). Since the energy is proportional to the square of the intensity of the field, this means that almost no energy will be carried by the field, despite that the energy emitted by the two sources should be twice the energy emitted by one of them if the other one was not present.
How can this be explained?
How can this be explained?