- #1
Costweist
- 9
- 4
The last couple of days I’ve been troubled with a specific part of electromagnetism. How will electric field lines be affected by an oscillating charge? More specific, what will happen with the “amplitude” of a wave in an electrical field line as the wave propagate away from the charge?
1. Will the amplitude be unchanged as the distance increase?
2. Will the amplitude increase as the distance increase?
When such wave propagation is described in figures / illustrations I actually find both cases (hence the confusion). Below you will find a short video that simulates case no.1 (no increase in amplitude with distance) using excel. I suspect, however that this is an incorrect description..
1. Will the amplitude be unchanged as the distance increase?
2. Will the amplitude increase as the distance increase?
When such wave propagation is described in figures / illustrations I actually find both cases (hence the confusion). Below you will find a short video that simulates case no.1 (no increase in amplitude with distance) using excel. I suspect, however that this is an incorrect description..