- #1
-sandro-
- 24
- 0
OK so we all know that seismic waves decrease in amplitude at about 1/r, the decay is slower than P and S-waves but it's there.
So how do you explain that when I see examples of seismograms of the same earthquake from stations at different distance you can sometimes notice that surface waves amplitudes increase as you get away from the source?
Example: http://www.sciencebuddies.org/Files/3030/3/Geo_img046.gif
So how do you explain that when I see examples of seismograms of the same earthquake from stations at different distance you can sometimes notice that surface waves amplitudes increase as you get away from the source?
Example: http://www.sciencebuddies.org/Files/3030/3/Geo_img046.gif