Do electromagnetic waves fade with distance in vacuum?

  • #1
yashraj
6
2
TL;DR Summary
Do electromagnetic wave gets weaker with the increasing distance from the source in its travel ?
I want to know that when a charged particle accelerates then the electromagnetic wave so produced will loose it's strength or can say fades with distance or not ? If yes then what happens to its frequency and also tell me if electromagnetic waves fades away with increasing distance from the source then how physicists detects signals from the outer space which have travelled thousands of light years ?
 
Physics news on Phys.org
  • #2
Welcome to PF.

As an EM wave radiates out into space, the energy is distributed over the surface of a radially growing sphere. That causes the EM wave energy density to be reduced by the inverse square law.

The frequency of the EM wave remains the same.

The signals detected from space start out with incredible energy. After the inverse square law has been applied, there is sufficient energy remaining, to enter the aperture of the telescope, and for the signals to be detected.
 
  • Like
Likes yashraj
  • #3
Baluncore said:
Welcome to PF.

As an EM wave radiates out into space, the energy is distributed over the surface of a radially growing sphere. That causes the EM wave to be reduced by the inverse square law.

The frequency of the EM wave remains the same.

The signals detected from space start out with incredible energy. After the inverse square law has been applied, there is sufficient energy remaining, to enter the aperture of the telescope, and for the signals to be detected.
Thank you for the answer. I was actually thinking almost same but I wanted to make sure that I was right.
 
  • Like
Likes berkeman

Similar threads

Replies
3
Views
772
Replies
36
Views
5K
Replies
14
Views
809
Replies
4
Views
2K
Replies
4
Views
2K
Back
Top