Does the use of microwave communication heat water particles?

  • I
  • Thread starter ronaldcaius
  • Start date
  • Tags
    Microwave
  • #1
ronaldcaius
1
0
TL;DR Summary
Have there been studies that measure the heat gain of air born water particles in the path of microwave communication systems?
As microwave communication systems send designated frequencies through the atmosphere, are water particles within the path of these systems agitated enough to increase their temperature?
 
Physics news on Phys.org
  • #2
ronaldcaius said:
TL;DR Summary: Have there been studies that measure the heat gain of air born water particles in the path of microwave communication systems?

As microwave communication systems send designated frequencies through the atmosphere, are water particles within the path of these systems agitated enough to increase their temperature?
Let me quote from the following PF post: https://www.physicsforums.com/threa...y-high-in-this-one-room.1053147/#post-6904755
"Even at ##100\text{% RH}##, the attenuation at ##2.45## and ##5\text{ GHz}## is less than ##10^{-2}\text{ dB/km}## ..."
The propagation loss of microwaves is due both to scattering and absorption (heating). So even if a 1 kilowatt microwave beam could somehow avoid scattering and attenuate only by the heating of atmospheric moisture, only about 3 watts at most would be absorbed by that moisture in a 1 kilometer distance. Bottom line: there is negligible heat gain by atmospheric water even from high-power microwave propagation.
 
  • Like
  • Informative
Likes DaveE, russ_watters and berkeman
  • #3
renormalize said:
Bottom line: there is negligible heat gain by atmospheric water even from high-power microwave propagation.
Interesting. I did not know it was that low, but it makes sense. Thanks.
 
  • #5
Baluncore said:
The authors need to specify temperature or absolute humidity, or they don't know what they are doing.
Losses in a microwave link can vary but even ten times the qaoted value would not involve significant thermal dissipation into a 'cone' of radiated RF power - except right next to the dish. That quoted PF post was quite appropriate when you bear in mind that microwave transmitters are relatively low power. This link quotes 1.6kW for a link transferring useable RF power with the intension of minimising link losses, of courses. That ain;t going to warm up anything much on the way.
 
  • #6
sophiecentaur said:
That ain;t going to warm up anything much on the way.
So where does the lost microwave energy go in the atmosphere, if it is not into heating dissolved water molecules, mist, raindrops, or snow.

I am not saying you could make tea with it, or even clear the fog at an airport, but the use of microwaves must increase the average temperature of the water "particles" on route.
 
  • Like
Likes tech99
  • #7
Baluncore said:
So where does the lost microwave energy go in the atmosphere,

I was not challenging the concept of Energy Conservation - just addressing the actual order of magnitude of any effect. Hence the choice of the phrase "warm up", rather than temperature rise. Ignoring the unimportant is standard practice for Engineers. You are well aware of how hard it is to transport (not transmit) useful energy via RF.
 
  • Like
Likes DaveE
  • #8
sophiecentaur said:
You are well aware of how hard it is to transport (not transmit) useful energy via RF.
It is not hard at all. If you want to transport microwave energy, you would not radiate it, you would use a waveguide. Think inside the box.
 
  • Wow
Likes Vanadium 50
  • #9
Baluncore said:
you would use a waveguide
To the next mountain? Perhaps I should have helped you all by including the words "free space", That's the only time there's enough power transmitted to knock the skin off a rice pudding.
 
  • #10
It is proposed from time-to-time to use microwaves to transfer energy from a solar array in space down to the Earth's surface.
Incidentally, a typical microwave link uses transmitter powers of just a few watts, not kilowatts, and the antenna gain does not give an increase in energy dissipated in the atmosphere. I should also mention that rain has little effect at frequencies lower than about 10 GHz due to the size of the droplets.
 
  • Like
Likes berkeman
  • #11
tech99 said:
It is proposed from time-to-time to use microwaves to transfer energy from a solar array in space down to the Earth's surface.
This link has a load of examples of experimental power transmission from space but the overall efficiency of such systems is never very high. Problem is that, once your efficiency drops below 50% then you should consider using Earth based systems and just suck up the fact that they only work for half the time and then position them around the equator at low latitudes. Transmission by DC takes a lot of beating and it's pretty safe.
 
  • Like
Likes tech99
  • #12
tech99 said:
It is proposed from time-to-time to use microwaves to transfer energy from a solar array in space down to the Earth's surface.
Not by anyone with a lick of sense.

The ISS averages ~100 kW at $3B/year. That is 20,000x more expensive than I pay for electricity. Not 20x. 20,000x.

This idea isn't extrapolating. It's pretending.
 
  • Like
Likes russ_watters
  • #13
Vanadium 50 said:
Not by anyone with a lick of sense.

The ISS averages ~100 kW at $3B/year. That is 20,000x more expensive than I pay for electricity. Not 20x. 20,000x.

This idea isn't extrapolating. It's pretending.
Using a manned space station as the cost estimate for energy production is a straw man argument.
 
  • #14
Pick some other thing then. You are not looking to save a factor of 2 - you're looking to save a factor of 20,000. Put another way, an ISS-sized unit generates $100K of power per year, Can you build, launch and operate it for that?

Putting solar panels in space gets you maybe a factor of 4 in power. It costs at least a factor of 1000 more than leaving it on the ground.

This idea isn't extrapolating. It's pretending.
 
  • Like
Likes russ_watters
Back
Top