- #1
Impavid
- 4
- 0
Okay so I am a first time poster, hello to my fellow scientists.
I need an expert opinion on a simple question I can't get around: I understand that Carbon Dioxide will absorb more energy from whatever radiation it comes into contact with than, say, Oxygen. That being said, I imagine the light (and other radiation/particles) breaching the atmosphere, making contact with a CO2 particle, transferring heat to it, hitting the ground, again transferring heat, going back up hitting another CO2 particle, transferring heat, and MAYBE by chance getting reflected and the efficiency of absorption is thus increased... however...
If the particle 10 miles in the atmosphere absorbs more heat, we must remember that in physics all energy transferred will be equal to energy lost in whatever your causation is. Therefore, the light has less energy than it would have had if it had not made contact with any CO2 by the time it hit the ground. After going through a greenhouse atmosphere, the energy from what gets through is weakened, and the ground itself absorbs less heat. Now here's where it all comes together...
Maybe CO2 increases the total amount of heat absorption for the Earth when we tally the numbers, but wouldn't heat transferred to an atom 10 miles above the surface virtually do nothing to heat the particles 10 miles below it? Wouldn't it be more beneficial to the earth, due simply to the insulation miles of atmosphere, for the light to transfer the bulk of its energy as deep under the Earth's skin as possible? (so as for the heat to actually have to escape to other atoms around it, which is difficult for atoms in high atmosphere because they are so incredibly far apart by comparison.)
Therefore, wouldn't CO2 cause global cooling??
Someone enlighten me. Much appreciated.
I need an expert opinion on a simple question I can't get around: I understand that Carbon Dioxide will absorb more energy from whatever radiation it comes into contact with than, say, Oxygen. That being said, I imagine the light (and other radiation/particles) breaching the atmosphere, making contact with a CO2 particle, transferring heat to it, hitting the ground, again transferring heat, going back up hitting another CO2 particle, transferring heat, and MAYBE by chance getting reflected and the efficiency of absorption is thus increased... however...
If the particle 10 miles in the atmosphere absorbs more heat, we must remember that in physics all energy transferred will be equal to energy lost in whatever your causation is. Therefore, the light has less energy than it would have had if it had not made contact with any CO2 by the time it hit the ground. After going through a greenhouse atmosphere, the energy from what gets through is weakened, and the ground itself absorbs less heat. Now here's where it all comes together...
Maybe CO2 increases the total amount of heat absorption for the Earth when we tally the numbers, but wouldn't heat transferred to an atom 10 miles above the surface virtually do nothing to heat the particles 10 miles below it? Wouldn't it be more beneficial to the earth, due simply to the insulation miles of atmosphere, for the light to transfer the bulk of its energy as deep under the Earth's skin as possible? (so as for the heat to actually have to escape to other atoms around it, which is difficult for atoms in high atmosphere because they are so incredibly far apart by comparison.)
Therefore, wouldn't CO2 cause global cooling??
Someone enlighten me. Much appreciated.