- #1
RichMacf
- 2
- 1
It takes energy to compress an ideal gas, and that same amount of energy is output as heat. But we also gain the potential energy of the compressed gas. It seems to me that we are doubling up our energy.
Can anybody explain in simple, practical terms (to a humble mech engineer) how this works. It’s like having our cake and eating it! I know the theory and calculations, but I still can’t get my brain around the principles.
Isothermal and adiabatic compression both produce heat, but we remove it during the compression, or cooling the hot gas after. If our motor was driving a simple friction brake we would just get heat, but if it’s driving a compressor we get the heat and the potential energy.I’m working on compressed air energy storage, and these are my calculations. For 1kg of air, isothermal compression to 100 Bar takes 387 kJ of energy, and the same 387 kJ of heat is output. For adiabatic, the work is 573 kJ, and the intercooler would take out 573 kJ from 1092K to ambient.
Can anybody explain in simple, practical terms (to a humble mech engineer) how this works. It’s like having our cake and eating it! I know the theory and calculations, but I still can’t get my brain around the principles.
Isothermal and adiabatic compression both produce heat, but we remove it during the compression, or cooling the hot gas after. If our motor was driving a simple friction brake we would just get heat, but if it’s driving a compressor we get the heat and the potential energy.I’m working on compressed air energy storage, and these are my calculations. For 1kg of air, isothermal compression to 100 Bar takes 387 kJ of energy, and the same 387 kJ of heat is output. For adiabatic, the work is 573 kJ, and the intercooler would take out 573 kJ from 1092K to ambient.