- #1
Rhannmah
- 17
- 0
I'm under the impression that if you increase the frequency of a circuit coupled to an antenna, for the same amount of photons emitted, the frequency of those photons will increase therefore having them carry out more energy total, thus increasing the amount of energy required to power the circuit. Am I wrong in thinking this? How does this relate to the amplitude of the electromagnetic wave?
Are there any counter-examples to this in radio or other telecommunication circuits where increasing the frequency does not require extra energy? If so, how is this reconciled with the fact that higher frequency photons carry more energy?
Thanks in advance.
Are there any counter-examples to this in radio or other telecommunication circuits where increasing the frequency does not require extra energy? If so, how is this reconciled with the fact that higher frequency photons carry more energy?
Thanks in advance.