- #1
GhostLoveScore
- 149
- 9
This question is about transmission line that I use between my Ham radio and the antenna.
We a transmission line that is made of two parallel conductors close together. The current in one conductor is exactly the same as the current in the other conductor, but flowing in opposite direction. Transmission line's impedance is 50 ohm and it is terminated into a 50 ohm load. In every text it says that their fields will cancel each other and there will be no radiation looking at some distance "d" that is much larger that two conductors separation.
That sounds logical but we still have two wave sources from those two wires. Even if their fields cancel each other at the distance they must still lose power due to the radiation.
So why doesn't the transmission line radiate all the power away?
There are no standing waves on the transmission line. Do we have to have standing waves in order to radiate something away? In that case, if we left transmission line not terminated, would it radiate since standing waves would occur on it?
We a transmission line that is made of two parallel conductors close together. The current in one conductor is exactly the same as the current in the other conductor, but flowing in opposite direction. Transmission line's impedance is 50 ohm and it is terminated into a 50 ohm load. In every text it says that their fields will cancel each other and there will be no radiation looking at some distance "d" that is much larger that two conductors separation.
That sounds logical but we still have two wave sources from those two wires. Even if their fields cancel each other at the distance they must still lose power due to the radiation.
So why doesn't the transmission line radiate all the power away?
There are no standing waves on the transmission line. Do we have to have standing waves in order to radiate something away? In that case, if we left transmission line not terminated, would it radiate since standing waves would occur on it?